US20190011921A1 - Systems and methods for uav interactive instructions and control - Google Patents
Systems and methods for uav interactive instructions and control Download PDFInfo
- Publication number
- US20190011921A1 US20190011921A1 US16/067,577 US201516067577A US2019011921A1 US 20190011921 A1 US20190011921 A1 US 20190011921A1 US 201516067577 A US201516067577 A US 201516067577A US 2019011921 A1 US2019011921 A1 US 2019011921A1
- Authority
- US
- United States
- Prior art keywords
- target
- movable object
- uav
- mode
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 105
- 230000002452 interceptive effect Effects 0.000 title description 6
- 230000033001 locomotion Effects 0.000 claims abstract description 88
- 230000000694 effects Effects 0.000 claims description 8
- 230000007704 transition Effects 0.000 claims description 6
- 238000003384 imaging method Methods 0.000 description 202
- 238000004891 communication Methods 0.000 description 72
- 230000008859 change Effects 0.000 description 36
- 238000004422 calculation algorithm Methods 0.000 description 33
- 230000001276 controlling effect Effects 0.000 description 32
- 238000012545 processing Methods 0.000 description 25
- 230000001133 acceleration Effects 0.000 description 24
- 230000000875 corresponding effect Effects 0.000 description 21
- 230000007246 mechanism Effects 0.000 description 21
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 19
- 230000000007 visual effect Effects 0.000 description 19
- 238000013519 translation Methods 0.000 description 18
- 238000001514 detection method Methods 0.000 description 16
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 13
- 230000008569 process Effects 0.000 description 12
- 230000003287 optical effect Effects 0.000 description 11
- 230000009471 action Effects 0.000 description 9
- 230000008901 benefit Effects 0.000 description 9
- 241001465754 Metazoa Species 0.000 description 8
- 230000006870 function Effects 0.000 description 8
- 230000004044 response Effects 0.000 description 8
- 230000003068 static effect Effects 0.000 description 7
- 230000003993 interaction Effects 0.000 description 6
- 239000003550 marker Substances 0.000 description 6
- 230000002829 reductive effect Effects 0.000 description 6
- 230000009466 transformation Effects 0.000 description 6
- 241000712469 Fowl plague virus Species 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 5
- 238000005265 energy consumption Methods 0.000 description 5
- 239000007787 solid Substances 0.000 description 5
- 238000013459 approach Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 4
- 230000007774 longterm Effects 0.000 description 4
- 230000005055 memory storage Effects 0.000 description 4
- 230000011218 segmentation Effects 0.000 description 4
- 239000000126 substance Substances 0.000 description 4
- 239000008186 active pharmaceutical agent Substances 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 3
- 230000003190 augmentative effect Effects 0.000 description 3
- 230000015572 biosynthetic process Effects 0.000 description 3
- 238000012790 confirmation Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000005755 formation reaction Methods 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 238000013507 mapping Methods 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 238000009877 rendering Methods 0.000 description 3
- 230000006641 stabilisation Effects 0.000 description 3
- 238000011105 stabilization Methods 0.000 description 3
- 241000196324 Embryophyta Species 0.000 description 2
- 241000282412 Homo Species 0.000 description 2
- 230000000712 assembly Effects 0.000 description 2
- 238000000429 assembly Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000005670 electromagnetic radiation Effects 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 244000144992 flock Species 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 230000001788 irregular Effects 0.000 description 2
- 230000000670 limiting effect Effects 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 239000000523 sample Substances 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 230000008685 targeting Effects 0.000 description 2
- 241000283690 Bos taurus Species 0.000 description 1
- 241000282465 Canis Species 0.000 description 1
- 241000283086 Equidae Species 0.000 description 1
- 241000283073 Equus caballus Species 0.000 description 1
- 241000282324 Felis Species 0.000 description 1
- 241000238631 Hexapoda Species 0.000 description 1
- 240000004752 Laburnum anagyroides Species 0.000 description 1
- 241000283984 Rodentia Species 0.000 description 1
- 241000238370 Sepia Species 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000007664 blowing Methods 0.000 description 1
- 239000000969 carrier Substances 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 239000003337 fertilizer Substances 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 244000144980 herd Species 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000010348 incorporation Methods 0.000 description 1
- 238000003331 infrared imaging Methods 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000006386 memory function Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 239000000575 pesticide Substances 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 239000011435 rock Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000002187 spin decoupling employing ultra-broadband-inversion sequences generated via simulated annealing Methods 0.000 description 1
- 230000000087 stabilizing effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000001931 thermography Methods 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0094—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S3/00—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
- G01S3/78—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
- G01S3/782—Systems for determining direction or deviation from predetermined direction
- G01S3/785—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
- G01S3/786—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically
- G01S3/7864—T.V. type tracking systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/106—Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/12—Target-seeking control
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/248—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/685—Vibration or motion blur correction performed by mechanical compensation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
- B64C39/024—Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
- B64U2101/31—UAVs specially adapted for particular uses or applications for imaging, photography or videography for surveillance
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/10—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/10—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
- B64U2201/104—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS] using satellite radio beacon positioning systems, e.g. GPS
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/20—Remote controls
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/12—Bounding box
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
Definitions
- an aerial vehicle carrying a payload e.g., a camera
- An aerial vehicle carrying a payload can be used to track objects, or controlled to move in a certain direction.
- Tracking and flight navigation methods may be based on global positioning system (GPS) data or camera vision.
- GPS global positioning system
- one or more operators may have to manually select a target object, and manually control the aerial vehicle/camera to move to the target object or follow the target object.
- the operators may also have to manually control the aerial vehicle such that it flies in a desired direction and/or avoid obstacles along the way.
- Presently known flight control systems generally require the operators to have some level of aviation experience or manual skill to operate the aerial vehicle, and offer limited real-time automatic control capability.
- the lack of an easy-to-use interactive control and guidance system may reduce the usefulness of aerial vehicles in certain applications.
- the burden of manually piloting the aerial vehicle on the user can be significantly reduced, thus allowing the user to more readily focus on payload or mission operation, such as visually monitoring and/or taking aerial imagery of a stationary target or a moving target.
- Improved flight control and tracking capabilities may allow a movable object to automatically detect one or more stationary/moving target objects and to autonomously track the target objects, without requiring manual input and/or operation by a user.
- the improved flight control and tracking capabilities may be particularly useful when the movable object is used to track a target object, move towards a target object, and/or move in a selected direction.
- the improved tracking capabilities can be incorporated into an aerial vehicle, such as an unmanned aerial vehicle (UAV).
- UAV unmanned aerial vehicle
- a target object may be tracked using an imaging device located on an aerial vehicle.
- Vision-based tracking methods can be manual or automatic.
- an image may be first captured using the imaging device, and an operator may manually select a target object to be tracked from the image.
- the manual selection may be performed using an input device, for example, a tablet, a mobile device, or a personal computer (PC).
- the aerial vehicle may be configured to automatically track the target object after it has been manually selected by the operator using the input device. In other instances, the operator may continue to manually control the aerial vehicle to track the target object even after it has been selected.
- automatic tracking may be implemented using tracking algorithms that can automatically detect a particular type of object, or an object carrying a marker.
- the type of object may be based on different object classes (e.g., people, buildings, landscape, etc.).
- the marker may include one or more optical markers comprising unique patterns.
- a target object may be defined based on predetermined features (e.g., color, structure, salient features, etc.) and/or by modeling (e.g., object class). After the target object has been defined, movement of the features and/or model may be detected and calculated in real-time as the target object moves.
- predetermined features e.g., color, structure, salient features, etc.
- modeling e.g., object class
- movement of the features and/or model may be detected and calculated in real-time as the target object moves.
- a high-level consistency in the features and/or model may be typically required for precise tracking of the target object.
- the level of tracking precision may depend on the spatial relations between the features and/or an error in the model.
- vision-based tracking methods can be used to track an object, they may be inadequate for tracking the object when obstacles appear in the flight path of the aerial vehicle.
- the obstacles may be stationary or capable of movement.
- the obstacles may be a fast-moving group of objects, whereby the size and/or shape of the group may be amorphous and change over time as the objects move.
- groups of objects may include, but are not limited to, groups of moving animals (e.g., a herd of horses running on the plains, or a flock of birds flying in different formations), groups of people (e.g., a large crowd of people moving in a parade), groups of vehicles (e.g., a squadron of airplanes performing aerial acrobatics), or groups comprising different objects moving in different formations (e.g., a group comprising of moving animals, people, and vehicles to be tracked).
- groups of moving animals e.g., a herd of horses running on the plains, or a flock of birds flying in different formations
- groups of people e.g., a large crowd of people moving in a parade
- groups of vehicles e.g., a squadron of airplanes performing aerial acrobatics
- groups comprising different objects moving in different formations e.g., a group comprising of moving animals, people, and vehicles to be tracked.
- an imaging device and a target object may each be provided with GPS apparatus (e.g., a GPS receiver).
- GPS apparatus e.g., a GPS receiver
- a spatial relation between the imaging device and the target object may be calculated based on estimates of their real-time locations.
- the imaging device may be configured to track the target object based on their spatial relation.
- this method may be limited by GPS signal quality and availability of GPS signals.
- global positioning system (GPS)-based tracking methods may not work indoors, or when GPS signal reception is blocked by buildings and/or natural terrain features such as valleys, mountains, etc.
- GPS-based tracking methods are predicated on GPS tracking, and thus cannot be used when the target object(s) (e.g., a group of animals) do not carry GPS apparatus.
- GPS-based tracking methods are unable to account for obstacles in the path of the movable object.
- the conditions may include both indoor and outdoor environments, places without GPS signals or places that have poor GPS signal reception, a variety of different terrain, stationary obstacles, dynamically appearing obstacles, etc.
- the applications may include tracking of a stationary object, a moving target object or a group of moving target objects, or moving in a selected direction.
- the target objects may include target objects that do not carry GPS apparatus, and target objects that do not have well-defined features or that do not fall into known object classes.
- the obstacles may collectively form a group whereby the size and/or shape of the group may be amorphous and change over time (such as a flock of birds), different obstacles moving in different formations (other aerial vehicles), or any combination of the above.
- Systems, methods, and devices are provided herein to address at least the above needs.
- a method for controlling a movable object may comprise: receiving an input indicative of a selected mode, wherein the selected mode is selected from a plurality of modes; and effecting movement of the movable object based on the selected mode.
- an apparatus for controlling a movable object may comprise one or more processors that are, individually or collectively, configured to: receive an input indicative of a selected mode, wherein the selected mode is selected from a plurality of modes; and effect movement of the movable object based on the selected mode.
- a non-transitory computer-readable medium storing instructions that, when executed, causes a computer to perform a method for controlling a movable object.
- the method may comprise: receiving an input indicative of a selected mode, wherein the selected mode is selected from a plurality of modes; and effecting movement of the movable object based on the selected mode.
- the system may comprise a flight control module for controlling the UAV.
- the flight control module may comprise one or more processors that are, individually or collectively, configured to: receive an input indicative of a selected mode, wherein the selected mode is selected from a plurality of modes; and effect movement of the UAV based on the selected mode.
- Further aspects of the invention may be directed to a method for controlling a movable object.
- the method may comprise: obtaining target information for the movable object while the movable object is at a first location, wherein said target information is indicative of a second location different from the first location; and generating a path for the movable object from the first location to the second location.
- an apparatus for controlling a movable object may comprise one or more processors that are, individually or collectively, configured to: obtain target information for the movable object while the movable object is at a first location, wherein said target information is indicative of a second location different from the first location; and generate a path for the movable object from the first location to the second location.
- a non-transitory computer-readable medium storing instructions that, when executed, causes a computer to perform a method for controlling a movable object.
- the method may comprise: obtaining target information for the movable object while the movable object is at a first location, wherein said target information is indicative of a second location different from the first location; and generating a path for the movable object from the first location to the second location.
- An unmanned aerial vehicle (UAV) system may be provided in accordance with an additional aspect of the invention.
- the system may comprise a flight control module for controlling the UAV.
- the flight control module may comprise one or more processors that are, individually or collectively, configured to: obtain target information for the UAV while the UAV is at a first location, wherein said target information is indicative of a second location different from the first location; and generate a path for the UAV from the first location to the second location.
- Further aspects of the invention may be directed to a method for controlling a movable object.
- the method may comprise: acquiring, when the movable object is at a first location, a target from one or more images captured by an imaging device that is carried by the movable object; and controlling the movable object to track the acquired target.
- an apparatus for controlling a movable object may comprise one or more processors that are, individually or collectively, configured to: acquire, when the movable object is at a first location, a target from one or more images captured by an imaging device that is carried by the movable object; and control the movable object to track the acquired target.
- a non-transitory computer-readable medium storing instructions that, when executed, causes a computer to perform a method for controlling a movable object.
- the method may comprise: acquiring, when the movable object is at a first location, a target from one or more images captured by an imaging device that is carried by the movable object; and controlling the movable object to track the acquired target.
- the system may comprise a flight control module for controlling the UAV.
- the flight control module may comprise one or more processors that are, individually or collectively, configured to: acquire, when the UAV is at a first location, a target from one or more images captured by an imaging device that is carried by the UAV; and control the UAV to track the acquired target.
- Further aspects of the invention may be directed to a method for controlling a movable object.
- the method may comprise: obtaining target information for the movable object while the movable object is at a first location, wherein said target information is indicative of a second location different from the first location; and directing the movable object to move from the first location along a target direction toward the second location.
- an apparatus for controlling a movable object may comprise one or more processors that are, individually or collectively, configured to: obtain target information for the movable object while the movable object is at a first location, wherein said target information is indicative of a second location different from the first location; and direct the movable object to move from the first location along a target direction toward the second location.
- a non-transitory computer-readable medium storing instructions that, when executed, causes a computer to perform a method for controlling a movable object.
- the method may comprise: obtaining target information for the movable object while the movable object is at a first location, wherein said target information is indicative of a second location different from the first location; and directing the movable object to move from the first location along a target direction toward the second location.
- An unmanned aerial vehicle (UAV) system may be provided in accordance with an additional aspect of the invention.
- the system may comprise a flight control module for controlling the UAV.
- the flight control module may comprise one or more processors that are, individually or collectively, configured to: obtain target information for the movable object while the movable object is at a first location, wherein said target information is indicative of a second location different from the first location; and direct the movable object to move from the first location along a target direction toward the second location.
- Further aspects of the invention may be directed to a method for controlling a movable object using a computer-implemented graphical display.
- the method may comprise: receiving an input when a user selects a point that is visually depicted within an image on the graphical display, wherein the movable object is positioned at a first location, and wherein the point corresponds to: (1) a second location different from the first location and/or (2) a direction from the first location; and processing the input to generate: (1) a path for the movable object to move from the first location toward the second location, and/or (2) the direction in which the movable object moves from the first location.
- an apparatus for controlling a movable object using a computer-implemented graphical display may comprise one or more processors that are, individually or collectively, configured to: receive an input when a user selects a point that is visually depicted within an image on the graphical display, wherein the movable object is positioned at a first location, and wherein the point corresponds to: (1) a second location different from the first location and/or (2) a direction from the first location; and process the input to generate: (1) a path for the movable object to move from the first location toward the second location, and/or (2) the direction in which the movable object moves from the first location.
- a non-transitory computer-readable medium storing instructions that, when executed, causes a computer to perform a method for controlling a movable object using a computer-implemented graphical display.
- the method may comprise: receiving an input when a user selects a point that is visually depicted within an image on the graphical display, wherein the movable object is positioned at a first location, and wherein the point corresponds to: (1) a second location different from the first location and/or (2) a direction from the first location; and processing the input to generate: (1) a path for the movable object to move from the first location toward the second location, and/or (2) the direction in which the movable object moves from the first location.
- the system may comprise a computer-implemented graphical display and a flight control module for controlling the UAV.
- the flight control module may comprise one or more processors that are, individually or collectively, configured to: receive an input when a user selects a point that is visually depicted within an image on the graphical display, wherein the UAV is positioned at a first location, and wherein the point corresponds to: (1) a second location different from the first location and/or (2) a direction from the first location; and process the input to generate: (1) a path for the movable object to move from the first location toward the second location, and/or (2) the direction in which the movable object moves from the first location.
- FIG. 1 shows an example of a system used in visual interactive navigation, in accordance with some embodiments
- FIG. 2 shows an example of communications that may occur within a visual interactive navigation system, in accordance with some embodiments
- FIG. 3 shows an example in which the position of the target may be determined using a plurality of imaging devices, in accordance with some embodiments
- FIG. 4 shows an exemplary method for generating a flight path using a 3D map and avoiding obstacles, in accordance with some embodiments
- FIG. 5 shows an example of an occupancy grid in accordance with some embodiments
- FIG. 6 illustrates flowcharts of different flight modes in which a UAV can operate, in accordance with some embodiments
- FIG. 7 shows an example of a user interface (UI) through which a user may select a target and cause the UAV to move towards the target, in accordance with some embodiments;
- UI user interface
- FIG. 8 shows an example of a user interface (UI) through which a user may select a target by selecting different points and cause the UAV to move towards the target, in accordance with some embodiments;
- UI user interface
- FIG. 9 shows an example of a user interface (UI) through which a user may select a target by drawing a shape around the target and cause the UAV to move towards the target, in accordance with some embodiments;
- UI user interface
- FIG. 10 shows an example of a user interface (UI) comprising a first person view (FPV) photographic/video image and a 2D map through which a user may select a target and cause the UAV to move towards the target, in accordance with some embodiments;
- UI user interface
- FMV first person view
- 2D map 2D map
- FIG. 11 shows an example of a user interface (UI) through which a user may select a target from among a plurality of objects and cause the UAV to move towards the target, in accordance with some embodiments;
- UI user interface
- FIG. 12 shows an example of a user interface (UI) through which a user may select a new target and cause the UAV to move towards the new target, in accordance with some embodiments;
- UI user interface
- FIGS. 13 and 14 show an example of a user interface (UI) through which a user may select a moving target and cause the UAV to track the moving target, in accordance with some embodiments;
- UI user interface
- FIG. 15 illustrates UAV tracking of a moving target, in accordance with some embodiments
- FIG. 16 shows the avoidance of obstacles as the UAV is moving towards and/or tracking a target, in accordance with some embodiments
- FIG. 17 shows an example of a user interface (UI) through which a user may select a target direction, in accordance with some embodiments
- FIG. 18 shows an example of a user interface (UI) through which a user may adjust a target direction, in accordance with some embodiments
- FIG. 19 shows an example of a change in flight path of a UAV when a user adjusts a target direction, in accordance with some embodiments
- FIG. 20 shows an example of a UAV traveling in a target direction within an environment, in accordance with some embodiments
- FIG. 21 shows an example of a UAV traveling in a target direction within an environment, where the UAV and/or the imaging device has changed orientation relative to the environment, in accordance with some embodiments;
- FIG. 22 shows a geometry model of camera imaging whereby the geometry model is used for transforming from camera coordinates to world coordinates, in accordance with some embodiments
- FIG. 23 shows an example of selecting a target direction within an environment where an obstacle may be within the path of the UAV when traveling along the target direction, in accordance with some embodiments
- FIG. 24 shows an example of a flight path of a UAV when avoiding an obstacle, in accordance with some embodiments
- FIG. 25 illustrates an exemplary target tracking system in a movable object environment, in accordance with some embodiments
- FIG. 26 illustrates supporting target tracking in a movable object environment, in accordance with various embodiments
- FIG. 27 illustrates initializing target tracking in a movable object environment, in accordance with various embodiments
- FIG. 28 illustrates tracking a target in a movable object environment, in accordance with various embodiments
- FIG. 29 illustrates supporting target tracking and redetecting in a movable object environment, in accordance with various embodiments
- FIG. 30 illustrates using positioning devices for aiding target tracking in a movable object environment, in accordance with various embodiments
- FIG. 31 illustrates tracking a target based on distance measuring in a movable object environment, in accordance with various embodiments.
- FIG. 32 is a schematic block diagram of a system for controlling a movable object, in accordance with some embodiments.
- Systems, methods, and devices provided herein can be used to improve the ease of operation of movable objects such as unmanned aerial vehicles (UAVs).
- UAVs unmanned aerial vehicles
- the flight control and tracking systems provided herein are intuitive and easy to use, and allows a human to manage and operate a UAV through interaction with a graphical human-system interface.
- the burden of manually piloting the UAV on the user can be significantly reduced, thus allowing the user to more readily focus on payload or mission operation, such as visually monitoring and/or taking aerial imagery of a stationary target or a moving target.
- the burden of manually piloting the UAV on the user may also be significantly reduced by controlling the aerial vehicle to automatically fly in any desired direction via the graphical human-system interface.
- the improved flight control and tracking capabilities may further allow a UAV to automatically detect one or more stationary/moving target objects and to autonomously track the target objects, without requiring manual input and/or operation by a user.
- the improved flight control and tracking capabilities may be particularly useful when the UAV is used to track a target object, move towards a target object, and/or move in a selected direction.
- the improved tracking capabilities can be incorporated into any type of aerial vehicle.
- FIG. 1 shows an example of a system used in visual navigation.
- the visual navigation system 100 may include a movable object 102 and a user terminal 106 capable of communicating with the movable object.
- the movable object may be configured to carry a payload 104 .
- the user terminal can be used to control one or more motion characteristics of the movable object and/or the payload.
- the user terminal can be used to control the movable object such that the movable object is able to navigate towards a target object 108 within an environment.
- the user terminal can also be used to control the movable object such that the movable object is able to track or follow the target object within the environment.
- the user terminal can be used to control the movable object such that the movable object is able to navigate in a specified direction 110 within the environment.
- the movable object 102 may be any object capable of traversing an environment.
- the movable object may be capable of traversing air, water, land, and/or space.
- the environment may include objects that are incapable of motion (stationary objects) and objects that are capable of motion.
- stationary objects may include geographic features, plants, landmarks, buildings, monolithic structures, or any fixed structures.
- objects that are capable of motion include people, vehicles, animals, projectiles, etc.
- the environment may be an inertial reference frame.
- the inertial reference frame may be used to describe time and space homogeneously, isotropically, and in a time-independent manner.
- the inertial reference frame may be established relative to the movable object, and move in accordance with the movable object. Measurements in the inertial reference frame can be converted to measurements in another reference frame (e.g., a global reference frame) by a transformation (e.g., Galilean transformation in Newtonian physics).
- a transformation e.g., Galilean transformation in Newtonian physics
- the movable object 102 may be a vehicle.
- the vehicle may be a self-propelled vehicle.
- the vehicle may traverse an environment with aid of one or more propulsion units.
- the vehicle may be an aerial vehicle, a land-based vehicle, a water-based vehicle, or a space-based vehicle.
- the vehicle may be an unmanned vehicle.
- the vehicle may be capable of traversing an environment without a human passenger onboard. Alternatively, the vehicle may carry a human passenger.
- the movable object may be an unmanned aerial vehicle (UAV).
- UAV unmanned aerial vehicle
- any description herein of a UAV or any other type of movable object may apply to any other type of movable object or various categories of movable objects in general, or vice versa.
- any description herein of a UAV may apply to any unmanned land-bound, water-based, or space-based vehicle. Further examples of movable objects are provided in greater detail elsewhere herein.
- the movable object may be capable of traversing an environment.
- the movable object may be capable of flight within three dimensions.
- the movable object may be capable of spatial translation along one, two, or three axes.
- the one, two or three axes may be orthogonal to one another.
- the axes may be along a pitch, yaw, and/or roll axis.
- the movable object may be capable of rotation about one, two, or three axes.
- the one, two, or three axes may be orthogonal to one another.
- the axes may be a pitch, yaw, and/or roll axis.
- the movable object may be capable of movement along up to 6 degrees of freedom.
- the movable object may include one or more propulsion units that may aid the movable object in movement.
- the movable object may be a UAV with one, two or more propulsion units.
- the propulsion units may be configured to generate lift for the UAV.
- the propulsion units may include rotors.
- the movable object may be a multi-rotor UAV.
- the movable object may have any physical configuration.
- the movable object may have a central body with one or arms or branches extending from the central body.
- the arms may extend laterally or radially from the central body.
- the arms may be movable relative to the central body or may be stationary relative to the central body.
- the arms may support one or more propulsion units.
- each arm may support one, two or more propulsion units.
- the movable object may have a housing.
- the housing may be formed from a single integral piece, two integral pieces, or multiple pieces.
- the housing may include a cavity within where one or more components are disposed.
- the components may be electrical components, such as a flight controller, one or more processors, one or more memory storage units, one or more sensors (e.g., one or more inertial sensors or any other type of sensor described elsewhere herein), one or more navigational units (e.g., a global positioning system (GPS) unit), one or communication units, or any other type of component.
- the housing may have a single cavity or multiple cavities.
- a flight controller may in communication with one or more propulsion units and/or may control operation of the one or more propulsion units.
- the flight controller may communicate and/or control operation of the one or more propulsion units with aid of one or more electronic speed control (ESC) modules.
- the flight controller may communicate with the ESC modules to control operation of the propulsion units.
- ESC electronic speed control
- the movable object may support an on-board payload 104 .
- the payload may have a fixed position relative to the movable object, or may be movable relative to the movable object.
- the payload may spatially translate relative to the movable object. For instance, the payload may move along one, two or three axes relative to the movable object.
- the payload may rotate relative to the movable object. For instance, the payload may rotate about one, two or three axes relative to the movable object.
- the axes may be orthogonal to on another.
- the axes may be a pitch, yaw, and/or roll axis.
- the payload may be fixed or integrated into the movable object.
- the payload may be movable relative to the movable object with aid of a carrier.
- the carrier may include one or more gimbal stages that may permit movement of the carrier relative to the movable object.
- the carrier may include a first gimbal stage that may permit rotation of the carrier relative to the movable object about a first axis, a second gimbal stage that may permit rotation of the carrier relative to the movable object about a second axis, and/or a third gimbal stage that may permit rotation of the carrier relative to the movable object about a third axis. Any descriptions and/or characteristics of carriers as described elsewhere herein may apply.
- the payload may include a device capable of sensing the environment about the movable object, a device capable of emitting a signal into the environment, and/or a device capable of interacting with the environment.
- One or more sensors may be provided as a payload, and may be capable of sensing the environment.
- the one or more sensors may include an imaging device.
- An imaging device may be a physical imaging device.
- An imaging device can be configured to detect electromagnetic radiation (e.g., visible, infrared, and/or ultraviolet light) and generate image data based on the detected electromagnetic radiation.
- An imaging device may include a charge-coupled device (CCD) sensor or a complementary metal-oxide-semiconductor (CMOS) sensor that generates electrical signals in response to wavelengths of light.
- CMOS complementary metal-oxide-semiconductor
- the resultant electrical signals can be processed to produce image data.
- the image data generated by an imaging device can include one or more images, which may be static images (e.g., photographs), dynamic images (e.g., video), or suitable combinations thereof.
- the image data can be polychromatic (e.g., RGB, CMYK, HSV) or monochromatic (e.g., grayscale, black-and-white, sepia).
- the imaging device may include a lens configured to direct light onto an image sensor.
- the imaging device can be a camera.
- a camera can be a movie or video camera that captures dynamic image data (e.g., video).
- a camera can be a still camera that captures static images (e.g., photographs).
- a camera may capture both dynamic image data and static images.
- a camera may switch between capturing dynamic image data and static images.
- a camera can be used to generate 2D images of a 3D scene (e.g., an environment, one or more objects, etc.).
- the images generated by the camera can represent the projection of the 3D scene onto a 2D image plane. Accordingly, each point in the 2D image corresponds to a 3D spatial coordinate in the scene.
- the camera may comprise optical elements (e.g., lens, mirrors, filters, etc).
- the camera may capture color images, greyscale image, infrared images, and the like.
- the camera may be a thermal imaging device when it is configured to capture infrared images.
- the payload may include multiple imaging devices, or an imaging device with multiple lenses and/or image sensors.
- the payload may be capable of taking multiple images substantially simultaneously.
- the multiple images may aid in the creation of a 3D scene, a 3D virtual environment, a 3D map, or a 3D model.
- a right image and a left image may be taken and used for stereo-mapping.
- a depth map may be calculated from a calibrated binocular image. Any number of images (e.g., 2 or more, 3 or more, 4 or more, 5 or more, 6 or more, 7 or more, 8 or more, 9 or more) may be taken simultaneously to aid in the creation of a 3D scene/virtual environment/model, and/or for depth mapping.
- the images may be directed in substantially the same direction or may be directed in slightly different directions.
- data from other sensors e.g., ultrasonic data, LIDAR data, data from any other sensors as described elsewhere herein, or data from external devices
- ultrasonic data e.g., ultrasonic data, LIDAR data, data from any other sensors as described elsewhere herein, or data from external devices
- LIDAR data e.g., ultrasonic data, LIDAR data, data from any other sensors as described elsewhere herein, or data from external devices
- the imaging device may capture an image or a sequence of images at a specific image resolution.
- the image resolution may be defined by the number of pixels in an image.
- the image resolution may be greater than or equal to about 352 ⁇ 420 pixels, 480 ⁇ 320 pixels, 720 ⁇ 480 pixels, 1280 ⁇ 720 pixels, 1440 ⁇ 1080 pixels, 1920 ⁇ 1080 pixels, 2048 ⁇ 1080 pixels, 3840 ⁇ 2160 pixels, 4096 ⁇ 2160 pixels, 7680 ⁇ 4320 pixels, or 15360 ⁇ 8640 pixels.
- the camera may be a 4K camera or a camera with a higher resolution.
- the imaging device may capture a sequence of images at a specific capture rate.
- the sequence of images may be captured standard video frame rates such as about 24p, 25p, 30p, 48p, 50p, 60p, 72p, 90p, 100p, 120p, 300p, 50i, or 60 i .
- the sequence of images may be captured at a rate less than or equal to about one image every 0.0001 seconds, 0.0002 seconds, 0.0005 seconds, 0.001 seconds, 0.002 seconds, 0.005 seconds, 0.01 seconds, 0.02 seconds, 0.05 seconds. 0.1 seconds, 0.2 seconds, 0.5 seconds, 1 second, 2 seconds, 5 seconds, or 10 seconds.
- the capture rate may change depending on user input and/or external conditions (e.g. rain, snow, wind, unobvious surface texture of environment).
- the imaging device may have adjustable parameters. Under differing parameters, different images may be captured by the imaging device while subject to identical external conditions (e.g., location, lighting).
- the adjustable parameter may comprise exposure (e.g., exposure time, shutter speed, aperture, film speed), gain, gamma, area of interest, binning/subsampling, pixel clock, offset, triggering, ISO, etc.
- Exposure e.g., exposure time, shutter speed, aperture, film speed
- gain e.g., gamma, area of interest, binning/subsampling, pixel clock, offset, triggering, ISO, etc.
- Parameters related to exposure may control the amount of light that reaches an image sensor in the imaging device.
- shutter speed may control the amount of time light reaches an image sensor and aperture may control the amount of light that reaches the image sensor in a given time.
- Parameters related to gain may control the amplification of a signal from the optical sensor.
- ISO may control the level of sensitivity of the camera to
- an imaging device may extend beyond a physical imaging device.
- an imaging device may include any technique that is capable of capturing and/or generating images or video frames.
- the imaging device may refer to an algorithm that is capable of processing images obtained from another physical device.
- a payload may include one or more types of sensors.
- types of sensors may include location sensors (e.g., global positioning system (GPS) sensors, mobile device transmitters enabling location triangulation), vision sensors (e.g., imaging devices capable of detecting visible, infrared, or ultraviolet light, such as cameras), proximity or range sensors (e.g., ultrasonic sensors, lidar, time-of-flight or depth cameras), inertial sensors (e.g., accelerometers, gyroscopes, and/or gravity detection sensors, which may form inertial measurement units (IMUs)), altitude sensors, attitude sensors (e.g., compasses), pressure sensors (e.g., barometers), temperature sensors, humidity sensors, vibration sensors, audio sensors (e.g., microphones), and/or field sensors (e.g., magnetometers, electromagnetic sensors, radio sensors).
- location sensors e.g., global positioning system (GPS) sensors, mobile device transmitters enabling location triangulation
- vision sensors e.g.
- the payload may include one or more devices capable of emitting a signal into an environment.
- the payload may include an emitter along an electromagnetic spectrum (e.g., visible light emitter, ultraviolet emitter, infrared emitter).
- the payload may include a laser or any other type of electromagnetic emitter.
- the payload may emit one or more vibrations, such as ultrasonic signals.
- the payload may emit audible sounds (e.g., from a speaker).
- the payload may emit wireless signals, such as radio signals or other types of signals.
- the payload may be capable of interacting with the environment.
- the payload may include a robotic arm.
- the payload may include an item for delivery, such as a liquid, gas, and/or solid component.
- the payload may include pesticides, water, fertilizer, fire-repellant materials, food, packages, or any other item.
- payloads may apply to devices that may be carried by the movable object or that may be part of the movable object.
- one or more sensors may be part of the movable object.
- the one or more sensors may or may be provided in addition to the payload. This may apply for any type of payload, such as those described herein.
- the movable object may be capable of communicating with the user terminal 106 .
- the user terminal may communicate with the movable object itself, with a payload of the movable object, and/or with a carrier of the movable object, wherein the carrier is used to support the payload.
- Any description herein of communications with the movable object may also apply to communications with the payload of the movable object, the carrier of the movable object, and/or one or more individual components of the movable object (e.g., communication unit, navigation unit, propulsion units, power source, processors, memory storage units, and/or actuators).
- the communications between the movable object and the user terminal may be wireless communications.
- Direct communications may be provided between the movable object and the user terminal.
- the direct communications may occur without requiring any intermediary device or network.
- Indirect communications may be provided between the movable object and the user terminal.
- the indirect communications may occur with aid of one or more intermediary device or network.
- indirect communications may utilize a telecommunications network.
- Indirect communications may be performed with aid of one or more router, communication tower, satellite, or any other intermediary device or network.
- Examples of types of communications may include, but are not limited to: communications via the Internet, Local Area Networks (LANs), Wide Area Networks (WANs), Bluetooth, Near Field Communication (NFC) technologies, networks based on mobile data protocols such as General Packet Radio Services (GPRS), GSM, Enhanced Data GSM Environment (EDGE), 3G, 4G, or Long Term Evolution (LTE) protocols, Infra-Red (IR) communication technologies, and/or Wi-Fi, and may be wireless, wired, or a combination thereof.
- LANs Local Area Networks
- WANs Wide Area Networks
- NFC Near Field Communication
- GPRS General Packet Radio Services
- GSM Global System for Mobile communications
- EDGE Enhanced Data GSM Environment
- 3G Third Generation
- 4G Long Term Evolution
- LTE Long Term Evolution
- IR Infra-Red
- Wi-Fi Wi-Fi
- the user terminal may be any type of external device.
- Examples of user terminals may include, but are not limited to, smartphones/cellphones, tablets, personal digital assistants (PDAs), laptop computers, desktop computers, media content players, video gaming station/system, virtual reality systems, augmented reality systems, wearable devices (e.g., watches, glasses, gloves, headgear (such as hats, helmets, virtual reality headsets, augmented reality headsets, head-mounted devices (HMD), headbands), pendants, armbands, leg bands, shoes, vests), gesture-recognition devices, microphones, any electronic device capable of providing or rendering image data, or any other type of device.
- the user terminal may be a handheld object.
- the user terminal may be portable.
- the user terminal may be carried by a human user. In some cases, the user terminal may be located remotely from a human user, and the user can control the user terminal using wireless and/or wired communications.
- Various examples, and/or characteristics of user terminals are provided in greater detail elsewhere here
- the user terminals may include one or more processors that may be capable of executing non-transitory computer readable media that may provide instructions for one or more actions.
- the user terminals may include one or more memory storage devices comprising non-transitory computer readable media including code, logic, or instructions for performing the one or more actions.
- the user terminal may include software applications that allow the user terminal to communicate with and receive imaging data from a movable object.
- the user terminals may include a communication unit, which may permit the communications with the movable object.
- the communication unit may include a single communication module, or multiple communication modules.
- the user terminal may be capable of interacting with the movable object using a single communication link or multiple different types of communication links.
- the user terminal may include a display.
- the display may be a screen.
- the display may or may not be a touchscreen.
- the display may be a light-emitting diode (LED) screen, OLED screen, liquid crystal display (LCD) screen, plasma screen, or any other type of screen.
- the display may be configured to show a graphical user interface (GUI).
- the GUI may show an image that may permit a user to control actions of the UAV. For instance, the user may select a target from the image.
- the target may be a stationary target or a moving target.
- the user may select a direction of travel from the image.
- the user may select a portion of the image (e.g., point, region, and/or object) to define the target and/or direction.
- the user may select the target and/or direction by directly touching the screen (e.g., touchscreen).
- the user may touch a portion of the screen.
- the user may touch the portion of the screen by touching a point on the screen.
- the user may select a region on a screen from a pre-existing set of regions, or may draw a boundary for a region, a diameter of a region, or specify a portion of the screen in any other way.
- the user may select the target and/or direction by selecting the portion of the image with aid of a user interactive device (e.g., mouse, joystick, keyboard, trackball, touchpad, button, verbal commands, gesture-recognition, attitude sensor, thermal sensor, touch-capacitive sensors, or any other device).
- a touchscreen may be configured to detect location of the user's touch, length of touch, pressure of touch, and/or touch motion, whereby each of the aforementioned manner of touch may be indicative of a specific input command from the user.
- the image on the display may show a view collected with aid of a payload of the movable object.
- an image collected by the imaging device may be shown on the display.
- This may be considered a first person view (FPV).
- FPV first person view
- a single imaging device may be provided and a single FPV may be provided.
- multiple imaging devices having different fields of view may be provided.
- the views may be toggled between the multiple FPVs, or the multiple FPVs may be shown simultaneously.
- the multiple FPVs may correspond to (or generated by) different imaging devices, which may have different field of views.
- a user at a user terminal may select a portion of the image collected by the imaging device to specify a target and/or direction of motion by the movable object.
- the image on the display may show a map that may be generated with aid of information from a payload of the movable object.
- the map may optionally be generated with aid of multiple imaging devices (e.g., right camera, left camera, or more cameras), which may utilize stereo-mapping techniques.
- the map may be generated based on positional information about the UAV relative to the environment, the imaging device relative to the environment, and/or the UAV relative to the imaging device.
- Positional information may include posture information, spatial location information, angular velocity, linear velocity, angular acceleration, and/or linear acceleration.
- the map may be optionally generated with aid of one or more additional sensors, as described in greater detail elsewhere herein.
- the map may be a two-dimensional map or a three-dimensional map.
- the views may be toggled between a two-dimensional and a three-dimensional map view, or the two-dimensional and three-dimensional map views may be shown simultaneously.
- a user at a user terminal may select a portion of the map to specify a target and/or direction of motion by the movable object.
- the views may be toggled between one or more FPV and one or more map view, or the one or more FPV and one or more map view may be shown simultaneously.
- the user may make a selection of a target or direction using any of the views.
- the portion selected by the user may include the target and/or direction.
- the user may select the portion using any of the selection techniques as described.
- the image may be provided in a 3D virtual environment that is displayed on the user terminal (e.g., virtual reality system or augmented reality system).
- the 3D virtual environment may optionally correspond to a 3D map.
- the virtual environment may comprise a plurality of points or objects that can be manipulated by a user.
- the user can manipulate the points or objects through a variety of different actions in the virtual environment. Examples of those actions may include selecting one or more points or objects, drag-and-drop, translate, rotate, spin, push, pull, zoom-in, zoom-out, etc. Any type of movement action of the points or objects in a three-dimensional virtual space may be contemplated.
- a user at a user terminal can manipulate the points or objects in the virtual environment to control a flight path of the UAV and/or motion characteristic(s) of the UAV.
- the user terminal may optionally be used to control the movement of the movable object, such as the flight of an UAV.
- the user terminal may permit a user to manually directly control flight of the movable object.
- a separate device may be provided that may allow a user to manually directly control flight of the movable object.
- the separate device may or may not be in communication with the user terminal.
- the flight of the movable object may optionally be fully autonomous or semi-autonomous.
- the user terminal may optionally be used to control any component of the movable object (e.g., operation of the payload, operation of the carrier, one or more sensors, communications, navigation, landing stand, actuation of one or more components, power supply control, or any other function).
- a separate device may be used to control one or more components of the movable object.
- the separate device may or may not be in communication with the user terminal.
- One or more components may be controlled automatically with aid of one or more processors.
- a target object 108 may be selected by a user.
- the movable object 102 may travel toward the target object and/or visually track the target object.
- the target object may be a stationary target or a moving target.
- the user may specify whether the target is a stationary or moving target.
- a user may specify by selecting a mode of targeting (e.g., select a fly-to mode or a tracking mode).
- the user may provide any other type of indicator of whether the target is a stationary or moving target.
- no indication may be provided, and a determination may be automatically made with aid of one or more processors, optionally without requiring user input whether the target is a stationary target or a moving target, and selecting a mode of targeting (e.g., select a fly-to mode or a tracking mode).
- a target object may be classified as a stationary target or a moving target depending on its state of motion. In some cases, a target object may be moving or stationary at any given point in time. When the target object is moving, the target object may be classified as a moving target. Conversely, when the same target object is stationary, the target object may be classified as a stationary target.
- a stationary target may remain substantially stationary within an environment.
- stationary targets may include, but are not limited to landscape features (e.g., trees, plants, mountains, hills, rivers, streams, creeks, valleys, boulders, rocks, etc.) or manmade features (e.g., structures, buildings, roads, bridges, poles, fences, unmoving vehicles, signs, lights, etc.).
- Stationary targets may include large targets or small targets.
- a user may select a stationary target.
- the stationary target may be recognized.
- the stationary target may be mapped.
- the movable object may travel to the stationary target.
- a path (e.g., flight path) may be planned for the movable object to travel to the stationary target.
- the movable object may travel to the stationary target without requiring a planned path.
- the stationary target may correspond to a selected portion of a structure or object.
- the stationary target may correspond to a particular section (e.g., top floor) of a skyscraper.
- a moving target may be capable of moving within the environment.
- the moving target may always be in motion, or may be at motions for portions of a time.
- the moving target may move in a fairly steady direction or may change direction.
- the moving target may move in the air, on land, underground, on or in the water, and/or in space.
- the moving target may be a living moving target (e.g., human, animal) or a non-living moving target (e.g., moving vehicle, moving machinery, object blowing in wind or carried by water, object carried by living target).
- the moving target may include a single moving object or a group of moving objects.
- the moving target may include a single human or a group of moving humans.
- Moving targets may be large targets or small targets. A user may select a moving target.
- the moving target may be recognized.
- the moving target may be mapped.
- the movable object may travel to the moving target and/or visually track the moving object.
- a path e.g., flight path
- the path may be changed or updated as the moving object moves.
- the movable object may travel to the stationary object and/or visually track the moving object without requiring a planned path.
- a moving target may be any object configured to move within any suitable environment, such as in air (e.g., a fixed-wing aircraft, a rotary-wing aircraft, or an aircraft having neither fixed wings nor rotary wings), in water (e.g., a ship or a submarine), on ground (e.g., a motor vehicle, such as a car, truck, bus, van, motorcycle; a movable structure or frame such as a stick, fishing pole; or a train), under the ground (e.g., a subway), in space (e.g., a spaceplane, a satellite, or a probe), or any combination of these environments.
- air e.g., a fixed-wing aircraft, a rotary-wing aircraft, or an aircraft having neither fixed wings nor rotary wings
- water e.g., a ship or a submarine
- ground e.g., a motor vehicle, such as a car, truck, bus, van, motorcycle
- a movable structure or frame such as a stick,
- a moving target may be capable of moving freely within the environment with respect to six degrees of freedom (e.g., three degrees of freedom in translation and three degrees of freedom in rotation).
- the movement of the moving target can be constrained with respect to one or more degrees of freedom, such as by a predetermined path, track, or orientation.
- the movement can be actuated by any suitable actuation mechanism, such as an engine or a motor.
- the actuation mechanism of the moving target can be powered by any suitable energy source, such as electrical energy, magnetic energy, solar energy, wind energy, gravitational energy, chemical energy, nuclear energy, or any suitable combination thereof.
- the moving target may be self-propelled via a propulsion system, such as described further below.
- the propulsion system may optionally run on an energy source, such as electrical energy, magnetic energy, solar energy, wind energy, gravitational energy, chemical energy, nuclear energy, or any suitable combination thereof.
- the moving target can be a vehicle, such as a remotely controlled vehicle.
- Suitable vehicles may include water vehicles, aerial vehicles, space vehicles, or ground vehicles.
- aerial vehicles may be fixed-wing aircraft (e.g., airplane, gliders), rotary-wing aircraft (e.g., helicopters, rotorcraft), aircraft having both fixed wings and rotary wings, or aircraft having neither (e.g., blimps, hot air balloons).
- a vehicle can be self-propelled, such as self-propelled through the air, on or in water, in space, or on or under the ground.
- a self-propelled vehicle can utilize a propulsion system, such as a propulsion system including one or more engines, motors, wheels, axles, magnets, rotors, propellers, blades, nozzles, or any suitable combination thereof.
- the propulsion system can be used to enable the movable object to take off from a surface, land on a surface, maintain its current position and/or orientation (e.g., hover), change orientation, and/or change position.
- a direction 110 may be selected by the user.
- the movable object 102 may travel in the direction selected by the user.
- the direction may be selected by a user selecting a portion of an image (e.g., in FPV or map view).
- the movable object may travel in the selected direction until a countermanding instruction is received or when a countermanding condition is realized. For instance, the movable object may automatically travel in the selected direction until a new direction is input, or a new target is input.
- the movable object may travel in the selected direction until a different flight mode is selected. For instance, the user may take manual control over the flight of the movable object.
- Restrictions may be provided for the travel of the movable object.
- a condition may be detected in which a flight restriction may apply.
- obstacle avoidance may occur when undergoing target or direction tracking. Additional limitations such as flight ceilings, flight floors, limited range, or other types of flight restrictions may apply.
- FIG. 2 shows an example of communications that may occur within a visual navigation system.
- a user terminal 202 may be provided that may accept an input from a user.
- the user terminal may include an output device 204 .
- the user terminal may also communicate with a flight controller 206 , which may communicate with an image analyzer 208 .
- the image analyzer may communicate with an imaging device 210 .
- the imaging device may capture images which may include portions indicative of one or more target objects 212 and/or one or more target direction(s) 214 .
- a user terminal 202 may include an output device 204 of the user terminal.
- the output device may be a display, such as a screen.
- a user may interact with the user terminal via the output screen.
- the output device is a touchscreen
- a user may manipulate visual objects in a GUI on the touchscreen by selecting (touching) the visual objects through a variety of actions. Examples of those actions may include selecting one or more points or objects, draw a shape, drag-and-drop, translate, rotate, spin, push, pull, zoom-in, zoom-out, etc. Any type of user action in the GUI may be contemplated.
- a user at a user terminal can manipulate the visual objects in the GUI to control flight path, flight direction, tracking function, and/or motion characteristic(s) of the UAV.
- the display may have any characteristics as described elsewhere herein.
- the display may be incorporated into the user device or may be provided separately from the rest of the user terminal. If provided separately from the rest of the user terminal, the display device may communicate with the user terminal. Two-way communications may optionally be provided between the output device and the rest of the user terminal.
- the user terminal may be configured to display, on the output device, one or more images through which a user may select a target and/or direction.
- the images may include FPVs and/or map views.
- the image may include a live-image or visual representation of a target and/or direction.
- a target object and/or direction may be identified by a user that may make a selection in the image. For example, a portion of the image selected by the user may become a target object. A portion of the image selected by the user may become a target direction.
- One or more imaging devices 210 may be provided.
- the one or more imaging devices may have substantially the same field of view or different fields of view.
- One or more imaging devices may be movable relative to the movable object while one or more imaging devices may be stationary relative to the movable object.
- one or more of the imaging devices may be supported by a carrier that may permit movement of the imaging device relative to the movable object.
- One or more of the imaging devices may be directly on the movable object, move in the same direction and speed as the movable object, and/or may not move relative to the movable object.
- One or more imaging devices may capture images of an environment.
- the environment may include one or more target objects 212 and/or target directions 214 .
- the target objects and/or directions may be defined or determined by the user who may make a selection within the image.
- the image data captured by the one or more imaging devices may correspond to, for example, still images or video frames of one or more objects.
- the objects may include any physical object or structure that can be optically identified and/or tracked in real-time by the movable object.
- Optical tracking has several advantages. For example, optical tracking allows for wireless ‘sensors’, is less susceptible to noise, and allows for many objects (e.g., different types of objects) to be tracked simultaneously.
- the objects can be depicted in still images and/or video frames in a 2D or 3D format, can be real-life and/or animated, can be in color, black/white, or grayscale, can be in any color space, or can be in a wireframe model.
- Images from the one or more imaging devices may optionally be received by an image analyzer 208 .
- the image analyzer may be on-board the imaging device, on-board a carrier, on-board a movable object, or an external device (e.g., user terminal, server, etc.).
- the image analyzer may be located remotely from the imaging device.
- the image analyzer may be disposed in a remote server that is in communication with the imaging device.
- the image analyzer may be provided at any other type of external device (e.g., a remote controller for a tracking device, an object carried by the target object, a reference location such as a base station, or another tracking device), or may be distributed on a cloud computing infrastructure.
- the image analyzer and the flight controller may be located on a same device. In other embodiments, the image analyzer and the flight controller may be located on different devices. The image analyzer and the flight controller may communicate either via wired or wireless connections. In some embodiments, the image analyzer may be located on a movable object. For example, the image analyzer may be disposed in a housing of the movable object. In some further embodiments, the image analyzer may be disposed at a base station that is in communication with the movable object. The image analyzer may be located anywhere, as long as the image analyzer is capable of: (i) receiving a plurality of image frames captured at different times using an imaging device, and (ii) analyzing the plurality of image frames.
- the image data captured by the imaging device may be stored in a media storage (not shown) before the image data is provided to the image analyzer.
- the image analyzer may be configured to receive the image data directly from the media storage.
- the image analyzer may be configured to receive image data concurrently from both the imaging device and the media storage.
- the media storage can be any type of storage medium capable of storing image data of a plurality of objects.
- the image data may include video or still images. The video or still images may be processed and analyzed by the image analyzer, as described later in the specification.
- the media storage can be provided as a CD, DVD, Blu-ray disc, hard disk, magnetic tape, flash memory card/drive, solid state drive, volatile or non-volatile memory, holographic data storage, and any other type of storage medium.
- the media storage can also be a computer capable of providing image data to the image analyzer.
- the media storage can be a web server, an enterprise server, or any other type of computer server.
- the media storage can be computer programmed to accept requests (e.g., HTTP, or other protocols that can initiate data transmission) from the image analyzer and to serve the image analyzer with requested image data.
- requests e.g., HTTP, or other protocols that can initiate data transmission
- the media storage can be a broadcasting facility, such as free-to-air, cable, satellite, and other broadcasting facility, for distributing image data.
- the media storage may also be a server in a data network (e.g., a cloud computing network).
- the media storage may be located on-board the imaging device. In some other embodiments, the media storage may be located on-board the movable object but off-board the imaging device. In some further embodiments, the media storage may be located on one or more external devices off-board the movable object and/or the imaging device. In those further embodiments, the media storage may be located on a remote controller, a ground station, a server, etc. Any arrange or combination of the above components may be contemplated. In some embodiments, the media storage may communicate with the imaging device and the tracking device via a peer-to-peer network architecture. In some embodiments, the media storage may be implemented using a cloud computing architecture.
- the image data may be provided (e.g., in the form of image signals) to the image analyzer for image processing/analysis.
- the image analyzer can be implemented as a software program executing in a processor and/or as hardware that analyzes the plurality of image frames to identify a target object and/or direction.
- the image analyzer may be configured to analyze the image frames to identify a target object, such a stationary target or a moving target. This may include detecting the object based on an input from the user, such as a portion of the image that is selected. For instance, even if a single point is selected, an object corresponding to the point may be determined.
- the image analyzer may be configured to analyze the image frames to identify a target direction.
- the image analyzer may be configured to determine the relative positions between the movable object and the target object or direction. In some instances, the image analyzer may determine a position of the imaging device and/or movable object with respect to the environment (e.g., an inertial reference frame) and/or one another. The image analyzer may determine a position of the target object with respect to the environment (e.g., an inertial reference frame) and/or with respect to the movable object (which may include an imaging device supported by the movable object).
- data from one or more additional sensors and/or external devices may be used to aid in determination of positional information by the image analyzer (for example, IMU data or data from any other sensors as described elsewhere herein).
- positional information may include spatial location (e.g., in reference to one, two or three axes), attitude (e.g., relative to one, two or three axes), linear velocity, angular velocity, linear acceleration, and/or angular acceleration.
- the resulting analysis of the image frames may be provided (in the form of analyzed signals) to be displayed on an output device of a user terminal.
- a map may be generated indicative of the environment and/or positions of various objects and/or the movable object within the environment.
- the map may be a 2D or a 3D map.
- the map may be displayed on the output device.
- data from the image analyzer may be provided directly to a user terminal which may display it on its output device without requiring any intermediary analysis or processing.
- the data from the image analyzer may optionally be transmitted to be displayed on an output device of the user terminal without going through a flight controller.
- data from the image analyzer may be provided to a flight controller 206 .
- the flight controller may be provided on-board the movable object, on-board the carrier, on-board the imaging device, and/or on an external device or network.
- the flight controller may be provided using any exemplary devices or configurations provided elsewhere herein for other components, such as the image analyzer or memory storage.
- the flight controller may control flight of the movable object.
- the flight controller may generate one or more flight instructions to be provided to one or more propulsion units of the movable object.
- the flight controller may optionally generate a flight path for the movable object.
- the flight path may be substantially fixed, or may be variable or dynamic.
- the flight path may be toward a target object.
- a flight path may be toward a stationary object.
- the flight path may optionally be heading towards a moving object, but the heading and/or path may be altered as the object is moving. Alternatively, no flight path is generated for the moving object.
- the flight path may include a heading in a target direction.
- the flight path may remain heading in the target direction until a countermanding condition is detected (e.g., a further input is detected or a flight restriction applies).
- the flight controller may be in communication with one or more propulsion units of the movable object (not pictured).
- information from one or more sensors may be provided to the flight controller.
- information from one or more sets of IMUs may be provided to the flight controller.
- the one or more sets of IMUs may be on-board the movable object, on-board a carrier and/or on-board a payload.
- the data from the IMUs may be indicative of positional information of the movable object, the carrier, and/or the payload.
- the flight controller may optionally use the information from the one or more sensors in controlling flight of the UAV.
- the information from the one or more sensors may be used to control position of the imaging device relative to the UAV and/or its environment.
- the flight controller may receive information from the user terminal.
- the flight controller may receive information indicative of the user selection of a target and/or direction.
- the flight controller may generate a flight path and/or control flight of the UAV in response to the selection of the target and/or direction.
- Information from the flight controller may optionally be provided to the user terminal.
- a user terminal may receive information about a flight path.
- the flight path and/or heading may optionally be displayed on the output device.
- FIG. 2 While shown in FIG. 2 as separate components that are operatively connected, it is understood that the as-shown configuration is for illustrative purposes only. Certain components or devices may be removed or combined, and other components or devices may be added.
- a method for controlling a movable object may be implemented using the system of FIG. 2 .
- the method may include obtaining target information for the movable object while the movable object is at a first location.
- the target information may be indicative of a second location different from the first location.
- the method may further include generating a path for the movable object from the first location to the second location.
- the target information may be obtained using one or more imaging devices.
- the path for the movable object may be a flight path, and may be generated by the flight controller and/or the user terminal.
- the second location may be based on one or more selected points in at least one image captured by the movable object at the first location.
- the image may be captured using at least one imaging device located on the movable object.
- the imaging device may be a payload carried by the movable object.
- the image may be displayed on the output device of the user terminal.
- the one or more selected points may be associated with a target. When a user selects one or more points in the image on the display, at least a portion of the target that is displayed in the image may be selected. In some cases, selecting the one or more points may cause the entire target that is displayed in the image to be selected.
- the position of the target in the real world may be determined using a single imaging device, or a plurality of imaging devices.
- the position of the target may be determined using a triangulation method.
- the imaging device may be translated (by moving the movable object) in a lateral manner relative to the target, and perpendicular to a direction from the imaging device to the target.
- the imaging device may capture a plurality of images.
- the plurality of images may be provided to the image analyzer, which then calculates a distance from the target to the movable object based on: (1) a change in position of the target in the plurality of images, and (2) distances traveled by the movable object during the lateral translation.
- the distances covered during the lateral translation may be recorded by an IMU on the imaging device and/or the movable object.
- the distances covered during the lateral translation may be obtained from one or more global navigation satellite systems (GNSS).
- GNSS receivers on the imaging device and/or the movable object can determine estimated position, velocity, and precise time (PVT) by processing signals broadcasted by the satellites.
- PVT information can be used to calculate the distances covered during the lateral translation.
- the IMU may be an electronic device that is configured to measure and report the UAV's velocity, orientation, and gravitational forces, using a combination of accelerometers and gyroscopes. Magnetometers may be optionally included.
- the IMU may detect current rate of acceleration using one or more accelerometers, and detect changes in rotational attributes like pitch, roll and yaw using one or more gyroscopes.
- a magnetometer may be included to assist calibration against orientation drift.
- the position of the target may be determined using a single imaging device that is a time-of-flight (TOF) camera.
- TOF camera may be a range imaging camera system that can resolve distances based on the known speed of light, by measuring the time-of-flight of a light signal between the camera and the subject for each point of the image. In some cases, tracking accuracy may be improved using a TOF camera.
- the position of the target may be determined using a plurality of imaging devices.
- FIG. 3 shows an example in which the position of the target may be determined using a plurality of imaging devices.
- a first imaging device 304 and a second imaging device 306 may be provided.
- the first imaging device and the second imaging device may be disposed at different locations.
- the first imaging device may be a payload carried by a movable object 302
- the second imaging device may be located on or within the movable object.
- the first imaging device may be a camera and the second imaging device may be a binocular vision sensor.
- the first imaging device and the second imaging device may be part of a same binocular camera.
- a first IMU may be disposed on the payload, for example on the first imaging device itself, or on a carrier that couples the payload to the movable object.
- a second IMU may be located within a body of the movable object.
- the first imaging device and the second imaging device may have different optical axes.
- the first imaging device may have a first optical axis 305 and the second imaging device may have a second optical axis 307 .
- the first imaging device and the second imaging device may belong to different inertial reference frames that move independently of each other.
- the first imaging device and the second imaging device may belong to a same inertial reference frame.
- the first imaging device may be configured to capture an image 310 that is displayed on an output device of a user terminal.
- the second imaging device may be configured to capture a binocular image 314 comprising a left-eye image 314 - 1 and a right-eye image 314 - 2 .
- the first imaging device and the second imaging device may capture images of a target 308 .
- the position of the target in the captured images may be different since the first imaging device and the second imaging device are at different locations.
- the position 308 ′ of the target in the image 310 may be located on a bottom right corner of the image.
- the position 308 - 1 ′ of the target in the left-eye image 314 - 1 and the position 308 - 2 ′ of the target in the right-eye image 314 - 2 may be located in a left portion of the respective left and right eye images.
- the positions 308 - 1 ′ and 308 - 2 ′ in the left and right eye images may also be slightly different due to the binocular vision.
- a positional difference between the first imaging device and the second imaging device may be determined based on real-time positional information obtained from the first IMU and the second IMU.
- the real-time positional information from the first IMU may be indicative of the actual position of the first imaging device since the first IMU is mounted on the payload.
- the real-time positional information from the second IMU may be indicative of the actual position of the second imaging device since the second IMU is located at the second imaging device on the body of the movable object.
- the flight controller may adjust an attitude of the movable object and/or the payload based on the calculated positional difference.
- the image analyzer may be configured to correlate or map the images obtained by the second imaging device to the images obtained by the first imaging device, based on the calculated positional difference.
- the position of the target may be determined based on the correlation of the images between the first and second imaging devices, and the positional difference of the first and second imaging devices at different time instances.
- the actual position of the target need not be known.
- the tracking may be based primarily from the size and/or position of the target in the image.
- the movable object may be configured to move towards the target until a size of the target in the image reaches a predetermined threshold.
- the imaging device of the movable object may zoom in onto the target without the movable object moving, until a size of the target in the image reaches a predetermined threshold.
- the imaging device may zoom in and the movable object may move towards the target object simultaneously, until a size of the target in the image reaches a predetermined threshold.
- the actual position of the target may be known.
- the size of the target in the image may include a characteristic length of the target in the image.
- the characteristic length of the target in the image may be based on a most significant dimensional scale of the target.
- the most significant dimensional scale of the target may be represented by a length, width, height, thickness, arc, and/or circumference of a substantial portion of the target.
- the predetermined threshold may be defined based on a width of the image.
- the movable object may be configured to move towards the target and/or the imaging device may be actuated until the target in the image is displayed in a target region.
- the movable object may be configured to move along the path from the first location to the second location.
- the surrounding environment may include obstacles in the path between the movable object and the target. These obstacles may be stationary, capable of movement, or in motion.
- information about the external environment may be necessary for the movable object to avoid such obstacles by re-planning its path in real-time.
- information about the external environment may be provided in a 3D map based on one or more images captured by one or more imaging devices. The flight path for the movable object may be generated by using the 3D map.
- FIG. 4 shows an exemplary method for generating a flight path using a 3D map and avoiding obstacles in accordance with some embodiments.
- an image may be captured by an imaging device.
- the imaging device may be a binocular vision sensor on the movable object.
- the image may be a binocular image comprising a left-eye image and a right-eye image.
- the binocular image may be correlated/calibrated with the image captured by another camera (payload).
- a depth map may be generated using the binocular image, by stereo matching of the left-eye image and the right-eye image.
- the left-eye image and right-eye image may be matched to obtain a depth image in which the position of obstacles/objects in the environment can be detected.
- a depth map may be generated using multiple cameras disposed at a plurality of locations.
- the stereo matching may be performed using real-time block matching (BM) or semi-global block matching (SGBM) algorithms implemented using one or more processors.
- ultrasonic data from an ultrasonic sensor may be additionally used to detect the position/distance of an object having no obvious texture (e.g., a binocular vision sensor may not be capable of detecting the position of a white-colored wall, or a glass wall).
- a 3D map of the external environment may be generated by correlating the binocular image or any image to the depth map.
- the left-eye image and/or the right-eye image may be mapped to the depth map.
- the image captured by the other camera (payload) may be mapped to the depth map.
- the depth map may comprise a plurality of pixel points.
- a valid pixel point may correspond to an obstacle in the external environment.
- the relationship between pixel points and obstacles may be one-to-many or many-to-one.
- a valid pixel point may correspond to a plurality of obstacles.
- a plurality of valid pixel points may correspond to an obstacle.
- a group of valid pixel points may correspond to a group of obstacles.
- a valid pixel point has a value that is greater than 0.
- an invalid pixel point is a point that is unidentifiable from the mapped image.
- An invalid pixel point has a value that is equal to or less than 0.
- Objects that have no obvious texture or are transparent may show up invalid pixel points in the image.
- ultrasonic data from ultrasonic imaging may be used to supplement the visual correlation to identify those invalid pixel points.
- the ultrasonic imaging may be performed, for example using a lidar sensor located on the movable object. Ultrasonic data from the ultrasonic sensor can be used to detect the position/distance of an object having no obvious texture or that is transparent.
- 3D spatial points corresponding to the pixel points in the depth map may be generated.
- a 3D spatial point corresponding to a pixel point in the depth map may be given by:
- a plurality of 3D spatial points may be distributed into a plurality of cells of an occupancy grid.
- the position of the movable object may be located at the center of the occupancy grid. In some cases, the position of the movable object may be located another portion (e.g., edge) of the occupancy grid.
- the occupancy grid may be used to define a 3D map of the spatial environment surrounding the movable object.
- the occupancy grid may have plurality of cells.
- the occupancy grid may have a size of n x ⁇ n y ⁇ n z , where n x is the number of cells along an x-axis, n y is the number of cells along a y-axis, and n z is the number of cells along a z-axis.
- n x , n y , and n z may be any integer, and may be the same or different.
- n x and n y may be less than 80 or greater than 80.
- n z may be less than 40 or greater than 40.
- Each cell in the occupancy grid may have a size of m ⁇ m ⁇ m, where m may be any dimension.
- m may be less than or equal to 0.1 meters, 0.2 meters, 0.3 meters, 0.4 meters, 0.5 meters, or 1 meter.
- m may be greater than 1 meter, 1.1 meter, 1.2 meter, 1.3 meter, 1.4 meter, 1.5 meter, or 2 meters.
- Each cell may be denoted as an i-th cell.
- the number of 3D spatial points falling into the cell may be determined.
- a 3D map of the environment may be generated by determining, for each i-th cell, whether a number of 3D spatial points falling within the i-th cell is greater than a predetermined threshold value T.
- Each i-th cell may have a binary state C i .
- the predetermined threshold value ⁇ may be determined based on a sampling frequency of the captured images, and an accuracy of the 3D spatial point as obtained from the depth map.
- the predetermined threshold value ⁇ may increase when the sampling frequency increases and when the number of 3D spatial points falling within the cell increases.
- the predetermined threshold value ⁇ may decrease when the accuracy of the 3D spatial point increases.
- the predetermined threshold value ⁇ may have a range of values. For example, the predetermined threshold value may range from about 5 to about 30. In some cases, the predetermined threshold value may range from less than 5 to more than 30.
- ultrasonic data may be used to supplement the visual correlation to identify invalid pixel points.
- the state C i of all cells having a distance of d s within the sonar range may be set to 1.
- FIG. 5 shows an example of an occupancy grid.
- the occupancy grid may include i number of cells, and that the cells may be arranged in a 3D configuration.
- Each i-th cell may have a binary state C i (0 or 1).
- the occupancy grid may include regions with two different grayscale levels.
- a cell state of 0 may be represented by a grayscale value of 255
- a cell state of 1 may be represented by a grayscale value that is substantially less than 255, in order to distinguish the different cells.
- the path may be generated based on the 3D map.
- the path may be generated based on the 3D map, a current position of the movable object, and a position of the target.
- the path may be generated by taking into account obstacles surrounding the movable object, or that lie between the movable object and the target, or that lie in the vicinity of the movable object and/or the target.
- the flight controller may be configured to generate a path passing through passable (open) space within the 3D map.
- the path may be generated using a Rapidly-Exploring Random Tree (RRT) algorithm.
- the RRT algorithm may include connecting a plurality of lines in X, Y, and Z directions (or other directions) between the current position of the movable object and the position of the target, and applying a polynomial smooth process to the plurality of lines to generate the path.
- the smoothening of the lines in each direction (X, Y, or Z) may be independently processed of the other directions.
- the RRT algorithm may further include discretizing the path into a plurality of control points.
- the movable object may be configured to move from one point to the next point along the path.
- a number of n-order polynomials may be resolved to ensure that the location, velocity and acceleration are continuous at the starting point and ending point of the path, by taking the following known values: (1) the location of starting point, (2) the location of ending point, (3) the velocity, and (4) the acceleration.
- T is the time that the movable object takes to travel from the starting point to the ending point, and is an adjustable parameter. A smaller T results a sharper curve in the path, and a larger T results in a more gradual curve.
- the following 5 th -order polynomial equation may be solved:
- x ( t ) a 5 5 t 5 +a 4 4 +a 3 3 t 3 +a 2 2 t 2 +a 1 t+a 0
- FIG. 6 illustrates flowcharts of different flight modes in which a movable object (such as a UAV) can operate, in accordance with some embodiments.
- a method for controlling a UAV may be implemented using the system of FIG. 2 .
- the method may include receiving, at the user terminal, an input indicative of a selected mode.
- a user may select a mode from a plurality of modes.
- the plurality of modes may include at least a target mode (“target-pointing flight) and a directional mode (“direction-pointing flight”).
- the flight controller may effect movement of the UAV based on the selected mode.
- the UAV may move towards and/or follow a selected target when the selected mode is the target mode.
- the UAV position and/or imaging device position may be adjusted when the selected mode is the target mode.
- the UAV may move in a selected direction when the selected mode is the directional mode.
- the target may be selected based on one or more selected points in at least one image that is captured by the movable object (for example, an imaging device carried by the movable object).
- the different modes may be selected one at a time.
- the different modes may be selectable simultaneously.
- the modes may be selected via a user input.
- the modes may be selected based on the external environment, the locations of the UAV and a target, and/or a direction.
- Part A of FIG. 6 shows a flowchart of a method for implementing target-pointing flight.
- a user may select one or more points on an image.
- the image may be provided in a GUI rendered on the output device of the user terminal.
- the selection may extend to a target associated with that point. In some cases, the selection may extend to a portion of the target.
- the point may be located on or proximate to the target in the image.
- the UAV may then fly towards and/or track the target. For example, the UAV may fly to a predetermined distance, position, and/or orientation relative to the target. In some instances, the UAV may track the target by following it at the predetermined distance, position, and/or orientation.
- the UAV may continue to move towards the target, track the target, or hover at the predetermined distance, position, and/or orientation to the target, until a new target instruction is received at the user terminal
- a new target instruction may be received when the user selects another different one or more points on the image.
- the target selection may switch from the original target to a new target that is associated with the new one or more points.
- the UAV may then change its flight path and fly towards and/or track the new target.
- Part B of FIG. 6 shows a flowchart of a method for implementing direction-pointing flight.
- a user may select a point on an image.
- the image may be provided in a GUI rendered on the output device of the user terminal.
- the selection may extend to a target direction associated with that point.
- the UAV may then fly in the direction.
- the UAV may continue to move in the direction until a countermanding condition is detected. For instance, the UAV may fly in the target direction until a new target direction instruction is received at the user terminal.
- a new target direction instruction may be received when the user selects another different point on the image.
- the target direction selection may switch from the original direction to a new target direction that is associated with the new point.
- the UAV may then change its flight path and fly in the new target direction.
- the image analyzer may further determine whether the target is stationary or moving based on a plurality of captured images.
- the target mode may comprise a fly-to mode and a tracking mode.
- the UAV may be configured to fly towards the target when the UAV is in the fly-to mode.
- the UAV may be configured to track the target when the UAV is in the tracking mode.
- the UAV may maintain a predetermined distance to the target or maintain the target in its field of view, and may or may not fly towards the target.
- the motion state of the target may determine which mode of the target mode will be selected. For example, the fly-to mode may be selected when the target is determined to be stationary. The fly-to mode may be selected when a relatively direct path exists between the UAV and the target. When the relatively direct path exists, the UAV may be configured to move in a substantially straight line along the path. In some cases, the path may be established by a clear line of sight between the UAV and the target.
- the fly-to mode may also be selected when no obstacles are determined to be present as the UAV is moving towards the selected target. In some cases, the fly-to mode may be selected when fewer than a predetermined number and/or type of obstacles are determined to be present as the UAV is moving towards the selected target.
- the UAV may be configured to move towards the stationary target when the selected mode is the fly-to mode.
- the UAV may be configured to move to a predetermined distance from the stationary target.
- one or more motion characteristics of the UAV may be adjusted when the UAV is moving towards the stationary object.
- the one or more motion characteristics of the UAV may be adjusted when one or more obstacles appear in a path between the UAV and the stationary target.
- the motion state of the target may determine which mode of the target mode will be selected.
- the tracking mode may be selected when the target is determined to be moving.
- the tracking mode may be selected when a relatively complicated flight path exists between the UAV and the target.
- the complicated flight path may require the movable object to move in a zigzag manner, in different directions, and/or at different altitudes along the path. In such cases, a clear line of sight may often be absent between the movable object and the target.
- the tracking mode may be selected when at least one obstacle is determined to be present as the UAV is moving towards the selected target.
- the tracking mode may be selected when greater than a predetermined number and/or type of obstacles are determined to be present as the UAV is moving towards the selected target.
- the UAV may be configured to follow the moving target when the selected mode is the tracking mode.
- the UAV may be configured to follow the moving target at a predetermined distance.
- one or more motion characteristics of the UAV may be adjusted when the UAV is following the moving target.
- the one or more motion characteristics of the UAV may be adjusted when one or more obstacles appear in a path between the UAV and the moving target.
- the flight controller and/or the user terminal may automatically toggle/switch between the fly-to mode or the tracking mode based on whether the target is determined to be stationary or moving. In some cases, automatically toggling/switching between the fly-to mode or the tracking mode may take place depending on a number and/or type of obstacles that are present as the UAV is moving towards the selected target. For example, the selected mode may be switched to the fly-to mode when no obstacles are determined to be present as the UAV is moving towards the selected target. The selected mode may be switched to the fly-to mode when fewer than a predetermined number and/or type of obstacles are determined to be present as the UAV is moving towards the selected target.
- the selected mode may be switched to the tracking mode when at least one obstacle is determined to be present as the UAV is moving towards the selected target. In other cases, the selected mode may be switched to the tracking mode when greater than a predetermined number and/or type of obstacles are determined to be present as the UAV is moving towards the selected target.
- the target mode may encompass simultaneous operation of the both the fly-to mode and the tracking mode.
- the UAV may be configured to fly to the target while simultaneously tracking the target.
- the UAV may also track the target regardless of the motion state of the target (e.g., regardless whether the target is stationary or moving).
- the UAV may be configured to move in a selected direction when the selected mode is the directional mode.
- the direction may be selected based on one or more selected points in at least one image that is captured by the movable object (for example, an imaging device carried by the movable object).
- one or more motion characteristics of the UAV may be adjusted when the UAV is moving in the selected direction.
- the one or more motion characteristics of the UAV may be adjusted when one or more obstacles appear in the selected direction that the UAV is moving.
- the UAV may be configured to switch course from one direction to another direction depending on which point in the image is being selected.
- the UAV may be configured to move in a first direction when a first point is selected, and to move in a second direction when a second point selected.
- the selection of the second point may replace the selection of the first point.
- the first point and the second point are located at different portions of the image.
- the second direction may be different from the first direction.
- An attitude and/or orientation of the UAV may be changed when the UAV is switching course from one direction to another direction.
- the flight controller may be configured to generate a transition path that allows the UAV to switch course from one direction to another direction in a curvilinear manner Switching course in the curvilinear manner may provide certain benefits, such as reduction in power consumption of the UAV and/or improvement in flight stability of the UAV.
- the UAV may be configured to move in a path when the selected mode is either in the target mode or the directional mode.
- the flight controller may generate a detour from the path when one or more obstacles are detected in the path.
- the one or more obstacles may be stationary, capable of movement, or in motion.
- the UAV may be configured to automatically avoid the one or more obstacles by moving along the detour.
- the detour may exit the path at a first point and rejoin the path at a second point.
- the original flight path may be substantially replaced by the detour.
- the detour may be shaped around, above, and/or underneath the one or more obstacles.
- the detour may be in a lateral and/or vertical direction, or any direction in 3-dimensional space.
- the detour may be a straight line, a curve, curvilinear path, or any combination thereof.
- an orientation of the UAV and/or an imaging device located thereon may be changed during the detour, such that the target remains in a field-of-view of the imaging device.
- FIG. 7 shows an example of a user interface (UI) through which a user may select a target and cause the UAV to move towards the target.
- Part A shows an initial display of an environment comprising the target.
- Part B shows a user selecting the target within the initial display.
- Part C shows a box indicating the selected target.
- Part D shows an image of the target after the UAV has moved towards the target and is at a distance from the target.
- Part A shows an initial display of an environment comprising the target.
- a FPV may be provided as illustrated.
- the FPV may include a live streaming image from an imaging device.
- the imaging device may be a payload of the UAV.
- the imaging device may be mounted on a body of the UAV. In some instances, the imaging device may be located remotely from the UAV at a different location. In some instances, the imaging may be located on another UAV.
- the FPV may alternatively be a graphical depiction or representation of the image from the imaging device.
- the target lies within the field of view of the imaging device. In some cases, the target may be a stand-alone object. In other cases, the target may be surrounded by or proximate to one or more other objects.
- the target may be stationary and/or capable of movement.
- the UAV may be stationary or moving while the initial display of the environment is occurring.
- a map view may be provided.
- the map view may include a 2D map, such as an overhead map.
- the map view may include a 3D map.
- the 3D map may be alterable to view the 3D environment from various angles. Solid renderings, wireframes, or other types of imaging may be shown, as described previously herein.
- the display may be shown on a user terminal.
- the user may optionally hold the user terminal.
- the user may interact with the display by selecting different points or objects in the FPV.
- Part B shows a user selecting a target within the initial display.
- the user may select a portion of the image to select the target.
- the image may include a FPV and/or a map.
- the map may be a 2D map or a 3D map.
- the image optionally include a plurality of FPVs.
- the user may select a portion of the FPV or the map to select the target.
- the portion of the image selected by the user may optionally be a point.
- the point may be located on the target as shown on the display. In some embodiments, the point may be located proximate to the target as shown on the display.
- the target may be automatically selected when the user selects the point.
- the target may be marked using one or more types of marking schemes (e.g., shading, coloring, highlighted, etc.) to indicate that the target has been selected.
- a pop-up window may appear at the target on the display requesting confirmation from the user whether the user wishes to select the target.
- a plurality of bounding boxes may be generated in the vicinity of the selected point. Each bounding box may be associated with a different target. A user may be presented with the option to select a target by selecting the respective bounding box. In some instances, a user may select more than one target. In those instances, the UAV may be configured to fly first to a nearer target and then to a target that is further away.
- the target may be identified from a 2D image or a 3D map.
- Identification of a target from a 3D map may be based on spatial information of objects/features obtained from, for example the 3D map described in FIG. 4 and/or an occupancy grid similar to that shown in FIG. 5 .
- Identification of a target by selecting a point on the image may be performed using a category-independent segmentation algorithm. For example, when a user selects a point on or in the vicinity of the target on the image, the target may be segmented from adjacent or surrounding objects. The segmentation may proceed without knowing which object category the target may fall into. In some cases, the segmentation algorithm may generate a plurality of seed regions in the image(s) and ranks each region, such that top-ranked regions are likely to be good segmentations of different objects (i.e., correspond to different objects).
- the target may be selected based on moving target detection.
- the UAV and the surrounding environment are assumed to be static/stationary, and the target to be tracked may be the only moving object in the image.
- the target can be identified and selected through background subtraction.
- a feature point can be a portion of an image (e.g., an edge, corner, interest point, blob, ridge, etc.) that is uniquely distinguishable from the remaining portions of the image and/or other feature points in the image.
- a feature point may be relatively invariant to transformations of the imaged object (e.g., translation, rotation, scaling) and/or changes in the characteristics of the image (e.g., brightness, exposure).
- a feature point may be detected in portions of an image that is rich in terms of informational content (e.g., significant 2D texture).
- a feature point may be detected in portions of an image that are stable under perturbations (e.g., when varying illumination and brightness of an image).
- Feature points can be detected using various algorithms (e.g., texture detection algorithm) which may extract one or more feature points from image data.
- the algorithms may additionally make various calculations regarding the feature points. For example, the algorithms may calculate a total number of feature points, or “feature point number.”
- the algorithms may also calculate a distribution of feature points. For example, the feature points may be widely distributed within an image (e.g., image data) or a subsection of the image. For example, the feature points may be narrowly distributed within an image (e.g., image data) or a subsection of the image.
- the algorithms may also calculate a quality of the feature points. In some instances, the quality of feature points may be determined or evaluated based on a value calculated by algorithms mentioned herein (e.g., FAST, Corner detector, Harris, etc).
- the algorithm may be an edge detection algorithm, a corner detection algorithm, a blob detection algorithm, or a ridge detection algorithm.
- the corner detection algorithm may be a “Features from accelerated segment test” (FAST).
- the feature detector may extract feature points and make calculations regarding feature points using FAST.
- the feature detector can be a Canny edge detector, Sobel operator, Harris & Stephens/Plessy/Shi-Tomasi corner detection algorithm, the SUSAN corner detector, Level curve curvature approach, Laplacian of Gaussian, Difference of Gaussians, Determinant of Hessian, MSER, PCBR, or Grey-level blobs, ORB, FREAK, or suitable combinations thereof.
- a feature point may comprise one or more non-salient features.
- non-salient features may refer to non-salient regions or non-distinct (e.g., non-recognizable) objects within an image.
- Non-salient features may refer to elements within an image that are unlikely to stand out or catch attention of a human observer. Examples of non-salient features may include individual pixels or groups of pixels that are non-distinct or non-identifiable to a viewer, when viewed outside of the context of their surrounding pixels.
- a feature point may comprise one or more salient features.
- Salient features may refer to salient regions or distinct (e.g., recognizable) objects within an image.
- salient features may refer to salient regions or distinct (e.g., recognizable) objects within an image.
- Salient features may refer to elements within an image that are likely to stand out or catch attention of a human observer.
- a salient feature may have semantic meaning.
- Salient features may refer to elements that may be identified consistently under computer vision processes.
- a salient feature may refer to animate objects, inanimate objects, landmarks, marks, logos, obstacles, and the like within an image.
- a salient feature may be persistently observed under differing conditions.
- a salient feature may be persistently identified (e.g., by a human observer or by computer programs) in images acquired from different points of view, during different times of the day, under different lighting conditions, under different weather conditions, under different image acquisition settings (e.g., different gain, exposure, etc), and the like.
- salient features may include humans, animals, faces, bodies, structures, buildings, vehicles, planes, signs, and the like.
- Salient features may be identified or determined using any existing saliency calculating methods.
- salient features may be identified by contrast based filtering (e.g., color, intensity, orientation, size, motion, depth based, etc), using a spectral residual approach, via frequency-tuned salient region detection, via a binarized normed gradients for objectness estimation, using a context-aware top down approach, by measuring visual saliency by site entropy rate, and the like.
- salient features may be identified in a saliency map that is generated by subjecting one or more images to contrast based filtering (e.g., color, intensity, orientation, etc).
- a saliency map may represent areas with feature contrasts.
- a saliency map may be a predictor where people will look.
- a saliency map may comprise a spatial heat map representation of features or fixations.
- salient regions may have a higher luminance contrast, color contrast, edge content, intensities, etc than non-salient regions.
- salient features may be identified using object recognition algorithms (e.g., feature based methods, appearance based methods, etc).
- object recognition algorithms e.g., feature based methods, appearance based methods, etc.
- one or more objects or types of patterns, objects, figures, colors, logos, outlines, etc may be pre-stored as possible salient features.
- An image may be analyzed to identify salient features that are pre-stored (e.g., an object or types of objects).
- the pre-stored salient features may be updated. Alternatively, salient features may not need to be pre-stored.
- Salient features may be recognized on a real time basis independent to pre-stored information.
- the precision to which the user may specify a point may be on the order of 0.01 degrees or less, 0.05 degrees or less, 0.1 degrees or less, 0.5 degrees or less, 1 degree or less 2 degrees or less, 3 degrees or less, 5 degrees or less, 7 degrees or less, 10 degrees or less, 15 degrees or less, 20 degrees or less, or 30 degrees or less.
- the UAV may travel towards the target that is indicated by the selected point.
- a flight path for the UAV may be defined from the current location of the UAV to the location of the target.
- the flight path may be denoted by a vector between the current location of the UAV to the location of the target.
- a flight path to the selected target may or may not be visually indicated on the screen.
- a visual marker may be provided within the image indicative of the flight path to the target object.
- the visual marker may be a point, region, icon, line, or vector.
- the line or vector may be indicative of a direction of the flight path towards the target.
- the line or vector may be indicative of the direction that the UAV is heading.
- a user may specify that the UAV is in a target mode.
- the portion of the image selected by the user may determine the target towards which the UAV will travel until it encounters obstacles, or when another different target is selected, or when the UAV encounters flight restrictions.
- the UAV may travel towards the target object until it encounters a stop or change criteria, such as a target change, flight restriction, flight mode change, low power supply or obstacle.
- the user may specify that the UAV is in a target mode by selecting the target mode from one or more available modes, such as a directional mode as previously mentioned.
- Any other user interface tools or techniques may be provided that may allow a user to specify a target object using the user interface.
- part C of FIG. 7 shows a box 702 surrounding the selected target on the display.
- the box may be in any shape, for example an n-sided polygon where n may be any integer greater than 2.
- the box is a 4-sided polygon (quadrilateral-shaped).
- the box may serve as a visual indicator to the user, to distinguish the selected target from adjacent objects.
- a prompt window (not shown) may appear in or near the box, requesting confirmation from the user on whether the selected target corresponds to an intended target of the user. A user may confirm the selected target by clicking on the box.
- Part D of FIG. 7 shows an image of the target object after the UAV has moved towards the target. For instance, from a FPV, when a UAV is traveling towards the target, an object that was once further away may become closer up. From a map view, a distance between the UAV and the target is reduced after the UAV has moved closer towards the target.
- the UAV may move towards the target until it is offset from the target by a predetermined distance.
- the predetermined distance may include a horizontal distance component and/or a vertical distance component.
- the UAV may stay at the predetermined distance from the target. In some cases, the UAV may remain outside of the predetermined distance to the target.
- the predetermined distance may be determined based on a size of the target and an initial distance from the UAV to the target.
- the predetermined distance may be automatically generated, or optionally adjustable by a user. For example, if a user desires to move the UAV closer to the target, the user may select (e.g., “click”) the target in the image multiple times to adjust the predetermined distance.
- Adjustment of the distance may optionally depend on a length of time which the user selects (e.g., touches) the target in the image.
- the predetermined distance may be dynamically calculated based on factors such as a size of the target and an initial distance of the UAV from the target.
- a user may control flight of the UAV by interacting with the GUI of the display in a number of different configurations. For example, when a user selects a point on the target in the image, the UAV may fly towards the target. Optionally, when the user selects a point located below the target in the image, the UAV may fly backward along its original flight path and away from the target. Alternatively, selecting a point above the target in the image may cause the UAV to fly forward. In some cases, double-clicking (or touching) the target in the image multiple times may cause the UAV to fly closer to the target. It is noted that any form of interaction of the user with the user terminal/output device to control various functions of the UAV flight may be contemplated.
- the UAV may travel towards the target at a fixed velocity or a variable velocity.
- a standard target travel velocity may be provided.
- a variable target travel velocity may also be provided.
- the user may specify the velocity and/or acceleration at which the UAV may be traveling towards the target. Any description herein of affecting the velocity of the UAV may also apply to affecting acceleration of the UAV when moving towards the target.
- the user may affect the velocity at the same time at which the user is specifying the target. For instance, when a user selects a target, the number of clicks or touches that a user touches the target may affect the velocity of the UAV.
- the UAV may travel at a first velocity
- the UAV may travel at a second velocity.
- the second velocity may be greater than the first velocity.
- the velocity of the UAV travel may correspond to the number of touches or selections of the point indicative of the target.
- a positive proportional relationship may be provided between the number of selections and the velocity of the UAV.
- a linear relationship may be provided between the number of selections and the velocity of the UAV.
- the velocity of the UAV may be X+N*Y, where X is a velocity value, Y is a velocity multiplier, and N is the number of times the target was selected. Any other mathematical relation may be provided.
- the user may make a selection a first time to get a first velocity, and then make the selection again to speed up the UAV. The user may keep making the selection to keep speeding up the UAV.
- a length of time associated with the selection of the target may affect the velocity of the UAV. For instance, if a user touches a point indicative of a target for a first period of time, the UAV may travel at a first velocity, and if the user touches for a second period of time greater than the first period of time, the UAV may travel a second velocity. The second velocity may be greater than the first velocity. The velocity of the UAV travel may correspond to the length of the touch or selection of the point indicative of the target. A positive proportional relationship may be provided between the length of the selection and the velocity of the UAV. In some instances, a linear relationship may be provided between the length of the selection and the velocity of the UAV.
- swiping motions may affect the velocity of the UAV.
- different regions may be touched to affect the velocity of the UAV.
- a separate control may be provided for velocity control. For instance, a user may adjust a velocity using a manual control while the UAV is traveling towards the target. The velocity may be adjusted in accordance with the manual control in real-time.
- a user may enter a numerical value for the desired velocity, or select the velocity from a plurality of pre-selected options.
- FIG. 8 shows an example of a user interface (UI) through which a user may select a target by selecting different points and cause the UAV to move towards the target.
- FIG. 8 is similar to FIG. 7 except for the following difference.
- a user may select a target by touching a plurality of points on the image to generate a box containing the target.
- Part A shows an initial display of an environment comprising the target.
- Part B shows a user selecting a first point proximate to the target within the initial display.
- Part C shows a user selecting a second point proximate to the target within the initial display.
- Part D shows an image of the target after the UAV has moved towards the target and is at a distance from the target.
- a box may be generated to contain the target therein.
- the box may be in any shape, for example an n-sided polygon where n may be any integer greater than 2.
- the box is a 4-sided polygon (quadrilateral-shaped).
- the target may be selected when the target substantially lies within the box in the image.
- FIG. 9 shows an example of a user interface (UI) through which a user may select a target by drawing a shape around the target and cause the UAV to move towards the target.
- FIG. 9 is similar to FIG. 8 except for the following difference.
- a user may select a target by drawing a box around the target.
- Part A shows an initial display of an environment comprising the target.
- Part B shows a user drawing a box around the target within the initial display.
- Part C shows an image of the target after the UAV has moved towards the target and is at a distance from the target.
- a user may draw a box around the target on the image by touching the display in a circular manner around the target.
- the box may contain the target therein.
- the box may be in any shape, for example an n-sided polygon, an ellipse, an irregular shape, etc.
- the box may be an ellipse.
- the target may be selected when the target substantially lies within the ellipse in the image.
- the box may be generated when a user touches a point on or near the target on the image.
- the box may indicate that the target displayed therein has been selected.
- FIG. 10 shows an example of a user interface (UI) comprising a first person view (FPV) photographic/video image and a 2D map through which a user may select a target and cause the UAV to move towards the target.
- the UI may include a FPV 1002 substantially occupying the bulk of the display, and a 2D map 1004 (such as an overhead map) located on a portion (e.g., bottom left corner) of the display.
- the FPV may include an image captured by an imaging device on the UAV.
- a user may select a target by touching a point 1006 on the image.
- a balloon 1008 may be generated to display a magnified view of the target.
- FIG. 10 shows an example of a user interface (UI) comprising a first person view (FPV) photographic/video image and a 2D map through which a user may select a target and cause the UAV to move towards the target.
- the UI may include a FPV 1002 substantially occupying the bulk of
- the target corresponds to a portion of a building, as shown in the balloon.
- a user may further refine the target selection by selecting one or more points, or a region, within the balloon. For example, the user may click on a particular feature within the balloon. Alternatively, the user may draw a shape to enclose a region within the balloon. Additionally, the user may zoom in or zoom out of the view that is displayed within the balloon. The user may also navigate in any direction within the view that is displayed within the balloon. In some embodiments, the user may move the balloon around within the image to display magnified views of different parts of the image. As the user is moving the balloon around within the image, the user may notice features or points of interest, and select those features or points of interest as the target.
- a prompt window may appear next to the selected point, requesting confirmation from the user on whether the selected point corresponds to an intended target of the user.
- the positions of the target and the UAV may be displayed in the 2D map 1004 on the bottom left corner of the display.
- the UAV may move towards the target.
- the UAV may move towards the buildings shown in balloon 1008 .
- the size of the image in the image increases as the UAV moves towards the target.
- the positions of the target and the UAV may also be updated in real-time on the 2D map. For example, as the UAV moves towards the target, a distance between the UAV and target on the 2D map starts to decrease.
- a user may select the target from the balloon pre-flight (i.e., prior to operation of the UAV or when the UAV is hovering at a fixed point).
- a user may refine the selection of the target during flight.
- a user may select a new target by selecting a different point in the displayed image, e.g. from the balloon.
- the displayed image may include more details about (and also around) the original target, when the UAV is flying towards and/or tracking the target.
- a user may refine his target selection based on the additional details about (and also around) the original target, when the UAV is flying towards and/or tracking the target.
- the user may select a different point or select a different region to refine his target selection.
- the UAV may modify its course slightly and fly towards and/or track the refined target.
- a user may select an entirely new target by moving the balloon to another location on the image, during UAV flight.
- the UAV may change course and fly towards and/or track the new target.
- the map view may include a 3D map instead of a 2D map.
- the 3D map may be alterable to view the 3D environment from various angles.
- the 3D environment may comprise a plurality of virtual objects.
- the virtual objects may be graphical solid objects or graphical wireframes.
- the virtual objects may comprise points or objects that may be of interest to a user. Points or objects that may be of less interest to the user may be omitted from the 3D virtual environment to reduce object clutter and to more clearly delineate points/objects of interest. The reduced clutter makes it easier for the user to select or identify a desired point or object of interest from the 3D virtual environment.
- FIG. 11 shows an example of a user interface (UI) through which a user may select a target and cause the UAV to move towards the target.
- Part A shows an initial display of an environment comprising a plurality of objects.
- the objects may comprise stationary objects (e.g., buildings, trees, golf course, gas station, etc.) and objects that are capable of movement (e.g., a group of people).
- Part B shows a user selecting the target within the initial display.
- Part C shows an image of the target after the UAV has moved towards the target and is at a distance from the target.
- a user may select a point on the image.
- the point may be at or proximate to a golf course. Selection of the point may cause the golf course to be selected as the target.
- the UAV may move towards the golf course.
- the size of the golf course has now increased since the UAV is now closer to the golf course (target).
- FIG. 12 shows an example of a user interface (UI) through which a user may select a new target and cause the UAV to move towards the new target.
- Part A of FIG. 12 shows an initial display corresponding to part C of FIG. 11 .
- Part B shows a user manipulating the image to increase a field of view to generate an updated display.
- Part C shows a user selecting a new target within the updated display.
- Part D shows an image of the new target after the UAV has moved towards the new target and is at a distance from the new target.
- Part A of FIG. 12 shows an image of a currently selected target 1202 .
- a user may increase a field-of-view by manipulating the image. For example, the user may perform a “pinching” motion on the touch display to increase the field of view, which allows more objects to be displayed in the image.
- the user may select a new point in the updated display.
- the new point may be associated with a different target object 1204 (e.g., a building). Selection of the new point may cause the building to be selected as the new target 1204 .
- the UAV may move towards the building.
- the size of the building has now increased since the UAV is now closer to the building (new target).
- FIGS. 13 and 14 show an example of a user interface (UI) through which a user may select a target and cause the UAV to track the target.
- Part A of FIG. 13 shows an initial display of an environment comprising a plurality of objects.
- the objects may comprise stationary objects (e.g., buildings, trees, golf course, gas station, etc.) and objects that are capable of movement (e.g., a group of people).
- Part B of FIG. 13 shows a user selecting the target within the initial display.
- Part A of FIG. 14 shows an image of the selected target.
- Part B of FIG. 14 shows an image of the target after the UAV has moved towards the target and is at a distance from the target.
- Part C of FIG. 14 shows an image of the target as it is being tracked by the UAV.
- a user may select a point on the image.
- the point may be at or proximate to a group of people. Selection of the point may cause the group of people to be selected as the target.
- the UAV may move towards and/or track the group of people.
- the group of people may be stationary and/or moving at different points in time.
- the image may continually update as the UAV is tracking the target.
- the group of people may disperse after some time.
- the UAV may be configured to track a person that has a position closest to the selected point. In some cases, the UAV may track a substantial portion of the group that remains.
- FIG. 15 illustrates UAV tracking of a target.
- a user may select a target on a display.
- the target may be a group of people.
- the UAV may move at a velocity Vi and the target may move at a velocity Vt during the tracking.
- the velocities Vi and Vt may be substantially the same. In some cases, the velocities Vi and Vt may be different. For example, Vi ⁇ Vt such that the UAV is moving slower than the target, or Vi>Vt such that the UAV is moving faster than the target.
- a method for controlling a UAV may be implemented using the system of FIG. 2 .
- the method may include acquiring, when the UAV is at a first location, a target from one or more images captured by an imaging device that is carried by the UAV; and controlling the UAV to track the acquired target.
- the target may be acquired based on a selected point in the one or more images.
- the images may be captured by an imaging device on the movable object at the first location.
- the selected point in the one or more images may be associated with a set of image coordinates.
- the target may be positioned at a second location that is associated with a set of world coordinates.
- a transformation may be generated from the set of image coordinates to the set of world coordinates.
- a direction vector from the first location to the second location may be calculated based on the transformation.
- a path may be generated for the UAV to track the acquired target based on the direction vector.
- a selected point in an initialization image may be received from a user.
- the initialization image may be included in the one or more images.
- a plurality of object candidates may be provided for the user to select, whereby each object candidate may be denoted using a bounding box.
- a selected object candidate may be received as the target when the user selects the bounding box associated with the selected object candidate.
- a projective transformation of the target may be obtained in the one or more images, based on state information of the imaging device.
- the state information of the imaging device may be determined based on position and attitude information of the UAV and attitude information of the imaging device.
- a selected point in an initialization image may be received from a user.
- the initialization image may be included in the one or more images.
- a target direction may be determined for the UAV to move in based on the selected point.
- the target direction may be dynamically adjusted so that the UAV avoids one or more obstacles lying in the target direction.
- An attitude of the imaging device and/or the UAV may be adjusted to maintain the target in the field of view of the imaging device when the UAV is avoiding the one or more obstacles. For example, a yaw angle movement and a translational movement of the UAV may be controlled to maintain the target in the field of view.
- a tracking failure may be determined to have occurred when the target is no longer in the one or more images and/or the field-of-view of the imaging device.
- the position and attitude of the movable object and/or the attitude of the imaging device may be adjusted in order to recapture the target in one or more subsequent images.
- the one or more subsequent images may be analyzed to detect the target, and the target may be tracked once it is detected.
- a distance and/or a velocity of the target may be obtained relative to the UAV.
- the target may be tracked based on the distance and/or the velocity of the target relative to the UAV.
- the path may be an optimized route between the first location (associated with UAV) and the second location (associated with target).
- the path may be optimized based on one or more parameters including flight distance, flight time, energy consumption, altitude, weather effects including wind directions and speed, and/or tracking of the target (such as speed and direction of target).
- the path may also be optimized for the UAV to avoid one or more obstacles between the first location and the second location.
- the path may include straight lines and/or curvilinear lines.
- the path may be configured to minimize an energy consumption of the UAV as the UAV is moving from the first location to the second location.
- the path may be configured to minimize effects of weather on the movement of the UAV.
- the path may be configured based on a wind speed and direction.
- the path may be configured to reduce movement of the UAV into a headwind.
- the path may be configured to account for changes in altitude and pressure as the UAV moves towards the target.
- the path may be configured based on the surrounding landscape between the first location and the second location.
- the path may be configured to account for man-made structures and natural terrain that are present in the surrounding landscape.
- the path may be configured to navigate around/over/underneath obstacles such as man-made structures and natural terrain in the path between the first location and the second location.
- a 3-D model of the surrounding landscape may be obtained based on: (1) one or more images captured by one or more imaging devices on the movable object, and (2) geographical maps obtained from global positioning system (GPS) data.
- GPS global positioning system
- the GPS data may be provided from a server to the user terminal that is used to control the UAV.
- the path may be configured such that a point of interest maintains in a field-of-view of an imaging device on the UAV as the UAV is moving from the first location to the second location, whereby the point of interest may be the target and/or other objects.
- FIG. 16 shows the avoidance of obstacles as the UAV is moving towards and/or tracking a target.
- a UAV may be configured to fly in a path 1602 towards a target 1604 .
- the target may be associated with a point in an image on a display that is selected by a user.
- the target may be a stationary target, a moving target, or a direction.
- An obstacle 1606 may be detected in the flight path 1602 .
- a detour 1608 from the path may be generated for the UAV to avoid the obstacle.
- the UAV may be configured to automatically avoid the obstacle by moving along the detour.
- the detour may exit the path at a first point 1609 - 1 and rejoin the path at a second point 1609 - 2 .
- Part B of FIG. 16 shows different configurations for generating (for example, re-planning) a path after the UAV has successfully navigated around an obstacle 1606 .
- the UAV may be initially at a first location 1616 and the target may be at a second location 1618 .
- a vector v 1 may be defined between the first location and the second location.
- the vector v 1 may be associated with an initial path from the UAV to the target.
- the UAV may make a detour to avoid the obstacle (by flying above, around, or underneath obstacle).
- the UAV may fly to a third location 1620 after having successfully navigated the obstacle.
- the UAV fly to a third location 1620 - 1 located to the right of the obstacle, as shown in part B of FIG. 16 .
- a vector v 2 - 1 may be defined between the first location and the third location.
- a new path for moving towards and/or tracking the target may be generated for the UAV when the UAV is at the third location 1620 - 1 .
- the new path may be defined by a vector v 3 - 1 .
- the vector v 3 - 1 may be determined based on the vector v 1 and the vector v 2 - 1 .
- the new path may be generated using a triangulation method based on one or more images obtained at the third location, and one or more images obtained at one or more previously known locations (e.g., the first location and/or the second location).
- the UAV may be controlled to fly back to its initial location (e.g., the first location, or any other location that the UAV has passed) and the new path may be generated from the initial location.
- the new path may be generated from the third location to the second location without using the first location, for example using the method previously described in FIG. 6 .
- a shortest (or most direct) path for avoiding the obstacle may be determined.
- the UAV may fly to a fourth location 1620 - 1 located to the left of the obstacle, as shown in part B of FIG. 16 .
- a distance from the first location to the fourth location may be less than a distance from the first location to the third location.
- the distance from the first location to the fourth location may be indicative of the shortest distance that the UAV travels to circumvent the obstacle.
- a vector v 2 - 2 may be defined between the first location and the fourth location.
- a new path for moving towards and/or tracking the target may be generated (for example, re-planned) for the UAV when the UAV is at the fourth location.
- the new path may be defined by a vector v 3 - 2 .
- the vector v 3 - 2 may be determined based on the vector v 1 and the vector v 2 - 2 .
- the original path between the UAV and the target may be substantially replaced by the detour or a new path.
- the detour may be shaped around ( 1612 and 1614 ), above ( 1610 ), and/or underneath the obstacle.
- the detour may be in a lateral and/or vertical direction.
- an orientation of the UAV and/or an imaging device located thereon may be changed during the detour, such that a position of the target remains in a field-of-view of the imaging device.
- the UAV may be initially at a first location and the target may be at a second location.
- the UAV may be at a third location after having successfully navigated around the obstacle.
- a new path for the UAV may be generated based on a vector between the first location and the second location, and a vector between the first location and the third location.
- the new path may be defined by a vector between the third location and the second location.
- the vector between the first location and the second location may be obtained based on the target information (e.g., from imaging data).
- the vector between the first location and the third location may be obtained from an IMU of the UAV.
- the target may be selected based on a point in one or more images captured by an imaging device that is carried by the UAV.
- the one or more images may be provided from a FPV.
- the UAV may be configured to switch course from moving towards one target to moving towards another target, depending on which point in the one or more images is being selected.
- the UAV may be configured to move towards a first target when a first point is selected, and to move towards a second target when a second point selected.
- the selection of the second target may replace the selection of the first target.
- the first point and the second point may be located at different portions of the image.
- the second target may be different from the first target.
- An attitude and/or orientation of the UAV may be changed when the UAV is switching from moving towards one target to moving towards another target.
- a transition path may be generated that allows the UAV to switch course from one target to another target in a curvilinear manner Power consumption of the UAV may be reduced by switching course in the curvilinear manner. Also, flight stability of the UAV may be improved by switching course in the curvilinear manner
- FIG. 17 shows an example of a user interface (UI) through which a user may select a target direction.
- Part A shows an initial display of an environment.
- Part B shows a user selecting a target direction within the initial display.
- Part C shows an image of the movable object traveling in the target direction. Corresponding movements/headings of the UAV are shown in the compasses.
- Part A shows an initial display of an environment.
- a FPV may be provided as illustrated.
- the FPV may include a live streaming image from an imaging device.
- the FPV may alternatively be a graphical depiction or representation of the image from the imaging device.
- a horizon is shown, along with an object within the field of view.
- the UAV may be stationary or moving while the initial display of the environment is occurring.
- the corresponding compass shows a stationary UAV.
- a map view may be provided.
- the map view may include a 2D map, such as an overhead map.
- the map view may include a 3D map.
- the 3D map may be alterable to view the 3D environment from various angles. Solid renderings, wireframes, or other types of imaging may be shown, as described previously herein.
- the display may be shown on a user terminal.
- the user may optionally hold the user terminal.
- Part B shows a user selecting a target direction within the initial display.
- the user may select a portion of the image to select the target direction.
- the image may include a FPV and/or a map.
- the user may select a portion of the FPV or the map to select the target direction.
- the portion of the image selected by the user may optionally be a point.
- the UAV may travel in the direction indicated by the selected point.
- a directional heading of the UAV may be determined by a current location of the UAV and an angle that includes the selected point along the trajectory.
- the user may select a target direction that is northeast of the current position of the UAV.
- the corresponding compass shows the UAV may move in a corresponding northeastern direction.
- the user selection of a target direction may include a lateral selection of the target direction.
- the target direction may be within a two-dimensional plane.
- the user may specify whether the UAV is to move north, south, east, west, or anywhere in between.
- the UAV may remain at substantially the same altitude while traveling in the specified two-dimensional direction.
- the UAV may encounter flight restrictions that may affect the flight path of the UAV. For instance, some lateral flight restrictions may apply.
- the UAV may remain within a certain range of the user terminal. If the UAV is traveling in a target direction, and is about to exceed the range of the user terminal, the UAV may stop and hover, or may return toward the user terminal.
- the UAV may remain within a geo-fenced region. If the UAV is traveling in a target direction, and is about to pass outside of the geo-fenced region, the UAV may stop hover, or may return toward the user terminal.
- An obstacle may be a flight restricted area. Alternatively a flight restricted area may or may not contain any obstacle. Any other type of flight restriction may apply.
- the user selection of a target direction may include a three-dimensional selection of the target direction.
- the target direction may be anywhere within a three-dimensional space.
- the user may specify whether the UAV is to move north, south, east, west, up, or down, or anywhere in between.
- the UAV may be capable of changing altitude while traveling within the specified three-dimensional direction.
- the UAV may encounter flight restrictions that may affect the flight path of the UAV. Lateral flight restrictions, such as those previously described, may be provided. Additional altitude flight restrictions may be provided that may limit altitude change of the UAV. For instance, if the target direction is upwards, the UAV may travel in that target direction indefinitely, all the while increasing the altitude of the UAV. Alternatively, the flight restriction, such as a flight ceiling, may kick in. When the UAV reaches the flight ceiling, the UAV may level out and remain at substantially the same altitude. However, the UAV may continue to travel in the same specified lateral direction. Similarly, if the target direction is downwards, the UAV may travel in that direction indefinitely until it reaches the ground, all the while decreasing the altitude. Alternatively, the flight restriction, such as a flight floor, may kick in. When the UAV reaches the flight floor, the UAV may level out and remain at substantially the same altitude. However, the UAV may continue to travel in the same specified lateral direction.
- visual indicators such as compass and/or vector
- heading information indicative of flight angles, compass direction, future/target destination, etc. may optionally be displayed on a 2D map and/or a 3D map.
- the precision to which the user may specify a direction may be on the order of 0.01 degrees or less, 0.05 degrees or less, 0.1 degrees or less, 0.5 degrees or less, 1 degree or less 2 degrees or less, 3 degrees or less, 5 degrees or less, 7 degrees or less, 10 degrees or less, 15 degrees or less, 20 degrees or less, or 30 degrees or less.
- the selected target direction may or may not be visually indicated on the screen.
- a visual marker may be provided within the image indicative of the target direction.
- the visual marker may be a point, region, icon, line, or vector.
- the point may be indicative of a selection of the target direction.
- the vector may be indicative of the direction that the UAV is heading.
- a user may specify that the UAV is in a directional mode.
- the portion of the image selected by the user may determine the direction at which the UAV will travel until it encounters other directions, or encounters flight restrictions.
- the UAV may travel indefinitely in that direction until it encounters a stop or change criteria, such as a direction change, flight restriction, flight mode change, low power supply or obstacle.
- the user may specify that the UAV is in a directional mode by selecting the directional mode from one or more available modes, such as a target tracking mode.
- the UAV may fly in a target direction when a user selects a user-interface tool that indicates that the portion of the image that the user will select will be the target direction.
- the target direction tool may be a one-use tool (e.g., the user may need to reselect the tool in order to select another target direction), or may be used multiple times (the user can keep specifying target direction without having to re-select the tool unless the user has switched tools).
- one or more images (e.g., FPV, 2D map and/or 3D map) on the screen may have one or more predetermined regions indicative of flight direction.
- the regions may be visually distinguishable from other regions.
- the regions may include borders, or arrows, or any other type of features that may distinguish the region.
- the regions may be provided in a border surrounding the image.
- one or more arrow buttons may be provided that may allow the target direction of the UAV to be adjusted.
- a user may indicate one or more values or coordinates indicative of the target direction that the UAV is to travel.
- angles may provide a target direction for the UAV to head. The angles may be provided for two dimensional or three dimensional direction control.
- the values may include spatial coordinates which are along a vector descriptive of the target direction.
- Any other user interface tools or techniques may be provided that may allow a user to specify a target direction using the user interface.
- Part C shows an image of the movable object traveling in the target direction. For instance, from a FPV, when a UAV is traveling in the specified direction, an object that was once further away may become closer up. From a map view, objects may be shown to be passed by the UAV as the UAV follows the target direction. As shown on the corresponding compass, the UAV may be continuing to travel in the target direction.
- the UAV when a user specifies a target direction, the UAV may travel in that target direction at a fixed velocity or a variable velocity.
- a standard target travel velocity may be provided.
- a variable target travel velocity may also be provided.
- the user may specify the velocity and/or acceleration at which the UAV may be traveling in the target direction. Any description herein of affecting the velocity of the UAV may also apply to affecting acceleration of the UAV in the target direction.
- the user may affect the velocity at the same time at which the user is specifying the target direction. For instance, when a user selects a target direction, the number of clicks or touches that a user touches the target direction may affect the velocity of the UAV.
- the UAV may travel at a first velocity
- the UAV may travel a second velocity.
- the second velocity may be greater than the first velocity.
- the velocity of the UAV travel may correspond to the number of touches or selections of the point indicative of the target direction.
- a positive proportional relationship may be provided between the number of selections and the velocity of the UAV.
- a linear relationship may be provided between the number of selections and the velocity of the UAV.
- the velocity of the UAV may be X+N*Y, where X is a velocity value, Y is a velocity multiplier, and N is the number of times the target direction was selected. Any other mathematical relation may be provided.
- the user may make a selection a first time to get a first velocity, and then make the selection again to speed up the UAV. The user may keep making the selection to keep speeding up the UAV.
- the length of the selection of the target direction may affect the velocity of the UAV. For instance, if a user touches a point indicative of a target direction for a first period of time, the UAV may travel at a first velocity, and if the user touches a second period of time greater than the first period of time, the UAV may travel a second velocity. The second velocity may be greater than the first velocity. The velocity of the UAV travel may correspond to the length of the touch or selection of the point indicative of the target direction. A positive proportional relationship may be provided between the length of the selection and the velocity of the UAV. In some instances, a linear relationship may be provided between the length of the selection and the velocity of the UAV.
- swiping motions may affect the velocity of the UAV.
- different regions may be touched to affect the velocity of the UAV.
- a separate control may be provided for velocity control. For instance, a user may adjust a velocity using a manual control while the UAV is traveling in the target direction. The velocity may be adjusted in accordance with the manual control in real-time.
- a user may enter a numerical value for the desired velocity, or select the velocity from a plurality of pre-selected options.
- FIG. 18 shows an example of a user interface (UI) through which a user may adjust a target direction.
- Part A shows a display of an environment while a UAV is heading in a first target direction.
- Part B shows a user selecting a second target direction different from the first target direction within the display.
- Part C shows an image of the movable object traveling in the second target direction. Corresponding movements/headings of the UAV are shown in the compasses.
- Part A shows a display of an environment while a UAV is heading in a first target direction.
- the corresponding compass shows that the UAV is traveling in a northeast direction.
- the UAV may continue along that target direction until it encounters a situation that requires a change in direction, such as those previously described (e.g., a direction change, flight restriction, flight mode change, low power supply or obstacle).
- the UAV may continue in the target direction along a constant velocity and/or acceleration or a varying velocity and/or acceleration.
- Part B shows a user selecting a second target direction different from the first target direction within the display.
- the second target direction may be in a northwest direction.
- the second target direction may be selected in the same manner as the first target direction.
- the second target direction may be selected while the UAV is traversing the first target direction.
- the corresponding compass shows that the UAV is now traveling in the northwest direction.
- Part C shows an image of the movable object traveling in the second target direction.
- the UAV may transition from the traveling in the first target direction to traveling in the second target direction.
- the transition from the first to second target direction may be relatively abrupt or gradual.
- the image is illustrated as a FPV, but may also be a map in conjunction with or as an alternative to the FPV.
- FIG. 19 shows an example of a flight path of a UAV.
- a UAV may be initially traveling in a first target direction (i.e., original direction) illustrated by a first vector 1902 .
- the UAV may receive an instruction to head in a second target direction (i.e., new direction) illustrated by a second vector 1904 .
- the flight path of the UAV may be curved 1906 to transition from the first direction to the second direction.
- the curvature of the flight path may depend on one or more factors, such as speed and/or acceleration of the UAV when it receives the instruction to change direction, the degree of directional change, types of propulsion units, configuration of the UAV, specifications by the user of the UAV, or any other factor. In some instances, a standard curvature or gradual change of the flight path may be provided. Alternatively, the curvature may vary in accordance with one or more of the factors described. For instance, if the UAV is traveling very quickly, it may not be able to make as sharp a turn as if it were traveling more slowly. A flight controller may make a calculation to effect change of the direction in the flight path. The flight controller may have any of the characteristics as described elsewhere herein.
- FIG. 20 shows an example of a UAV traveling in a target direction within an environment.
- the environment 2000 may optionally include one or more objects 2002 .
- a UAV 2004 may be capable of traversing the environment.
- a field of view of an imaging device of the UAV may be provided 2008 .
- one or more objects may be captured within the field of view.
- a target direction 2006 of the UAV may be selected. The UAV may be capable of traveling in the target direction.
- Part A shows an initial position of the UAV 2004 within the environment 2000 .
- a target direction for the UAV may be specified 2006 .
- a user may specify a target direction by selecting a portion of an image captured by the imaging device.
- the target direction may include a point that is within the field of view captured by the imaging device.
- the target direction may optionally be selected by selecting a portion of an image based on information captured by the imaging device.
- the image may be an image captured by the imaging device or rendered image based on the image captured by the imaging device.
- the image may be a FPV or may be a map representative of the environment within which the UAV is traveling. The user may select a target direction from a map view that need not be within the field of view of the imaging device.
- Part B shows a subsequent position of the UAV 2004 within the environment 2000 as the UAV travels in the target direction 2006 .
- the UAV may move relative to the environment, such as one or more objects 2002 within the environment.
- the UAV and/or the imaging device of the UAV may maintain its direction relative to the environment while traveling in the target direction. For instance, if a UAV and/or imaging device of the UAV is initially facing north, the UAV and/or imaging device may remain facing north while traveling in the target direction. In other embodiments, the UAV and/or the imaging device may change orientation.
- FIG. 21 shows an example of a UAV traveling in a target direction within an environment, where the UAV and/or the imaging device has changed orientation relative to the environment.
- the environment 2100 may optionally include one or more objects 2102 .
- a UAV 2104 may be capable of traversing the environment.
- a field of view of an imaging device of the UAV may be provided 2108 .
- one or more objects may be captured within the field of view.
- a target direction 2106 of the UAV may be selected.
- the UAV may be capable of traveling in the target direction.
- Part A shows an initial position of the UAV 2104 within the environment 2100 .
- a target direction for the UAV may be specified 2106 .
- the target direction may be specified using any technique or user interface as described elsewhere herein.
- Part B shows a change in orientation of the UAV 2104 and/or the imaging device in response to the selection of the target direction 2106 .
- the field of view 2108 of the imaging device may be adjusted in response to the selection of the target direction.
- the orientation of the UAV and/or imaging device may be selected based on the target direction. For instance, the orientation of the UAV and/or imaging device may be selected to provide the target direction within a central region of the field of view of the imaging device.
- the target direction may be at a center point, or along a lateral and/or longitudinal central line of the field of view.
- the orientation of the UAV may be selected to permit easy traversal of the UAV in the target direction (e.g., if the UAV has a ‘front’ orientation, it may orient the UAV to have the front orientation of the UAV in the target direction).
- the imaging device may remain stationary with respect to the UAV and/or may change orientation/position with respect to the UAV.
- Part C shows a subsequent position of the UAV 2104 within the environment 2100 as the UAV travels in the target direction 2106 .
- the UAV may move relative to the environment, such as one or more objects 2102 within the environment.
- the UAV may travel in the target direction after adjusting the orientation of the UAV and/or the imaging device.
- the UAV may travel in the target direction while adjusting the orientation of the UAV and/or the imaging device, or prior to adjusting the orientation of the UAV and/or imaging device.
- various types of coordinate systems may be employed in selecting and flying a UAV in a target direction.
- one or more coordinate systems may be local to an imaging device, and/or UAV.
- One or more coordinate systems may include global coordinate systems may be provided relative to an inertial reference frame, such as the environment.
- a position of the imaging device and/or UAV may be determined in reference to a global coordinate system.
- the position of the imaging device and/or UAV may be determined in reference to a local coordinate.
- the position of the imaging device and/or UAV may be converted between global and local coordinates.
- a target direction in an image may be determined in relation to a local coordinate system of the imaging device and/or the image captured within the device.
- the local coordinate of the target direction may be converted to global coordinates for the target direction.
- a vector in a local coordinate system denoting the target direction may be converted to a vector in the global coordinate systems.
- the screen position (x screen ,y screen ) on which the user selects may be obtained.
- the user terminal may convert the coordinate (x screen ,y screen ) of the selected point into coordinate (x rawimage ,y rawimage ), which is the coordinate of the selected point in the camera raw image, based on a position and a percentage of the current preview image within the camera raw image, and normalize it into (x percentage ,y percentage ).
- the equation for normalizing process is given by:
- the coordinate (x percentage ,y percentage ) may be transmitted to the UAV via a communications system.
- the controller may receive the transmitted data and calculate a spatial flight direction (x space ,y space ,z space ), and transmit the flight direction (x space ,y space ,z space ) back to user terminal via the communications system.
- the user terminal receives the flight direction, and re-projects it onto the image to obtain (x dir ,y dir ), and display it.
- Step 1 above obtains a user's input coordinate (x screen ,y screen ) via an API (e.g., an IOS API or an Android API), and step 2 above obtains a normalized coordinate based on a percentage of the preview image within the camera raw image.
- Step 3 which is described next, calculates the spatial flight direction from user selected point.
- FIG. 22 shows a geometry model of camera imaging (assuming that the optical axis strictly aligns with the center of image).
- the (x i ,y i ) is the coordinate of the same point under camera coordinate system.
- the following relation may be obtained:
- ⁇ f ImageWidth 2 ⁇ tan ⁇ ( FOV h / 2 )
- f ImageHeight 2 ⁇ tan ⁇ ( FOV v / 2 )
- the (x w ,y w ,z w ) contains an unknown value D.
- a normalization may be possible since the selected direction is a direction vector.
- the direction vector of the selected direction in camera coordinate system has been obtained from the above.
- a translation matrix from UAV body to world coordinate system (East, North, Ground) is provided by a gimbal as:
- M ⁇ ( ⁇ , ⁇ , ⁇ ) [ cos ⁇ ⁇ ⁇ cos ⁇ ⁇ ⁇ - cos ⁇ ⁇ ⁇ ⁇ ⁇ sin ⁇ ⁇ ⁇ ⁇ sin ⁇ ⁇ ⁇ amp ; - cos ⁇ ⁇ ⁇ cos ⁇ ⁇ ⁇ sin ⁇ ⁇ ⁇ - cos ⁇ ⁇ ⁇ ⁇ sin ⁇ ⁇ ⁇ amp ; sin ⁇ ⁇ ⁇ ⁇ ⁇ sin ⁇ ⁇ ⁇ cos ⁇ ⁇ ⁇ ⁇ + cos ⁇ ⁇ ⁇ ⁇ ⁇ cos ⁇ ⁇ ⁇ ⁇ sin ⁇ ⁇ ⁇ amp ; - cos ⁇ ⁇ ⁇ cos ⁇ ⁇ ⁇ cos ⁇ ⁇ ⁇ - sin ⁇ ⁇ ⁇ ⁇ sin ⁇ ⁇ ⁇ amp ; - cos ⁇ ⁇ ⁇ cos ⁇ ⁇ ⁇ cos ⁇ ⁇ ⁇ - sin ⁇
- the spatial direction vector of the clicked direction in world coordinate system (East, North, Ground) may be given as:
- the re-projecting process of re-projecting the direction vector onto the preview image in next step 4 is a reverse process of step 3.
- the translation matrix can be obtained by other methods, for example, by providing a sensor on the camera that is capable of measuring attitude.
- FIG. 23 shows an example of selecting a target direction within an environment where an obstacle may be within the path of the UAV when traveling along the target direction.
- Part A shows an initial display of an environment.
- Part B shows a user selecting a target direction within the display, an obstacle may be in the way.
- Part C shows an image of the movable object traveling in the target direction, having avoided the obstacle. Corresponding movements/headings of the UAV are shown in the compasses.
- Part A shows an initial display of an environment.
- the initial display may include an image which may include a FPV, and/or a map view.
- an image of the environment may be presented before a user makes a selection for a target direction.
- One or more objects within the environment may be displayed, such as a tree.
- the corresponding compass may show that the UAV may optionally not be moving.
- a UAV may already be moving when a user makes a selection for a target direction.
- Part B shows a user selecting a target direction within the display.
- the target direction may be in a northeast direction.
- the target direction may be selected in any manner as described in greater detail elsewhere herein.
- one or more objects may become obstacles when in the UAV's path when the UAV is traveling.
- a user may select a target direction in the northeastern direction.
- a tree may lie in the path in the northeastern direction.
- the UAV may automatically adjust the flight path of the UAV to avoid the obstacle presented.
- the corresponding compass shows that the UAV may go around the obstacle.
- a flight controller such as a flight controller described elsewhere herein, may aid in determining how the UAV is to avoid the obstacle.
- the UAV may circumnavigate the obstacle laterally and/or may travel above and/or below the obstacle.
- the obstacle may be a stationary obstacle.
- the obstacle may be a moving obstacle. Any obstacle avoidance technique, such as those described in greater detail elsewhere herein, may be employed.
- Part C shows an image of the movable object traveling in the target direction, having avoided the obstacle.
- a UAV may fly around a tree that was an obstacle, which permits the UAV to continue on in its original target direction, which may be in the northeast, as shown by the corresponding compass.
- FIG. 24 shows an example of a flight path of a UAV when avoiding an obstacle.
- a UAV may be initially traveling in a target direction (i.e., original direction) illustrated by a vector 2402 .
- the target direction may be determined based on a portion of an image (e.g., point 2404 ) which is selected by a user on a display.
- the selected portion may be used to determine an angle or heading of the UAV, which may determine the target direction.
- the UAV may travel toward the selected point indefinitely.
- the selected point may be a virtual point indicative of a heading within the image.
- the target direction 2402 may intersect an obstacle 2406 .
- the obstacle may be a stationary obstacle.
- the obstacle may be a moving obstacle.
- the moving obstacle may be in the target direction, or may be predicted to intersect with the target direction.
- a flight path of the UAV may be initially along the target direction.
- the flight path may be altered avoid the obstacle.
- a curved portion of the path may be provided to avoid the obstacle.
- the curved portion of the path may be within a two-dimensional lateral plane, or may be within three-dimensional space.
- the curved portion of the path may be contained in a single plane or may require multiple planes.
- the curvature of the flight path may depend on one or more factors, such as speed and/or acceleration of the UAV when it receives the instruction to avoid the obstacle, the size of the obstacle, the amount of warning regarding the obstacle, the nearness of the obstacle, motion of the obstacle, shape of the obstacle, types of propulsion units, configuration of the UAV, specifications by the user of the UAV, or any other factor.
- a standard curvature or gradual change of the flight path may be provided.
- the curvature may vary in accordance with one or more of the factors described. For instance, if the UAV is traveling very quickly, it may not be able to make as sharp a turn as if it were traveling more slowly.
- the flight path may be a relatively tight curved path to avoid the obstacle and quickly return to the original flight path along the target direction.
- a flight controller may make a calculation to effect change of the direction in the flight path.
- the flight controller may have any of the characteristics as described elsewhere herein.
- Part B of FIG. 24 shows different configurations for generating (for example, re-planning) a path after the UAV has successfully navigated around an obstacle 2406 .
- the UAV may be initially at a first location 2410 and moving towards a target direction 2412 .
- a vector v 1 may be defined in the direction 2412 .
- the vector v 1 may be associated with a path of the UAV in the target direction.
- the UAV may make a detour to avoid the obstacle (by flying above, around, or underneath obstacle).
- the UAV may fly to a second location 2414 after having successfully navigated the obstacle.
- the UAV fly to a second location 2414 - 1 located to the right of the obstacle, as shown in part B of FIG. 24 .
- a vector v 2 - 1 may be defined between the first location and the second location.
- a new path for rejoining the original path to move in the selected direction may be generated for the UAV when the UAV is at the second location 2414 - 1 .
- the UAV may towards a third location 2416 that is located along the original path.
- the new path may be defined by a vector v 3 - 1 .
- the vector v 3 - 1 may be determined based on a vector v 1 - 1 between the first location and the third location, and the vector v 2 - 1 .
- the new path may be defined by a vector v 3 - 1 .
- the vector v 3 - 1 may be determined based on the vector v 1 - 1 and the vector v 2 - 1 .
- the new path may be configured such that the UAV flies in a smooth curvilinear manner to rejoin the original path and to proceed in the selected direction. In some instances, the UAV need not rejoin the original path and may proceed on an entirely new path in a direction 2418 .
- the direction 2418 may or may not be parallel to the direction 2412 .
- the new path may be generated using a triangulation method based on one or more images obtained at the second location, and one or more images obtained at one or more previously known locations (e.g., the first location).
- the UAV may be controlled to fly back to its initial location (e.g., the first location, or any other location that the UAV has passed) and a new target direction may be generated from the initial location.
- the new path may be generated from the second location in the target direction without using the first location, for example using the method previously described in FIG. 6 .
- a shortest (or most direct) path for avoiding the obstacle may be determined.
- the UAV may fly to a fourth location 2414 - 2 located to the left of the obstacle, as shown in part B of FIG. 16 .
- a distance from the first location to the fourth location may be less than a distance from the first location to the third location.
- the distance from the first location to the fourth location may be indicative of the shortest distance that the UAV travels to circumvent the obstacle.
- a vector v 2 - 2 may be defined between the first location and the fourth location.
- a new path for moving towards and/or tracking the target may generated for the UAV when the UAV is at the fourth location.
- the new path may be defined by a vector v 3 - 2 .
- the vector v 3 - 2 may be determined based on the vector v 1 - 1 and the vector v 2 - 2 .
- Parts C and D of FIG. 24 show an additional example of an obstacle avoidance path.
- Part C shows a UAV 2402 having a target direction 2404 .
- An obstacle 2406 may intersect a path of the UAV in the target direction.
- Part D shows alternative paths 2404 a , 2404 b that may be taken by the UAV 2402 to avoid the obstacle 2406 .
- the UAV may continue in traveling in the target direction 2404 .
- the curvature and/or the alternative path may be selected with aid of a flight controller.
- the alternative path may include a relatively tight curve as shown in part A, or a looser curve as shown in part D.
- the alternative path may be selected in accordance with any factors, such as those previously listed. Obstacle avoidance algorithms and techniques may be used to calculate the alternative path as described in greater detail elsewhere herein.
- a UAV may track a status of a power supply of the UAV. If the power supply falls beneath a threshold level, an instruction may be provided for a UAV to automatically return to a home reference position.
- the threshold level may be a predetermined level. In some instances, the threshold level may be determined based on a distance of the UAV from the home reference position. For instance, if the UAV is further from the home reference position, the threshold may be lower. The threshold may be sufficient to ensure that there is sufficient power for the UAV to return to the home reference position.
- the home reference position may be a location of a start of the UAV flight. In another example, the home reference position may be a location of a user terminal. The home reference position may be static, or may be moving. The UAV may travel in a target direction until a condition is detected, such as a low power supply.
- a controller e.g., image analyzer and/or flight controller
- the controller can extract one or more features from one or more images that are captured by an imaging device carried by a movable object, and can apply the feature model on said one or more features to determine similarity.
- the controller can apply a feature model for a target on an image that is captured for tracking a target to generate a tracking output.
- the controller can also determine that a tracking failure occurs based on the tracking output, and can check one or more subsequent images to detect the target.
- the controller can obtain a relative distance between the movable object and the target, and can generate one or more control signals to track the target.
- FIG. 25 illustrates an exemplary target tracking system in a movable object environment, in accordance with some embodiments.
- a movable object 2500 includes a controller 2510 , which can receive various types of information, such as imagery information, from a camera 2502 , which is carried by a carrier 2501 , and other sensors 2503 on board.
- the controller can perform an initialization operation 2511 based on the imagery information received from the camera.
- the controller can use a specialized or general detector 2513 for detecting the target (i.e. the object to be tracked) in an initialization image.
- the controller can acquire the target and set up corresponding tracking strategies.
- the movable object can use a tracker 2512 for tracking the target.
- the system can use the detector for redetecting the lost target to continue the tracking process.
- the controller can perform further operations, such as position estimation 2514 and navigation operations 2515 , based on the information received from the sensors 2503 .
- FIG. 26 illustrates supporting target tracking in a movable object environment, in accordance with various embodiments.
- a movable object may start tracking a target.
- the movable object can capture one or more images, such as videos, in real time.
- the movable object can take advantage of the camera carried by a gimbal or other image sensors on board of the movable object (such as a UAV).
- the movable object can perform the initialization operation for acquiring a target.
- the movable object can acquire a target from an initialization image and obtain a feature model for the target.
- the movable object may continually perform the initialization operation until the target has been successfully acquired.
- the movable object can perform the tracking process.
- the movable object can employ a vision-based tracker for tracking the target.
- the system can check whether a tracking failure occurs.
- the system can perform the target detection operation.
- the system can check whether the target is detected.
- the system can repeatedly perform the target detection operation until the target is redetected.
- the system can estimate the target position for continuously tracking the target as long as the tracking is successful (i.e. including the cases when the target is redetected).
- the system can perform the movement control operation, such as flight control for a UAV, which allows the moveable object to capture the images for continuously tracking.
- FIG. 27 illustrates initializing target tracking in a movable object environment, in accordance with various embodiments.
- a movable object 2703 can capture an image 2701 , which can be transmitted to an application 2702 (e.g. on a ground terminal).
- the application 2702 can use an image view 2704 for displaying the image 2701 , which includes one or more features, e.g. the objects A-C 2711 - 2713 .
- the movable object 2703 can use different types of object detector for detecting the target (i.e. the object that is desired by the user), e.g. after receiving a user input such as a point on a target and/or a target class to be tracked (e.g. a human being).
- the movable object 2703 can use an object proposal approach at the initialization stage. As shown in FIG. 27 , the system allows a user to select a point 2705 on an interested object, e.g. the object B 2712 in the image 2701 . Once receiving the selected point 2705 from the user, the system can use a bounding box 2706 for defining and proposing the object 2712 , which may have irregular shapes.
- the system may propose multiple object candidates, e.g. using different bounding boxes.
- the user is allowed to make a decision on which object candidate (i.e. bounding box) is desired.
- the system can generate a feature model 2710 based on the selected object proposal.
- the feature model 2710 can represent the imagery characteristics of the patch of image points within the bounding box 2706 .
- the movable object 2703 can start tracking the target based on the feature model 2710 for the target.
- the feature model 2710 can be constructed based on examining common objects such as human body, cars, and human faces.
- the feature model 2804 can include various discrete objects that are trained offline.
- the feature model 2804 can be constructed based on analyzing characteristics of the objects, such as the edge/contour and color information.
- the feature model 2710 can be generated using different methods, such as optical flow and/or correlation filter algorithms.
- the feature model 2710 may be represented in the spatial domain and/or the frequency domain.
- the data to be transmitted from a ground station (i.e. the application 2702 ) to the movable object 2703 is limited, since only the position of the selected point 2705 may be needed for initializing the tracking process.
- the transmission delay in the initialization step can be minimized.
- the movable object can continuously transmit image or video data to the application 2702 for user interaction, since the data link from the movable object 2703 (e.g. a UAV) to the ground station (i.e. the application 2702 ) often have a wider bandwidth and a higher speed than the data link from the movable object 2703 to the ground station (i.e. the application 2702 ).
- FIG. 28 illustrates tracking a target in a movable object environment, in accordance with various embodiments.
- a movable object 2810 can include a carrier 2801 , which carries an imaging device 2802 such as a camera.
- the imaging device 2802 can capture an image 2803 for the target 2806 .
- the movable object 2810 can include a controller 2805 , which can maintain a feature model 2804 for tracking the target 2806 and generates control signals for controlling the movable object 2810 .
- the system can track a target by following a feature that represents the target 2806 .
- the system can determine the similarity between the various features, such as features A-C 2811 - 2813 in the image 2803 , and the feature model 2804 .
- the similarity may be calculated as a result value (or score) of a function for each feature in the image 2803 . Based on the calculated score, the system can determine which feature represents the target 2806 . Alternatively, the system can directly compare each feature in the image 2803 with the feature model to determine whether the feature represents the target 2806 .
- the system can determine whether the similarity between the feature and the feature model remains within a tracking process, e.g. by checking whether the result value (or score) of the function remains above a previously determined threshold.
- the system may consider the target is lost, when the value is below the previously determined threshold.
- the system can examine every subsequent image and looks for the target. The examination can be based on the original feature model or the last updated feature model, and may be performed by traversing different scales and locations in every subsequent images.
- the system can maintain the tracking accuracy, which is beneficial for long term target tracking since small errors may accumulate and make the whole tracking system unstable. Also, the system can perform failure-detection and target re-detection, which also benefits long term target tracking both in terms of robustness and practicability. For example, the system can maintain the tracking of a target, once the target re-appears after being occluded for a while.
- FIG. 29 illustrates supporting target tracking and redetecting in a movable object environment, in accordance with various embodiments.
- the system can use a vision based tracker for performing target tracking based on a captured image 2901 .
- the vision based tracker can take advantage of different tracking algorithms, such as the optical flow algorithms and/or the correlation filter algorithms.
- the system can automatically track and detect a target over a long period of time.
- the tracking can be performed by taking advantage of a search window that is a local search range, within which the tracking algorithm can find an optimal position/scale of the target, i.e. the tracking can be performed locally instead of globally for the whole image.
- a vision based tracker can generate a tracking output 2903 by applying a feature model 2902 on the image 2901 , which includes various features such as features A-C 2911 - 2913 , while tracking a target.
- the tracker can follow the feature B with a bounding box 2906 in the searching window 2910 .
- the system can perform failure detection 2904 and target redetection 2905 operations.
- the tracking output 2903 can be a feature response map, which is generated for the searching window 2910 in the image 2901 .
- Each point in the feature response map can represent a correlation, or similarity, between one or more features extracted from a patch of image points (i.e. a sliding window) in the searching window 2910 and a feature model 2902 for the target.
- the system can move a sliding window around in the searching window 2910 to obtain the whole feature response map in the spatial domain.
- the system can obtain the feature response map in the frequency domain, e.g. using correlation filter method, without a need for actually moving the sliding window all over the searching window 2910 in the image 2901 .
- the system can incorporate the target detection with the correlation framework, by taking advantage of both a correlation filter based tracking algorithm and a fast object proposal algorithm.
- the correlation-based object tracking algorithm is fast and effective, which is beneficial for a movable object such as an unmanned aerial vehicle (UAV) since the movable object often has limited computing capability and power resource.
- UAV unmanned aerial vehicle
- the system can redetect the target once the target is lost.
- the system can calculate the position of tracking object on the fly using a single camera, e.g. estimating the object size using the correlation filter based tracking algorithm, with continuity and stability.
- FIG. 30 illustrates using positioning devices for aiding target tracking in a movable object environment, in accordance with various embodiments.
- a movable object 3010 in a movable object environment 3000 can include a carrier 3001 , which carries an imaging device such as a camera 3002 .
- the movable object 3010 can include a positioning device, such as a GPS device 3005 .
- the target 3006 may carry a positioning device, such as a GPS device 3015 .
- the target 3006 can be a person who carries a mobile device with GPS functionality, such as a watch, a band, a hat, and/or a pair of shoes.
- the movable object 3010 e.g. the controller 3003
- the movable object 3010 can obtain the relative distance and orientation of the target 3006 , in order to maintain the target within a proximity 3008 of the target (e.g. a predefined circular range).
- the system may rely on the positioning devices for maintaining the target 3006 within a proximity 3008 of the target when the system determines that the target is lost.
- the system e.g. a controller 3003
- the vision based tracking system 1204 can be used for the indoor scenes when the GPS signal is not available.
- the system can take advantage of both the vision based tacking technologies and the positioning devices for performing a long term tracking task.
- FIG. 31 illustrates tracking a target based on distance measuring in a movable object environment, in accordance with various embodiments.
- a movable object 3110 in a movable object environment 3100 can include a carrier 3101 , which carries an imaging device such as a camera 3102 .
- the movable object 3101 can acquire a target 3106 from an image 3103 , which are captured by the camera 3102 .
- the image 3103 may include multiple features, such as objects A-C 3111 - 3113 , and a user can select the object 3111 as the target to follow (or track).
- the system can obtain the (3D) position of the target from the (2D) tracking information.
- the position of the target 3106 which is important for tracking the target, can be determined based on the direction toward the target 3106 and the relative distance 3115 between the movable object 3110 (e.g. a UAV) and the target 3106 .
- the direction of the object can be obtained by calculating the direction vector from a calibrated camera.
- the system can estimate the relative distance between the tracking device and the target based on state information associated with the imaging device and the movable object.
- the state information includes altitude information of the imaging device carried by the movable object.
- the altitude information of the imaging device can be received from a control module for the movable object.
- the state information can include attitude information of the imaging device that is carried by a movable object.
- the attitude information of the imaging device can be received from a payload stabilization control module, wherein the payload stabilization control module controls a stabilization system, which stabilizes the imaging device on the movable object.
- the controller 3105 can obtain a relative distance 3115 between the movable object 3110 and the target 3106 from the image 3103 . Also, the controller 3105 can generate one or more flight control signals 3104 to direct the movable object 3110 to track the target 3106 .
- the control signals 3104 can include acceleration/deceleration signals and gimbal attitude adjustment signals. For example, when the movable object 3110 is tracking the target 3106 , the controller 3105 can adjust the movable object or the gimbal to rotate about the yaw direction based on the distance between the target and the center point of the image.
- the controller 3105 can maintain a desired tracking distance (which can be a constant distance or dynamically configured distance) from the target 3106 .
- the system can calculate the speed, v_target, of the target relative to the movable object 3110 , based on the relative distances of the target from the movable object 3110 at different time points. Then, the system can determine the necessary movement change of the movable object 3110 based on the speed of the movable object 3110 , v_uav, and the current relative distance 3115 .
- a movable object of the present invention can be configured to move within any suitable environment, such as in air (e.g., a fixed-wing aircraft, a rotary-wing aircraft, or an aircraft having neither fixed wings nor rotary wings), in water (e.g., a ship or a submarine), on ground (e.g., a motor vehicle, such as a car, truck, bus, van, motorcycle; a movable structure or frame such as a stick, fishing pole; or a train), under the ground (e.g., a subway), in space (e.g., a spaceplane, a satellite, or a probe), or any combination of these environments.
- air e.g., a fixed-wing aircraft, a rotary-wing aircraft, or an aircraft having neither fixed wings nor rotary wings
- water e.g., a ship or a submarine
- ground e.g., a motor vehicle, such as a car, truck, bus, van, motorcycle
- a movable structure or frame such as
- the movable object can be a vehicle, such as a vehicle described elsewhere herein.
- the movable object can be mounted on a living subject, such as a human or an animal.
- Suitable animals can include avines, canines, felines, equines, bovines, ovines, porcines, delphines, rodents, or insects.
- the movable object may be capable of moving freely within the environment with respect to six degrees of freedom (e.g., three degrees of freedom in translation and three degrees of freedom in rotation). Alternatively, the movement of the movable object can be constrained with respect to one or more degrees of freedom, such as by a predetermined path, track, or orientation.
- the movement can be actuated by any suitable actuation mechanism, such as an engine or a motor.
- the actuation mechanism of the movable object can be powered by any suitable energy source, such as electrical energy, magnetic energy, solar energy, wind energy, gravitational energy, chemical energy, nuclear energy, or any suitable combination thereof.
- the movable object may be self-propelled via a propulsion system, as described elsewhere herein.
- the propulsion system may optionally run on an energy source, such as electrical energy, magnetic energy, solar energy, wind energy, gravitational energy, chemical energy, nuclear energy, or any suitable combination thereof.
- the movable object may be carried by a living being.
- the movable object can be a vehicle.
- Suitable vehicles may include water vehicles, aerial vehicles, space vehicles, or ground vehicles.
- aerial vehicles may be fixed-wing aircraft (e.g., airplane, gliders), rotary-wing aircraft (e.g., helicopters, rotorcraft), aircraft having both fixed wings and rotary wings, or aircraft having neither (e.g., blimps, hot air balloons).
- a vehicle can be self-propelled, such as self-propelled through the air, on or in water, in space, or on or under the ground.
- a self-propelled vehicle can utilize a propulsion system, such as a propulsion system including one or more engines, motors, wheels, axles, magnets, rotors, propellers, blades, nozzles, or any suitable combination thereof.
- the propulsion system can be used to enable the movable object to take off from a surface, land on a surface, maintain its current position and/or orientation (e.g., hover), change orientation, and/or change position.
- the movable object can be controlled remotely by a user or controlled locally by an occupant within or on the movable object.
- the movable object is an unmanned movable object, such as a UAV.
- An unmanned movable object, such as a UAV, may not have an occupant onboard the movable object.
- the movable object can be controlled by a human or an autonomous control system (e.g., a computer control system), or any suitable combination thereof.
- the movable object can be an autonomous or semi-autonomous robot, such as a robot configured with an artificial intelligence.
- the movable object can have any suitable size and/or dimensions.
- the movable object may be of a size and/or dimensions to have a human occupant within or on the vehicle.
- the movable object may be of size and/or dimensions smaller than that capable of having a human occupant within or on the vehicle.
- the movable object may be of a size and/or dimensions suitable for being lifted or carried by a human.
- the movable object may be larger than a size and/or dimensions suitable for being lifted or carried by a human.
- the movable object may have a maximum dimension (e.g., length, width, height, diameter, diagonal) of less than or equal to about: 2 cm, 5 cm, 10 cm, 50 cm, 1 m, 2 m, 5 m, or 10 m.
- the maximum dimension may be greater than or equal to about: 2 cm, 5 cm, 10 cm, 50 cm, 1 m, 2 m, 5 m, or 10 m.
- the distance between shafts of opposite rotors of the movable object may be less than or equal to about: 2 cm, 5 cm, 10 cm, 50 cm, 1 m, 2 m, 5 m, or 10 m.
- the distance between shafts of opposite rotors may be greater than or equal to about: 2 cm, 5 cm, 10 cm, 50 cm, 1 m, 2 m, 5 m, or 10 m.
- the movable object may have a volume of less than 100 cm ⁇ 100 cm ⁇ 100 cm, less than 50 cm ⁇ 50 cm ⁇ 30 cm, or less than 5 cm ⁇ 5 cm ⁇ 3 cm.
- the total volume of the movable object may be less than or equal to about: 1 cm 3 , 2 cm 3 , 5 cm 3 , 10 cm 3 , 20 cm 3 , 30 cm 3 , 40 cm 3 , 50 cm 3 , 60 cm 3 , 70 cm 3 , 80 cm 3 , 90 cm 3 , 100 cm 3 , 150 cm 3 , 200 cm 3 , 300 cm 3 , 500 cm 3 , 750 cm 3 , 1000 cm 3 , 5000 cm 3 , 10,000 cm 3 , 100,000 cm 3 , 1 m 3 , or 10 m 3 .
- the total volume of the movable object may be greater than or equal to about: 1 cm 3 , 2 cm 3 , 5 cm 3 , 10 cm 3 , 20 cm 3 , 30 cm 3 , 40 cm 3 , 50 cm 3 , 60 cm 3 , 70 cm 3 , 80 cm 3 , 90 cm 3 , 100 cm 3 , 150 cm 3 , 200 cm 3 , 300 cm 3 , 500 cm 3 , 750 cm 3 , 1000 cm 3 , 5000 cm 3 , 10,000 cm 3 , 100,000 cm 3 , 1 m 3 , or 10 m 3 .
- the movable object may have a footprint (which may refer to the lateral cross-sectional area encompassed by the movable object) less than or equal to about: 32,000 cm 2 , 20,000 cm 2 , 10,000 cm 2 , 1,000 cm 2 , 500 cm 2 , 100 cm 2 , 50 cm 2 , 10 cm 2 , or 5 cm 2 .
- the footprint may be greater than or equal to about: 32,000 cm 2 , 20,000 cm 2 , 10,000 cm 2 , 1,000 cm 2 , 500 cm 2 , 100 cm 2 , 50 cm 2 , 10 cm 2 , or 5 cm 2 .
- the movable object may weigh no more than 1000 kg.
- the weight of the movable object may be less than or equal to about: 1000 kg, 750 kg, 500 kg, 200 kg, 150 kg, 100 kg, 80 kg, 70 kg, 60 kg, 50 kg, 45 kg, 40 kg, 35 kg, 30 kg, 25 kg, 20 kg, 15 kg, 12 kg, 10 kg, 9 kg, 8 kg, 7 kg, 6 kg, 5 kg, 4 kg, 3 kg, 2 kg, 1 kg, 0.5 kg, 0.1 kg, 0.05 kg, or 0.01 kg.
- the weight may be greater than or equal to about: 1000 kg, 750 kg, 500 kg, 200 kg, 150 kg, 100 kg, 80 kg, 70 kg, 60 kg, 50 kg, 45 kg, 40 kg, 35 kg, 30 kg, 25 kg, 20 kg, 15 kg, 12 kg, 10 kg, 9 kg, 8 kg, 7 kg, 6 kg, 5 kg, 4 kg, 3 kg, 2 kg, 1 kg, 0.5 kg, 0.1 kg, 0.05 kg, or 0.01 kg.
- a movable object may be small relative to a load carried by the movable object.
- the load may include a payload and/or a carrier, as described in further detail below.
- a ratio of a movable object weight to a load weight may be greater than, less than, or equal to about 1:1.
- a ratio of a movable object weight to a load weight may be greater than, less than, or equal to about 1:1.
- a ratio of a carrier weight to a load weight may be greater than, less than, or equal to about 1:1.
- the ratio of an movable object weight to a load weight may be less than or equal to: 1:2, 1:3, 1:4, 1:5, 1:10, or even less.
- the ratio of a movable object weight to a load weight can also be greater than or equal to: 2:1, 3:1, 4:1, 5:1, 10:1, or even greater.
- the movable object may have low energy consumption.
- the movable object may use less than about: 5 W/h, 4 W/h, 3 W/h, 2 W/h, 1 W/h, or less.
- a carrier of the movable object may have low energy consumption.
- the carrier may use less than about: 5 W/h, 4 W/h, 3 W/h, 2 W/h, 1 W/h, or less.
- a payload of the movable object may have low energy consumption, such as less than about: 5 W/h, 4 W/h, 3 W/h, 2 W/h, 1 W/h, or less.
- a UAV can include a propulsion system having a plurality of rotors. Any number of rotors may be provided (e.g., one, two, three, four, five, six, or more).
- the rotors, rotor assemblies, or other propulsion systems of the unmanned aerial vehicle may enable the unmanned aerial vehicle to hover/maintain position, change orientation, and/or change location.
- the distance between shafts of opposite rotors can be any suitable length.
- the length can be less than or equal to 2 m, or less than equal to 5 m.
- the length can be within a range from 40 cm to 1 m, from 10 cm to 2 m, or from 5 cm to 5 m. Any description herein of a UAV may apply to a movable object, such as a movable object of a different type, and vice versa.
- the movable object can be configured to carry a load.
- the load can include one or more of passengers, cargo, equipment, instruments, and the like.
- the load can be provided within a housing.
- the housing may be separate from a housing of the movable object, or be part of a housing for an movable object.
- the load can be provided with a housing while the movable object does not have a housing.
- portions of the load or the entire load can be provided without a housing.
- the load can be rigidly fixed relative to the movable object.
- the load can be movable relative to the movable object (e.g., translatable or rotatable relative to the movable object).
- the load includes a payload.
- the payload can be configured not to perform any operation or function.
- the payload can be a payload configured to perform an operation or function, also known as a functional payload.
- the payload can include one or more sensors for surveying one or more targets. Any suitable sensor can be incorporated into the payload, such as an image capture device (e.g., a camera), an audio capture device (e.g., a parabolic microphone), an infrared imaging device, or an ultraviolet imaging device.
- the sensor can provide static sensing data (e.g., a photograph) or dynamic sensing data (e.g., a video). In some embodiments, the sensor provides sensing data for the target of the payload.
- the payload can include one or more emitters for providing signals to one or more targets. Any suitable emitter can be used, such as an illumination source or a sound source.
- the payload includes one or more transceivers, such as for communication with a module remote from the movable object.
- the payload can be configured to interact with the environment or a target.
- the payload can include a tool, instrument, or mechanism capable of manipulating objects, such as a robotic arm.
- the load may include a carrier.
- the carrier can be provided for the payload and the payload can be coupled to the movable object via the carrier, either directly (e.g., directly contacting the movable object) or indirectly (e.g., not contacting the movable object).
- the payload can be mounted on the movable object without requiring a carrier.
- the payload can be integrally formed with the carrier.
- the payload can be releasably coupled to the carrier.
- the payload can include one or more payload elements, and one or more of the payload elements can be movable relative to the movable object and/or the carrier, as described above.
- the carrier can be integrally formed with the movable object. Alternatively, the carrier can be releasably coupled to the movable object. The carrier can be coupled to the movable object directly or indirectly. The carrier can provide support to the payload (e.g., carry at least part of the weight of the payload).
- the carrier can include a suitable mounting structure (e.g., a gimbal platform) capable of stabilizing and/or directing the movement of the payload. In some embodiments, the carrier can be adapted to control the state of the payload (e.g., position and/or orientation) relative to the movable object.
- the carrier can be configured to move relative to the movable object (e.g., with respect to one, two, or three degrees of translation and/or one, two, or three degrees of rotation) such that the payload maintains its position and/or orientation relative to a suitable reference frame regardless of the movement of the movable object.
- the reference frame can be a fixed reference frame (e.g., the surrounding environment).
- the reference frame can be a moving reference frame (e.g., the movable object, a payload target).
- the carrier can be configured to permit movement of the payload relative to the carrier and/or movable object.
- the movement can be a translation with respect to up to three degrees of freedom (e.g., along one, two, or three axes) or a rotation with respect to up to three degrees of freedom (e.g., about one, two, or three axes), or any suitable combination thereof.
- the carrier can include a carrier frame assembly and a carrier actuation assembly.
- the carrier frame assembly can provide structural support to the payload.
- the carrier frame assembly can include individual carrier frame components, some of which can be movable relative to one another.
- the carrier actuation assembly can include one or more actuators (e.g., motors) that actuate movement of the individual carrier frame components.
- the actuators can permit the movement of multiple carrier frame components simultaneously, or may be configured to permit the movement of a single carrier frame component at a time. The movement of the carrier frame components can produce a corresponding movement of the payload.
- the carrier actuation assembly can actuate a rotation of one or more carrier frame components about one or more axes of rotation (e.g., roll axis, pitch axis, or yaw axis).
- the rotation of the one or more carrier frame components can cause a payload to rotate about one or more axes of rotation relative to the movable object.
- the carrier actuation assembly can actuate a translation of one or more carrier frame components along one or more axes of translation, and thereby produce a translation of the payload along one or more corresponding axes relative to the movable object.
- the movement of the movable object, carrier, and payload relative to a fixed reference frame (e.g., the surrounding environment) and/or to each other, can be controlled by a terminal.
- the terminal can be a remote control device at a location distant from the movable object, carrier, and/or payload.
- the terminal can be disposed on or affixed to a support platform.
- the terminal can be a handheld or wearable device.
- the terminal can include a smartphone, tablet, laptop, computer, glasses, gloves, helmet, microphone, or suitable combinations thereof.
- the terminal can include a user interface, such as a keyboard, mouse, joystick, touchscreen, or display. Any suitable user input can be used to interact with the terminal, such as manually entered commands, voice control, gesture control, or position control (e.g., via a movement, location or tilt of the terminal).
- the terminal can be used to control any suitable state of the movable object, carrier, and/or payload.
- the terminal can be used to control the position and/or orientation of the movable object, carrier, and/or payload relative to a fixed reference from and/or to each other.
- the terminal can be used to control individual elements of the movable object, carrier, and/or payload, such as the actuation assembly of the carrier, a sensor of the payload, or an emitter of the payload.
- the terminal can include a wireless communication device adapted to communicate with one or more of the movable object, carrier, or payload.
- the terminal can include a suitable display unit for viewing information of the movable object, carrier, and/or payload.
- the terminal can be configured to display information of the movable object, carrier, and/or payload with respect to position, translational velocity, translational acceleration, orientation, angular velocity, angular acceleration, or any suitable combinations thereof.
- the terminal can display information provided by the payload, such as data provided by a functional payload (e.g., images recorded by a camera or other image capturing device).
- the same terminal may both control the movable object, carrier, and/or payload, or a state of the movable object, carrier and/or payload, as well as receive and/or display information from the movable object, carrier and/or payload.
- a terminal may control the positioning of the payload relative to an environment, while displaying image data captured by the payload, or information about the position of the payload.
- different terminals may be used for different functions. For example, a first terminal may control movement or a state of the movable object, carrier, and/or payload while a second terminal may receive and/or display information from the movable object, carrier, and/or payload.
- a first terminal may be used to control the positioning of the payload relative to an environment while a second terminal displays image data captured by the payload.
- Various communication modes may be utilized between a movable object and an integrated terminal that both controls the movable object and receives data, or between the movable object and multiple terminals that both control the movable object and receives data.
- at least two different communication modes may be formed between the movable object and the terminal that both controls the movable object and receives data from the movable object.
- FIG. 32 illustrates a movable object 3200 including a carrier 3202 and a payload 3204 , in accordance with embodiments.
- the movable object 3200 is depicted as an aircraft, this depiction is not intended to be limiting, and any suitable type of movable object can be used, as previously described herein.
- the payload 3204 may be provided on the movable object 3200 without requiring the carrier 3202 .
- the movable object 3200 may include propulsion mechanisms 3206 , a sensing system 3208 , and a communication system 3210 .
- the propulsion mechanisms 3206 can include one or more of rotors, propellers, blades, engines, motors, wheels, axles, magnets, or nozzles, as previously described.
- the propulsion mechanisms 3206 may be self-tightening rotors, rotor assemblies, or other rotary propulsion units, as disclosed elsewhere herein.
- the movable object may have one or more, two or more, three or more, or four or more propulsion mechanisms.
- the propulsion mechanisms may all be of the same type. Alternatively, one or more propulsion mechanisms can be different types of propulsion mechanisms.
- the propulsion mechanisms 3206 can be mounted on the movable object 3200 using any suitable means, such as a support element (e.g., a drive shaft) as described elsewhere herein.
- the propulsion mechanisms 3206 can be mounted on any suitable portion of the movable object 3200 , such on the top, bottom, front, back, sides, or suitable combinations thereof.
- the propulsion mechanisms 3206 can enable the movable object 3200 to take off vertically from a surface or land vertically on a surface without requiring any horizontal movement of the movable object 3200 (e.g., without traveling down a runway).
- the propulsion mechanisms 3206 can be operable to permit the movable object 3200 to hover in the air at a specified position and/or orientation.
- One or more of the propulsion mechanisms 3200 may be controlled independently of the other propulsion mechanisms.
- the propulsion mechanisms 3200 can be configured to be controlled simultaneously.
- the movable object 3200 can have multiple horizontally oriented rotors that can provide lift and/or thrust to the movable object.
- the multiple horizontally oriented rotors can be actuated to provide vertical takeoff, vertical landing, and hovering capabilities to the movable object 3200 .
- one or more of the horizontally oriented rotors may spin in a clockwise direction, while one or more of the horizontally rotors may spin in a counterclockwise direction.
- the number of clockwise rotors may be equal to the number of counterclockwise rotors.
- each of the horizontally oriented rotors can be varied independently in order to control the lift and/or thrust produced by each rotor, and thereby adjust the spatial disposition, velocity, and/or acceleration of the movable object 3200 (e.g., with respect to up to three degrees of translation and up to three degrees of rotation).
- the sensing system 3208 can include one or more sensors that may sense the spatial disposition, velocity, and/or acceleration of the movable object 3200 (e.g., with respect to up to three degrees of translation and up to three degrees of rotation).
- the one or more sensors can include global positioning system (GPS) sensors, motion sensors, inertial sensors, proximity sensors, or image sensors.
- GPS global positioning system
- the sensing data provided by the sensing system 3208 can be used to control the spatial disposition, velocity, and/or orientation of the movable object 3200 (e.g., using a suitable processing unit and/or control module, as described below).
- the sensing system 3208 can be used to provide data regarding the environment surrounding the movable object, such as weather conditions, proximity to potential obstacles, location of geographical features, location of manmade structures, and the like.
- the communication system 3210 enables communication with terminal 3212 having a communication system 3214 via wireless signals 3216 .
- the communication systems 3210 , 3214 may include any number of transmitters, receivers, and/or transceivers suitable for wireless communication.
- the communication may be one-way communication, such that data can be transmitted in only one direction.
- one-way communication may involve only the movable object 3200 transmitting data to the terminal 3212 , or vice-versa.
- the data may be transmitted from one or more transmitters of the communication system 3210 to one or more receivers of the communication system 3212 , or vice-versa.
- the communication may be two-way communication, such that data can be transmitted in both directions between the movable object 3200 and the terminal 3212 .
- the two-way communication can involve transmitting data from one or more transmitters of the communication system 3210 to one or more receivers of the communication system 3214 , and vice-versa.
- the terminal 3212 can provide control data to one or more of the movable object 3200 , carrier 3202 , and payload 3204 and receive information from one or more of the movable object 3200 , carrier 3202 , and payload 3204 (e.g., position and/or motion information of the movable object, carrier or payload; data sensed by the payload such as image data captured by a payload camera).
- control data from the terminal may include instructions for relative positions, movements, actuations, or controls of the movable object, carrier and/or payload.
- control data may result in a modification of the location and/or orientation of the movable object (e.g., via control of the propulsion mechanisms 3206 ), or a movement of the payload with respect to the movable object (e.g., via control of the carrier 3202 ).
- the control data from the terminal may result in control of the payload, such as control of the operation of a camera or other image capturing device (e.g., taking still or moving pictures, zooming in or out, turning on or off, switching imaging modes, change image resolution, changing focus, changing depth of field, changing exposure time, changing viewing angle or field of view).
- the communications from the movable object, carrier and/or payload may include information from one or more sensors (e.g., of the sensing system 3208 or of the payload 3204 ).
- the communications may include sensed information from one or more different types of sensors (e.g., GPS sensors, motion sensors, inertial sensor, proximity sensors, or image sensors). Such information may pertain to the position (e.g., location, orientation), movement, or acceleration of the movable object, carrier and/or payload.
- Such information from a payload may include data captured by the payload or a sensed state of the payload.
- the control data provided transmitted by the terminal 3212 can be configured to control a state of one or more of the movable object 3200 , carrier 3202 , or payload 3204 .
- the carrier 3202 and payload 3204 can also each include a communication module configured to communicate with terminal 3212 , such that the terminal can communicate with and control each of the movable object 3200 , carrier 3202 , and payload 3204 independently.
- the movable object 3200 can be configured to communicate with another remote device in addition to the terminal 3212 , or instead of the terminal 3212 .
- the terminal 3212 may also be configured to communicate with another remote device as well as the movable object 3200 .
- the movable object 3200 and/or terminal 3212 may communicate with another movable object, or a carrier or payload of another movable object.
- the remote device may be a second terminal or other computing device (e.g., computer, laptop, tablet, smartphone, or other mobile device).
- the remote device can be configured to transmit data to the movable object 3200 , receive data from the movable object 3200 , transmit data to the terminal 3212 , and/or receive data from the terminal 3212 .
- the remote device can be connected to the Internet or other telecommunications network, such that data received from the movable object 3200 and/or terminal 3212 can be uploaded to a website or server.
- a system for controlling a movable object may be provided in accordance with embodiments.
- the system can be used in combination with any suitable embodiment of the systems, devices, and methods disclosed herein.
- the system can include a sensing module, processing unit, non-transitory computer readable medium, control module, and communication module.
- the sensing module can utilize different types of sensors that collect information relating to the movable objects in different ways. Different types of sensors may sense different types of signals or signals from different sources.
- the sensors can include inertial sensors, GPS sensors, proximity sensors (e.g., lidar), or vision/image sensors (e.g., a camera).
- the sensing module can be operatively coupled to a processing unit having a plurality of processors.
- the sensing module can be operatively coupled to a transmission module (e.g., a Wi-Fi image transmission module) configured to directly transmit sensing data to a suitable external device or system.
- the transmission module can be used to transmit images captured by a camera of the sensing module to a remote terminal.
- the processing unit can have one or more processors, such as a programmable processor (e.g., a central processing unit (CPU)).
- the processing unit can be operatively coupled to a non-transitory computer readable medium.
- the non-transitory computer readable medium can store logic, code, and/or program instructions executable by the processing unit for performing one or more steps.
- the non-transitory computer readable medium can include one or more memory units (e.g., removable media or external storage such as an SD card or random access memory (RAM)).
- data from the sensing module can be directly conveyed to and stored within the memory units of the non-transitory computer readable medium.
- the memory units of the non-transitory computer readable medium can store logic, code and/or program instructions executable by the processing unit to perform any suitable embodiment of the methods described herein.
- the processing unit can be configured to execute instructions causing one or more processors of the processing unit to analyze sensing data produced by the sensing module.
- the memory units can store sensing data from the sensing module to be processed by the processing unit.
- the memory units of the non-transitory computer readable medium can be used to store the processing results produced by the processing unit.
- the processing unit can be operatively coupled to a control module configured to control a state of the movable object.
- the control module can be configured to control the propulsion mechanisms of the movable object to adjust the spatial disposition, velocity, and/or acceleration of the movable object with respect to six degrees of freedom.
- the control module can control one or more of a state of a carrier, payload, or sensing module.
- the processing unit can be operatively coupled to a communication module configured to transmit and/or receive data from one or more external devices (e.g., a terminal, display device, or other remote controller). Any suitable means of communication can be used, such as wired communication or wireless communication.
- the communication module can utilize one or more of local area networks (LAN), wide area networks (WAN), infrared, radio, WiFi, point-to-point (P2P) networks, telecommunication networks, cloud communication, and the like.
- relay stations such as towers, satellites, or mobile stations, can be used.
- Wireless communications can be proximity dependent or proximity independent. In some embodiments, line-of-sight may or may not be required for communications.
- the communication module can transmit and/or receive one or more of sensing data from the sensing module, processing results produced by the processing unit, predetermined control data, user commands from a terminal or remote controller, and the like.
- the components of the system can be arranged in any suitable configuration.
- one or more of the components of the system can be located on the movable object, carrier, payload, terminal, sensing system, or an additional external device in communication with one or more of the above.
- one or more of the plurality of processing units and/or non-transitory computer readable media can be situated at different locations, such as on the movable object, carrier, payload, terminal, sensing module, additional external device in communication with one or more of the above, or suitable combinations thereof, such that any suitable aspect of the processing and/or memory functions performed by the system can occur at one or more of the aforementioned locations.
- a and/or B encompasses one or more of A or B, and combinations thereof such as A and B. It will be understood that although the terms “first,” “second,” “third” etc. may be used herein to describe various elements, components, regions and/or sections, these elements, components, regions and/or sections should not be limited by these terms. These terms are merely used to distinguish one element, component, region or section from another element, component, region or section. Thus, a first element, component, region or section discussed below could be termed a second element, component, region or section without departing from the teachings of the present invention.
- relative terms such as “lower” or “bottom” and “upper” or “top” may be used herein to describe one element's relationship to other elements as illustrated in the figures. It will be understood that relative terms are intended to encompass different orientations of the elements in addition to the orientation depicted in the figures. For example, if the element in one of the figures is turned over, elements described as being on the “lower” side of other elements would then be oriented on the “upper” side of the other elements. The exemplary term “lower” can, therefore, encompass both an orientation of “lower” and “upper,” depending upon the particular orientation of the figure.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Remote Sensing (AREA)
- Theoretical Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Astronomy & Astrophysics (AREA)
- Electromagnetism (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Image Analysis (AREA)
- Studio Devices (AREA)
- Selective Calling Equipment (AREA)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CNPCT/CN2015/089594 | 2015-09-15 | ||
PCT/CN2015/089594 WO2017045116A1 (fr) | 2015-09-15 | 2015-09-15 | Système et procédé de prise en charge de suivi de cible uniforme |
PCT/CN2015/093459 WO2017045251A1 (fr) | 2015-09-15 | 2015-10-30 | Systèmes et procédés pour des instructions et une commande interactives d'uav |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2015/093459 A-371-Of-International WO2017045251A1 (fr) | 2015-09-15 | 2015-10-30 | Systèmes et procédés pour des instructions et une commande interactives d'uav |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/991,739 Continuation US11635775B2 (en) | 2015-09-15 | 2020-08-12 | Systems and methods for UAV interactive instructions and control |
Publications (1)
Publication Number | Publication Date |
---|---|
US20190011921A1 true US20190011921A1 (en) | 2019-01-10 |
Family
ID=58288098
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/067,577 Abandoned US20190011921A1 (en) | 2015-09-15 | 2015-10-30 | Systems and methods for uav interactive instructions and control |
US15/396,022 Active US10129478B2 (en) | 2015-09-15 | 2016-12-30 | System and method for supporting smooth target following |
US15/922,023 Active 2036-10-16 US10928838B2 (en) | 2015-09-15 | 2018-03-15 | Method and device of determining position of target, tracking device and tracking system |
US16/188,144 Active US10976753B2 (en) | 2015-09-15 | 2018-11-12 | System and method for supporting smooth target following |
US16/991,739 Active 2035-11-01 US11635775B2 (en) | 2015-09-15 | 2020-08-12 | Systems and methods for UAV interactive instructions and control |
US17/224,391 Pending US20210223795A1 (en) | 2015-09-15 | 2021-04-07 | System and method for supporting smooth target following |
Family Applications After (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/396,022 Active US10129478B2 (en) | 2015-09-15 | 2016-12-30 | System and method for supporting smooth target following |
US15/922,023 Active 2036-10-16 US10928838B2 (en) | 2015-09-15 | 2018-03-15 | Method and device of determining position of target, tracking device and tracking system |
US16/188,144 Active US10976753B2 (en) | 2015-09-15 | 2018-11-12 | System and method for supporting smooth target following |
US16/991,739 Active 2035-11-01 US11635775B2 (en) | 2015-09-15 | 2020-08-12 | Systems and methods for UAV interactive instructions and control |
US17/224,391 Pending US20210223795A1 (en) | 2015-09-15 | 2021-04-07 | System and method for supporting smooth target following |
Country Status (5)
Country | Link |
---|---|
US (6) | US20190011921A1 (fr) |
EP (2) | EP3353706A4 (fr) |
JP (1) | JP6735821B2 (fr) |
CN (7) | CN107209854A (fr) |
WO (3) | WO2017045116A1 (fr) |
Cited By (77)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170193666A1 (en) * | 2016-01-05 | 2017-07-06 | Microsoft Technology Licensing, Llc | Motion capture from a mobile self-tracking device |
US20170285631A1 (en) * | 2016-03-31 | 2017-10-05 | Unmanned Innovation, Inc. | Unmanned aerial vehicle modular command priority determination and filtering system |
US20170359561A1 (en) * | 2016-06-08 | 2017-12-14 | Uber Technologies, Inc. | Disparity mapping for an autonomous vehicle |
US20180155023A1 (en) * | 2016-12-05 | 2018-06-07 | Samsung Electronics Co., Ltd | Flight control method and electronic device for supporting the same |
US20190004524A1 (en) * | 2016-08-31 | 2019-01-03 | Faraday&Future Inc. | System and method for planning a vehicle path |
US20190066524A1 (en) * | 2017-08-10 | 2019-02-28 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for obstacle avoidance in aerial systems |
US20190073025A1 (en) * | 2016-03-07 | 2019-03-07 | Sensomotoric Instruments Gesellschaft Fur Innovative Sensorik Mbh | Method and device for carrying out eye gaze mapping |
US20190154871A1 (en) * | 2017-11-21 | 2019-05-23 | Reliance Core Consulting LLC | Methods and systems for detecting motion corresponding to a field of interest |
US20190158755A1 (en) * | 2017-11-20 | 2019-05-23 | Chiun Mai Communication Systems, Inc. | Aerial vehicle and target object tracking method |
US20190156496A1 (en) * | 2017-11-21 | 2019-05-23 | Reliance Core Consulting LLC | Methods, systems, apparatuses and devices for facilitating motion analysis in an environment |
US20190197335A1 (en) * | 2017-12-25 | 2019-06-27 | Autel Robotics Co., Ltd. | Distance measurement method and apparatus, and unmanned aerial vehicle |
US20190335153A1 (en) * | 2016-11-17 | 2019-10-31 | Nokia Technologies Oy | Method for multi-camera device |
US20190362473A1 (en) * | 2018-05-23 | 2019-11-28 | International Business Machines Corporation | Selectively Redacting Unrelated Objects from Images of a Group Captured within a Coverage Area |
US10538326B1 (en) * | 2016-08-31 | 2020-01-21 | Amazon Technologies, Inc. | Flare detection and avoidance in stereo vision systems |
US10576968B2 (en) * | 2014-08-27 | 2020-03-03 | Renesas Electronics Corporation | Control system, relay device and control method |
US20200090501A1 (en) * | 2018-09-19 | 2020-03-19 | International Business Machines Corporation | Accident avoidance system for pedestrians |
US10656650B2 (en) * | 2015-01-09 | 2020-05-19 | Korean Air Lines Co., Ltd. | Method for guiding and controlling drone using information for controlling camera of drone |
US20200156255A1 (en) * | 2018-11-21 | 2020-05-21 | Ford Global Technologies, Llc | Robotic manipulation using an independently actuated vision system, an adversarial control scheme, and a multi-tasking deep learning architecture |
US20200211404A1 (en) * | 2018-12-27 | 2020-07-02 | Subaru Corporation | Optimal-route generating system |
CN111596692A (zh) * | 2020-06-09 | 2020-08-28 | 北京航空航天大学 | 一种平流层飞艇的环绕跟踪移动目标控制方法及系统 |
CN111600644A (zh) * | 2020-04-09 | 2020-08-28 | 西安理工大学 | 一种紫外光协助无人机编队最优刚性拓扑生成方法 |
US10768639B1 (en) | 2016-06-30 | 2020-09-08 | Snap Inc. | Motion and image-based control system |
US10846880B1 (en) * | 2020-01-03 | 2020-11-24 | Altec Industries, Inc. | Camera embedded joystick |
US20210009270A1 (en) * | 2018-04-04 | 2021-01-14 | SZ DJI Technology Co., Ltd. | Methods and system for composing and capturing images |
US10902634B2 (en) * | 2018-12-04 | 2021-01-26 | Here Global B.V. | Method and apparatus for providing feature triangulation |
US20210025998A1 (en) * | 2018-04-03 | 2021-01-28 | Kaarta, Inc. | Methods and systems for real or near real-time point cloud map data confidence evaluation |
US10921825B2 (en) * | 2017-11-04 | 2021-02-16 | Automodality, Inc. | System and method for perceptive navigation of automated vehicles |
WO2021052893A1 (fr) * | 2019-09-17 | 2021-03-25 | Atlas Elektronik Gmbh | Détection optique de mines dans des eaux peu profondes |
US10962650B2 (en) | 2017-10-31 | 2021-03-30 | United States Of America As Represented By The Administrator Of Nasa | Polyhedral geofences |
US10967862B2 (en) | 2017-11-07 | 2021-04-06 | Uatc, Llc | Road anomaly detection for autonomous vehicle |
US20210171193A1 (en) * | 2017-11-27 | 2021-06-10 | Beijing Jingdong Shangke Information Technology Co., Ltd. | Unmanned aerial vehicle control method, unmanned aerial vehicle control device, and computer readable storage medium |
US11036216B2 (en) * | 2018-09-26 | 2021-06-15 | International Business Machines Corporation | Voice-controllable unmanned aerial vehicle for object retrieval and delivery |
US20210206491A1 (en) * | 2020-01-03 | 2021-07-08 | Tencent America LLC | Unmanned aerial system communication |
US11084581B2 (en) * | 2016-04-29 | 2021-08-10 | Lg Electronics Inc. | Mobile terminal and control method therefor |
US11107506B2 (en) * | 2016-07-08 | 2021-08-31 | SZ DJI Technology Co., Ltd. | Method and system for combining and editing UAV operation data and video data |
US11117662B2 (en) * | 2016-04-18 | 2021-09-14 | Autel Robotics Co., Ltd. | Flight direction display method and apparatus, and unmanned aerial vehicle |
US11126182B2 (en) | 2016-08-12 | 2021-09-21 | Skydio, Inc. | Unmanned aerial image capture platform |
US11145078B2 (en) * | 2018-04-04 | 2021-10-12 | Tencent Technology (Shenzhen) Company Limited | Depth information determining method and related apparatus |
WO2021213737A1 (fr) * | 2020-04-22 | 2021-10-28 | Siemens Aktiengesellschaft | Système de navigation automatique pour robot de lutte contre l'incendie |
US20210405646A1 (en) * | 2019-07-03 | 2021-12-30 | Lg Electronics Inc. | Marker, method of moving in marker following mode, and cart-robot implementing method |
US11277528B2 (en) * | 2018-05-14 | 2022-03-15 | Fujifilm Corporation | Mobile type apparatus and imaging system |
EP3968626A1 (fr) * | 2020-09-09 | 2022-03-16 | Beijing Xiaomi Mobile Software Co., Ltd. | Procédé de photographie, appareil de photographie, dispositif électronique et support d'enregistrement |
US11282225B2 (en) | 2018-09-10 | 2022-03-22 | Mapbox, Inc. | Calibration for vision in navigation systems |
US11295458B2 (en) * | 2016-12-01 | 2022-04-05 | Skydio, Inc. | Object tracking by an unmanned aerial vehicle using visual sensors |
US11314254B2 (en) * | 2019-03-26 | 2022-04-26 | Intel Corporation | Methods and apparatus for dynamically routing robots based on exploratory on-board mapping |
US11321942B2 (en) * | 2018-03-23 | 2022-05-03 | Guangzhou Xaircraft Technology Co., Ltd. | Method for measuring plant planting data, device and system |
US20220137648A1 (en) * | 2019-06-14 | 2022-05-05 | Autel Robotics Co., Ltd. | Method and apparatus for tracking moving target and unmanned aerial vehicle |
WO2022104489A1 (fr) * | 2020-11-20 | 2022-05-27 | Drovid Technologies | Procédé pour transmettre et suivre des paramètres détectés par drones par l'intermédiaire de (paas) avec (ai) |
US11347217B2 (en) | 2014-06-19 | 2022-05-31 | Skydio, Inc. | User interaction paradigms for a flying digital assistant |
EP3989118A4 (fr) * | 2019-06-28 | 2022-06-29 | SZ DJI Technology Co., Ltd. | Procédé et système de suivi de cible, support de stockage lisible et plateforme mobile |
WO2022136355A1 (fr) * | 2020-12-22 | 2022-06-30 | Naval Group | Système de planification d'une trajectoire optimisée d'un véhicule maritime |
TWI771192B (zh) * | 2020-09-24 | 2022-07-11 | 大陸商深圳市海柔創新科技有限公司 | 任務處理方法、控制終端、機器人、倉儲系統及儲存媒體 |
EP3998578A4 (fr) * | 2019-07-16 | 2022-07-20 | SZ DJI Technology Co., Ltd. | Procédé, dispositif, et système de photographie, et support d'informations lisible par ordinateur |
WO2022153065A1 (fr) * | 2021-01-18 | 2022-07-21 | Hybrid Drones Limited | Système et procédé de vision pour véhicules aériens sans pilote |
US11417088B2 (en) * | 2018-06-15 | 2022-08-16 | Sony Corporation | Information processing device, information processing method, program, and information processing system |
US11443639B2 (en) * | 2018-09-30 | 2022-09-13 | Moutong Science And Technology Co., Ltd | Methods of generating a unmanned aerial vehicle migration trajectory, electronic devices and storage mediums |
US11498676B2 (en) * | 2016-11-28 | 2022-11-15 | Guangzhou Xaircraft Technology Co., Ltd. | Method and apparatus for controlling flight of unmanned aerial vehicle |
EP3941693A4 (fr) * | 2019-03-20 | 2022-11-30 | Covidien LP | Systèmes de détection de collision chirurgicale robotique |
US20220390965A1 (en) * | 2021-06-02 | 2022-12-08 | FLIR Unmanned Aerial Systems AS | Mobile platform vision sensor systems and methods |
US20220390940A1 (en) * | 2021-06-02 | 2022-12-08 | Skydio, Inc. | Interfaces And Control Of Aerial Vehicle For Automated Multidimensional Volume Scanning |
US11531357B1 (en) | 2017-10-05 | 2022-12-20 | Snap Inc. | Spatial vector-based drone control |
US20230018021A1 (en) * | 2020-03-31 | 2023-01-19 | SZ DJI Technology Co., Ltd. | Movable platform control method and device, movable platform and storage medium |
US11565807B1 (en) | 2019-06-05 | 2023-01-31 | Gal Zuckerman | Systems and methods facilitating street-level interactions between flying drones and on-road vehicles |
US11573562B2 (en) | 2014-06-19 | 2023-02-07 | Skydio, Inc. | Magic wand interface and other user interaction paradigms for a flying digital assistant |
US20230138206A1 (en) * | 2021-11-03 | 2023-05-04 | Amit Bahl | Methods and systems for detecting intravascular device failure |
US20230139242A1 (en) * | 2016-05-25 | 2023-05-04 | Siemens Mobility GmbH | Method, device and arrangement for tracking moving objects |
US11753142B1 (en) | 2017-09-29 | 2023-09-12 | Snap Inc. | Noise modulation for unmanned aerial vehicles |
US11801937B2 (en) * | 2018-07-26 | 2023-10-31 | California Institute Of Technology | Systems and methods for avian flock flight path modification using UAVs |
US20230367463A1 (en) * | 2022-05-11 | 2023-11-16 | Supercell Oy | Randomized movement control |
US11822346B1 (en) | 2018-03-06 | 2023-11-21 | Snap Inc. | Systems and methods for estimating user intent to launch autonomous aerial vehicle |
US11958183B2 (en) | 2019-09-19 | 2024-04-16 | The Research Foundation For The State University Of New York | Negotiation-based human-robot collaboration via augmented reality |
US11972521B2 (en) | 2022-08-31 | 2024-04-30 | Snap Inc. | Multisensorial presentation of volumetric content |
US11987383B2 (en) | 2015-10-31 | 2024-05-21 | Snap Inc. | Lighting apparatus for remote controlled device |
US12007763B2 (en) | 2014-06-19 | 2024-06-11 | Skydio, Inc. | Magic wand interface and other user interaction paradigms for a flying digital assistant |
US12073577B2 (en) | 2017-09-15 | 2024-08-27 | Snap Inc. | Computing a point cloud from stitched images |
US12071228B1 (en) * | 2019-03-28 | 2024-08-27 | Snap Inc. | Drone with propeller guard configured as an airfoil |
KR102722952B1 (ko) | 2022-03-31 | 2024-10-29 | 주식회사 휴인스 | 인공지능 기반의 실내 자율비행이 가능한 드론 |
Families Citing this family (173)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10244504B2 (en) | 2013-03-15 | 2019-03-26 | DGS Global Systems, Inc. | Systems, methods, and devices for geolocation with deployable large scale arrays |
US10237770B2 (en) | 2013-03-15 | 2019-03-19 | DGS Global Systems, Inc. | Systems, methods, and devices having databases and automated reports for electronic spectrum management |
US10219163B2 (en) | 2013-03-15 | 2019-02-26 | DGS Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management |
US11646918B2 (en) | 2013-03-15 | 2023-05-09 | Digital Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management for identifying open space |
US9288683B2 (en) | 2013-03-15 | 2016-03-15 | DGS Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management |
US8750156B1 (en) | 2013-03-15 | 2014-06-10 | DGS Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management for identifying open space |
US10257729B2 (en) | 2013-03-15 | 2019-04-09 | DGS Global Systems, Inc. | Systems, methods, and devices having databases for electronic spectrum management |
US10299149B2 (en) | 2013-03-15 | 2019-05-21 | DGS Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management |
US10271233B2 (en) | 2013-03-15 | 2019-04-23 | DGS Global Systems, Inc. | Systems, methods, and devices for automatic signal detection with temporal feature extraction within a spectrum |
US10257728B2 (en) | 2013-03-15 | 2019-04-09 | DGS Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management |
US10257727B2 (en) | 2013-03-15 | 2019-04-09 | DGS Global Systems, Inc. | Systems methods, and devices having databases and automated reports for electronic spectrum management |
US10231206B2 (en) | 2013-03-15 | 2019-03-12 | DGS Global Systems, Inc. | Systems, methods, and devices for electronic spectrum management for identifying signal-emitting devices |
EP4342792A3 (fr) * | 2015-12-09 | 2024-04-17 | SZ DJI Technology Co., Ltd. | Systèmes et procédés de commande de vol de véhicule aérien sans pilote (uav) |
US10665115B2 (en) * | 2016-01-05 | 2020-05-26 | California Institute Of Technology | Controlling unmanned aerial vehicles to avoid obstacle collision |
US11461912B2 (en) | 2016-01-05 | 2022-10-04 | California Institute Of Technology | Gaussian mixture models for temporal depth fusion |
US9758246B1 (en) | 2016-01-06 | 2017-09-12 | Gopro, Inc. | Systems and methods for adjusting flight control of an unmanned aerial vehicle |
WO2017169516A1 (fr) * | 2016-03-28 | 2017-10-05 | 日本電気株式会社 | Système de commande de dispositif de vol sans pilote, procédé de commande de dispositif de vol sans pilote, et dispositif d'inspection |
TWI598143B (zh) * | 2016-06-03 | 2017-09-11 | 博泰科技有限公司 | 飛行器的跟隨遙控方法 |
WO2018023736A1 (fr) * | 2016-08-05 | 2018-02-08 | SZ DJI Technology Co., Ltd. | Système et procédé permettant de positionner un objet mobile |
CN106504270B (zh) | 2016-11-08 | 2019-12-20 | 浙江大华技术股份有限公司 | 一种视频中目标物体的展示方法及装置 |
US10700794B2 (en) | 2017-01-23 | 2020-06-30 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time within an electromagnetic spectrum |
US10459020B2 (en) | 2017-01-23 | 2019-10-29 | DGS Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time within a spectrum |
US10529241B2 (en) * | 2017-01-23 | 2020-01-07 | Digital Global Systems, Inc. | Unmanned vehicle recognition and threat management |
US10498951B2 (en) | 2017-01-23 | 2019-12-03 | Digital Global Systems, Inc. | Systems, methods, and devices for unmanned vehicle detection |
KR102275452B1 (ko) * | 2017-03-16 | 2021-07-12 | 한국전자통신연구원 | 색상과 형태를 동시에 고려한 실시간 영상 추적 방법 및 이를 위한 장치 |
CN106909170B (zh) * | 2017-04-26 | 2020-04-07 | 北京小米移动软件有限公司 | 控制飞行器的方法和装置 |
WO2018195979A1 (fr) * | 2017-04-28 | 2018-11-01 | 深圳市大疆创新科技有限公司 | Procédé et appareil de commande de poursuite, et véhicule de vol |
WO2018222945A1 (fr) | 2017-05-31 | 2018-12-06 | Geomni, Inc. | Système et procédé de planification de mission et d'automatisation de vol pour aéronef sans pilote |
CN109479088A (zh) | 2017-06-02 | 2019-03-15 | 深圳市大疆创新科技有限公司 | 基于深度机器学习和激光雷达进行多目标跟踪和自动聚焦的系统和方法 |
CN107273937A (zh) * | 2017-07-10 | 2017-10-20 | 北京工业大学 | 一种视频及图像中目标任意角度标记方法 |
US10816354B2 (en) | 2017-08-22 | 2020-10-27 | Tusimple, Inc. | Verification module system and method for motion-based lane detection with multiple sensors |
US10565457B2 (en) | 2017-08-23 | 2020-02-18 | Tusimple, Inc. | Feature matching and correspondence refinement and 3D submap position refinement system and method for centimeter precision localization using camera-based submap and LiDAR-based global map |
JP7057637B2 (ja) * | 2017-08-23 | 2022-04-20 | キヤノン株式会社 | 制御装置、制御システム、制御方法、プログラム、及び記憶媒体 |
US10762673B2 (en) | 2017-08-23 | 2020-09-01 | Tusimple, Inc. | 3D submap reconstruction system and method for centimeter precision localization using camera-based submap and LiDAR-based global map |
US10719087B2 (en) | 2017-08-29 | 2020-07-21 | Autel Robotics Co., Ltd. | Target tracking method, unmanned aerial vehicle, and computer readable storage medium |
CN107505951B (zh) * | 2017-08-29 | 2020-08-21 | 深圳市道通智能航空技术有限公司 | 一种目标跟踪方法、无人机和计算机可读存储介质 |
CN107590450A (zh) * | 2017-09-01 | 2018-01-16 | 歌尔科技有限公司 | 一种运动目标的标记方法、装置和无人机 |
US10953880B2 (en) | 2017-09-07 | 2021-03-23 | Tusimple, Inc. | System and method for automated lane change control for autonomous vehicles |
US10649458B2 (en) | 2017-09-07 | 2020-05-12 | Tusimple, Inc. | Data-driven prediction-based system and method for trajectory planning of autonomous vehicles |
US10953881B2 (en) | 2017-09-07 | 2021-03-23 | Tusimple, Inc. | System and method for automated lane change control for autonomous vehicles |
US10491824B2 (en) | 2017-09-26 | 2019-11-26 | Gopro, Inc. | Combined mechanical and electronic image stabilization |
CN108206941A (zh) * | 2017-09-27 | 2018-06-26 | 深圳市商汤科技有限公司 | 目标跟踪方法、系统、终端设备及存储介质 |
JP6412998B1 (ja) * | 2017-09-29 | 2018-10-24 | 株式会社Qoncept | 動体追跡装置、動体追跡方法、動体追跡プログラム |
WO2019069626A1 (fr) * | 2017-10-06 | 2019-04-11 | 株式会社豊田自動織機 | Véhicule mobile |
FR3074950B1 (fr) * | 2017-10-19 | 2019-12-27 | Valeo Comfort And Driving Assistance | Procede de traitement de donnees et systeme embarque associe |
CN109753076B (zh) * | 2017-11-03 | 2022-01-11 | 南京奇蛙智能科技有限公司 | 一种无人机视觉追踪实现方法 |
AU2018364811A1 (en) * | 2017-11-13 | 2020-05-28 | Geomni, Inc. | System and method for mission planning, flight automation, and capturing of high-resolution images by unmanned aircraft |
CN109154815B (zh) * | 2017-11-30 | 2022-06-21 | 深圳市大疆创新科技有限公司 | 最高温度点跟踪方法、装置和无人机 |
US11496684B2 (en) | 2017-12-11 | 2022-11-08 | Gopro, Inc. | Combined mechanical and electronic image stabilization |
CN108052901B (zh) * | 2017-12-13 | 2021-05-25 | 中国科学院沈阳自动化研究所 | 一种基于双目的手势识别智能无人机远程操控方法 |
US10827123B1 (en) | 2018-01-05 | 2020-11-03 | Gopro, Inc. | Modular image capture systems |
CN112004729B (zh) | 2018-01-09 | 2023-12-01 | 图森有限公司 | 具有高冗余的车辆的实时远程控制 |
CN111989716B (zh) | 2018-01-11 | 2022-11-15 | 图森有限公司 | 用于自主车辆操作的监视系统 |
US11009365B2 (en) | 2018-02-14 | 2021-05-18 | Tusimple, Inc. | Lane marking localization |
US11009356B2 (en) | 2018-02-14 | 2021-05-18 | Tusimple, Inc. | Lane marking localization and fusion |
CN110197097B (zh) * | 2018-02-24 | 2024-04-19 | 北京图森智途科技有限公司 | 一种港区监控方法及系统、中控系统 |
US10685244B2 (en) * | 2018-02-27 | 2020-06-16 | Tusimple, Inc. | System and method for online real-time multi-object tracking |
WO2019175992A1 (fr) * | 2018-03-13 | 2019-09-19 | 日本電気株式会社 | Dispositif et procédé de guidage de corps mobile, et support d'enregistrement lisible par ordinateur |
US11380054B2 (en) | 2018-03-30 | 2022-07-05 | Cae Inc. | Dynamically affecting tailored visual rendering of a visual element |
US10964106B2 (en) | 2018-03-30 | 2021-03-30 | Cae Inc. | Dynamically modifying visual rendering of a visual element comprising pre-defined characteristics |
CN110378185A (zh) | 2018-04-12 | 2019-10-25 | 北京图森未来科技有限公司 | 一种应用于自动驾驶车辆的图像处理方法、装置 |
CN108646787B (zh) * | 2018-04-12 | 2021-03-02 | 广州杰赛科技股份有限公司 | 目标追踪方法、装置以及无人机 |
CN111801703A (zh) * | 2018-04-17 | 2020-10-20 | 赫尔实验室有限公司 | 用于图像处理管线的边界框生成的硬件和系统 |
CN110291363A (zh) * | 2018-04-26 | 2019-09-27 | 深圳市大疆创新科技有限公司 | 可移动平台的导航传感器检测的方法及相关设备 |
CN108596116B (zh) * | 2018-04-27 | 2021-11-05 | 深圳市商汤科技有限公司 | 测距方法、智能控制方法及装置、电子设备和存储介质 |
CN110458854B (zh) | 2018-05-02 | 2022-11-15 | 北京图森未来科技有限公司 | 一种道路边缘检测方法和装置 |
US20190346842A1 (en) * | 2018-05-11 | 2019-11-14 | Honeywell International Inc. | Transferring annotations to images captured by remote vehicles between displays |
CN113395450A (zh) * | 2018-05-29 | 2021-09-14 | 深圳市大疆创新科技有限公司 | 一种跟踪拍摄方法、设备及存储介质 |
CN108564787A (zh) * | 2018-05-31 | 2018-09-21 | 北京理工大学 | 基于浮动车法的交通观测方法、系统及设备 |
CN108803655A (zh) * | 2018-06-08 | 2018-11-13 | 哈尔滨工程大学 | 一种无人机飞行控制平台及目标跟踪方法 |
CN109285179B (zh) * | 2018-07-26 | 2021-05-14 | 昆明理工大学 | 一种基于多特征融合的运动目标跟踪方法 |
CN108958297A (zh) * | 2018-08-03 | 2018-12-07 | 南京航空航天大学 | 一种多无人机协同目标跟踪地面站 |
CN109292099B (zh) * | 2018-08-10 | 2020-09-25 | 顺丰科技有限公司 | 一种无人机着陆判断方法、装置、设备及存储介质 |
CN109164825A (zh) * | 2018-08-13 | 2019-01-08 | 上海机电工程研究所 | 一种用于多旋翼无人机的自主导航避障方法及装置 |
EP3837492A1 (fr) * | 2018-08-21 | 2021-06-23 | SZ DJI Technology Co., Ltd. | Procédé et dispositif de mesure de distance |
US10943461B2 (en) | 2018-08-24 | 2021-03-09 | Digital Global Systems, Inc. | Systems, methods, and devices for automatic signal detection based on power distribution by frequency over time |
CN109376587A (zh) * | 2018-09-05 | 2019-02-22 | 福州日兆信息科技有限公司 | 基于物联网的检测查勘通信铁塔智能巡检系统和方法 |
EP3849868A4 (fr) | 2018-09-13 | 2022-10-12 | Tusimple, Inc. | Procédés et systèmes de conduite sans danger à distance |
CN109358497B (zh) * | 2018-09-14 | 2020-04-21 | 北京航空航天大学 | 一种基于b样条函数的卫星路径规划和预测控制的跟踪方法 |
CN109240320A (zh) * | 2018-09-27 | 2019-01-18 | 易瓦特科技股份公司 | 无人机控制方法及装置 |
CN109240346A (zh) * | 2018-09-27 | 2019-01-18 | 易瓦特科技股份公司 | 用于跟踪目标对象的方法及装置 |
CN110892353A (zh) * | 2018-09-30 | 2020-03-17 | 深圳市大疆创新科技有限公司 | 控制方法、控制装置、无人飞行器的控制终端 |
CN109101041A (zh) * | 2018-10-22 | 2018-12-28 | 深圳市智璟科技有限公司 | 一种无人飞机的动态跟随及动态返航方法 |
US10942271B2 (en) | 2018-10-30 | 2021-03-09 | Tusimple, Inc. | Determining an angle between a tow vehicle and a trailer |
CN109618131B (zh) * | 2018-11-22 | 2021-08-24 | 亮风台(上海)信息科技有限公司 | 一种用于呈现决策辅助信息的方法与设备 |
CN109561282B (zh) * | 2018-11-22 | 2021-08-06 | 亮风台(上海)信息科技有限公司 | 一种用于呈现地面行动辅助信息的方法与设备 |
CN109656319B (zh) * | 2018-11-22 | 2021-06-15 | 亮风台(上海)信息科技有限公司 | 一种用于呈现地面行动辅助信息方法与设备 |
US20210323669A1 (en) * | 2018-11-22 | 2021-10-21 | Lorenz Technology Aps | Method for inducing an autonomous behavior into an unmanned vehicle, and a communication unit for use in such a method |
CN109633661A (zh) * | 2018-11-28 | 2019-04-16 | 杭州凌像科技有限公司 | 一种基于rgb-d传感器与超声波传感器融合的玻璃检测系统和方法 |
WO2020107475A1 (fr) * | 2018-11-30 | 2020-06-04 | 深圳市大疆创新科技有限公司 | Procédé de commande d'évitement d'obstacle, appareil et dispositif pour véhicule aérien de pulvérisation sans pilote et support de stockage |
CN116184417A (zh) | 2018-12-10 | 2023-05-30 | 北京图森智途科技有限公司 | 一种挂车夹角的测量方法、装置及车辆 |
CN111319629B (zh) | 2018-12-14 | 2021-07-16 | 北京图森智途科技有限公司 | 一种自动驾驶车队的组队方法、装置及系统 |
EP3667696A1 (fr) * | 2018-12-14 | 2020-06-17 | ASML Netherlands B.V. | Appareil à platine approprié pour appareil d'inspection de faisceaux d'électrons |
TR201819906A2 (fr) * | 2018-12-20 | 2019-03-21 | Havelsan Hava Elektronik Sanayi Ve Ticaret Anonim Sirketi | |
KR102166326B1 (ko) * | 2018-12-21 | 2020-10-15 | 충북대학교 산학협력단 | 드론의 3차원 경로 설정 시스템 |
CN109754420B (zh) | 2018-12-24 | 2021-11-12 | 深圳市道通智能航空技术股份有限公司 | 一种目标距离估计方法、装置及无人机 |
CN109828488A (zh) * | 2018-12-27 | 2019-05-31 | 北京航天福道高技术股份有限公司 | 采集传输一体化双光探测跟踪系统 |
CN111376239B (zh) * | 2018-12-29 | 2023-06-27 | 希姆通信息技术(上海)有限公司 | 机器人的抓取方法及系统 |
US11001991B2 (en) * | 2019-01-11 | 2021-05-11 | Caterpillar Inc. | Optimizing loading of a payload carrier of a machine |
CN111435259A (zh) * | 2019-01-14 | 2020-07-21 | 北京京东尚科信息技术有限公司 | 一种智能跟随车移动控制装置和智能跟随车 |
US10950104B2 (en) * | 2019-01-16 | 2021-03-16 | PANASONIC l-PRO SENSING SOLUTIONS CO., LTD. | Monitoring camera and detection method |
CN111476827B (zh) * | 2019-01-24 | 2024-02-02 | 曜科智能科技(上海)有限公司 | 目标跟踪方法、系统、电子装置及存储介质 |
CN109934870B (zh) * | 2019-01-30 | 2021-11-30 | 西安天伟电子系统工程有限公司 | 目标检测方法、装置、设备、计算机设备和存储介质 |
US20220129017A1 (en) * | 2019-02-18 | 2022-04-28 | Sony Group Corporation | Flight body, information processing method, and program |
CN109947123B (zh) * | 2019-02-27 | 2021-06-22 | 南京航空航天大学 | 一种基于视线导引律的无人机路径跟踪与自主避障方法 |
US11455742B2 (en) * | 2019-04-19 | 2022-09-27 | Thermoteknix Systems Ltd. | Imaging systems including real-time target-acquisition and triangulation features and human-machine interfaces therefor |
CN110876275A (zh) * | 2019-04-30 | 2020-03-10 | 深圳市大疆创新科技有限公司 | 一种瞄准控制方法、移动机器人及计算机可读存储介质 |
CN110162102A (zh) * | 2019-05-17 | 2019-08-23 | 广东技术师范大学 | 基于云平台和机器视觉的无人机自动识别跟踪方法及系统 |
CN111656294A (zh) * | 2019-05-31 | 2020-09-11 | 深圳市大疆创新科技有限公司 | 可移动平台的控制方法、控制终端及可移动平台 |
US11823460B2 (en) | 2019-06-14 | 2023-11-21 | Tusimple, Inc. | Image fusion for autonomous vehicle operation |
WO2020258066A1 (fr) * | 2019-06-26 | 2020-12-30 | 深圳市大疆创新科技有限公司 | Procédé et dispositif de commande de véhicule aérien sans pilote, véhicule aérien sans pilote et support d'informations |
CN110262568B (zh) * | 2019-07-19 | 2021-10-22 | 深圳市道通智能航空技术股份有限公司 | 一种基于目标跟踪的无人机避障方法、装置及无人机 |
CN111328387B (zh) * | 2019-07-19 | 2024-02-20 | 深圳市大疆创新科技有限公司 | 云台控制方法、设备和计算机可读存储介质 |
CN110618700A (zh) * | 2019-08-23 | 2019-12-27 | 西南交通大学 | 用于社区配送的三维地理信息系统及无人机航迹路径规划应用方法 |
CN111752295B (zh) * | 2019-08-27 | 2021-09-10 | 广州极飞科技股份有限公司 | 无人机飞行轨迹规划方法及相关装置 |
CN110758381B (zh) * | 2019-09-18 | 2021-05-04 | 北京汽车集团有限公司 | 生成转向轨迹的方法、装置、存储介质及电子设备 |
CN110928432B (zh) * | 2019-10-24 | 2023-06-23 | 中国人民解放军军事科学院国防科技创新研究院 | 指环鼠标、鼠标控制装置及鼠标控制系统 |
DE102019129182A1 (de) * | 2019-10-29 | 2021-04-29 | Krones Aktiengesellschaft | Verfahren zur Überwachung eines Behälterstroms in einer Abfüllanlage, Überwachungssystem und Platzhalter für einen Behälter oder eine Umverpackung |
US20220383541A1 (en) * | 2019-11-13 | 2022-12-01 | Battelle Energy Alliance, Llc | Unmanned vehicle navigation, and associated methods, systems, and computer-readable medium |
CN110865650B (zh) * | 2019-11-19 | 2022-12-20 | 武汉工程大学 | 基于主动视觉的无人机位姿自适应估计方法 |
JP7192748B2 (ja) * | 2019-11-25 | 2022-12-20 | トヨタ自動車株式会社 | 搬送システム、学習済みモデル生成方法、学習済みモデル、制御方法およびプログラム |
CN113031511B (zh) * | 2019-12-24 | 2022-03-22 | 沈阳智能机器人创新中心有限公司 | 一种基于高阶b样条的多轴系统实时引导轨迹规划方法 |
CN111161323B (zh) * | 2019-12-31 | 2023-11-28 | 北京理工大学重庆创新中心 | 一种基于相关滤波的复杂场景目标跟踪方法及系统 |
CN111212456B (zh) * | 2020-01-16 | 2022-07-08 | 中国电建集团成都勘测设计研究院有限公司 | 基于地理位置面向低功耗远距离物联网的多径路由方法 |
CN111260689B (zh) * | 2020-01-16 | 2022-10-11 | 东华大学 | 一种基于置信度增强的相关滤波视觉跟踪方法 |
US20210247196A1 (en) * | 2020-02-10 | 2021-08-12 | Uber Technologies, Inc. | Object Detection for Light Electric Vehicles |
CN115362473A (zh) | 2020-02-18 | 2022-11-18 | 康耐视公司 | 用于对长于视场的运动物体进行三维扫描的系统和方法 |
CN112650235A (zh) * | 2020-03-11 | 2021-04-13 | 南京奥拓电子科技有限公司 | 一种机器人避障控制方法、系统及机器人 |
CN111240342A (zh) * | 2020-03-12 | 2020-06-05 | 南京奥拓电子科技有限公司 | 一种机器人避障控制方法与装置、机器人及机器人系统 |
WO2021184359A1 (fr) * | 2020-03-20 | 2021-09-23 | 深圳市大疆创新科技有限公司 | Procédé de suivi de cible, appareil de suivi de cible, dispositif mobile et support de stockage |
CN113515111B (zh) * | 2020-03-25 | 2023-08-25 | 宇通客车股份有限公司 | 一种车辆避障路径规划方法及装置 |
CN111474953B (zh) * | 2020-03-30 | 2021-09-17 | 清华大学 | 多动态视角协同的空中目标识别方法及系统 |
CN111455900B (zh) * | 2020-04-01 | 2021-11-12 | 无锡格物智能科技有限公司 | 路障布设方法、终端、计算机设备和存储介质 |
EP3893150A1 (fr) | 2020-04-09 | 2021-10-13 | Tusimple, Inc. | Techniques d'estimation de pose de caméra |
WO2021212518A1 (fr) * | 2020-04-24 | 2021-10-28 | 深圳市大疆创新科技有限公司 | Procédé, appareil et système de direction de vol, terminal de commande et support de stockage lisible |
US11415990B2 (en) | 2020-05-04 | 2022-08-16 | Honeywell International Inc. | Optical object tracking on focal plane with dynamic focal length |
CN112639652A (zh) * | 2020-05-07 | 2021-04-09 | 深圳市大疆创新科技有限公司 | 目标跟踪方法和装置、可移动平台以及成像平台 |
CN113688463B (zh) * | 2020-05-18 | 2024-01-23 | 中国航发商用航空发动机有限责任公司 | 导线碰撞角度范围筛选方法和装置、计算机可读存储介质 |
CN111665490B (zh) * | 2020-06-02 | 2023-07-14 | 浙江大华技术股份有限公司 | 目标跟踪方法和装置、存储介质及电子装置 |
EP3926432A1 (fr) * | 2020-06-16 | 2021-12-22 | Hexagon Geosystems Services AG | Commande tactile de véhicules aériens sans pilote |
AU2021203567A1 (en) | 2020-06-18 | 2022-01-20 | Tusimple, Inc. | Angle and orientation measurements for vehicles with multiple drivable sections |
CN111862154B (zh) * | 2020-07-13 | 2024-03-01 | 中移(杭州)信息技术有限公司 | 机器人视觉跟踪方法、装置、机器人及存储介质 |
JP2023117420A (ja) * | 2020-07-16 | 2023-08-24 | ソニーグループ株式会社 | 情報処理方法、プログラム、及びシステム |
CN112050813B (zh) * | 2020-08-08 | 2022-08-02 | 浙江科聪控制技术有限公司 | 一种用于防暴一区移动机器人的激光导航系统 |
DE102020210618B4 (de) * | 2020-08-20 | 2022-03-17 | Top Seven Gmbh & Co. Kg | Verfahren und system zur objekterfassung |
RU200639U1 (ru) * | 2020-09-02 | 2020-11-03 | Илья Игоревич Бычков | Автоматизированное устройство управления беспилотным летательным аппаратом при полете над движущимся наземным объектом |
CN112162570B (zh) * | 2020-10-10 | 2022-12-06 | 中国人民解放军海军航空大学 | 一种四旋翼直升飞机小范围动态跟踪的方法 |
WO2022126436A1 (fr) * | 2020-12-16 | 2022-06-23 | 深圳市大疆创新科技有限公司 | Procédé et appareil de détection de retards, système, plateforme mobile et support de stockage |
CN114556425A (zh) * | 2020-12-17 | 2022-05-27 | 深圳市大疆创新科技有限公司 | 定位的方法、设备、无人机和存储介质 |
EP4305594A2 (fr) * | 2021-03-08 | 2024-01-17 | Ridecell, Inc. | Cadre pour la détection d'objets 3d et la prédiction de profondeur à partir d'images 2d |
CN112896551B (zh) * | 2021-05-08 | 2021-09-07 | 成都飞机工业(集团)有限责任公司 | 一种飞机航电设备安装的校准辅助方法 |
IL284872B2 (en) * | 2021-07-13 | 2023-03-01 | Allen Richter | Devices, systems and methods for mobile platform navigation |
CN113418522B (zh) * | 2021-08-25 | 2021-12-14 | 季华实验室 | Agv路径规划方法、跟随方法、装置、设备及存储介质 |
TWI769915B (zh) | 2021-08-26 | 2022-07-01 | 財團法人工業技術研究院 | 投射系統及應用其之投射校準方法 |
TWI769924B (zh) * | 2021-09-15 | 2022-07-01 | 東元電機股份有限公司 | 人體跟隨系統 |
US11922606B2 (en) | 2021-10-04 | 2024-03-05 | Samsung Electronics Co., Ltd. | Multipass interference correction and material recognition based on patterned illumination without frame rate loss |
CN113848869B (zh) * | 2021-10-20 | 2023-03-07 | 北京三快在线科技有限公司 | 一种无人设备控制方法、装置、存储介质及电子设备 |
CN113961016B (zh) * | 2021-10-25 | 2023-11-10 | 华东计算技术研究所(中国电子科技集团公司第三十二研究所) | 基于a*算法的无人机动态目标航迹规划方法及系统 |
CN114167900B (zh) * | 2021-11-19 | 2023-06-30 | 北京环境特性研究所 | 一种基于无人机和差分gps的光电跟踪系统标校方法及装置 |
CN114119651B (zh) * | 2021-11-30 | 2022-10-25 | 重庆紫光华山智安科技有限公司 | 目标跟踪方法、系统、设备及存储介质 |
KR20230105162A (ko) * | 2022-01-03 | 2023-07-11 | 한화에어로스페이스 주식회사 | 무인전투차량 및 이의 표적 검출 방법 |
US11417106B1 (en) | 2022-02-04 | 2022-08-16 | King Abdulaziz University | Crowd evacuation system based on real time perception, simulation, and warning |
CN114625170B (zh) * | 2022-03-24 | 2023-05-12 | 中国民用航空飞行学院 | 一种山区火灾直升机救援飞行路径动态规划方法 |
CN114584928B (zh) * | 2022-05-05 | 2022-08-05 | 深圳源中瑞科技有限公司 | 基于电子围栏的路径推测方法、装置、计算机设备和介质 |
TWI817594B (zh) * | 2022-07-04 | 2023-10-01 | 鴻海精密工業股份有限公司 | 圖像深度識別方法、電腦設備及儲存介質 |
CN115599092B (zh) * | 2022-09-07 | 2024-09-24 | 格力电器(武汉)有限公司 | 一种工件搬运控制方法、装置、设备及存储介质 |
CN115451919B (zh) * | 2022-09-28 | 2023-06-30 | 安徽理工大学 | 一种智能型无人测绘装置及方法 |
TWI843251B (zh) | 2022-10-25 | 2024-05-21 | 財團法人工業技術研究院 | 目標追蹤系統及應用其之目標追蹤方法 |
WO2024089890A1 (fr) * | 2022-10-28 | 2024-05-02 | 三菱電機株式会社 | Système d'opération à distance et procédé d'opération à distance |
CN117021117B (zh) * | 2023-10-08 | 2023-12-15 | 电子科技大学 | 一种基于混合现实的移动机器人人机交互与定位方法 |
CN117649426B (zh) * | 2024-01-29 | 2024-04-09 | 中国科学院长春光学精密机械与物理研究所 | 抗无人机起落架遮挡的运动目标跟踪方法 |
CN117893933B (zh) * | 2024-03-14 | 2024-05-24 | 国网上海市电力公司 | 一种用于输变电设备的无人巡检故障检测方法和系统 |
CN118590612A (zh) * | 2024-05-23 | 2024-09-03 | 北京艾普智城网络科技有限公司 | 一种基于ai视觉识别的物体移动监测系统、方法及设备 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130345920A1 (en) * | 2003-06-20 | 2013-12-26 | L-3 Unmanned Systems, Inc. | Autonomous control of unmanned aerial vehicles |
US20160304198A1 (en) * | 2014-12-03 | 2016-10-20 | Google Inc. | Systems and methods for reliable relative navigation and autonomous following between unmanned aerial vehicle and a target object |
Family Cites Families (120)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5155683A (en) | 1991-04-11 | 1992-10-13 | Wadiatur Rahim | Vehicle remote guidance with path control |
JP3600383B2 (ja) * | 1996-09-30 | 2004-12-15 | 三菱重工業株式会社 | 移動体追跡装置 |
JPH10141891A (ja) * | 1996-11-07 | 1998-05-29 | Mitsubishi Heavy Ind Ltd | 飛しょう経路の設定方法 |
SE515655C2 (sv) | 1999-12-22 | 2001-09-17 | Saab Ab | "System och metod för kollisionsundvikning mellan farkoster |
JP2001306144A (ja) * | 2000-04-21 | 2001-11-02 | Yamaha Motor Co Ltd | 無人ヘリコプタの飛行制御システム |
US7173650B2 (en) | 2001-03-28 | 2007-02-06 | Koninklijke Philips Electronics N.V. | Method for assisting an automated video tracking system in reaquiring a target |
JP2003177120A (ja) | 2001-08-30 | 2003-06-27 | Toppan Printing Co Ltd | アウトイオンの分析方法、パーティクルまたは溶出物の測定方法、ならびにクリーンフィルムとその積層体 |
JP4301861B2 (ja) * | 2002-05-20 | 2009-07-22 | 川崎重工業株式会社 | 移動体の操縦方法及び装置 |
JP3861781B2 (ja) * | 2002-09-17 | 2006-12-20 | 日産自動車株式会社 | 前方車両追跡システムおよび前方車両追跡方法 |
JP4213518B2 (ja) * | 2003-05-27 | 2009-01-21 | 川崎重工業株式会社 | 移動体の制御方法及び制御装置 |
JP3994950B2 (ja) * | 2003-09-19 | 2007-10-24 | ソニー株式会社 | 環境認識装置及び方法、経路計画装置及び方法、並びにロボット装置 |
US8448858B1 (en) * | 2004-06-21 | 2013-05-28 | Stoplift, Inc. | Method and apparatus for detecting suspicious activity using video analysis from alternative camera viewpoint |
US7884849B2 (en) * | 2005-09-26 | 2011-02-08 | Objectvideo, Inc. | Video surveillance system with omni-directional camera |
US8855846B2 (en) * | 2005-10-20 | 2014-10-07 | Jason W. Grzywna | System and method for onboard vision processing |
US7602480B2 (en) | 2005-10-26 | 2009-10-13 | Alcatel-Lucent Usa Inc. | Method and system for tracking a moving station or target in free space communications |
US7835542B2 (en) | 2005-12-29 | 2010-11-16 | Industrial Technology Research Institute | Object tracking systems and methods utilizing compressed-domain motion-based segmentation |
US8902233B1 (en) | 2006-06-09 | 2014-12-02 | Pixar | Driving systems extension |
JP4709101B2 (ja) * | 2006-09-01 | 2011-06-22 | キヤノン株式会社 | 自動追尾カメラ装置 |
US7411167B2 (en) * | 2006-09-05 | 2008-08-12 | Honeywell International Inc. | Tracking a moving object from a camera on a moving platform |
IL183006A0 (en) * | 2007-05-06 | 2007-12-03 | Wave Group Ltd | A bilateral robotic omni-directional situational awarness system having a smart throw able transportaion case |
EP2043045B1 (fr) * | 2007-09-20 | 2011-09-07 | Delphi Technologies, Inc. | Procédé destiné au suivi d'objet |
US8244469B2 (en) * | 2008-03-16 | 2012-08-14 | Irobot Corporation | Collaborative engagement for target identification and tracking |
CN101252687B (zh) * | 2008-03-20 | 2010-06-02 | 上海交通大学 | 实现多通道联合的感兴趣区域视频编码及传输的方法 |
JP4497236B2 (ja) | 2008-08-11 | 2010-07-07 | オムロン株式会社 | 検出用情報登録装置、電子機器、検出用情報登録装置の制御方法、電子機器の制御方法、検出用情報登録装置制御プログラム、電子機器の制御プログラム |
US8855819B2 (en) * | 2008-10-09 | 2014-10-07 | Samsung Electronics Co., Ltd. | Method and apparatus for simultaneous localization and mapping of robot |
ATE545924T1 (de) | 2008-11-04 | 2012-03-15 | Saab Ab | Vermeidungsmanöver-generator für ein flugzeug |
CN101489147B (zh) * | 2009-01-16 | 2010-12-01 | 西安电子科技大学 | 基于感兴趣区域的幅型比变换方法 |
US20100228406A1 (en) | 2009-03-03 | 2010-09-09 | Honeywell International Inc. | UAV Flight Control Method And System |
CN101614816B (zh) * | 2009-07-24 | 2011-08-31 | 东北大学 | 一种室内移动机器人位姿检测装置及控制方法 |
WO2011013179A1 (fr) * | 2009-07-31 | 2011-02-03 | 富士通株式会社 | Dispositif de détection de position d'objet mobile et procédé de détection de position d'objet mobile |
US8515596B2 (en) | 2009-08-18 | 2013-08-20 | Honeywell International Inc. | Incremental position-based guidance for a UAV |
US20110279682A1 (en) * | 2009-11-12 | 2011-11-17 | Le Li | Methods for Target Tracking, Classification and Identification by Using Foveal Sensors |
CN101769754B (zh) * | 2010-01-19 | 2012-04-25 | 湖南大学 | 一种基于类三维地图的移动机器人全局路径规划方法 |
KR20110119118A (ko) * | 2010-04-26 | 2011-11-02 | 엘지전자 주식회사 | 로봇 청소기, 및 이를 이용한 원격 감시 시스템 |
CN101860732B (zh) * | 2010-06-04 | 2014-08-27 | 天津市亚安科技股份有限公司 | 一种控制云台摄像机自动跟踪目标的方法 |
US9681065B2 (en) * | 2010-06-15 | 2017-06-13 | Flir Systems, Inc. | Gimbal positioning with target velocity compensation |
JP5560978B2 (ja) * | 2010-07-13 | 2014-07-30 | 村田機械株式会社 | 自律移動体 |
TWI420906B (zh) * | 2010-10-13 | 2013-12-21 | Ind Tech Res Inst | 興趣區域之追蹤系統與方法及電腦程式產品 |
IL208910A0 (en) | 2010-10-24 | 2011-02-28 | Rafael Advanced Defense Sys | Tracking and identification of a moving object from a moving sensor using a 3d model |
CN102087530B (zh) * | 2010-12-07 | 2012-06-13 | 东南大学 | 基于手绘地图和路径的移动机器人视觉导航方法 |
US8494766B2 (en) * | 2011-01-07 | 2013-07-23 | Ge Aviation Systems, Llc | Flight management system with integrated tactical commands for use with an aircraft and method of operating same |
TW201235808A (en) * | 2011-02-23 | 2012-09-01 | Hon Hai Prec Ind Co Ltd | System and method for controlling UAV to flight in predefined area |
JP5848507B2 (ja) | 2011-03-08 | 2016-01-27 | キヤノン株式会社 | 追尾機能付き撮影装置及び方法 |
EP2511659A1 (fr) * | 2011-04-14 | 2012-10-17 | Hexagon Technology Center GmbH | Système de marquage géodésique pour le marquage de points de mire |
JP5719230B2 (ja) * | 2011-05-10 | 2015-05-13 | キヤノン株式会社 | 物体認識装置、物体認識装置の制御方法、およびプログラム |
US20120316680A1 (en) | 2011-06-13 | 2012-12-13 | Microsoft Corporation | Tracking and following of moving objects by a mobile robot |
US9157750B2 (en) * | 2011-10-03 | 2015-10-13 | Furuno Electric Co., Ltd. | Device having touch panel, radar apparatus, plotter apparatus, ship network system, information display method and information display program |
CN104011772B (zh) * | 2011-10-19 | 2017-02-15 | 克朗设备公司 | 基于识别和跟踪图像场景中的多个对象来控制车叉 |
KR101887055B1 (ko) * | 2011-11-14 | 2018-09-11 | 삼성전자주식회사 | 로봇 청소기 및 그 제어 방법 |
US20130226373A1 (en) * | 2012-02-27 | 2013-08-29 | Ge Aviation Systems Llc | Methods for in-flight adjusting of a flight plan |
CN102629385B (zh) * | 2012-02-28 | 2014-09-24 | 中山大学 | 一种基于多摄像机信息融合的目标匹配与跟踪系统及方法 |
NO334183B1 (no) * | 2012-03-22 | 2014-01-13 | Prox Dynamics As | Metode og anordning for å kontrollere og overvåke det omliggende område til et ubemannet luftfartøy |
US9841761B2 (en) | 2012-05-04 | 2017-12-12 | Aeryon Labs Inc. | System and method for controlling unmanned aerial vehicles |
CN202758243U (zh) * | 2012-09-06 | 2013-02-27 | 北京工业大学 | 一种无人机飞行操控系统 |
JP2014063411A (ja) * | 2012-09-24 | 2014-04-10 | Casio Comput Co Ltd | 遠隔制御システム、制御方法、及び、プログラム |
CN102902282B (zh) * | 2012-09-25 | 2015-08-12 | 中国兵器工业第二0五研究所 | 基于光轴与惯性轴重合的地理跟踪方法 |
CN102955478B (zh) * | 2012-10-24 | 2016-01-20 | 深圳一电科技有限公司 | 无人机飞行控制方法及系统 |
CN102967305B (zh) * | 2012-10-26 | 2015-07-01 | 南京信息工程大学 | 基于大小回字标志物的多旋翼无人机位姿获取方法 |
CN103914068A (zh) * | 2013-01-07 | 2014-07-09 | 中国人民解放军第二炮兵工程大学 | 一种基于栅格地图的服务机器人自主导航方法 |
GB201301748D0 (en) * | 2013-01-31 | 2013-03-20 | Reiter Johannes | Aircraft for Vertical Take-off and Landing with a Wing Arrangement comprising an extendible lift increasing system |
CN103149939B (zh) * | 2013-02-26 | 2015-10-21 | 北京航空航天大学 | 一种基于视觉的无人机动态目标跟踪与定位方法 |
US9367067B2 (en) * | 2013-03-15 | 2016-06-14 | Ashley A Gilmore | Digital tethering for tracking with autonomous aerial robot |
JP2014212479A (ja) * | 2013-04-19 | 2014-11-13 | ソニー株式会社 | 制御装置、制御方法及びコンピュータプログラム |
US9253410B2 (en) | 2013-04-25 | 2016-02-02 | Canon Kabushiki Kaisha | Object detection apparatus, control method therefor, image capturing apparatus, and storage medium |
JP6250952B2 (ja) * | 2013-04-26 | 2017-12-20 | 古野電気株式会社 | 情報表示装置及び針路設定方法 |
CN104217417B (zh) * | 2013-05-31 | 2017-07-07 | 张伟伟 | 一种视频多目标跟踪的方法及装置 |
US9715761B2 (en) | 2013-07-08 | 2017-07-25 | Vangogh Imaging, Inc. | Real-time 3D computer vision processing engine for object recognition, reconstruction, and analysis |
JP6296801B2 (ja) | 2013-07-24 | 2018-03-20 | キヤノン株式会社 | 撮像装置、撮像装置の制御方法、および撮像装置の制御プログラム |
US9237318B2 (en) | 2013-07-26 | 2016-01-12 | SkyBell Technologies, Inc. | Doorbell communication systems and methods |
CN103455797B (zh) * | 2013-09-07 | 2017-01-11 | 西安电子科技大学 | 航拍视频中运动小目标的检测与跟踪方法 |
CN103530894B (zh) * | 2013-10-25 | 2016-04-20 | 合肥工业大学 | 一种基于多尺度块稀疏表示的视频目标追踪方法及其系统 |
US10248886B2 (en) * | 2013-10-30 | 2019-04-02 | Pgs Geophysical As | System and method for underwater distance measurement |
CN103576692A (zh) * | 2013-11-07 | 2014-02-12 | 哈尔滨工程大学 | 一种多无人机协同飞行方法 |
CN103611324B (zh) * | 2013-11-14 | 2016-08-17 | 南京航空航天大学 | 一种无人直升机飞行控制系统及其控制方法 |
CN106030430A (zh) * | 2013-11-27 | 2016-10-12 | 宾夕法尼亚大学理事会 | 用于使用旋翼微型航空载具(mav)在室内和室外环境中的稳健的自主飞行的多传感器融合 |
JP6429454B2 (ja) | 2013-11-28 | 2018-11-28 | キヤノン株式会社 | 撮像装置、撮像装置の制御方法および撮像装置の制御プログラム |
CN103871075B (zh) * | 2013-12-30 | 2015-11-18 | 华中科技大学 | 一种大椭圆遥感卫星地球背景相对运动估计方法 |
CN103822615B (zh) * | 2014-02-25 | 2016-01-20 | 北京航空航天大学 | 一种多控制点自动提取与聚合的无人机地面目标实时定位方法 |
WO2015127535A1 (fr) | 2014-02-26 | 2015-09-03 | Searidge Technologies Inc. | Assemblage d'images et correction de couleur automatique |
WO2015134391A1 (fr) | 2014-03-03 | 2015-09-11 | University Of Washington | Outils d'éclairage virtuel haptique |
US9165361B1 (en) | 2014-03-13 | 2015-10-20 | Raytheon Company | Video tracking with jitter, slewing, or zoom |
US9414153B2 (en) | 2014-05-08 | 2016-08-09 | Panasonic Intellectual Property Management Co., Ltd. | Directivity control apparatus, directivity control method, storage medium and directivity control system |
CN104035446B (zh) * | 2014-05-30 | 2017-08-25 | 深圳市大疆创新科技有限公司 | 无人机的航向生成方法和系统 |
US9798324B2 (en) * | 2014-07-18 | 2017-10-24 | Helico Aerospace Industries Sia | Autonomous vehicle operation |
WO2016015251A1 (fr) * | 2014-07-30 | 2016-02-04 | SZ DJI Technology Co., Ltd. | Systèmes et procédés de poursuite de cible |
US10139819B2 (en) | 2014-08-22 | 2018-11-27 | Innovative Signal Analysis, Inc. | Video enabled inspection using unmanned aerial vehicles |
CN104197928B (zh) * | 2014-08-29 | 2017-01-18 | 西北工业大学 | 多摄像机协同的无人机检测、定位及跟踪方法 |
KR101940936B1 (ko) * | 2014-09-11 | 2019-01-21 | 사이버옵틱스 코포레이션 | 3-차원 형상측정에서 복수 카메라 및 광원으로부터의 포인트 클라우드 병합 |
CN104299244B (zh) * | 2014-09-26 | 2017-07-25 | 东软集团股份有限公司 | 基于单目相机的障碍物检测方法及装置 |
US9355320B2 (en) * | 2014-10-30 | 2016-05-31 | Toyota Motor Engineering & Manufacturing North America, Inc. | Blur object tracker using group lasso method and apparatus |
US10334158B2 (en) | 2014-11-03 | 2019-06-25 | Robert John Gove | Autonomous media capturing |
JP6222879B2 (ja) | 2014-11-14 | 2017-11-01 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | 移動体の制御方法、装置及び移動デバイス |
CN104361770B (zh) * | 2014-11-18 | 2017-01-04 | 武汉理工大学 | 用于交通信息采集无人机的精确降落自动控制方法 |
CN204360218U (zh) | 2014-12-31 | 2015-05-27 | 深圳市大疆创新科技有限公司 | 移动物体 |
DK3164774T3 (da) | 2014-12-31 | 2021-02-08 | Sz Dji Technology Co Ltd | Fartøjshøjdebegrænsninger og styring |
CN104537898B (zh) * | 2015-01-08 | 2017-11-28 | 西北工业大学 | 一种空地协同的无人机感知规避系统及其规避方法 |
US9373036B1 (en) | 2015-01-16 | 2016-06-21 | Toyota Motor Engineering & Manufacturing North America, Inc. | Collaborative distance metric learning for method and apparatus visual tracking |
CN104656665B (zh) * | 2015-03-06 | 2017-07-28 | 云南电网有限责任公司电力科学研究院 | 一种新型无人机通用避障模块及步骤 |
CN104714556B (zh) * | 2015-03-26 | 2017-08-11 | 清华大学 | 无人机智能航向控制方法 |
CN104786226A (zh) * | 2015-03-26 | 2015-07-22 | 华南理工大学 | 抓取在线工件的机器人位姿及运动轨迹定位系统与方法 |
CN104820428B (zh) * | 2015-04-20 | 2017-11-07 | 余江 | 一种无人机的记忆型航迹再现方法及其装置 |
CN104796611A (zh) * | 2015-04-20 | 2015-07-22 | 零度智控(北京)智能科技有限公司 | 移动终端遥控无人机实现智能飞行拍摄的方法及系统 |
CN106292720A (zh) * | 2015-04-21 | 2017-01-04 | 高域(北京)智能科技研究院有限公司 | 一种智能多模式飞行拍摄设备及其飞行控制方法 |
CN104834307A (zh) * | 2015-04-23 | 2015-08-12 | 杨珊珊 | 无人飞行器的控制方法及控制装置 |
CN104899554A (zh) * | 2015-05-07 | 2015-09-09 | 东北大学 | 一种基于单目视觉的车辆测距方法 |
CN111762136A (zh) | 2015-05-12 | 2020-10-13 | 深圳市大疆创新科技有限公司 | 识别或检测障碍物的设备和方法 |
CN104853104B (zh) * | 2015-06-01 | 2018-08-28 | 深圳市微队信息技术有限公司 | 一种自动跟踪拍摄运动目标的方法以及系统 |
CN104850134B (zh) * | 2015-06-12 | 2019-01-11 | 北京中飞艾维航空科技有限公司 | 一种无人机高精度自主避障飞行方法 |
CN107735290B (zh) | 2015-06-19 | 2021-05-04 | 日产自动车株式会社 | 停车辅助装置及停车辅助方法 |
CN105371818A (zh) * | 2015-11-30 | 2016-03-02 | 湖北易瓦特科技股份有限公司 | 测距避障仪和无人机测距避障的方法 |
US10012982B2 (en) | 2015-12-11 | 2018-07-03 | Fuji Xerox Co., Ltd. | System and method for focus and context views for telepresence and robotic teleoperation |
US10514711B2 (en) * | 2016-10-09 | 2019-12-24 | Airspace Systems, Inc. | Flight control using computer vision |
KR20180051996A (ko) * | 2016-11-09 | 2018-05-17 | 삼성전자주식회사 | 무인 비행 장치 및 이를 이용한 피사체 촬영 방법 |
CN107636550A (zh) * | 2016-11-10 | 2018-01-26 | 深圳市大疆创新科技有限公司 | 飞行控制方法、装置及飞行器 |
WO2018094741A1 (fr) * | 2016-11-28 | 2018-05-31 | 深圳市大疆创新科技有限公司 | Procédé et appareil d'édition de trajet aérien, et dispositif de commande |
JP6844235B2 (ja) * | 2016-12-08 | 2021-03-17 | 富士通株式会社 | 距離測定装置および距離測定方法 |
US10409276B2 (en) * | 2016-12-21 | 2019-09-10 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for controller-free user drone interaction |
CN107102647A (zh) * | 2017-03-30 | 2017-08-29 | 中国人民解放军海军航空工程学院青岛校区 | 基于图像的无人机目标跟踪控制方法 |
WO2019100011A1 (fr) * | 2017-11-17 | 2019-05-23 | Divine Logic, Inc. | Systèmes et procédés de suivi d'articles |
US10429487B1 (en) * | 2018-05-18 | 2019-10-01 | Here Global B.V. | Drone localization |
-
2015
- 2015-09-15 EP EP15903807.4A patent/EP3353706A4/fr not_active Withdrawn
- 2015-09-15 CN CN201580074398.6A patent/CN107209854A/zh active Pending
- 2015-09-15 WO PCT/CN2015/089594 patent/WO2017045116A1/fr active Application Filing
- 2015-10-30 EP EP15903939.5A patent/EP3374836A4/fr not_active Ceased
- 2015-10-30 US US16/067,577 patent/US20190011921A1/en not_active Abandoned
- 2015-10-30 CN CN202210224021.6A patent/CN114594792A/zh active Pending
- 2015-10-30 CN CN201580084348.6A patent/CN108351649B/zh active Active
- 2015-10-30 WO PCT/CN2015/093459 patent/WO2017045251A1/fr active Application Filing
- 2015-12-31 CN CN201580060689.XA patent/CN107148639B/zh not_active Expired - Fee Related
- 2015-12-31 WO PCT/CN2015/100257 patent/WO2017045315A1/fr active Application Filing
- 2015-12-31 CN CN201910552106.5A patent/CN110276786B/zh not_active Expired - Fee Related
-
2016
- 2016-02-29 CN CN202210304319.8A patent/CN114815906A/zh active Pending
- 2016-02-29 CN CN201680060804.8A patent/CN108139759B/zh active Active
- 2016-02-29 JP JP2018521078A patent/JP6735821B2/ja not_active Expired - Fee Related
- 2016-12-30 US US15/396,022 patent/US10129478B2/en active Active
-
2018
- 2018-03-15 US US15/922,023 patent/US10928838B2/en active Active
- 2018-11-12 US US16/188,144 patent/US10976753B2/en active Active
-
2020
- 2020-08-12 US US16/991,739 patent/US11635775B2/en active Active
-
2021
- 2021-04-07 US US17/224,391 patent/US20210223795A1/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130345920A1 (en) * | 2003-06-20 | 2013-12-26 | L-3 Unmanned Systems, Inc. | Autonomous control of unmanned aerial vehicles |
US20160304198A1 (en) * | 2014-12-03 | 2016-10-20 | Google Inc. | Systems and methods for reliable relative navigation and autonomous following between unmanned aerial vehicle and a target object |
Cited By (108)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11573562B2 (en) | 2014-06-19 | 2023-02-07 | Skydio, Inc. | Magic wand interface and other user interaction paradigms for a flying digital assistant |
US12007763B2 (en) | 2014-06-19 | 2024-06-11 | Skydio, Inc. | Magic wand interface and other user interaction paradigms for a flying digital assistant |
US11347217B2 (en) | 2014-06-19 | 2022-05-31 | Skydio, Inc. | User interaction paradigms for a flying digital assistant |
US11644832B2 (en) | 2014-06-19 | 2023-05-09 | Skydio, Inc. | User interaction paradigms for a flying digital assistant |
US10576968B2 (en) * | 2014-08-27 | 2020-03-03 | Renesas Electronics Corporation | Control system, relay device and control method |
US10656650B2 (en) * | 2015-01-09 | 2020-05-19 | Korean Air Lines Co., Ltd. | Method for guiding and controlling drone using information for controlling camera of drone |
US11987383B2 (en) | 2015-10-31 | 2024-05-21 | Snap Inc. | Lighting apparatus for remote controlled device |
US10845188B2 (en) * | 2016-01-05 | 2020-11-24 | Microsoft Technology Licensing, Llc | Motion capture from a mobile self-tracking device |
US20170193666A1 (en) * | 2016-01-05 | 2017-07-06 | Microsoft Technology Licensing, Llc | Motion capture from a mobile self-tracking device |
US20190073025A1 (en) * | 2016-03-07 | 2019-03-07 | Sensomotoric Instruments Gesellschaft Fur Innovative Sensorik Mbh | Method and device for carrying out eye gaze mapping |
US11579686B2 (en) * | 2016-03-07 | 2023-02-14 | Apple Inc. | Method and device for carrying out eye gaze mapping |
US20170285631A1 (en) * | 2016-03-31 | 2017-10-05 | Unmanned Innovation, Inc. | Unmanned aerial vehicle modular command priority determination and filtering system |
US11874656B2 (en) * | 2016-03-31 | 2024-01-16 | Skydio, Inc. | Unmanned aerial vehicle modular command priority determination and filtering system |
US11117662B2 (en) * | 2016-04-18 | 2021-09-14 | Autel Robotics Co., Ltd. | Flight direction display method and apparatus, and unmanned aerial vehicle |
US11084581B2 (en) * | 2016-04-29 | 2021-08-10 | Lg Electronics Inc. | Mobile terminal and control method therefor |
US20230139242A1 (en) * | 2016-05-25 | 2023-05-04 | Siemens Mobility GmbH | Method, device and arrangement for tracking moving objects |
US11972683B2 (en) * | 2016-05-25 | 2024-04-30 | Yunex Gmbh | Method, device and arrangement for tracking moving objects |
US20170359561A1 (en) * | 2016-06-08 | 2017-12-14 | Uber Technologies, Inc. | Disparity mapping for an autonomous vehicle |
US11126206B2 (en) | 2016-06-30 | 2021-09-21 | Snap Inc. | Motion and image-based control system |
US11404056B1 (en) * | 2016-06-30 | 2022-08-02 | Snap Inc. | Remoteless control of drone behavior |
US11892859B2 (en) | 2016-06-30 | 2024-02-06 | Snap Inc. | Remoteless control of drone behavior |
US11720126B2 (en) | 2016-06-30 | 2023-08-08 | Snap Inc. | Motion and image-based control system |
US10768639B1 (en) | 2016-06-30 | 2020-09-08 | Snap Inc. | Motion and image-based control system |
US11107506B2 (en) * | 2016-07-08 | 2021-08-31 | SZ DJI Technology Co., Ltd. | Method and system for combining and editing UAV operation data and video data |
US11797009B2 (en) | 2016-08-12 | 2023-10-24 | Skydio, Inc. | Unmanned aerial image capture platform |
US11126182B2 (en) | 2016-08-12 | 2021-09-21 | Skydio, Inc. | Unmanned aerial image capture platform |
US11460844B2 (en) | 2016-08-12 | 2022-10-04 | Skydio, Inc. | Unmanned aerial image capture platform |
US10538326B1 (en) * | 2016-08-31 | 2020-01-21 | Amazon Technologies, Inc. | Flare detection and avoidance in stereo vision systems |
US20190004524A1 (en) * | 2016-08-31 | 2019-01-03 | Faraday&Future Inc. | System and method for planning a vehicle path |
US20190335153A1 (en) * | 2016-11-17 | 2019-10-31 | Nokia Technologies Oy | Method for multi-camera device |
US11498676B2 (en) * | 2016-11-28 | 2022-11-15 | Guangzhou Xaircraft Technology Co., Ltd. | Method and apparatus for controlling flight of unmanned aerial vehicle |
US20220309687A1 (en) * | 2016-12-01 | 2022-09-29 | Skydio, Inc. | Object Tracking By An Unmanned Aerial Vehicle Using Visual Sensors |
US11295458B2 (en) * | 2016-12-01 | 2022-04-05 | Skydio, Inc. | Object tracking by an unmanned aerial vehicle using visual sensors |
US11861892B2 (en) * | 2016-12-01 | 2024-01-02 | Skydio, Inc. | Object tracking by an unmanned aerial vehicle using visual sensors |
US20180155023A1 (en) * | 2016-12-05 | 2018-06-07 | Samsung Electronics Co., Ltd | Flight control method and electronic device for supporting the same |
US10800522B2 (en) * | 2016-12-05 | 2020-10-13 | Samsung Electronics Co., Ltd. | Flight control method and electronic device for supporting the same |
US11423792B2 (en) * | 2017-08-10 | 2022-08-23 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for obstacle avoidance in aerial systems |
US10515560B2 (en) * | 2017-08-10 | 2019-12-24 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for obstacle avoidance in aerial systems |
US20190066524A1 (en) * | 2017-08-10 | 2019-02-28 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for obstacle avoidance in aerial systems |
US12073577B2 (en) | 2017-09-15 | 2024-08-27 | Snap Inc. | Computing a point cloud from stitched images |
US11753142B1 (en) | 2017-09-29 | 2023-09-12 | Snap Inc. | Noise modulation for unmanned aerial vehicles |
US11531357B1 (en) | 2017-10-05 | 2022-12-20 | Snap Inc. | Spatial vector-based drone control |
US10962650B2 (en) | 2017-10-31 | 2021-03-30 | United States Of America As Represented By The Administrator Of Nasa | Polyhedral geofences |
US12079011B2 (en) | 2017-11-04 | 2024-09-03 | Farmx, Inc. | System and method for perceptive navigation of automated vehicles |
US10921825B2 (en) * | 2017-11-04 | 2021-02-16 | Automodality, Inc. | System and method for perceptive navigation of automated vehicles |
US11726501B2 (en) | 2017-11-04 | 2023-08-15 | Automodality, Inc. | System and method for perceptive navigation of automated vehicles |
US10967862B2 (en) | 2017-11-07 | 2021-04-06 | Uatc, Llc | Road anomaly detection for autonomous vehicle |
US11731627B2 (en) | 2017-11-07 | 2023-08-22 | Uatc, Llc | Road anomaly detection for autonomous vehicle |
US20190158755A1 (en) * | 2017-11-20 | 2019-05-23 | Chiun Mai Communication Systems, Inc. | Aerial vehicle and target object tracking method |
US20190154871A1 (en) * | 2017-11-21 | 2019-05-23 | Reliance Core Consulting LLC | Methods and systems for detecting motion corresponding to a field of interest |
US20190156496A1 (en) * | 2017-11-21 | 2019-05-23 | Reliance Core Consulting LLC | Methods, systems, apparatuses and devices for facilitating motion analysis in an environment |
US10921484B2 (en) * | 2017-11-21 | 2021-02-16 | Reliance Core Consulting | Methods and systems for detecting motion corresponding to a field of interest |
US10867398B2 (en) * | 2017-11-21 | 2020-12-15 | Reliance Core Consulting LLC | Methods, systems, apparatuses and devices for facilitating motion analysis in an environment |
US11634223B2 (en) * | 2017-11-27 | 2023-04-25 | Beijing Jingdong Qianshi Technology Co., Ltd. | Unmanned aerial vehicle control method, unmanned aerial vehicle control device, and computer readable storage medium |
US20210171193A1 (en) * | 2017-11-27 | 2021-06-10 | Beijing Jingdong Shangke Information Technology Co., Ltd. | Unmanned aerial vehicle control method, unmanned aerial vehicle control device, and computer readable storage medium |
US20190197335A1 (en) * | 2017-12-25 | 2019-06-27 | Autel Robotics Co., Ltd. | Distance measurement method and apparatus, and unmanned aerial vehicle |
US10621456B2 (en) * | 2017-12-25 | 2020-04-14 | Autel Robotics Co., Ltd. | Distance measurement method and apparatus, and unmanned aerial vehicle |
US11822346B1 (en) | 2018-03-06 | 2023-11-21 | Snap Inc. | Systems and methods for estimating user intent to launch autonomous aerial vehicle |
US11321942B2 (en) * | 2018-03-23 | 2022-05-03 | Guangzhou Xaircraft Technology Co., Ltd. | Method for measuring plant planting data, device and system |
US12014533B2 (en) * | 2018-04-03 | 2024-06-18 | Carnegie Mellon University | Methods and systems for real or near real-time point cloud map data confidence evaluation |
US20210025998A1 (en) * | 2018-04-03 | 2021-01-28 | Kaarta, Inc. | Methods and systems for real or near real-time point cloud map data confidence evaluation |
US20210009270A1 (en) * | 2018-04-04 | 2021-01-14 | SZ DJI Technology Co., Ltd. | Methods and system for composing and capturing images |
US11145078B2 (en) * | 2018-04-04 | 2021-10-12 | Tencent Technology (Shenzhen) Company Limited | Depth information determining method and related apparatus |
US11277528B2 (en) * | 2018-05-14 | 2022-03-15 | Fujifilm Corporation | Mobile type apparatus and imaging system |
US20190362473A1 (en) * | 2018-05-23 | 2019-11-28 | International Business Machines Corporation | Selectively Redacting Unrelated Objects from Images of a Group Captured within a Coverage Area |
US10839492B2 (en) * | 2018-05-23 | 2020-11-17 | International Business Machines Corporation | Selectively redacting unrelated objects from images of a group captured within a coverage area |
US11417088B2 (en) * | 2018-06-15 | 2022-08-16 | Sony Corporation | Information processing device, information processing method, program, and information processing system |
US11801937B2 (en) * | 2018-07-26 | 2023-10-31 | California Institute Of Technology | Systems and methods for avian flock flight path modification using UAVs |
US11282225B2 (en) | 2018-09-10 | 2022-03-22 | Mapbox, Inc. | Calibration for vision in navigation systems |
US20200090501A1 (en) * | 2018-09-19 | 2020-03-19 | International Business Machines Corporation | Accident avoidance system for pedestrians |
US11036216B2 (en) * | 2018-09-26 | 2021-06-15 | International Business Machines Corporation | Voice-controllable unmanned aerial vehicle for object retrieval and delivery |
US11443639B2 (en) * | 2018-09-30 | 2022-09-13 | Moutong Science And Technology Co., Ltd | Methods of generating a unmanned aerial vehicle migration trajectory, electronic devices and storage mediums |
US10926416B2 (en) * | 2018-11-21 | 2021-02-23 | Ford Global Technologies, Llc | Robotic manipulation using an independently actuated vision system, an adversarial control scheme, and a multi-tasking deep learning architecture |
US20200156255A1 (en) * | 2018-11-21 | 2020-05-21 | Ford Global Technologies, Llc | Robotic manipulation using an independently actuated vision system, an adversarial control scheme, and a multi-tasking deep learning architecture |
US10902634B2 (en) * | 2018-12-04 | 2021-01-26 | Here Global B.V. | Method and apparatus for providing feature triangulation |
US11631337B2 (en) * | 2018-12-27 | 2023-04-18 | Subaru Corporation | Optimal-route generating system |
US20200211404A1 (en) * | 2018-12-27 | 2020-07-02 | Subaru Corporation | Optimal-route generating system |
EP3941693A4 (fr) * | 2019-03-20 | 2022-11-30 | Covidien LP | Systèmes de détection de collision chirurgicale robotique |
US11314254B2 (en) * | 2019-03-26 | 2022-04-26 | Intel Corporation | Methods and apparatus for dynamically routing robots based on exploratory on-board mapping |
US12071228B1 (en) * | 2019-03-28 | 2024-08-27 | Snap Inc. | Drone with propeller guard configured as an airfoil |
US11565807B1 (en) | 2019-06-05 | 2023-01-31 | Gal Zuckerman | Systems and methods facilitating street-level interactions between flying drones and on-road vehicles |
US20220137648A1 (en) * | 2019-06-14 | 2022-05-05 | Autel Robotics Co., Ltd. | Method and apparatus for tracking moving target and unmanned aerial vehicle |
US12007794B2 (en) * | 2019-06-14 | 2024-06-11 | Autel Robotics Co., Ltd. | Method and apparatus for tracking moving target and unmanned aerial vehicle |
EP3989118A4 (fr) * | 2019-06-28 | 2022-06-29 | SZ DJI Technology Co., Ltd. | Procédé et système de suivi de cible, support de stockage lisible et plateforme mobile |
US11748968B2 (en) | 2019-06-28 | 2023-09-05 | SZ DJI Technology Co., Ltd. | Target tracking method and system, readable storage medium, and mobile platform |
US20210405646A1 (en) * | 2019-07-03 | 2021-12-30 | Lg Electronics Inc. | Marker, method of moving in marker following mode, and cart-robot implementing method |
EP3998578A4 (fr) * | 2019-07-16 | 2022-07-20 | SZ DJI Technology Co., Ltd. | Procédé, dispositif, et système de photographie, et support d'informations lisible par ordinateur |
WO2021052893A1 (fr) * | 2019-09-17 | 2021-03-25 | Atlas Elektronik Gmbh | Détection optique de mines dans des eaux peu profondes |
US11958183B2 (en) | 2019-09-19 | 2024-04-16 | The Research Foundation For The State University Of New York | Negotiation-based human-robot collaboration via augmented reality |
US10846880B1 (en) * | 2020-01-03 | 2020-11-24 | Altec Industries, Inc. | Camera embedded joystick |
US20210206491A1 (en) * | 2020-01-03 | 2021-07-08 | Tencent America LLC | Unmanned aerial system communication |
US20230018021A1 (en) * | 2020-03-31 | 2023-01-19 | SZ DJI Technology Co., Ltd. | Movable platform control method and device, movable platform and storage medium |
CN111600644A (zh) * | 2020-04-09 | 2020-08-28 | 西安理工大学 | 一种紫外光协助无人机编队最优刚性拓扑生成方法 |
WO2021213737A1 (fr) * | 2020-04-22 | 2021-10-28 | Siemens Aktiengesellschaft | Système de navigation automatique pour robot de lutte contre l'incendie |
CN111596692A (zh) * | 2020-06-09 | 2020-08-28 | 北京航空航天大学 | 一种平流层飞艇的环绕跟踪移动目标控制方法及系统 |
US11457139B2 (en) | 2020-09-09 | 2022-09-27 | Beijing Xiaomi Mobile Software Co., Ltd. | Photography method, electronic device, and storage medium |
EP3968626A1 (fr) * | 2020-09-09 | 2022-03-16 | Beijing Xiaomi Mobile Software Co., Ltd. | Procédé de photographie, appareil de photographie, dispositif électronique et support d'enregistrement |
TWI771192B (zh) * | 2020-09-24 | 2022-07-11 | 大陸商深圳市海柔創新科技有限公司 | 任務處理方法、控制終端、機器人、倉儲系統及儲存媒體 |
WO2022104489A1 (fr) * | 2020-11-20 | 2022-05-27 | Drovid Technologies | Procédé pour transmettre et suivre des paramètres détectés par drones par l'intermédiaire de (paas) avec (ai) |
WO2022136355A1 (fr) * | 2020-12-22 | 2022-06-30 | Naval Group | Système de planification d'une trajectoire optimisée d'un véhicule maritime |
WO2022153065A1 (fr) * | 2021-01-18 | 2022-07-21 | Hybrid Drones Limited | Système et procédé de vision pour véhicules aériens sans pilote |
US20220390965A1 (en) * | 2021-06-02 | 2022-12-08 | FLIR Unmanned Aerial Systems AS | Mobile platform vision sensor systems and methods |
US20220390940A1 (en) * | 2021-06-02 | 2022-12-08 | Skydio, Inc. | Interfaces And Control Of Aerial Vehicle For Automated Multidimensional Volume Scanning |
US11791049B2 (en) * | 2021-11-03 | 2023-10-17 | A Little Cold Gel, Llc | Methods and systems for detecting intravascular device failure |
US20230138206A1 (en) * | 2021-11-03 | 2023-05-04 | Amit Bahl | Methods and systems for detecting intravascular device failure |
KR102722952B1 (ko) | 2022-03-31 | 2024-10-29 | 주식회사 휴인스 | 인공지능 기반의 실내 자율비행이 가능한 드론 |
US20230367463A1 (en) * | 2022-05-11 | 2023-11-16 | Supercell Oy | Randomized movement control |
US11972521B2 (en) | 2022-08-31 | 2024-04-30 | Snap Inc. | Multisensorial presentation of volumetric content |
Also Published As
Publication number | Publication date |
---|---|
CN108351649B (zh) | 2022-03-18 |
CN114815906A (zh) | 2022-07-29 |
CN107148639A (zh) | 2017-09-08 |
CN114594792A (zh) | 2022-06-07 |
WO2017045315A1 (fr) | 2017-03-23 |
US20190082088A1 (en) | 2019-03-14 |
EP3374836A4 (fr) | 2018-12-05 |
JP6735821B2 (ja) | 2020-08-05 |
US20210223795A1 (en) | 2021-07-22 |
US10129478B2 (en) | 2018-11-13 |
US20170134631A1 (en) | 2017-05-11 |
CN108139759A (zh) | 2018-06-08 |
CN107209854A (zh) | 2017-09-26 |
WO2017045116A1 (fr) | 2017-03-23 |
EP3374836A1 (fr) | 2018-09-19 |
CN108351649A (zh) | 2018-07-31 |
EP3353706A1 (fr) | 2018-08-01 |
JP2018535487A (ja) | 2018-11-29 |
WO2017045251A1 (fr) | 2017-03-23 |
CN110276786B (zh) | 2021-08-20 |
CN108139759B (zh) | 2022-04-15 |
EP3353706A4 (fr) | 2019-05-08 |
US20210116943A1 (en) | 2021-04-22 |
US20180203467A1 (en) | 2018-07-19 |
US10928838B2 (en) | 2021-02-23 |
CN110276786A (zh) | 2019-09-24 |
US10976753B2 (en) | 2021-04-13 |
US11635775B2 (en) | 2023-04-25 |
CN107148639B (zh) | 2019-07-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11635775B2 (en) | Systems and methods for UAV interactive instructions and control | |
US20210072745A1 (en) | Systems and methods for uav flight control | |
US20210116944A1 (en) | Systems and methods for uav path planning and control | |
US11263761B2 (en) | Systems and methods for visual target tracking | |
US11704812B2 (en) | Methods and system for multi-target tracking | |
US10650235B2 (en) | Systems and methods for detecting and tracking movable objects | |
US11106203B2 (en) | Systems and methods for augmented stereoscopic display |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: SZ DJI TECHNOLOGY CO., LTD., CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WANG, MINGYU;ZHAO, TAO;YU, YUN;AND OTHERS;SIGNING DATES FROM 20180802 TO 20180821;REEL/FRAME:047868/0769 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |