EP3452881B1 - Imaging using multiple unmanned aerial vehicles - Google Patents
Imaging using multiple unmanned aerial vehicles Download PDFInfo
- Publication number
- EP3452881B1 EP3452881B1 EP17710806.5A EP17710806A EP3452881B1 EP 3452881 B1 EP3452881 B1 EP 3452881B1 EP 17710806 A EP17710806 A EP 17710806A EP 3452881 B1 EP3452881 B1 EP 3452881B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- uav
- camera
- target
- light
- processor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000003384 imaging method Methods 0.000 title claims description 35
- 238000005286 illumination Methods 0.000 claims description 36
- 238000000034 method Methods 0.000 claims description 26
- 230000003213 activating effect Effects 0.000 claims description 13
- 238000003032 molecular docking Methods 0.000 claims description 9
- 230000000007 visual effect Effects 0.000 claims description 8
- 238000002329 infrared spectrum Methods 0.000 claims description 4
- 238000001931 thermography Methods 0.000 claims description 4
- 238000004891 communication Methods 0.000 description 58
- 230000001413 cellular effect Effects 0.000 description 10
- 230000008569 process Effects 0.000 description 7
- 230000004044 response Effects 0.000 description 7
- 230000007246 mechanism Effects 0.000 description 6
- 230000008859 change Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 230000002093 peripheral effect Effects 0.000 description 5
- 230000004913 activation Effects 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 230000001360 synchronised effect Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 238000003306 harvesting Methods 0.000 description 2
- 238000011022 operating instruction Methods 0.000 description 2
- 230000006641 stabilisation Effects 0.000 description 2
- 239000000126 substance Substances 0.000 description 2
- 230000000153 supplemental effect Effects 0.000 description 2
- 235000017060 Arachis glabrata Nutrition 0.000 description 1
- 241001553178 Arachis glabrata Species 0.000 description 1
- 235000010777 Arachis hypogaea Nutrition 0.000 description 1
- 235000018262 Arachis monticola Nutrition 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000004146 energy storage Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 235000020232 peanut Nutrition 0.000 description 1
- APTZNLHMIGJTEW-UHFFFAOYSA-N pyraflufen-ethyl Chemical compound C1=C(Cl)C(OCC(=O)OCC)=CC(C=2C(=C(OC(F)F)N(C)N=2)Cl)=C1F APTZNLHMIGJTEW-UHFFFAOYSA-N 0.000 description 1
- 230000003019 stabilising effect Effects 0.000 description 1
- 239000003381 stabilizer Substances 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/104—Simultaneous control of position or course in three dimensions specially adapted for aircraft involving a plurality of aircrafts, e.g. formation flying
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
- B64C39/024—Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D47/00—Equipment not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D47/00—Equipment not otherwise provided for
- B64D47/02—Arrangements or adaptations of signal or lighting devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D47/00—Equipment not otherwise provided for
- B64D47/08—Arrangements of cameras
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/006—Apparatus mounted on flying objects
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/02—Illuminating scene
- G03B15/03—Combinations of cameras with lighting apparatus; Flash units
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/02—Illuminating scene
- G03B15/03—Combinations of cameras with lighting apparatus; Flash units
- G03B15/05—Combinations of cameras with electronic flash apparatus; Electronic flash units
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0094—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U20/00—Constructional aspects of UAVs
- B64U20/80—Arrangement of on-board electronics, e.g. avionics systems or wiring
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/20—Remote controls
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U30/00—Means for producing lift; Empennages; Arrangements thereof
- B64U30/20—Rotors; Rotor supports
Definitions
- Unmanned aerial vehicles also referred to as “drones,” are used for aerial photography and/or video surveillance.
- UAV cameras In poor daylight or low/no ambient light conditions, UAV cameras often depend on a built-in flash or an onboard light as a primary light source. However, the illumination provided by such lighting declines with the distance between the target and the light source. While stationary remote lights may be used to illuminate a scene, such lights require advanced setup, and cannot be easily reconfigured or moved.
- Systems, devices, and methods of various embodiments include an aerial imaging system including a first unmanned aerial vehicle (UAV) and a second UAV in which the first UAV includes a camera and may be configured to receive input from an operator.
- the second UAV may be configured to dock with and may include a light configured to provide illumination for the camera.
- the first UAV may be configured to fly while the second UAV is docketed on the first UAV, and the second UAV may be configured to deploy and fly independently from the first UAV.
- the first UAV may include a processor configured to determine a position of the second UAV from a first image.
- the processor may be further configured to determine the position of the second UAV from camera images.
- the processor may be configured to determine an aerial position of the second UAV flying separate from the first UAV based on a comparison of camera images and images received from the second UAV.
- the second UAV may be configured to fly separate from the first UAV to a predetermined aerial position relative to the first UAV without input from an operator of the first UAV.
- the second UAV may include a microphone configured to record sounds.
- the second UAV may be configured to use signals received from a proximity sensor to maintain a determined aerial position of the second UAV relative to a target of photography by the camera.
- the second UAV may include a processor configured to recognize a target of photography by the camera in images obtained from an image capture device on the second UAV, and to maintain an aerial position relative to a target of photography by the camera.
- the second UAV may be controlled to fly to a position relative to a target of photography by the camera in order to provide a determined amount of illumination of the target of photography.
- the amount of illumination provided by the light on the second UAV may be adjustable by changing the aerial position of the second UAV or changing a level of light emitted from the second UAV.
- the light may emit in an infrared spectrum and the camera may be configured for thermal imaging.
- a third UAV may be configured to dock with and deploy from the first UAV.
- the first UAV may be configured to fly while supporting both the second UAV and the third UAV.
- the camera on the first UAV and the first UAV may be configured to use camera images for controlling the second UAV.
- the camera may be configured to contemporaneously capture two or more different images.
- the camera may be configured so that the two or more different images overlap.
- Some embodiments may include deploying, from a first unmanned aerial vehicle (UAV) including a camera, a second UAV to fly separate from the first UAV, activating a light on the second UAV to illuminate a target of photography by the camera, and activating the camera to photograph the target of photography illuminated by the light.
- UAV unmanned aerial vehicle
- Some embodiments may include flying the second UAV separate from the first UAV without input from a remotely controlling operator of the first UAV.
- Some embodiments may include flying the first UAV while the second UAV is docked on the first UAV.
- Some embodiments may include activating the camera to photograph the target of photography to contemporaneously capture two or more different images, which may overlap.
- Some embodiments may include re-docking the second UAV with the first UAV after activating the camera to photograph the target of photography. Some embodiments may include activating a microphone on the second UAV for recording sounds emanating from a target of a sound recording.
- Some embodiments may include deploying a third UAV from the first UAV, and flying the first UAV while supporting both the second UAV and the third UAV. Some embodiments may include determining a position of the second UAV using camera images from the camera for controlling the second UAV. Some embodiments may include receiving by the first UAV remotely captured visual images taken by another camera on the second UAV. Some embodiments may include determining an aerial position of the second UAV flying separate from the first UAV based on comparing an onboard visual image captured by the camera on the first UAV and the remotely captured visual image taken by the other camera on the second UAV. Some embodiments may include transmitting from the first UAV to the second UAV a command for the second UAV to maintain a predetermined aerial position relative to the first UAV.
- Some embodiments may include transmitting from the first UAV to the second UAV a command for the second UAV to maintain a predetermined aerial position relative to the target of photography by the camera. Some embodiments may include receiving location information for determining an aerial position of the second UAV from the second UAV flying remote from the first UAV. Some embodiments may include determining a relative position of the second UAV relative to the first UAV. Some embodiments may include determining an amount of illumination provided by the light on the second UAV. Some embodiments may include determining an adjustment needed for the amount of illumination provided by the light on the second UAV and transmitting instructions to the second UAV for making the adjustment needed for the amount of illumination provided by the light on the second UAV.
- Further embodiments may include an aerial imaging system including a first UAV and a second UAV in which the first and second UAVs include means for performing functions of the methods summarized above. Further embodiments may include non-transitory processor-readable storage media having stored thereon processor-executable instructions configured to cause a processor of a first UAV to perform operations of the methods summarized above.
- Various embodiments include an aerial imaging system that includes at least two UAVs.
- a first unmanned aerial vehicle (UAV) includes a camera and is configured to receive input from an operator.
- a second UAV is configured to dock with and deploy from the first UAV.
- the second UAV includes a light for illuminating a target being captured as a first image by the camera on the first UAV.
- Positioning an illumination source on the second UAV enables the aerial imaging system to capture images by the first UAV at a first distance from a subject or scene that is illuminated by the second UAV at a second distance from the subject or scene that may be selected or controlled to achieve a desired or minimum level of illumination.
- the second UAV may separate from the first UAV and fly to a position closer to the subject or scene where proper or desired illumination can be achieved.
- Multiple illumination UAVs may be implemented in the aerial imaging system, enabling lighting from different directions.
- UAV refers to one of various types of unmanned aerial vehicles.
- a UAV may include an onboard computing device configured to fly and/or operate the UAV without remote operating instructions (i.e., autonomously), and/or with some remote operating instructions or updates to instructions stored in a memory, such as from a human operator or remote computing device (i.e., semi-autonomously).
- UAVs may be propelled for flight in any of a number of known ways.
- a plurality of propulsion units each including one or more rotors, may provide propulsion or lifting forces for the UAV and any payload carried by the UAV.
- UAVs may include wheels, tank-treads, or other non-aerial movement mechanisms to enable movement on the ground, on or in water, and combinations thereof.
- the UAV may be powered by one or more types of power source, such as electrical, chemical, electro-chemical, or other power reserve, which may power the propulsion units, the onboard computing device, and/or other onboard components.
- computing device is used herein to refer to an electronic device equipped with at least a processor.
- Examples of computing devices may include UAV flight control and/or mission management computer that are onboard the UAV, as well as remote computing devices communicating with the UAV configured to perform operations of the various embodiments.
- Remote computing devices may include wireless communication devices (e.g., cellular telephones, wearable devices, smart-phones, web-pads, tablet computers, Internet enabled cellular telephones, Wi-Fi® enabled electronic devices, personal data assistants (PDA's), laptop computers, etc.), personal computers, and servers.
- wireless communication devices e.g., cellular telephones, wearable devices, smart-phones, web-pads, tablet computers, Internet enabled cellular telephones, Wi-Fi® enabled electronic devices, personal data assistants (PDA's), laptop computers, etc.
- computing devices may be configured with memory and/or storage as well as wireless communication capabilities, such as network transceiver(s) and antenna(s) configured to establish a wide area network (WAN) connection (e.g., a cellular network connection, etc.) and/or a local area network (LAN) connection (e.g., a wireless connection to the Internet via a Wi-Fi® router, etc.).
- WAN wide area network
- LAN local area network
- FIG. 1 illustrates an example of an aerial imaging system 100 with a first UAV 101 and a second UAV 102 according to various embodiments.
- the first UAV 101 includes a camera 110 and is configured to receive input or instructions from an operator 5 using a wireless communication device 800 via an operator communication link 130.
- the operator 5 may initiate and control the flight of the first UAV 101, and may control the camera 110, such as for photographing a target 50.
- the second UAV 102 includes a light 120 for illuminating the target 50 being photographed by the first UAV 101.
- the first UAV 101 may include the light 120 and the second UAV 102 may include the camera 110.
- both the first UAV 101 and the second UAV 102 may each include a light 120 and a camera 110.
- the second UAV 102 is configured to dock with and deploy from the first UAV 101 so that the first UAV 101 may carry the second UAV 102 to a photography location.
- the first UAV 101 is configured to dock with second UAV 102 and deploy from the so that the second UAV 102 may carry the first UAV 101 to a photography location.
- the first UAV 101 and the second UAV 102 may be configured to dock with and deploy from a third UAV (see for example FIG. 3A ) so that the third UAV may carry the first UAV 101 and second UAV 102 to a photography location.
- the second UAV 102 may be synchronized with the first UAV 101.
- the first UAV 101 may control the navigation of the second UAV 102 and keep the first and second UAVs 101, 102 synchronized.
- first UAV 101 and second UAV 102 are omitted, such as wiring, frame structure, power source, landing columns/gear, or other features that would be known to one of skill in the art.
- the UAVs are illustrated as quad copters with four rotors, the UAVs may include more or fewer than four rotors.
- the first UAV 101 and second UAV 102 may behave similar or different configurations, numbers of rotors, and/or other aspects.
- the camera 110 may focus light reflected or emitted from virtually anything within a field of view 112 onto an internal light-sensitive surface for capturing one or more images. In this way, the camera 110 captures images for still and/or video photography.
- the camera 110 may be pivotally mounted on the first UAV 101 or otherwise adjustable to provide 3-axis pointing control.
- the field of view 112 includes the extent of the observable world seen by the camera 110, which extends outwardly away from the camera 110 toward infinity.
- the camera 110 may be focused on the target 50 when the target 50 is within the field of view 112 and, if applicable, within the focal limits of the camera 110.
- the field of view 112 may include nearby objects 55 and intervening objects, such as the second UAV 102.
- the target 50 may include one or more creatures and/or objects that is/are the focus of the photography, such as the three individuals shown in FIG. 1 .
- the light 120 on the second UAV 102 provides a light source and/or a supplemental light source to enable the source of illumination for photography (i.e., still and/or video photography) to be positioned closer to the target 50 than the camera 110 and/or to provide illumination from an angle different from that of the perspective of the camera 110.
- the second UAV 102, along with the light 120, may fly separate from the first UAV 101 and be positioned to project light from a position removed from that of the first UAV 101.
- various embodiments may use the light 120 on the second UAV 102 to create a lighted region 122.
- the lighted region 122 may enhance the illumination in part of the field of view 112 of the camera 110, thus projecting additional illumination on the target 50.
- the nearby objects 55 within the field of view 112 may not receive the same level of illumination as the target 50 if such objects are disposed outside the lighted region 122.
- the lighted region 122 projects away from the second UAV 102 (i.e., toward the target 50)
- the second UAV 102 itself, or parts thereof may not be included in the lighted region 122.
- a relative position of the target 50 is further illustrated using a first frame of reference 115, a second frame of reference 117, and a third frame of reference 125.
- the first frame of reference 115 corresponds to an imaginary planar extent bounded by the field of view 112, perpendicular to a direction in which the camera 110 is facing, and includes a focal point on the target 50.
- the second frame of reference 117 also corresponds to an imaginary planar extent bounded by the field of view 112 and perpendicular to the direction in which the camera 110 is facing, but corresponds to a location of the second UAV 102 and includes a point from which the light 120 emanates.
- the third frame of reference 125 corresponds to an imaginary planar extent that is both bounded by the lighted region 122 and within the first frame of reference 115.
- the first UAV 101 and the second UAV 102 may be configured to dock with one another.
- the second UAV 102 may couple to the first UAV 101 via a coupling controlled by either of the two UAVs, or both UAVs 101, 102.
- a coupling may provide sufficient structural rigidity to secure the second UAV 102 while the first UAV 101 transits and performs various aerial maneuvers.
- the second UAV 102 may piggyback on an upper region of the first UAV 101.
- the second UAV 102 may dock on another portion of the first UAV 101, such as attached to a lateral or lower side thereof.
- Lateral and/or lower side docking arrangements may need to further consider the interaction of aerodynamic forces between the first UAV 101 and the second UAV 102.
- Turbulence generated by the rotors of the first UAV 101 may interfere with the deployment and/or re-docking of the second UAV 102.
- An area above the first UAV 101, while in flight, tends to be a lower region of turbulence.
- the second UAV 102 may try to remain in the region above the flight path of the first UAV 101 in order to avoid turbulence and loss of control.
- the second UAV 102 may be partially or fully held inside the first UAV 101 while in the docked configuration.
- the first UAV 101 may be configured to fly while supporting the second UAV 102 when operating in the docked configuration.
- the aerial imaging system 100 may use the camera 110 and the light 120 either while the first UAV 101 and the second UAV 102 are in the docked configuration or after the two UAVs have separated.
- the first and second UAVs 101, 102 may remain in the docked configuration.
- the first UAV 101 may take one or more photographs either by using the camera 110 without using the light 120 (e.g., ambient light is sufficient) or using the camera 110 and the light 120 together while the second UAV 102 remains docked with the first UAV 101.
- the second UAV 102 may deploy from the first UAV 101 and fly to a location closer to the target 50 where the lighted region 122 provides sufficient illumination of the target.
- the second UAV 102 may fly to a designated aerial position relative to the first UAV 101 and/or the target 50 without navigational instructions from the operator 5.
- the designated aerial position may be determined for providing enhanced illumination to the target 50 being photographed.
- the first UAV 101, the second UAV 102, or both UAVs may automatically determine the designated aerial position for illuminating the target under the current lighting conditions and desired photographic effects.
- the designated aerial position for the second UAV 102 relative to the first UAV 101 may be a predetermined relative position, such as a default standoff position (e.g., five meters away from the first UAV 101 at a set angle and direction).
- Navigation of the second UAV 102 to the designated aerial position may be fully controlled by the first UAV 101, controlled by the first UAV 101 based on information/feedback received from the second UAV 102, or controlled independently by the second UAV 102.
- FIGS. 2A-2B are front elevation views of the aerial imaging system 100 according to various embodiments.
- the aerial imaging system 100 may include the first UAV 101 supporting the second UAV 102 docked thereon (e.g., as shown in FIG. 2A ).
- One or both of the first UAV 101 and the second UAV 102 may include an automatic latching mechanism for securely holding the two UAVs 101, 102 together to maintain a docked configuration.
- the automatic latching mechanism may disengage.
- the second UAV 102 (e.g., as shown in FIG. 2B ) may be deployed and fly separate from the first UAV 101.
- an ejection mechanism may be included that quickly separates the two UAVs 101, 102.
- the second UAV 102 may be carried on an underside of the first UAV 101, which would enable to second UAV 102 to make use of gravity to quickly separate from the first UAV 101 by free-falling for a few seconds when the second UAV 102 is released from the first UAV 101. Due to the significant turbulence generated from the downdraft of the first UAV 101, in flight re-docking to the underside of the first UAV 101 may be infeasible. Therefore, additional procedures may be necessary to re-dock the second UAV 102 to the underside of the first UAV 101, such as manual re-docking performed by the operator 5 after landing.
- FIG. 3A is a top view of an aerial imaging system 300 in accordance with various embodiments.
- the aerial imaging system 300 may include a first UAV 301 plus a second UAV 302, a third UAV 303, a fourth UAV 304, and a fifth UAV 305 docked on the first UAV 301.
- the aerial imaging system 300 may include more or fewer UAVs in various embodiments.
- the first UAV 301 may include the camera (e.g., 110) and is illustrated as a quad copter configuration with four rotors 315, although the first UAV 301 may include more or fewer rotors.
- Each of the second, third, fourth, and fifth UAVs 302, 303, 304, 305 may be configured to dock with and deploy from the first UAV 301.
- the first UAV 301 may be configured to fly while supporting some or all of the second UAV 302, the third UAV 303, the fourth UAV 304, and/or the fifth UAV 305.
- one or more of the second UAV 302, the third UAV 303, the fourth UAV 304, and/or the fifth UAV 305 may include the camera 110 in addition to or instead of the first UAV 301.
- the other ones of the first UAV 301, second UAV 302, the third UAV 303, the fourth UAV 304, and/or the fifth UAV 305 may include the light (e.g., 120 in FIG. 1 ) for providing illumination for the camera 110.
- FIG. 3B is a perspective relief view of the second UAV 302 in FIG. 3A .
- the second UAV 302 (which may be similar to the second UAV 102) may include the light 120 and/or additional sensors, such as a microphone.
- one or more of the third, fourth, and fifth UAVs 303, 304, 305 may include a light (e.g., 120) and/or additional sensors and/or microphones.
- the second, third, fourth, and fifth UAVs 302, 303, 304, 305 are illustrated as quad copters with four rotors 325; however, any of the second, third, fourth, and fifth UAVs 302, 303, 304, 305 may be configured with more or fewer rotors.
- FIG. 4A is a top schematic view of the aerial imaging system 300 including the first UAV 301, the second UAV 302, the third UAV 303, the fourth UAV 304, and the fifth UAV 305 according to various embodiments.
- the second UAV 302 is shown flying separate from the first UAV 301, while the third UAV 303, the fourth UAV 304, and the fifth UAV 305 remain docked in a piggybacking configuration on the first UAV 301.
- the first UAV 301 is configured to receive input from the operator 5 via the operator communication link 130 to the wireless control unit 6.
- the operator 5 may, directly or indirectly, initiate and control flight of any of the second UAV 302, the third UAV 303, the fourth UAV 304, and/or the fifth UAV 305, such as by transmitting a deployment command.
- the deployment command may be part of a process that may be initiated manually by the operator or automatically by a processer (e.g., of the first UAV 101) when enhanced lighting is needed for a photograph to be taken by the first UAV 101.
- the inter-UAV communication link 135 may control the navigation of the second UAV 302 and keep the first and second UAVs 301, 302 synchronized.
- the processor may control, via the inter-UAV communication link 135, the activation of the light (e.g., 120) on the second UAV 302 that generates the lighted region 122.
- the second UAV 302 may be deployed toward a focal point 60 on or near the target 50. However, it may be desirable to avoid having the second UAV 302 fly to an aerial position that lies between the camera (e.g., 120) and the target 50, since the second UAV 302 would thus block at least part of any photograph. Thus, the second UAV 302 may be directed to fly at a higher elevation than the target 50, either staying on the fringe of the field of view 112 or just outside thereof. Alternatively, the second UAV 302 may be directed to land on the ground or hover just off the ground in front of the target 50, which may also be either on the fringe of the field of view 112 or just outside thereof.
- the designated aerial position of the second UAV 302 may be selected based on a desired angle for the light (e.g., 120) to emit in generating the lighted region 122. For example, light emitted from above the target 50 may more naturally emulate sunlight.
- FIG. 4B is a top schematic view of the aerial imaging system 300 including the first UAV 301, the second UAV 302, the third UAV 303, the fourth UAV 304, and the fifth UAV 305 according to various embodiments.
- the second UAV 302 and the third UAV 303 are shown flying separate from the first UAV 301, while the fourth UAV 304 and the fifth UAV 305 remain docked in a piggybacking configuration on the first UAV 301.
- UAVs 302, 303, 304, 305 When two of the second, third, fourth, and fifth UAVs 302, 303, 304, 305 are deployed, it may be desirable to select two oppositely docked UAVs for deployment (e.g., 302 and 303; or 304 and 305) in order to maintain symmetry in the piggybacking configuration of the remaining UAVs on the first UAV 301 for flight stability.
- two oppositely docked UAVs for deployment e.g., 302 and 303; or 304 and 305
- Separate inter-UAV communication links 135, 136 may control the navigation of the second UAV 302 and the third UAV 303, respectively.
- the inter-UAV communication links 135, 136 may keep the first UAV 301 synchronized with each of the second UAV 302 and the third UAV 303.
- the inter-UAV communication links 135, 136 may control the activation of separate lights (e.g., 120) on each of the second UAV 302 and the third UAV 303 for generating separate lighted regions 422. Having the separate lighted regions 422 overlap but originate from opposite sides of the target 50 may avoid shadows on the target 50.
- FIG. 5 illustrates a configuration of the first UAV 301 that may be used for any of the UAVs in various embodiments.
- the first UAV 301 may include a control unit 150 that may house various circuits and devices used to power and control the operation of the first UAV 301, as well as any other UAVs controlled by the first UAV 301.
- the control unit 150 may include a processor 160, a power module 170, an input module 180, a camera 181, sensors 182, an output module 185, and a radio module 190 coupled to an antenna 191.
- the processor 160 may include or be coupled to memory 161 and a navigation unit 163.
- the processor 160 may be configured with processor-executable instructions to control flight and other operations of the first UAV 301, including operations of the various embodiments.
- the processor 160 may be coupled to one or more cameras 181 and sensors 182.
- the camera 181 may include one or more image capturing devices for photographing the target (e.g., 50). More than one image capturing device may be configured to contemporaneously capture two different images including the target. For example, a first image may include both the target and the second UAV (e.g., 102), while a second image may include the target but not the second UAV. Alternatively, the camera 181 may be configured to detect light in the infrared spectrum for thermal imaging. Such thermal imaging features may be enhanced if the light emitted from the second UAV extends to the infrared spectrum.
- the sensors 182 may be optical sensors (e.g., light meters for controlling exposure and determining whether additional illumination is required), radio sensors, a rotary encoder, pressure sensors (i.e., for detecting wind, lift, drag, or changes therein) or other sensors. Alternatively or additionally, the sensors 182 may be contact or pressure sensors that may provide a signal that indicates when the first UAV 301 has landed.
- optical sensors e.g., light meters for controlling exposure and determining whether additional illumination is required
- radio sensors e.g., radio sensors, a rotary encoder
- pressure sensors i.e., for detecting wind, lift, drag, or changes therein
- the sensors 182 may be contact or pressure sensors that may provide a signal that indicates when the first UAV 301 has landed.
- the power module 170 may include one or more batteries that may provide power to various components, including the processor 160, the input module 180, the sensors 182, the output module 185, and the radio module 190.
- the power module 170 may include energy storage components, such as rechargeable batteries.
- the processor 160 may be configured with processor-executable instructions to control the charging of the power module 170, such as by executing a charging control algorithm using a charge control circuit.
- the power module 170 may be configured to manage its own charging.
- the processor 160 may be coupled to an output module 185, which may output control signals for managing the motors that drive the rotors 315 and other components.
- the first UAV 301 may be controlled in flight.
- the processor 160 may receive data from the navigation unit 163 and use such data in order to determine the present position and orientation of the first UAV 301, as well as the appropriate course towards the target (e.g., 50).
- the navigation unit 163 may include a global navigation satellite system (GNSS) receiver system (e.g., one or more Global Positioning System (GPS) receivers) enabling the first UAV 301 to navigate using GNSS signals.
- GNSS global navigation satellite system
- GPS Global Positioning System
- the navigation unit 163 may be equipped with radio navigation receivers for receiving navigation beacons or other signals from radio nodes, such as navigation beacons (e.g., very high frequency (VHF) Omni Directional Radio Range (VOR) beacons), Wi-Fi access points, cellular network sites, radio station, remote computing devices, other UAVs, etc.
- navigation beacons e.g., very high frequency (VHF) Omni Directional Radio Range (VOR) beacons
- Wi-Fi access points e.g., Wi-Fi access points, cellular network sites, radio station, remote computing devices, other UAVs, etc.
- the processor 160 and/or the navigation unit 163 may be configured to communicate with a server through a wireless connection (e.g., a cellular data network) to receive commands to use or stop using the extended flight protocol, receive data useful in navigation, provide real-time position altitude reports, and assess data.
- An avionics module 167 coupled to the processor 160 and/or the navigation unit 163 may be configured to provide flight control-related information such as altitude, attitude, airspeed, heading and similar information that the navigation unit 163 may use for navigation purposes, such as dead reckoning between GNSS position updates.
- the avionics module 167 may include or receive data from a gyro/accelerometer unit 165 that provides data regarding the orientation and accelerations of the first UAV 301 that may be used in navigation and positioning calculations.
- the radio module 190 may be configured to receive signals via the antenna 191, such as command signals to initiate, continue, or discontinue the use of the light (e.g., 120) from the second UAV (e.g., 302), receive signals from aviation navigation facilities, etc., and provide such signals to the processor 160 and/or the navigation unit 163 to assist in operation of the first UAV 301.
- commands for controlling the first UAV 301 and/or the second UAV 302, or components thereof may be received via the radio module 190.
- the first UAV 301 may receive signals from a wireless control unit 6.
- the operator communication link 130 may include input from a knowledge base regarding current conditions, a current orientation of the first UAV 301 or elements thereof, predicted future conditions, requirements for particular UAV maneuvers or missions, aiming parameters of the camera or even information regarding a target of the photography.
- the radio module 190 may be configured to switch between a cellular connection and a Wi-Fi or other form of radio connection depending on the location and altitude of the first UAV 301. For example, while in flight at an altitude designated for UAV traffic, the radio module 190 may communicate with a cellular infrastructure in order to maintain communications with a server. In addition, communications with the wireless control unit 6 may be established using cellular telephone networks while the first UAV 301 is flying out of line-of-sight with the operator 5. Communication between the radio module 190 and the operator communication link 130 may transition to a short-range communication link (e.g., Wi-Fi or Bluetooth) when the first UAV 301 moves closer to the wireless control unit 6. Similarly, the first UAV 301 may include and employ other forms of radio communication, such as mesh connections with other UAVs or connections to other information sources (e.g., balloons or other stations for collecting and/or distributing weather or other data harvesting information).
- information sources e.g., balloons or other stations for collecting and/or distributing weather
- the control unit 150 may be equipped with the input module 180, which may be used for a variety of applications.
- the input module 180 may receive and pre-process images or data from an onboard camera 181 or sensor 182, or may receive electronic signals from other components (e.g., a payload).
- the input module 180 may receive an activation signal for causing actuators on the first UAV 301 to deploy landing cushions or similar components for affecting an emergency landing.
- the output module 185 may be used to activate components (e.g., an energy cell, an actuator, an indicator, a circuit element, a sensor, and/or an energy-harvesting element).
- control unit 150 While the various components of the control unit 150 are illustrated in FIG. 5 as separate components, some or all of the components (e.g., the processor 160, the output module 185, the radio module 190, and other units) may be integrated together in a single device or module, such as a system-on-chip module.
- FIG. 6 illustrates an aerial imaging system 600 with a first UAV 601 (which, for example, may generally correspond to the first UAV 101, 301 in FIGS. 1-5 ) and a second UAV 602 (which, for example, may generally correspond to the second or other UAVs 102, 302, 303, 304, 305 in FIGS. 1-5 ) in accordance with various embodiments.
- the first UAV 601 includes a camera (e.g., 110) and is configured to receive input from the operator 5 via an operator communication link 130 to a wireless communication device 606 in the form of a tablet computer.
- the operator 5 may not only initiate and control the flight of the first UAV 601 but also may control the camera 110, such as for photographing a target 50.
- the second UAV 602 may be configured to dock with and deploy from the first UAV 601. Also, the second UAV 602 includes a light (e.g., 120) for illuminating the target 50 being photographed by the first UAV 601. In addition, the second UAV 602 may include a microphone 630 configured to record sounds 51 from an area in which the target is located. The second UAV 602 may be controlled to fly near the target 50 being photographed to pick up audio. The second UAV 602 may then wirelessly transmit audio to the first UAV 601 using the inter-UAV communication link 135. The first UAV 601 may synchronize the received audio with video images being captured.
- a light e.g. 120
- the second UAV 602 may include a microphone 630 configured to record sounds 51 from an area in which the target is located.
- the second UAV 602 may be controlled to fly near the target 50 being photographed to pick up audio.
- the second UAV 602 may then wirelessly transmit audio to the first UAV 601 using the inter-UAV communication link 13
- the first UAV 601 may independently and/or automatically control the aerial position of the second UAV 602 in a closed loop fashion using feedback from resources onboard the first UAV 601.
- the inter-UAV communication link e.g., 135, 136 in FIGS. 1 , 4A , 4B , and 6
- an onboard compass and/or GPS may provide information for determining a relative location of the first UAV 101 and/or the second UAV 102.
- Various embodiments may use the camera (e.g., 110) (and/or other sensor(s)) not only for photography but also to monitor, track and/or change the position of the second UAV 602.
- the processer (e.g., 160 in FIG. 5 ) in the first UAV 601 may control the second UAV 602 based on data from the camera, such as a first image 612 of the field of view 112 from the camera. If the second UAV 602 is maintained within the field of view 112, the processor in the first UAV 601 may determine an aerial position of the second UAV 602 based on the size and position of the second UAV 602 within the first image 612. Meanwhile, a second image 625 that cuts out the second UAV 602 may be used for more conventional image capture.
- the first and second images 612, 625 may be rendered from a single still image captured by the camera or rendered from the same stream of video images captured by the camera.
- a focal distance may be determined between the camera (i.e., the first UAV 601) and the target 50.
- a processor in the first UAV 601 may calculate a first distance between the two UAVs 601, 602. Using this calculated first distance, the processor may determine a second distance between the second UAV 602 and the target 50.
- the second UAV 602 may hover a predetermined distance from the first UAV 601, unless the amount of light being received from the target 50 is more or less than desirable.
- the predetermined distance may be a default distance and/or relative aerial position.
- the predetermined position may be determined by the processor based on current conditions (e.g., ambient lighting).
- one or both of the first UAV 601 and the second UAV 602 may include a light sensor (e.g., sensor 182 in FIG. 5 ).
- the second UAV 602 may be commanded to move further from or closer to the target 50, respectively.
- the second UAV 602 may be commanded to change to brightness emitted by the light accordingly.
- the second UAV 602 may independently and/or automatically control its own aerial position relative to the first UAV 601 and/or the target 50.
- the second UAV 602 may use an onboard proximity sensor to maintain a determined aerial position relative to the first UAV 601 and/or the target 50.
- the second UAV 602 may include its own camera and processor for capturing and analyzing images of the first UAV 601 and/or the target 50. After receiving information about the target 50 from the first UAV 601, the processor onboard the second UAV 602 may be configured to recognize the target 50. Using target recognition, the second UAV 602 may maintain a fixed position relative to the target 50.
- the first and second UAVs 601, 602 may cooperate, exchanging information, to automatically control the aerial position of the second UAV 602.
- the second UAV 602 may collect information about its aerial position, using communication links, GPS, compass, and/or images from a camera onboard the second UAV 602.
- the second UAV 602 may transmit the collected information to the first UAV 601 for processing and further controlling the second UAV 602.
- the first UAV (e.g., 101, 301, 601) may have generally the same components as the second UAV (e.g., 102, 302, 602) and/or additional UAVs (e.g., third, fourth, and fifth UAVs 303, 304, 305).
- the first UAV may have different components than the second and/or additional UAVs.
- the second UAV may lack a GPS receiver and/or transceiver for establishing a WAN connection, since the second UAV may not need this information and/or may obtain such information from the first UAV (e.g., via an inter-UAV communication link 135).
- various embodiments may include additional UAVs (e.g., third UAV 303, fourth UAV 304, fifth UAV 305) in which one or more of the second UAV and additional UAVs may be different and/or have different components from one another.
- FIG. 7 illustrates a method 700 of aerial imaging according to various embodiments.
- operations of the method 700 may be performed by a UAV control unit (e.g., 150 in FIG. 5 ) or another computing device (e.g., wireless control unit 6 in FIGS. 1 , 4A , 4B , 5 and/or wireless communication device 606 in FIG. 6 ) in communication with the first UAV (e.g., 101, 301, 601 in FIGS. 1-6 ).
- a UAV control unit e.g., 150 in FIG. 5
- another computing device e.g., wireless control unit 6 in FIGS. 1 , 4A , 4B , 5 and/or wireless communication device 606 in FIG. 6
- the first UAV e.g., 101, 301, 601 in FIGS. 1-6 .
- the processor of the first UAV may receive an input indicating the second UAV and/or at least one additional UAV (e.g., the third, fourth, and/or fifth UAV 303, 304, 305) should be deployed.
- the input may be a manual input from the operator (e.g., 6 in FIG. 1 ), an input from another process (e.g., image processing analysis determines lighting needs adjustment, a knowledge base (e.g., in an onboard and/or remote database), or systems controlling the operation of the first UAV).
- the processor of the first UAV may deploy the second UAV and/or at least one additional UAV, which may involve releasing a latching mechanism and/or using an ejection mechanism to ensure the second and/or additional UAV quickly moves away from the first UAV.
- the processor Prior to deploying the second and/or additional UAV, the processor may activate or otherwise cause to activate propellers on the second and/or additional UAV to prepare the second and/or additional UAV for flight.
- the second and/or additional UAV may automatically proceed to a designated aerial position relative to the first UAV, which may include more than one aerial position if multiple UAVs are deployed.
- the designated aerial position(s) may be predetermined (such as a default or preprogrammed position) or determined based on ambient or other conditions.
- the process may provide the second and/or additional UAV with instructions for the second and/or additional UAV to reach the designated aerial position(s).
- the processor of the first UAV may activate the light(s) on the second and/or additional UAV (if not already illuminated) to illuminate the target.
- the processor may transmit an activation signal via the inter-UAV communication link (e.g., 135 and/or 136 in FIGS. 4A and 4B ) for activating the light(s).
- the processor may determine whether lighting for taking one or images (or video) needs to change. For instance, the processor may determine whether lighting in one or more images taken by the camera needs to change. The processor may assess temporary images captured by the camera to determine with the current lighting is too low, too high, or just right.
- the processor may transmit one or more instructions to the second and/or additional UAV in block 742.
- the instruction(s) may indicate the second and/or additional UAV should change its aerial position, orientation of the light, and/or a lighting parameter (e.g., brightness) of the light on the second UAV. Accordingly, the second and/or additional UAV may make adjustments based on one or more of the instruction(s).
- the processor may also deploy one or more additional UAVs in block 720. For example, if supplemental and/or different lighting is needed and one or more additional UAVs are available, the one or more additional UAV may be deployed to implement needed lighting changes.
- the processor may wait until a message is received, indicating that the transmitted instructions have been implemented, before again determining whether the lighting needs to be changed in determination block 740.
- the message may be received from the second and/or addition UAV.
- the processor may allow a designated waiting time to pass to enable the changes from the transmitted instructions in block 742 to be implemented before determining whether the lighting needs to be changed further in determination block 740.
- the designated waiting time may act as a time-out period in case no message is received from the second and/or addition UAV.
- the processor may activate the camera to photograph the target in block 750.
- the wireless communication device 800 may include a processor 802 coupled with the various systems of the wireless communication device 800 for communication with and control thereof.
- the processor 802 may be coupled to a touch screen controller 804, radio communication elements, speakers and microphones, and an internal memory 806.
- the processor 802 may be one or more multi-core integrated circuits designated for general or specific processing tasks.
- the internal memory 806 may be volatile or non-volatile memory, and may also be secure and/or encrypted memory, or unsecure and/or unencrypted memory, or any combination thereof.
- the wireless communication device 800 may also be coupled to an external memory, such as an external hard drive.
- the touch screen controller 804 and the processor 802 may also be coupled to a touch screen panel 812, such as a resistive-sensing touch screen, capacitive-sensing touch screen, infrared sensing touch screen, etc. Additionally, the display of the wireless communication device 800 need not have touch screen capability.
- the wireless communication device 800 may have one or more radio signal transceivers 808 (e.g., Peanut, Bluetooth, Bluetooth LE, ZigBee, Wi-Fi®, radio frequency (RF) radio, etc.) and antennae, the wireless communication device antenna 810, for sending and receiving communications, coupled to each other and/or to the processor 802.
- radio signal transceivers 808 e.g., Peanut, Bluetooth, Bluetooth LE, ZigBee, Wi-Fi®, radio frequency (RF) radio, etc.
- the radio signal transceivers 808 and the wireless communication device antenna 810 may be used with the above-mentioned circuitry to implement the various wireless transmission protocol stacks and interfaces.
- the wireless communication device 800 may include a cellular network wireless modem chip 816 coupled to the processor that enables communication via a cellular network.
- the wireless communication device 800 may include a peripheral device connection interface 818 coupled to the processor 802.
- the peripheral device connection interface 818 may be singularly configured to accept one type of connection, or may be configured to accept various types of physical and communication connections, common or proprietary, such as USB, FireWire, Thunderbolt, or PCIe.
- the peripheral device connection interface 818 may also be coupled to a similarly configured peripheral device connection port (not shown).
- the wireless communication device 800 may include one or more microphones 815.
- the wireless communication device may have microphones 815 that are conventional for receiving voice or other audio frequency energy from a user during a call.
- the wireless communication device 800 may also include speakers 814 for providing audio outputs.
- the wireless communication device 800 may also include a housing 820, constructed of a plastic, metal, or a combination of materials, for containing all or some of the components discussed herein.
- the wireless communication device 800 may include a power source 822 coupled to the processor 802, such as a disposable or rechargeable battery.
- the rechargeable battery may also be coupled to the peripheral device connection port to receive a charging current from a source external to the wireless communication device 800.
- the wireless communication device 800 may also include a physical button 824 for receiving user inputs.
- the wireless communication device 800 may also include a power button 826 for turning the wireless communication device 800 on and off.
- the wireless communication device 800 may further include an accelerometer 828, which senses movement, vibration, and other aspects of the device through the ability to detect multi-directional values of and changes in acceleration.
- the accelerometer 828 may be used to determine the x, y, and z positions of the wireless communication device 800. Using the information from the accelerometer, a pointing direction of the wireless communication device 800 may be detected.
- DSP digital signal processor
- ASIC application specific integrated circuit
- FPGA field programmable gate array
- a general-purpose processor may be a microprocessor, but, in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
- a processor may also be implemented as a combination of receiver smart objects, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Alternatively, some operations or methods may be performed by circuitry that is specific to a given function.
- the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a non-transitory computer-readable storage medium or non-transitory processor-readable storage medium.
- the operations of a method or algorithm disclosed herein may be embodied in a processor-executable software module or processor-executable instructions, which may reside on a non-transitory computer-readable or processor-readable storage medium.
- Non-transitory computer-readable or processor-readable storage media may be any storage media that may be accessed by a computer or a processor.
- non-transitory computer-readable or processor-readable storage media may include RAM, ROM, EEPROM, FLASH memory, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage smart objects, or any other medium that may be used to store desired program code in the form of instructions or data structures and that may be accessed by a computer.
- Disk and disc includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above are also included within the scope of non-transitory computer-readable and processor-readable media.
- the operations of a method or algorithm may reside as one or any combination or set of codes and/or instructions on a non-transitory processor-readable storage medium and/or computer-readable storage medium, which may be incorporated into a computer program product.
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Radar, Positioning & Navigation (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Stereoscopic And Panoramic Photography (AREA)
- Details Of Cameras Including Film Mechanisms (AREA)
- Cameras In General (AREA)
Description
- Unmanned aerial vehicles (UAVs), also referred to as "drones," are used for aerial photography and/or video surveillance. In poor daylight or low/no ambient light conditions, UAV cameras often depend on a built-in flash or an onboard light as a primary light source. However, the illumination provided by such lighting declines with the distance between the target and the light source. While stationary remote lights may be used to illuminate a scene, such lights require advanced setup, and cannot be easily reconfigured or moved.
- Attention is drawn to
US 2015 377405 A1 describing a stabilisation system for an unmanned aerial vehicle (UAV) comprising positional stabilizer. A UAV provided with a stabilisation system, a method of stabilising a UAV and an inspection method are also provided. - In accordance with the present invention the scope as set forth in the independent claims, respectively, are provided. Preferred embodiments of the invention are described in the dependent claims.
- Systems, devices, and methods of various embodiments include an aerial imaging system including a first unmanned aerial vehicle (UAV) and a second UAV in which the first UAV includes a camera and may be configured to receive input from an operator. The second UAV may be configured to dock with and may include a light configured to provide illumination for the camera.
- In various embodiments, the first UAV may be configured to fly while the second UAV is docketed on the first UAV, and the second UAV may be configured to deploy and fly independently from the first UAV.
- The first UAV may include a processor configured to determine a position of the second UAV from a first image. The processor may be further configured to determine the position of the second UAV from camera images. The processor may be configured to determine an aerial position of the second UAV flying separate from the first UAV based on a comparison of camera images and images received from the second UAV.
- In various embodiments, the second UAV may be configured to fly separate from the first UAV to a predetermined aerial position relative to the first UAV without input from an operator of the first UAV. In some embodiments, the second UAV may include a microphone configured to record sounds. In some embodiments, the second UAV may be configured to use signals received from a proximity sensor to maintain a determined aerial position of the second UAV relative to a target of photography by the camera.
- In some embodiments, the second UAV may include a processor configured to recognize a target of photography by the camera in images obtained from an image capture device on the second UAV, and to maintain an aerial position relative to a target of photography by the camera. In some embodiments, the second UAV may be controlled to fly to a position relative to a target of photography by the camera in order to provide a determined amount of illumination of the target of photography. In some embodiments, the amount of illumination provided by the light on the second UAV may be adjustable by changing the aerial position of the second UAV or changing a level of light emitted from the second UAV. The light may emit in an infrared spectrum and the camera may be configured for thermal imaging.
- In some embodiments, a third UAV may be configured to dock with and deploy from the first UAV. In some embodiments, the first UAV may be configured to fly while supporting both the second UAV and the third UAV. In some embodiments, the camera on the first UAV and the first UAV may be configured to use camera images for controlling the second UAV. In some embodiments, the camera may be configured to contemporaneously capture two or more different images. In some embodiments, the camera may be configured so that the two or more different images overlap.
- Some embodiments may include deploying, from a first unmanned aerial vehicle (UAV) including a camera, a second UAV to fly separate from the first UAV, activating a light on the second UAV to illuminate a target of photography by the camera, and activating the camera to photograph the target of photography illuminated by the light. Some embodiments may include flying the second UAV separate from the first UAV without input from a remotely controlling operator of the first UAV. Some embodiments may include flying the first UAV while the second UAV is docked on the first UAV. Some embodiments may include activating the camera to photograph the target of photography to contemporaneously capture two or more different images, which may overlap.
- Some embodiments may include re-docking the second UAV with the first UAV after activating the camera to photograph the target of photography. Some embodiments may include activating a microphone on the second UAV for recording sounds emanating from a target of a sound recording.
- Some embodiments may include deploying a third UAV from the first UAV, and flying the first UAV while supporting both the second UAV and the third UAV. Some embodiments may include determining a position of the second UAV using camera images from the camera for controlling the second UAV. Some embodiments may include receiving by the first UAV remotely captured visual images taken by another camera on the second UAV. Some embodiments may include determining an aerial position of the second UAV flying separate from the first UAV based on comparing an onboard visual image captured by the camera on the first UAV and the remotely captured visual image taken by the other camera on the second UAV. Some embodiments may include transmitting from the first UAV to the second UAV a command for the second UAV to maintain a predetermined aerial position relative to the first UAV. Some embodiments may include transmitting from the first UAV to the second UAV a command for the second UAV to maintain a predetermined aerial position relative to the target of photography by the camera. Some embodiments may include receiving location information for determining an aerial position of the second UAV from the second UAV flying remote from the first UAV. Some embodiments may include determining a relative position of the second UAV relative to the first UAV. Some embodiments may include determining an amount of illumination provided by the light on the second UAV. Some embodiments may include determining an adjustment needed for the amount of illumination provided by the light on the second UAV and transmitting instructions to the second UAV for making the adjustment needed for the amount of illumination provided by the light on the second UAV.
- Further embodiments may include an aerial imaging system including a first UAV and a second UAV in which the first and second UAVs include means for performing functions of the methods summarized above. Further embodiments may include non-transitory processor-readable storage media having stored thereon processor-executable instructions configured to cause a processor of a first UAV to perform operations of the methods summarized above.
- The accompanying drawings, which are incorporated herein and constitute part of this specification, illustrate exemplary embodiments, and together with the general description given above and the detailed description given below, serve to explain the features of the various embodiments.
-
FIG. 1 is a schematic perspective view of an aerial imaging system including a first UAV capturing an image of targets illuminated by a second UAV according to various embodiments. -
FIG. 2A is a front elevation view of an aerial imaging system including a first UAV supporting a second UAV docked thereon according to various embodiments. -
FIG. 2B is a front elevation view of the aerial imaging system ofFIG. 2A with the second UAV flying separate from the first UAV according to various embodiments. -
FIG. 3A is a top view of an aerial imaging system including a first UAV supporting a second UAV, a third UAV, a fourth UAV, and a fifth UAV piggybacking on the first UAV according to various embodiments. -
FIG. 3B is a perspective relief view of the second UAV ofFIG. 3A according to various embodiments. -
FIG. 4A is a top schematic view of an aerial imaging system including a first UAV with a second UAV flying separate from the first UAV and a third UAV, a fourth UAV, and a fifth UAV piggybacking on the first UAV according to various embodiments. -
FIG. 4B is a top schematic view of the aerial imaging system ofFIG. 4A with both the second UAV and the third UAV flying separate from the first UAV according to various embodiments. -
FIG. 5 is a component diagram of a control unit of a UAV suitable for use with various embodiments. -
FIG. 6 is a schematic perspective view of an aerial imaging system including a first UAV capturing an image of targets a second UAV is illuminating while recording sounds from the targets according to various embodiments. -
FIG. 7 is a process flow diagram illustrating a method of aerial imaging according to various embodiments. -
FIG. 8 is a component diagram of a wireless communication device suitable for use with various embodiments. - Various embodiments will be described in detail with reference to the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. References made to particular examples and implementations are for illustrative purposes, and are not intended to limit the scope of the claims.
- Various embodiments include an aerial imaging system that includes at least two UAVs. A first unmanned aerial vehicle (UAV) includes a camera and is configured to receive input from an operator. A second UAV is configured to dock with and deploy from the first UAV. The second UAV includes a light for illuminating a target being captured as a first image by the camera on the first UAV. Positioning an illumination source on the second UAV enables the aerial imaging system to capture images by the first UAV at a first distance from a subject or scene that is illuminated by the second UAV at a second distance from the subject or scene that may be selected or controlled to achieve a desired or minimum level of illumination. Thus, when the distance from the subject or scene selected for gathering images by the first UAV would attenuate light from the illumination source on the second UAV to an unacceptable level, the second UAV may separate from the first UAV and fly to a position closer to the subject or scene where proper or desired illumination can be achieved. Multiple illumination UAVs may be implemented in the aerial imaging system, enabling lighting from different directions.
- As used herein, the term "UAV" refers to one of various types of unmanned aerial vehicles. A UAV may include an onboard computing device configured to fly and/or operate the UAV without remote operating instructions (i.e., autonomously), and/or with some remote operating instructions or updates to instructions stored in a memory, such as from a human operator or remote computing device (i.e., semi-autonomously). UAVs may be propelled for flight in any of a number of known ways. For example, a plurality of propulsion units, each including one or more rotors, may provide propulsion or lifting forces for the UAV and any payload carried by the UAV. In addition, UAVs may include wheels, tank-treads, or other non-aerial movement mechanisms to enable movement on the ground, on or in water, and combinations thereof. The UAV may be powered by one or more types of power source, such as electrical, chemical, electro-chemical, or other power reserve, which may power the propulsion units, the onboard computing device, and/or other onboard components.
- The term "computing device" is used herein to refer to an electronic device equipped with at least a processor. Examples of computing devices may include UAV flight control and/or mission management computer that are onboard the UAV, as well as remote computing devices communicating with the UAV configured to perform operations of the various embodiments. Remote computing devices may include wireless communication devices (e.g., cellular telephones, wearable devices, smart-phones, web-pads, tablet computers, Internet enabled cellular telephones, Wi-Fi® enabled electronic devices, personal data assistants (PDA's), laptop computers, etc.), personal computers, and servers. In various embodiments, computing devices may be configured with memory and/or storage as well as wireless communication capabilities, such as network transceiver(s) and antenna(s) configured to establish a wide area network (WAN) connection (e.g., a cellular network connection, etc.) and/or a local area network (LAN) connection (e.g., a wireless connection to the Internet via a Wi-Fi® router, etc.).
-
FIG. 1 illustrates an example of anaerial imaging system 100 with afirst UAV 101 and asecond UAV 102 according to various embodiments. In the example illustrated inFIG. 1 , thefirst UAV 101 includes acamera 110 and is configured to receive input or instructions from anoperator 5 using awireless communication device 800 via anoperator communication link 130. Theoperator 5 may initiate and control the flight of thefirst UAV 101, and may control thecamera 110, such as for photographing atarget 50. Thesecond UAV 102 includes a light 120 for illuminating thetarget 50 being photographed by thefirst UAV 101. Alternatively, thefirst UAV 101 may include the light 120 and thesecond UAV 102 may include thecamera 110. As a further alternative, both thefirst UAV 101 and thesecond UAV 102 may each include a light 120 and acamera 110. - In some embodiments, the
second UAV 102 is configured to dock with and deploy from thefirst UAV 101 so that thefirst UAV 101 may carry thesecond UAV 102 to a photography location. In some embodiments, thefirst UAV 101 is configured to dock withsecond UAV 102 and deploy from the so that thesecond UAV 102 may carry thefirst UAV 101 to a photography location. In some embodiments thefirst UAV 101 and thesecond UAV 102 may be configured to dock with and deploy from a third UAV (see for exampleFIG. 3A ) so that the third UAV may carry thefirst UAV 101 andsecond UAV 102 to a photography location. - In order to coordinate the illumination output from the light 120 with the photography taken by the
camera 110, thesecond UAV 102 may be synchronized with thefirst UAV 101. Using aninter-UAV communication link 135, thefirst UAV 101 may control the navigation of thesecond UAV 102 and keep the first andsecond UAVs - For ease of description and illustration, some detailed aspects of the
first UAV 101 andsecond UAV 102 are omitted, such as wiring, frame structure, power source, landing columns/gear, or other features that would be known to one of skill in the art. In addition, although in various embodiments the UAVs are illustrated as quad copters with four rotors, the UAVs may include more or fewer than four rotors. Also, thefirst UAV 101 andsecond UAV 102 may behave similar or different configurations, numbers of rotors, and/or other aspects. - The
camera 110 may focus light reflected or emitted from virtually anything within a field ofview 112 onto an internal light-sensitive surface for capturing one or more images. In this way, thecamera 110 captures images for still and/or video photography. Thecamera 110 may be pivotally mounted on thefirst UAV 101 or otherwise adjustable to provide 3-axis pointing control. The field ofview 112 includes the extent of the observable world seen by thecamera 110, which extends outwardly away from thecamera 110 toward infinity. Thecamera 110 may be focused on thetarget 50 when thetarget 50 is within the field ofview 112 and, if applicable, within the focal limits of thecamera 110. The field ofview 112 may includenearby objects 55 and intervening objects, such as thesecond UAV 102. Thetarget 50 may include one or more creatures and/or objects that is/are the focus of the photography, such as the three individuals shown inFIG. 1 . - In various embodiments, the light 120 on the
second UAV 102 provides a light source and/or a supplemental light source to enable the source of illumination for photography (i.e., still and/or video photography) to be positioned closer to thetarget 50 than thecamera 110 and/or to provide illumination from an angle different from that of the perspective of thecamera 110. Thesecond UAV 102, along with the light 120, may fly separate from thefirst UAV 101 and be positioned to project light from a position removed from that of thefirst UAV 101. - When ambient light conditions are low or less than desirable, various embodiments may use the light 120 on the
second UAV 102 to create a lightedregion 122. The lightedregion 122 may enhance the illumination in part of the field ofview 112 of thecamera 110, thus projecting additional illumination on thetarget 50. When the lightedregion 122 is much smaller than the field ofview 112, thenearby objects 55 within the field ofview 112 may not receive the same level of illumination as thetarget 50 if such objects are disposed outside the lightedregion 122. Also, because the lightedregion 122 projects away from the second UAV 102 (i.e., toward the target 50), thesecond UAV 102 itself, or parts thereof, may not be included in the lightedregion 122. - A relative position of the
target 50 is further illustrated using a first frame ofreference 115, a second frame ofreference 117, and a third frame ofreference 125. The first frame ofreference 115 corresponds to an imaginary planar extent bounded by the field ofview 112, perpendicular to a direction in which thecamera 110 is facing, and includes a focal point on thetarget 50. The second frame ofreference 117 also corresponds to an imaginary planar extent bounded by the field ofview 112 and perpendicular to the direction in which thecamera 110 is facing, but corresponds to a location of thesecond UAV 102 and includes a point from which the light 120 emanates. The third frame ofreference 125 corresponds to an imaginary planar extent that is both bounded by the lightedregion 122 and within the first frame ofreference 115. - In various embodiments, the
first UAV 101 and thesecond UAV 102 may be configured to dock with one another. Thesecond UAV 102 may couple to thefirst UAV 101 via a coupling controlled by either of the two UAVs, or bothUAVs second UAV 102 while thefirst UAV 101 transits and performs various aerial maneuvers. In the docked configuration, thesecond UAV 102 may piggyback on an upper region of thefirst UAV 101. Alternatively, thesecond UAV 102 may dock on another portion of thefirst UAV 101, such as attached to a lateral or lower side thereof. Lateral and/or lower side docking arrangements may need to further consider the interaction of aerodynamic forces between thefirst UAV 101 and thesecond UAV 102. Turbulence generated by the rotors of thefirst UAV 101 may interfere with the deployment and/or re-docking of thesecond UAV 102. An area above thefirst UAV 101, while in flight, tends to be a lower region of turbulence. Thus, thesecond UAV 102 may try to remain in the region above the flight path of thefirst UAV 101 in order to avoid turbulence and loss of control. As a further alternative, thesecond UAV 102 may be partially or fully held inside thefirst UAV 101 while in the docked configuration. - The
first UAV 101 may be configured to fly while supporting thesecond UAV 102 when operating in the docked configuration. Thus, theaerial imaging system 100 may use thecamera 110 and the light 120 either while thefirst UAV 101 and thesecond UAV 102 are in the docked configuration or after the two UAVs have separated. For example, when thetarget 50 being photographed is near thecamera 110 or the light 120 is not needed due to the ambient lighting conditions, the first andsecond UAVs first UAV 101 may take one or more photographs either by using thecamera 110 without using the light 120 (e.g., ambient light is sufficient) or using thecamera 110 and the light 120 together while thesecond UAV 102 remains docked with thefirst UAV 101. Once thetarget 50 is far enough away from thecamera 110 that the light 120 cannot provide sufficient illumination when the first andsecond UAVs second UAV 102 may deploy from thefirst UAV 101 and fly to a location closer to thetarget 50 where the lightedregion 122 provides sufficient illumination of the target. - Once deployed from the
first UAV 101, thesecond UAV 102 may fly to a designated aerial position relative to thefirst UAV 101 and/or thetarget 50 without navigational instructions from theoperator 5. The designated aerial position may be determined for providing enhanced illumination to thetarget 50 being photographed. Thefirst UAV 101, thesecond UAV 102, or both UAVs may automatically determine the designated aerial position for illuminating the target under the current lighting conditions and desired photographic effects. Alternatively, the designated aerial position for thesecond UAV 102 relative to thefirst UAV 101 may be a predetermined relative position, such as a default standoff position (e.g., five meters away from thefirst UAV 101 at a set angle and direction). Navigation of thesecond UAV 102 to the designated aerial position may be fully controlled by thefirst UAV 101, controlled by thefirst UAV 101 based on information/feedback received from thesecond UAV 102, or controlled independently by thesecond UAV 102. -
FIGS. 2A-2B are front elevation views of theaerial imaging system 100 according to various embodiments. With reference toFIGS. 1-2A , theaerial imaging system 100 may include thefirst UAV 101 supporting thesecond UAV 102 docked thereon (e.g., as shown inFIG. 2A ). One or both of thefirst UAV 101 and thesecond UAV 102 may include an automatic latching mechanism for securely holding the twoUAVs UAVs second UAV 102, the automatic latching mechanism may disengage. - With reference to
FIGS. 1-2B , the second UAV 102 (e.g., as shown inFIG. 2B ) may be deployed and fly separate from thefirst UAV 101. In some embodiments, an ejection mechanism may be included that quickly separates the twoUAVs - In alternative embodiments (not shown), the
second UAV 102 may be carried on an underside of thefirst UAV 101, which would enable tosecond UAV 102 to make use of gravity to quickly separate from thefirst UAV 101 by free-falling for a few seconds when thesecond UAV 102 is released from thefirst UAV 101. Due to the significant turbulence generated from the downdraft of thefirst UAV 101, in flight re-docking to the underside of thefirst UAV 101 may be infeasible. Therefore, additional procedures may be necessary to re-dock thesecond UAV 102 to the underside of thefirst UAV 101, such as manual re-docking performed by theoperator 5 after landing. -
FIG. 3A is a top view of anaerial imaging system 300 in accordance with various embodiments. With reference toFIGS. 1-3A , theaerial imaging system 300 may include afirst UAV 301 plus asecond UAV 302, athird UAV 303, afourth UAV 304, and afifth UAV 305 docked on thefirst UAV 301. Theaerial imaging system 300 may include more or fewer UAVs in various embodiments. Thefirst UAV 301 may include the camera (e.g., 110) and is illustrated as a quad copter configuration with fourrotors 315, although thefirst UAV 301 may include more or fewer rotors. Each of the second, third, fourth, andfifth UAVs first UAV 301. Thefirst UAV 301 may be configured to fly while supporting some or all of thesecond UAV 302, thethird UAV 303, thefourth UAV 304, and/or thefifth UAV 305. Alternatively, one or more of thesecond UAV 302, thethird UAV 303, thefourth UAV 304, and/or thefifth UAV 305 may include thecamera 110 in addition to or instead of thefirst UAV 301. Regardless of which UAV (e.g., 301, 302, 303, 304, 305) includes the camera, the other ones of thefirst UAV 301,second UAV 302, thethird UAV 303, thefourth UAV 304, and/or thefifth UAV 305 may include the light (e.g., 120 inFIG. 1 ) for providing illumination for thecamera 110. -
FIG. 3B is a perspective relief view of thesecond UAV 302 inFIG. 3A . With reference toFIGS. 1-3B , the second UAV 302 (which may be similar to the second UAV 102) may include the light 120 and/or additional sensors, such as a microphone. Similarly, one or more of the third, fourth, andfifth UAVs fifth UAVs rotors 325; however, any of the second, third, fourth, andfifth UAVs -
FIG. 4A is a top schematic view of theaerial imaging system 300 including thefirst UAV 301, thesecond UAV 302, thethird UAV 303, thefourth UAV 304, and thefifth UAV 305 according to various embodiments. With reference toFIGS. 1-4A , thesecond UAV 302 is shown flying separate from thefirst UAV 301, while thethird UAV 303, thefourth UAV 304, and thefifth UAV 305 remain docked in a piggybacking configuration on thefirst UAV 301. Thefirst UAV 301 is configured to receive input from theoperator 5 via theoperator communication link 130 to thewireless control unit 6. In addition to initiating and controlling flight of thefirst UAV 101, theoperator 5 may, directly or indirectly, initiate and control flight of any of thesecond UAV 302, thethird UAV 303, thefourth UAV 304, and/or thefifth UAV 305, such as by transmitting a deployment command. The deployment command may be part of a process that may be initiated manually by the operator or automatically by a processer (e.g., of the first UAV 101) when enhanced lighting is needed for a photograph to be taken by thefirst UAV 101. Theinter-UAV communication link 135 may control the navigation of thesecond UAV 302 and keep the first andsecond UAVs inter-UAV communication link 135, the activation of the light (e.g., 120) on thesecond UAV 302 that generates the lightedregion 122. - The
second UAV 302 may be deployed toward afocal point 60 on or near thetarget 50. However, it may be desirable to avoid having thesecond UAV 302 fly to an aerial position that lies between the camera (e.g., 120) and thetarget 50, since thesecond UAV 302 would thus block at least part of any photograph. Thus, thesecond UAV 302 may be directed to fly at a higher elevation than thetarget 50, either staying on the fringe of the field ofview 112 or just outside thereof. Alternatively, thesecond UAV 302 may be directed to land on the ground or hover just off the ground in front of thetarget 50, which may also be either on the fringe of the field ofview 112 or just outside thereof. The designated aerial position of thesecond UAV 302 may be selected based on a desired angle for the light (e.g., 120) to emit in generating the lightedregion 122. For example, light emitted from above thetarget 50 may more naturally emulate sunlight. -
FIG. 4B is a top schematic view of theaerial imaging system 300 including thefirst UAV 301, thesecond UAV 302, thethird UAV 303, thefourth UAV 304, and thefifth UAV 305 according to various embodiments. With reference toFIGS. 1-4B , thesecond UAV 302 and thethird UAV 303 are shown flying separate from thefirst UAV 301, while thefourth UAV 304 and thefifth UAV 305 remain docked in a piggybacking configuration on thefirst UAV 301. When two of the second, third, fourth, andfifth UAVs first UAV 301 for flight stability. - Separate
inter-UAV communication links second UAV 302 and thethird UAV 303, respectively. Theinter-UAV communication links first UAV 301 synchronized with each of thesecond UAV 302 and thethird UAV 303. In addition, theinter-UAV communication links second UAV 302 and thethird UAV 303 for generating separatelighted regions 422. Having the separatelighted regions 422 overlap but originate from opposite sides of thetarget 50 may avoid shadows on thetarget 50. In various embodiments there may be direct inter-UAV communication links between one or more of the second, third, fourth, andfifth UAVs -
FIG. 5 illustrates a configuration of thefirst UAV 301 that may be used for any of the UAVs in various embodiments. With reference toFIGS. 1-5 , thefirst UAV 301 may include acontrol unit 150 that may house various circuits and devices used to power and control the operation of thefirst UAV 301, as well as any other UAVs controlled by thefirst UAV 301. Thecontrol unit 150 may include aprocessor 160, apower module 170, aninput module 180, acamera 181,sensors 182, anoutput module 185, and aradio module 190 coupled to anantenna 191. Theprocessor 160 may include or be coupled tomemory 161 and anavigation unit 163. Theprocessor 160 may be configured with processor-executable instructions to control flight and other operations of thefirst UAV 301, including operations of the various embodiments. Theprocessor 160 may be coupled to one ormore cameras 181 andsensors 182. - The
camera 181 may include one or more image capturing devices for photographing the target (e.g., 50). More than one image capturing device may be configured to contemporaneously capture two different images including the target. For example, a first image may include both the target and the second UAV (e.g., 102), while a second image may include the target but not the second UAV. Alternatively, thecamera 181 may be configured to detect light in the infrared spectrum for thermal imaging. Such thermal imaging features may be enhanced if the light emitted from the second UAV extends to the infrared spectrum. - The
sensors 182 may be optical sensors (e.g., light meters for controlling exposure and determining whether additional illumination is required), radio sensors, a rotary encoder, pressure sensors (i.e., for detecting wind, lift, drag, or changes therein) or other sensors. Alternatively or additionally, thesensors 182 may be contact or pressure sensors that may provide a signal that indicates when thefirst UAV 301 has landed. - The
power module 170 may include one or more batteries that may provide power to various components, including theprocessor 160, theinput module 180, thesensors 182, theoutput module 185, and theradio module 190. In addition, thepower module 170 may include energy storage components, such as rechargeable batteries. In this way, theprocessor 160 may be configured with processor-executable instructions to control the charging of thepower module 170, such as by executing a charging control algorithm using a charge control circuit. Alternatively or additionally, thepower module 170 may be configured to manage its own charging. Theprocessor 160 may be coupled to anoutput module 185, which may output control signals for managing the motors that drive therotors 315 and other components. - Through control of the individual motors of the
rotors 315, thefirst UAV 301 may be controlled in flight. Theprocessor 160 may receive data from thenavigation unit 163 and use such data in order to determine the present position and orientation of thefirst UAV 301, as well as the appropriate course towards the target (e.g., 50). In various embodiments, thenavigation unit 163 may include a global navigation satellite system (GNSS) receiver system (e.g., one or more Global Positioning System (GPS) receivers) enabling thefirst UAV 301 to navigate using GNSS signals. Alternatively or in addition, thenavigation unit 163 may be equipped with radio navigation receivers for receiving navigation beacons or other signals from radio nodes, such as navigation beacons (e.g., very high frequency (VHF) Omni Directional Radio Range (VOR) beacons), Wi-Fi access points, cellular network sites, radio station, remote computing devices, other UAVs, etc. - The
processor 160 and/or thenavigation unit 163 may be configured to communicate with a server through a wireless connection (e.g., a cellular data network) to receive commands to use or stop using the extended flight protocol, receive data useful in navigation, provide real-time position altitude reports, and assess data. Anavionics module 167 coupled to theprocessor 160 and/or thenavigation unit 163 may be configured to provide flight control-related information such as altitude, attitude, airspeed, heading and similar information that thenavigation unit 163 may use for navigation purposes, such as dead reckoning between GNSS position updates. Theavionics module 167 may include or receive data from a gyro/accelerometer unit 165 that provides data regarding the orientation and accelerations of thefirst UAV 301 that may be used in navigation and positioning calculations. - The
radio module 190 may be configured to receive signals via theantenna 191, such as command signals to initiate, continue, or discontinue the use of the light (e.g., 120) from the second UAV (e.g., 302), receive signals from aviation navigation facilities, etc., and provide such signals to theprocessor 160 and/or thenavigation unit 163 to assist in operation of thefirst UAV 301. In some embodiments, commands for controlling thefirst UAV 301 and/or thesecond UAV 302, or components thereof may be received via theradio module 190. In some embodiments, thefirst UAV 301 may receive signals from awireless control unit 6. For example, theoperator communication link 130 may include input from a knowledge base regarding current conditions, a current orientation of thefirst UAV 301 or elements thereof, predicted future conditions, requirements for particular UAV maneuvers or missions, aiming parameters of the camera or even information regarding a target of the photography. - In various embodiments, the
radio module 190 may be configured to switch between a cellular connection and a Wi-Fi or other form of radio connection depending on the location and altitude of thefirst UAV 301. For example, while in flight at an altitude designated for UAV traffic, theradio module 190 may communicate with a cellular infrastructure in order to maintain communications with a server. In addition, communications with thewireless control unit 6 may be established using cellular telephone networks while thefirst UAV 301 is flying out of line-of-sight with theoperator 5. Communication between theradio module 190 and theoperator communication link 130 may transition to a short-range communication link (e.g., Wi-Fi or Bluetooth) when thefirst UAV 301 moves closer to thewireless control unit 6. Similarly, thefirst UAV 301 may include and employ other forms of radio communication, such as mesh connections with other UAVs or connections to other information sources (e.g., balloons or other stations for collecting and/or distributing weather or other data harvesting information). - In various embodiments, the
control unit 150 may be equipped with theinput module 180, which may be used for a variety of applications. For example, theinput module 180 may receive and pre-process images or data from anonboard camera 181 orsensor 182, or may receive electronic signals from other components (e.g., a payload). Theinput module 180 may receive an activation signal for causing actuators on thefirst UAV 301 to deploy landing cushions or similar components for affecting an emergency landing. In addition, theoutput module 185 may be used to activate components (e.g., an energy cell, an actuator, an indicator, a circuit element, a sensor, and/or an energy-harvesting element). - While the various components of the
control unit 150 are illustrated inFIG. 5 as separate components, some or all of the components (e.g., theprocessor 160, theoutput module 185, theradio module 190, and other units) may be integrated together in a single device or module, such as a system-on-chip module. -
FIG. 6 illustrates anaerial imaging system 600 with a first UAV 601 (which, for example, may generally correspond to thefirst UAV FIGS. 1-5 ) and a second UAV 602 (which, for example, may generally correspond to the second orother UAVs FIGS. 1-5 ) in accordance with various embodiments. With reference toFIGS. 1-6 , thefirst UAV 601 includes a camera (e.g., 110) and is configured to receive input from theoperator 5 via anoperator communication link 130 to awireless communication device 606 in the form of a tablet computer. Theoperator 5 may not only initiate and control the flight of thefirst UAV 601 but also may control thecamera 110, such as for photographing atarget 50. - The
second UAV 602 may be configured to dock with and deploy from thefirst UAV 601. Also, thesecond UAV 602 includes a light (e.g., 120) for illuminating thetarget 50 being photographed by thefirst UAV 601. In addition, thesecond UAV 602 may include amicrophone 630 configured to record sounds 51 from an area in which the target is located. Thesecond UAV 602 may be controlled to fly near thetarget 50 being photographed to pick up audio. Thesecond UAV 602 may then wirelessly transmit audio to thefirst UAV 601 using theinter-UAV communication link 135. Thefirst UAV 601 may synchronize the received audio with video images being captured. - In various embodiments, the
first UAV 601 may independently and/or automatically control the aerial position of thesecond UAV 602 in a closed loop fashion using feedback from resources onboard thefirst UAV 601. For example, the inter-UAV communication link (e.g., 135, 136 inFIGS. 1 ,4A ,4B , and6 ) may provide a distance and direction between the twoUAVs first UAV 101 and/or thesecond UAV 102. - Various embodiments may use the camera (e.g., 110) (and/or other sensor(s)) not only for photography but also to monitor, track and/or change the position of the
second UAV 602. The processer (e.g., 160 inFIG. 5 ) in thefirst UAV 601 may control thesecond UAV 602 based on data from the camera, such as afirst image 612 of the field ofview 112 from the camera. If thesecond UAV 602 is maintained within the field ofview 112, the processor in thefirst UAV 601 may determine an aerial position of thesecond UAV 602 based on the size and position of thesecond UAV 602 within thefirst image 612. Meanwhile, asecond image 625 that cuts out thesecond UAV 602 may be used for more conventional image capture. Optionally, the first andsecond images focus region 650 on thetarget 50 in thefirst image 612, a focal distance may be determined between the camera (i.e., the first UAV 601) and thetarget 50. Using the focal distance, a processor in thefirst UAV 601 may calculate a first distance between the twoUAVs second UAV 602 and thetarget 50. - Various embodiments may use the camera to optimize illumination, including holding illumination on moving targets. The
second UAV 602 may hover a predetermined distance from thefirst UAV 601, unless the amount of light being received from thetarget 50 is more or less than desirable. The predetermined distance may be a default distance and/or relative aerial position. Alternatively, the predetermined position may be determined by the processor based on current conditions (e.g., ambient lighting). For example, one or both of thefirst UAV 601 and thesecond UAV 602 may include a light sensor (e.g.,sensor 182 inFIG. 5 ). In response to the processor of thefirst UAV 601 determining the amount of light is more or less than desirable, thesecond UAV 602 may be commanded to move further from or closer to thetarget 50, respectively. Alternatively, in response to the processor of thefirst UAV 601 determining the amount of light is more or less than desirable, thesecond UAV 602 may be commanded to change to brightness emitted by the light accordingly. - In various embodiments, the
second UAV 602 may independently and/or automatically control its own aerial position relative to thefirst UAV 601 and/or thetarget 50. For example, thesecond UAV 602 may use an onboard proximity sensor to maintain a determined aerial position relative to thefirst UAV 601 and/or thetarget 50. Alternatively, thesecond UAV 602 may include its own camera and processor for capturing and analyzing images of thefirst UAV 601 and/or thetarget 50. After receiving information about thetarget 50 from thefirst UAV 601, the processor onboard thesecond UAV 602 may be configured to recognize thetarget 50. Using target recognition, thesecond UAV 602 may maintain a fixed position relative to thetarget 50. - In various embodiments, the first and
second UAVs second UAV 602. For example, thesecond UAV 602 may collect information about its aerial position, using communication links, GPS, compass, and/or images from a camera onboard thesecond UAV 602. Thesecond UAV 602 may transmit the collected information to thefirst UAV 601 for processing and further controlling thesecond UAV 602. - In various embodiments, the first UAV (e.g., 101, 301, 601) may have generally the same components as the second UAV (e.g., 102, 302, 602) and/or additional UAVs (e.g., third, fourth, and
fifth UAVs third UAV 303,fourth UAV 304, fifth UAV 305) in which one or more of the second UAV and additional UAVs may be different and/or have different components from one another. -
FIG. 7 illustrates amethod 700 of aerial imaging according to various embodiments. With reference toFIGS. 1-7 , operations of themethod 700 may be performed by a UAV control unit (e.g., 150 inFIG. 5 ) or another computing device (e.g.,wireless control unit 6 inFIGS. 1 ,4A ,4B ,5 and/orwireless communication device 606 inFIG. 6 ) in communication with the first UAV (e.g., 101, 301, 601 inFIGS. 1-6 ). - In
block 710, the processor of the first UAV (e.g., theprocessor 160 in thecontrol unit 150 or processor in a remote device, such as the wireless communication device 606) may receive an input indicating the second UAV and/or at least one additional UAV (e.g., the third, fourth, and/orfifth UAV FIG. 1 ), an input from another process (e.g., image processing analysis determines lighting needs adjustment, a knowledge base (e.g., in an onboard and/or remote database), or systems controlling the operation of the first UAV). - In
block 720, the processor of the first UAV may deploy the second UAV and/or at least one additional UAV, which may involve releasing a latching mechanism and/or using an ejection mechanism to ensure the second and/or additional UAV quickly moves away from the first UAV. Prior to deploying the second and/or additional UAV, the processor may activate or otherwise cause to activate propellers on the second and/or additional UAV to prepare the second and/or additional UAV for flight. Once deployed, the second and/or additional UAV may automatically proceed to a designated aerial position relative to the first UAV, which may include more than one aerial position if multiple UAVs are deployed. The designated aerial position(s) may be predetermined (such as a default or preprogrammed position) or determined based on ambient or other conditions. Alternatively, the process may provide the second and/or additional UAV with instructions for the second and/or additional UAV to reach the designated aerial position(s). - In
block 730, the processor of the first UAV may activate the light(s) on the second and/or additional UAV (if not already illuminated) to illuminate the target. The processor may transmit an activation signal via the inter-UAV communication link (e.g., 135 and/or 136 inFIGS. 4A and4B ) for activating the light(s). - In
determination block 740, the processor may determine whether lighting for taking one or images (or video) needs to change. For instance, the processor may determine whether lighting in one or more images taken by the camera needs to change. The processor may assess temporary images captured by the camera to determine with the current lighting is too low, too high, or just right. - In response to determining that the lighting needs to be changed (i.e., determination block 740 = "Yes"), the processor may transmit one or more instructions to the second and/or additional UAV in
block 742. The instruction(s) may indicate the second and/or additional UAV should change its aerial position, orientation of the light, and/or a lighting parameter (e.g., brightness) of the light on the second UAV. Accordingly, the second and/or additional UAV may make adjustments based on one or more of the instruction(s). - Optionally, in response to determining that the lighting needs to be changed (i.e., determination block 740 = "Yes"), the processor may also deploy one or more additional UAVs in
block 720. For example, if supplemental and/or different lighting is needed and one or more additional UAVs are available, the one or more additional UAV may be deployed to implement needed lighting changes. - In
block 744, the processor may wait until a message is received, indicating that the transmitted instructions have been implemented, before again determining whether the lighting needs to be changed indetermination block 740. The message may be received from the second and/or addition UAV. Alternatively, the processor may allow a designated waiting time to pass to enable the changes from the transmitted instructions inblock 742 to be implemented before determining whether the lighting needs to be changed further indetermination block 740. Additionally, the designated waiting time may act as a time-out period in case no message is received from the second and/or addition UAV. - In response to determining that the lighting does not need to be changed (i.e., determination block 740 = "No"), the processor may activate the camera to photograph the target in
block 750. - In
determination block 760, the processor may determine whether the first UAV will continue photographing the target. The determination whether to continue may be based on input from the operator or a default setting. In response to determining to continue photography (i.e., determination block 760 = "Yes"), the processor may further activate the light(s), if the light(s) are not already illuminated, on the second and/or additional UAV inblock 730. In response to determining not to continue photography (i.e., determination block 760 = "No"), the processor may initiate docking of the first and second UAVs inblock 765. Once the first and second UAVs are docked, the process may await receipt of further input indicating the second UAV should be deployed inblock 710. - Communications with the first UAV (e.g., 101, 301, 601 in
FIGS. 1-6 ) may be implemented using any of a variety of wireless communication devices (e.g., smartphones, tablets, smartwatches, etc.) an example of which is illustrated inFIG. 8 . Thewireless communication device 800 may include aprocessor 802 coupled with the various systems of thewireless communication device 800 for communication with and control thereof. For example, theprocessor 802 may be coupled to atouch screen controller 804, radio communication elements, speakers and microphones, and aninternal memory 806. Theprocessor 802 may be one or more multi-core integrated circuits designated for general or specific processing tasks. Theinternal memory 806 may be volatile or non-volatile memory, and may also be secure and/or encrypted memory, or unsecure and/or unencrypted memory, or any combination thereof. In another embodiment (not shown), thewireless communication device 800 may also be coupled to an external memory, such as an external hard drive. - The
touch screen controller 804 and theprocessor 802 may also be coupled to atouch screen panel 812, such as a resistive-sensing touch screen, capacitive-sensing touch screen, infrared sensing touch screen, etc. Additionally, the display of thewireless communication device 800 need not have touch screen capability. Thewireless communication device 800 may have one or more radio signal transceivers 808 (e.g., Peanut, Bluetooth, Bluetooth LE, ZigBee, Wi-Fi®, radio frequency (RF) radio, etc.) and antennae, the wirelesscommunication device antenna 810, for sending and receiving communications, coupled to each other and/or to theprocessor 802. Theradio signal transceivers 808 and the wirelesscommunication device antenna 810 may be used with the above-mentioned circuitry to implement the various wireless transmission protocol stacks and interfaces. Thewireless communication device 800 may include a cellular networkwireless modem chip 816 coupled to the processor that enables communication via a cellular network. - The
wireless communication device 800 may include a peripheraldevice connection interface 818 coupled to theprocessor 802. The peripheraldevice connection interface 818 may be singularly configured to accept one type of connection, or may be configured to accept various types of physical and communication connections, common or proprietary, such as USB, FireWire, Thunderbolt, or PCIe. The peripheraldevice connection interface 818 may also be coupled to a similarly configured peripheral device connection port (not shown). - In various embodiments, the
wireless communication device 800 may include one ormore microphones 815. For example, the wireless communication device may havemicrophones 815 that are conventional for receiving voice or other audio frequency energy from a user during a call. - The
wireless communication device 800 may also includespeakers 814 for providing audio outputs. Thewireless communication device 800 may also include ahousing 820, constructed of a plastic, metal, or a combination of materials, for containing all or some of the components discussed herein. Thewireless communication device 800 may include apower source 822 coupled to theprocessor 802, such as a disposable or rechargeable battery. The rechargeable battery may also be coupled to the peripheral device connection port to receive a charging current from a source external to thewireless communication device 800. Thewireless communication device 800 may also include aphysical button 824 for receiving user inputs. Thewireless communication device 800 may also include apower button 826 for turning thewireless communication device 800 on and off. - In various embodiments, the
wireless communication device 800 may further include anaccelerometer 828, which senses movement, vibration, and other aspects of the device through the ability to detect multi-directional values of and changes in acceleration. In the various embodiments, theaccelerometer 828 may be used to determine the x, y, and z positions of thewireless communication device 800. Using the information from the accelerometer, a pointing direction of thewireless communication device 800 may be detected. - The various embodiments illustrated and described are provided merely as examples to illustrate various features of the claims. However, features shown and described with respect to any given embodiment are not necessarily limited to the associated embodiment and may be used or combined with other embodiments that are shown and described. Descriptions of various embodiments in terms of a lighting UAV docking with a camera-carrying UAV to enable one UAV to fly both vehicles to a photography location are provided merely as examples, because the roles of the UAVs may be reversed and the two (or more UAVs) may dock with a third UAV that ferries all UAVs to a photography site as described. Further, the claims are not intended to be limited by any one example embodiment.
- The foregoing method descriptions and the process flow diagrams are provided merely as illustrative examples and are not intended to require or imply that the steps of the various embodiments must be performed in the order presented. As will be appreciated by one of skill in the art the order of operations in the foregoing embodiments may be performed in any order. Words such as "thereafter," "then," "next," etc. are not intended to limit the order of the operations; these words are used to guide the reader through the description of the methods. Further, any reference to claim elements in the singular, for example, using the articles "a," "an" or "the" is not to be construed as limiting the element to the singular.
- The various illustrative logical blocks, modules, circuits, and algorithm operations described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and operations have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the claims.
- The hardware used to implement the various illustrative logics, logical blocks, modules, and circuits described in connection with the aspects disclosed herein may be implemented or performed with a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, but, in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of receiver smart objects, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Alternatively, some operations or methods may be performed by circuitry that is specific to a given function.
- In one or more aspects, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a non-transitory computer-readable storage medium or non-transitory processor-readable storage medium. The operations of a method or algorithm disclosed herein may be embodied in a processor-executable software module or processor-executable instructions, which may reside on a non-transitory computer-readable or processor-readable storage medium. Non-transitory computer-readable or processor-readable storage media may be any storage media that may be accessed by a computer or a processor. By way of example but not limitation, such non-transitory computer-readable or processor-readable storage media may include RAM, ROM, EEPROM, FLASH memory, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage smart objects, or any other medium that may be used to store desired program code in the form of instructions or data structures and that may be accessed by a computer. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above are also included within the scope of non-transitory computer-readable and processor-readable media. Additionally, the operations of a method or algorithm may reside as one or any combination or set of codes and/or instructions on a non-transitory processor-readable storage medium and/or computer-readable storage medium, which may be incorporated into a computer program product.
- The preceding description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the claims. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the scope of the claims. Thus, the present disclosure is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the following claims and the principles and novel features disclosed herein.
Claims (15)
- An aerial imaging system, comprising:a first unmanned aerial vehicle (101), UAV, including a camera (110) and configured to receive input from an operator (5); anda second UAV (102) configured to dock with and deploy from the first UAV, the second UAV comprising a light (120) configured to provide illumination for the camera, characterised in that the second UAV is configured to fly to a position relative to a target (50) of photography by the camera in order to provide illumination of the target of photography by the camera.
- The aerial imaging system of claim 1, wherein the second UAV is configured to fly separate from the first UAV without the input from the operator.
- The aerial imaging system of claim 1, wherein the first UAV is configured to fly while the second UAV is docketed on the first UAV.
- The aerial imaging system of claim 1, wherein the second UAV includes a microphone configured to record sounds.
- The aerial imaging system of claim 1, further comprising:
a third UAV configured to dock with and deploy from the first UAV, wherein the first UAV is configured to fly while supporting both the second UAV and the third UAV. - The aerial imaging system of claim 1, wherein the camera and the first UAV are configured to use camera images for controlling the second UAV; orwherein the camera is configured to contemporaneously capture two or more different images; orwherein the camera is configured so that the two or more different images overlap.
- The aerial imaging system of claim 1, wherein the first UAV includes a processor configured to determine a position of the second UAV from a first image from the camera; and wherein the processor is further configured to determine the position of the second UAV from camera images; or
wherein the first UAV includes a processor configured to determine an aerial position of the second UAV flying separate from the first UAV based on a comparison of camera images and images received from the second UAV. - The aerial imaging system of claim 1, wherein the second UAV is configured to fly separate from the first UAV to a predetermined aerial position relative to the first UAV; orwherein the second UAV is configured to use signals received from a proximity sensor to maintain a determined aerial position of the second UAV relative to the target of photography by the camera; orwherein the second UAV includes a processor configured to recognize the target of photography by the camera in images obtained from an image capture device on the second UAV; orwherein the second UAV includes a processor configured to independently control flight of the second UAV separate from the first UAV and maintain an aerial position of the second UAV relative to the target of photography by the camera.
- The aerial imaging system of claim 1, wherein an amount of illumination provided by the light on the second UAV is adjustable; orwherein the light emits in an infrared spectrum and the camera is configured for thermal imaging; orwherein the first UAV includes a processor configured to:determine an amount of illumination provided by the light on the second UAV;determine an adjustment for the illumination provided by the light on the second UAV; andtransmit instructions to the second UAV for making the adjustment needed for the illumination provided by the light on the second UAV.
- A method (700) of capturing a visual image, comprising:deploying (720), from a first unmanned aerial vehicle (101), UAV, including a camera (110), a second UAV (102) to fly separate from the first UAV; characterised in that the method comprisesflying the second UAV to a position relative to a target (50) of photography by the camera;activating (730) a light (120) on the second UAV to illuminate the target (50) of photography by the camera; andactivating (750) the camera to photograph the target of photography by the camera illuminated by the light.
- The method of claim 10, further comprising:
re-docking the second UAV with the first UAV after activating the camera to photograph the target of photography. - The method of claim 10, further comprising flying the second UAV separate from the first UAV without input from a remotely controlling operator of the first UAV; orfurther comprising flying the first UAV while the second UAV is docked on the first UAV; orfurther comprising:
activating a microphone on the second UAV for recording sounds emanating from a target of a sound recording. - The method of claim 10, further comprising:deploying, from the first UAV, a third UAV configured to dock with and deploy from the first UAV; andflying the first UAV while supporting both the second UAV and the third UAV; or further comprising:determining a position of the second UAV using camera images from the camera for controlling the second UAV; orfurther comprising:determining an amount of illumination provided by the light on the second UAV;determining an adjustment needed for the illumination provided by the light on the second UAV; andtransmitting instructions to the second UAV for making the adjustment needed for the illumination provided by the light on the second UAV.
- An aerial imaging system, comprising:a first unmanned aerial vehicle, UAV, wherein the first UAV comprises:means for capturing visual images; andmeans for receiving input from an operator for flying the first UAV and activating the means for capturing visual images; anda second UAV, wherein the second UAV comprises:means for emitting light to provide illumination for the means for capturing visual images; andmeans for docking with and deploy from the first UAV, characterised in that the second UAV is configured to fly to a position relative to a target of the means for capturing visual images in order for the means for emitting light to provide illumination of the target.
- A non-transitory processor-readable storage medium having stored thereon processor-executable instructions configured to cause a processor of a first unmanned aerial vehicle, UAV, to perform operations comprising:deploying a second UAV from the first UAV and flying the second UAV separate from the first UAV; characterised in that the operations compriseactivating a light on the second UAV to illuminate a target of photography by a camera on the first UAV; andactivating the camera on the first UAV to photograph the target of photography illuminated by the light.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/144,105 US10005555B2 (en) | 2016-05-02 | 2016-05-02 | Imaging using multiple unmanned aerial vehicles |
PCT/US2017/018387 WO2017192198A1 (en) | 2016-05-02 | 2017-02-17 | Imaging using multiple unmanned aerial vehicles |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3452881A1 EP3452881A1 (en) | 2019-03-13 |
EP3452881B1 true EP3452881B1 (en) | 2020-04-22 |
Family
ID=58277316
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17710806.5A Active EP3452881B1 (en) | 2016-05-02 | 2017-02-17 | Imaging using multiple unmanned aerial vehicles |
Country Status (8)
Country | Link |
---|---|
US (1) | US10005555B2 (en) |
EP (1) | EP3452881B1 (en) |
JP (1) | JP6755966B2 (en) |
KR (1) | KR102251203B1 (en) |
CN (1) | CN109074101B (en) |
BR (1) | BR112018072581B1 (en) |
TW (1) | TW201740237A (en) |
WO (1) | WO2017192198A1 (en) |
Families Citing this family (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US12007763B2 (en) | 2014-06-19 | 2024-06-11 | Skydio, Inc. | Magic wand interface and other user interaction paradigms for a flying digital assistant |
US9678506B2 (en) | 2014-06-19 | 2017-06-13 | Skydio, Inc. | Magic wand interface and other user interaction paradigms for a flying digital assistant |
US9798322B2 (en) | 2014-06-19 | 2017-10-24 | Skydio, Inc. | Virtual camera interface and other user interaction paradigms for a flying digital assistant |
EP3446190B1 (en) * | 2016-05-16 | 2022-06-22 | SZ DJI Technology Co., Ltd. | Systems and methods for coordinating device actions |
US10435176B2 (en) | 2016-05-25 | 2019-10-08 | Skydio, Inc. | Perimeter structure for unmanned aerial vehicle |
US10520943B2 (en) * | 2016-08-12 | 2019-12-31 | Skydio, Inc. | Unmanned aerial image capture platform |
WO2018052323A1 (en) * | 2016-09-16 | 2018-03-22 | Motorola Solutions, Inc. | System and method for fixed camera and unmanned mobile device collaboration to improve identification certainty of an object |
GB2567587B (en) * | 2016-09-16 | 2021-12-29 | Motorola Solutions Inc | System and method for fixed camera and unmanned mobile device collaboration to improve identification certainty of an object |
US11295458B2 (en) | 2016-12-01 | 2022-04-05 | Skydio, Inc. | Object tracking by an unmanned aerial vehicle using visual sensors |
KR102681582B1 (en) * | 2017-01-05 | 2024-07-05 | 삼성전자주식회사 | Electronic device and controlling method thereof |
USD816546S1 (en) * | 2017-04-24 | 2018-05-01 | Alex Wang | Drone |
AT16628U1 (en) * | 2017-05-23 | 2020-03-15 | Ars Electronica Linz Gmbh & Co Kg | System for controlling unmanned aircraft in a swarm to film a moving object with multiple cameras |
US11676299B2 (en) * | 2017-08-07 | 2023-06-13 | Ford Global Technologies, Llc | Locating a vehicle using a drone |
US10852723B2 (en) * | 2017-11-14 | 2020-12-01 | Intel IP Corporation | Unmanned aerial vehicle swarm photography |
WO2019095300A1 (en) * | 2017-11-17 | 2019-05-23 | SZ DJI Technology Co., Ltd. | Systems and methods for synchronizing multiple control devices with a movable object |
WO2019140699A1 (en) * | 2018-01-22 | 2019-07-25 | SZ DJI Technology Co., Ltd. | Methods and system for multi-target tracking |
CN207976680U (en) * | 2018-03-27 | 2018-10-16 | 深圳市大疆创新科技有限公司 | Light compensating lamp, holder and unmanned plane |
CN108513080A (en) * | 2018-04-19 | 2018-09-07 | 深圳臻迪信息技术有限公司 | A kind of control method and device of light filling |
JP6974247B2 (en) * | 2018-04-27 | 2021-12-01 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co., Ltd | Information processing equipment, information presentation instruction method, program, and recording medium |
CN108924473B (en) * | 2018-04-28 | 2023-07-25 | 广州亿航智能技术有限公司 | Reservation aerial photographing method and system based on unmanned aerial vehicle cruise mode |
US10884415B2 (en) * | 2018-12-28 | 2021-01-05 | Intel Corporation | Unmanned aerial vehicle light flash synchronization |
JP7274726B2 (en) * | 2019-01-31 | 2023-05-17 | 株式会社RedDotDroneJapan | Shooting method |
US11373397B2 (en) * | 2019-04-16 | 2022-06-28 | LGS Innovations LLC | Methods and systems for operating a moving platform to determine data associated with a target person or object |
EP3742248A1 (en) * | 2019-05-20 | 2020-11-25 | Sony Corporation | Controlling a group of drones for image capture |
WO2020262105A1 (en) * | 2019-06-27 | 2020-12-30 | 株式会社Nttドコモ | Information processing device |
US11631241B2 (en) * | 2020-04-08 | 2023-04-18 | Micron Technology, Inc. | Paired or grouped drones |
EP3896548B1 (en) * | 2020-04-17 | 2024-02-14 | Goodrich Lighting Systems GmbH & Co. KG | Helicopter lighting system, helicopter comprising the same, and method of illuminating an environment of a helicopter |
US20220081125A1 (en) * | 2020-09-17 | 2022-03-17 | Laura Leigh Donovan | Personal paparazzo drones |
CN113002776A (en) * | 2021-03-29 | 2021-06-22 | 万航星空科技发展有限公司 | Unmanned aerial vehicle communication device |
Family Cites Families (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005024971A (en) * | 2003-07-03 | 2005-01-27 | Nikon Corp | Photographing system, illumination system and illumination condition setting method |
EP2055835A1 (en) | 2007-10-30 | 2009-05-06 | Saab Ab | Method and arrangement for determining position of vehicles relative each other |
CN105783594A (en) * | 2009-02-02 | 2016-07-20 | 威罗门飞行公司 | Multimode Unmanned Aerial Vehicle |
US8788119B2 (en) * | 2010-12-09 | 2014-07-22 | The Boeing Company | Unmanned vehicle and system |
FR2978425B1 (en) * | 2011-07-29 | 2015-12-04 | Eurocopter France | GIRAVION EQUIPPED WITH LIGHTING EQUIPMENT WITH SEVERAL PROJECTORS OPERATED FOR LANDING, WINCHING AND RESEARCH |
US9384668B2 (en) * | 2012-05-09 | 2016-07-05 | Singularity University | Transportation using network of unmanned aerial vehicles |
US9609284B2 (en) * | 2012-05-22 | 2017-03-28 | Otoy, Inc. | Portable mobile light stage |
US9696430B2 (en) * | 2013-08-27 | 2017-07-04 | Massachusetts Institute Of Technology | Method and apparatus for locating a target using an autonomous unmanned aerial vehicle |
US8825226B1 (en) | 2013-12-17 | 2014-09-02 | Amazon Technologies, Inc. | Deployment of mobile automated vehicles |
WO2015107558A1 (en) | 2014-01-14 | 2015-07-23 | Aero Sekur S.P.A. | Landing and/or docking system for aerial vehicles |
CN103941746B (en) * | 2014-03-29 | 2016-06-01 | 国家电网公司 | Image processing system and method is patrolled and examined without man-machine |
WO2014106814A2 (en) * | 2014-04-14 | 2014-07-10 | Wasfi Alshdaifat | A reporter drone |
EP2976687B1 (en) * | 2014-05-30 | 2017-09-06 | SZ DJI Technology Co., Ltd. | Systems and methods for uav docking |
US9678506B2 (en) * | 2014-06-19 | 2017-06-13 | Skydio, Inc. | Magic wand interface and other user interaction paradigms for a flying digital assistant |
GB201411293D0 (en) * | 2014-06-25 | 2014-08-06 | Pearson Eng Ltd | Improvements in or relating to inspection systems |
CN108983802A (en) * | 2014-07-31 | 2018-12-11 | 深圳市大疆创新科技有限公司 | The virtual tours system and method realized using unmanned vehicle |
EP3175309B1 (en) | 2014-08-01 | 2020-06-17 | Signify Holding B.V. | System, device for creating an aerial image |
US9573701B2 (en) | 2014-08-06 | 2017-02-21 | Disney Enterprises, Inc. | Robust and autonomous docking and recharging of quadrotors |
CN104129502B (en) | 2014-08-25 | 2016-03-09 | 无锡同春新能源科技有限公司 | Two frame unmanned planes of haze eliminated by rear-mounted dilatory nano titanium dioxide photocatalysis net |
CN108137153B (en) * | 2015-01-18 | 2022-07-15 | 基础制造有限公司 | Apparatus, system and method for unmanned aerial vehicle |
US9471059B1 (en) * | 2015-02-17 | 2016-10-18 | Amazon Technologies, Inc. | Unmanned aerial vehicle assistant |
CN204527663U (en) * | 2015-04-20 | 2015-08-05 | 刘亚敏 | A kind of unmanned vehicle |
CN204904034U (en) * | 2015-09-02 | 2015-12-23 | 杨珊珊 | Urgent medical rescue system and first -aid centre and first aid unmanned aerial vehicle thereof |
CN105242685B (en) * | 2015-10-15 | 2018-08-07 | 杨珊珊 | A kind of accompanying flying unmanned plane system and method |
US9589448B1 (en) * | 2015-12-08 | 2017-03-07 | Micro Apps Group Inventions, LLC | Autonomous safety and security device on an unmanned platform under command and control of a cellular phone |
CN105469579B (en) * | 2015-12-31 | 2020-05-29 | 北京臻迪机器人有限公司 | Somatosensory remote controller, somatosensory remote control flight system and somatosensory remote control flight method |
US20170253330A1 (en) * | 2016-03-04 | 2017-09-07 | Michael Saigh | Uav policing, enforcement and deployment system |
-
2016
- 2016-05-02 US US15/144,105 patent/US10005555B2/en active Active
-
2017
- 2017-02-17 JP JP2018557126A patent/JP6755966B2/en active Active
- 2017-02-17 BR BR112018072581-1A patent/BR112018072581B1/en active IP Right Grant
- 2017-02-17 KR KR1020187031535A patent/KR102251203B1/en active IP Right Grant
- 2017-02-17 WO PCT/US2017/018387 patent/WO2017192198A1/en unknown
- 2017-02-17 EP EP17710806.5A patent/EP3452881B1/en active Active
- 2017-02-17 CN CN201780027077.XA patent/CN109074101B/en active Active
- 2017-02-21 TW TW106105688A patent/TW201740237A/en unknown
Non-Patent Citations (1)
Title |
---|
None * |
Also Published As
Publication number | Publication date |
---|---|
KR20190004276A (en) | 2019-01-11 |
JP2019516323A (en) | 2019-06-13 |
CN109074101B (en) | 2022-04-26 |
TW201740237A (en) | 2017-11-16 |
US10005555B2 (en) | 2018-06-26 |
EP3452881A1 (en) | 2019-03-13 |
US20170313416A1 (en) | 2017-11-02 |
BR112018072581A2 (en) | 2019-02-19 |
KR102251203B1 (en) | 2021-05-11 |
JP6755966B2 (en) | 2020-09-16 |
WO2017192198A1 (en) | 2017-11-09 |
BR112018072581B1 (en) | 2023-02-28 |
CN109074101A (en) | 2018-12-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3452881B1 (en) | Imaging using multiple unmanned aerial vehicles | |
US11233943B2 (en) | Multi-gimbal assembly | |
US10979615B2 (en) | System and method for providing autonomous photography and videography | |
CN108367806B (en) | With the adjustable Unmanned Aerial Vehicle for aiming at component | |
US9977434B2 (en) | Automatic tracking mode for controlling an unmanned aerial vehicle | |
US20190144114A1 (en) | Systems and methods for controlling movable object behavior | |
CN106444843B (en) | Unmanned plane relative bearing control method and device | |
US11531340B2 (en) | Flying body, living body detection system, living body detection method, program and recording medium | |
CN205353774U (en) | Accompany unmanned aerial vehicle system of taking photo by plane of shooing aircraft | |
US20230058405A1 (en) | Unmanned aerial vehicle (uav) swarm control | |
CN110383712B (en) | Communication relay method, relay flight object, program, and recording medium | |
WO2018214071A1 (en) | Method and device for controlling unmanned aerial vehicle, and unmanned aerial vehicle system | |
WO2018059398A1 (en) | Method, apparatus, and system for controlling multi-rotor aircraft | |
KR102365931B1 (en) | An aerial battery replacement method of battery-replaceable drone and a device therefor | |
EP3919374B1 (en) | Image capturing method | |
WO2022188151A1 (en) | Image photographing method, control apparatus, movable platform, and computer storage medium | |
KR101599149B1 (en) | An imaging device with automatic tracing for the object | |
JP6856670B2 (en) | Aircraft, motion control methods, motion control systems, programs and recording media |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20180917 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G03B 15/00 20060101ALI20190910BHEP Ipc: G03B 15/05 20060101ALI20190910BHEP Ipc: G05D 1/10 20060101AFI20190910BHEP Ipc: G03B 15/03 20060101ALI20190910BHEP Ipc: G05D 1/00 20060101ALI20190910BHEP |
|
INTG | Intention to grant announced |
Effective date: 20190927 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAJ | Information related to disapproval of communication of intention to grant by the applicant or resumption of examination proceedings by the epo deleted |
Free format text: ORIGINAL CODE: EPIDOSDIGR1 |
|
GRAL | Information related to payment of fee for publishing/printing deleted |
Free format text: ORIGINAL CODE: EPIDOSDIGR3 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
INTC | Intention to grant announced (deleted) | ||
GRAR | Information related to intention to grant a patent recorded |
Free format text: ORIGINAL CODE: EPIDOSNIGR71 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
INTG | Intention to grant announced |
Effective date: 20200317 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602017015194 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 1260964 Country of ref document: AT Kind code of ref document: T Effective date: 20200515 Ref country code: CH Ref legal event code: NV Representative=s name: MAUCHER JENKINS PATENTANWAELTE AND RECHTSANWAE, DE |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: FP |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200422 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200422 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200824 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200723 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200822 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200422 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1260964 Country of ref document: AT Kind code of ref document: T Effective date: 20200422 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200422 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200422 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200422 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200422 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602017015194 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200422 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200422 Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200422 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200422 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200422 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200422 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200422 Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200422 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200422 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200422 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
26N | No opposition filed |
Effective date: 20210125 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200422 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200422 |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20210228 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210217 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210217 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210228 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200422 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20170217 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R079 Ref document number: 602017015194 Country of ref document: DE Free format text: PREVIOUS MAIN CLASS: G05D0001100000 Ipc: G05D0001460000 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: NL Payment date: 20240111 Year of fee payment: 8 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200422 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20240109 Year of fee payment: 8 Ref country code: GB Payment date: 20240111 Year of fee payment: 8 Ref country code: CH Payment date: 20240301 Year of fee payment: 8 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20240108 Year of fee payment: 8 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200422 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200422 |