WO2016065625A1 - Systems and methods for walking pets - Google Patents

Systems and methods for walking pets Download PDF

Info

Publication number
WO2016065625A1
WO2016065625A1 PCT/CN2014/090082 CN2014090082W WO2016065625A1 WO 2016065625 A1 WO2016065625 A1 WO 2016065625A1 CN 2014090082 W CN2014090082 W CN 2014090082W WO 2016065625 A1 WO2016065625 A1 WO 2016065625A1
Authority
WO
WIPO (PCT)
Prior art keywords
uav
target object
user
image
location
Prior art date
Application number
PCT/CN2014/090082
Other languages
French (fr)
Inventor
Yu Shen
Ang LIU
Guyue ZHOU
Original Assignee
SZ DJI Technology Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co., Ltd. filed Critical SZ DJI Technology Co., Ltd.
Priority to CN202010667247.4A priority Critical patent/CN111913494B/en
Priority to CN201480079886.1A priority patent/CN106455523B/en
Priority to JP2016553444A priority patent/JP6181321B2/en
Priority to PCT/CN2014/090082 priority patent/WO2016065625A1/en
Publication of WO2016065625A1 publication Critical patent/WO2016065625A1/en
Priority to US15/214,076 priority patent/US9661827B1/en
Priority to US15/493,072 priority patent/US9861075B2/en
Priority to US15/827,787 priority patent/US10159218B2/en
Priority to US16/228,190 priority patent/US10729103B2/en
Priority to US16/984,037 priority patent/US11246289B2/en
Priority to US17/651,062 priority patent/US20220159928A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K15/00Devices for taming animals, e.g. nose-rings or hobbles; Devices for overturning animals in general; Training or exercising equipment; Covering boxes
    • A01K15/02Training or exercising equipment, e.g. mazes or labyrinths for animals ; Electric shock devices ; Toys specially adapted for animals
    • A01K15/021Electronic training devices specially adapted for dogs or cats
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K15/00Devices for taming animals, e.g. nose-rings or hobbles; Devices for overturning animals in general; Training or exercising equipment; Covering boxes
    • A01K15/02Training or exercising equipment, e.g. mazes or labyrinths for animals ; Electric shock devices ; Toys specially adapted for animals
    • A01K15/021Electronic training devices specially adapted for dogs or cats
    • A01K15/023Anti-evasion devices
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K15/00Devices for taming animals, e.g. nose-rings or hobbles; Devices for overturning animals in general; Training or exercising equipment; Covering boxes
    • A01K15/02Training or exercising equipment, e.g. mazes or labyrinths for animals ; Electric shock devices ; Toys specially adapted for animals
    • A01K15/027Exercising equipment, e.g. tread mills, carousels
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K15/00Devices for taming animals, e.g. nose-rings or hobbles; Devices for overturning animals in general; Training or exercising equipment; Covering boxes
    • A01K15/04Devices for impeding movement; Devices for impeding passage through fencing, e.g. hobbles or the like; Anti-kicking devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • B25J11/0085Cleaning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K27/00Leads or collars, e.g. for dogs
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S901/00Robots
    • Y10S901/01Mobile robot

Definitions

  • Aerial vehicles such as unmanned aerial vehicles (UAVs) can travel along defined routes.
  • UAVs unmanned aerial vehicles
  • UAV unmanned aerial vehicle
  • a UAV can be configured to autonomously or semi-autonomously guide a pet or target object along a route in order to provide the target object with exercise or time outdoors.
  • the systems and methods further provide the ability to provide a defined route or a defined region where a UAV can guide a target object. Communication can occur between a user and the UAV in response to the defined route and/or region. Further communication can occur related to actions or behaviors exhibited by the target object.
  • the UAV can be configured to locate the target object and to recognize actions and behaviors exhibited by the target object.
  • the target object may be an animal such as a pet owned by a user.
  • a method of guiding a target object comprises, receiving a user input, through a user device, that defines a target area, the target area comprises (1) a permissible area for the target object to travel, or (2) an impermissible area where the target object is not permitted to travel; receiving, from a movable object that guides the target object, a signal indicative of a location of the movable object; and receiving an indicator of the movable object exiting the permissible area for the target object to travel or an indicator of the movable object entering the impermissible area where the target object is not permitted to travel, said indicator generated based on the location of the movable object and the target area; and generating a movable object operation, in response to the indicator.
  • the target object can be an animal.
  • the moveable object can be an unmanned aerial vehicle (UAV).
  • UAV unmanned aerial vehicle
  • the movable object operation can include controlling flight of the UAV to control movement of the target object.
  • the movable object operation can include alerting the user that the UAV is exiting the permissible area or entering the impermissible area.
  • the user input can comprise global coordinates that can define the permissible area or the impermissible area.
  • the user input can comprise an image or outline on a map defining the boundaries of the permissible area or the impermissible area.
  • the method can comprise guiding the target object using the UAV, wherein the UAV can be physically attached to the target object.
  • the UAV can be attached to the target object by a leash that is attached to a collar of the target object.
  • the UAV can be a rotorcraft comprising a plurality of rotors that can permit the UAV to take off and/or land vertically.
  • the UAV can comprise a location device that transmits information about the UAV’s location.
  • the location device can be a GPS sensor.
  • the indicator of exiting the permissible area can be received when the target object exits the permissible area.
  • the indicator of exiting the permissible area can be received when the target object is within a predetermined threshold distance of a boundary of the permissible area and the target object is heading in the direction of the boundary.
  • the target object can be heading in the direction of the boundary at a speed exceeding a threshold speed.
  • the indicator of entering the impermissible area can be received when the target object enters the impermissible area.
  • the indicator of entering the impermissible area can be received when the target object is within a predetermined threshold distance of a boundary of the impermissible area and the target object is heading in the direction of the boundary.
  • the target object can be heading in the direction of the boundary at a speed exceeding a threshold speed.
  • the movable object operation can include playing the user’s voice to the target object when the indicator of exiting the permissible area or entering the impermissible area is received.
  • the method can further comprise transmitting the user’s voice from the user device to the UAV in real-time.
  • the user’s voice can be a pre-recording.
  • the movable object operation can include delivering an electric shock to the target object if the target object does not respond to the user’s voice within a predetermined period of time.
  • the user interface can be a screen of the UAV and the alert can be provided visually.
  • the user interface can be a speaker of the UAV and the alert can be provided audibly.
  • a system for guiding a target object can comprise: one or more processors, individually or collectively, configured to: (a) receive a signal indicative of a user input that defines a target area, said target area comprising (1) a permissible area for the target object to travel, or (2) an impermissible area where the target object is not permitted to travel; (b) receive a signal indicative of a location of a movable object that guides the target object; and (c) determine, based on the target area and the signal indicative of the location of the movable object, when the movable object is exiting the permissible area for the target object to travel or when the movable object is entering the impermissible area where the target object is not permitted to travel; and (d) determine a movable object operation, in response to the determination of whether the movable object is exiting the permissible area for the target object to travel or entering the impermissible area where the target object is not permitted to travel.
  • the target object can be an animal.
  • the movable object can be an unmanned aerial vehicle (UAV).
  • UAV unmanned aerial vehicle
  • the moveable object operation can include controlling flight of the UAV to control movement of the target object.
  • the movable object operation can include alerting the user that the UAV is exiting the permissible area or entering the impermissible area.
  • the UAV can be physically attached to the target object while the UAV is guiding the target object.
  • the UAV can be attached to the target object by a leash that is attached to a collar of the target object.
  • the UAV can be a rotorcraft comprising a plurality of rotors that permit the UAV to take off and/or land vertically.
  • the UAV can comprise a location device that transmits information of the UAV’s location.
  • the location device can be a GPS sensor.
  • the indicator of exiting the permissible area can be provided when the target object exits the permissible area.
  • the indicator of exiting the permissible area can be provided when the target object is within a predetermined threshold distance of a boundary of the permissible area and the target object is heading in the direction of the boundary.
  • the target object can be heading in the direction of the boundary at a speed exceeding a threshold speed.
  • the one or more processors can be configured to determine the UAV is entering the impermissible area when the target object enters the impermissible area.
  • the one or more processors can be configured to determine that the UAV is entering the impermissible area when the target object is within a predetermined threshold distance of a boundary of the impermissible area and the target object is heading in the direction of the boundary.
  • the one or more processors can be configured to determine that the target object is heading in the direction of the boundary at a speed exceeding a threshold speed.
  • the movable object operation can include playing the user’s voice to the target object when the indicator of exiting the permissible area or entering the impermissible area is received, and the one or more processors are configured to effect the movable object operation.
  • the user’s voice can be transmitted from the user device to the UAV in real-time.
  • the user’s voice can be a pre-recording.
  • the movable object operation can include delivering an electric shock to the target object if the target object does not respond to the user’s voice within a predetermined period of time.
  • the user interface can be a screen of the UAV and the alert can be provided visually.
  • the user interface can be a speaker of the UAV and the alert can be provided audibly.
  • a method of guiding a target object using a movable object can comprise: recognizing the target object wearing a collar, with aid of one or more vision sensors on board the UAV; automatically attaching, without human aid, the movable object to the collar of the target object using a leash when the target object is recognized; and flying the movable object while the target object is attached to the movable object via the leash.
  • the target object can be an animal.
  • the movable object can be an unmanned aerial vehicle (UAV), and the UAV can be flying while the target object is in locomotion.
  • UAV unmanned aerial vehicle
  • the leash can be formed of a flexible or bendable material.
  • the method can further comprise extending or retracting the leash while the UAV is in flight.
  • the leash can attach to the collar of the target object using one or more magnetic connection.
  • the leash can attach to the collar of the target object with aid a robotic arm.
  • the robotic arm can comprise one or more extension that guides the leash to the collar.
  • the method can further comprising capturing, using the one or more vision sensors, at least one image of the target object wearing the collar.
  • the method can further comprise recognizing, with aid of one or more processors, the target object from the image of the target object.
  • the movable object can further comprise one or more processors configured to recognize the target object from the image of the collar.
  • the movable object can be a UAV
  • the method can further comprise flying the UAV, subsequent to recognizing the target object, to a closer proximity of the target object in order to get into position to automatically attach the UAV to the collar of the target object. Flying the movable object can include guiding the target object by pulling on the leash.
  • the method can further comprise comparing a calculation of the target object motion and the movable object motion to determine one or more parameter with which the movable object pulls on the leash.
  • the method can further comprise collecting, using the movable object, an image of the target object while the target object is in locomotion and is attached to the movable object via the leash.
  • the method can further comprise displaying, on a map, the location of the movable object to the user.
  • the method can further comprise playing the user’s voice to the target object while the target object is in locomotion and is attached to the movable object via a leash.
  • the user’s voice can be transmitted from the user device to the movable object in real-time.
  • the user’s voice can be a pre-recording.
  • the user’s voice can be speaking a command to the target object.
  • a UAV can be configured to guide a target object
  • the UAV can comprise: one or more vision sensors configured to capture an image of the target object wearing a collar; one or more processors configured to, individually or collectively, recognize the target object from the image of the target object wearing the collar; a leash attachment mechanism configured to automatically attach, without human aid, a leash to the collar of the target object when the target object is recognized; and one or more propulsion units configured to permit flight of the UAV while the target object is attached to the UAV via the leash.
  • the target object can be an animal.
  • the UAV can be flying while the target object is in locomotion.
  • the leash can be formed of a flexible or bendable material.
  • the leash can be extendible or retractable while the UAV is in flight.
  • the leash can be configured to attach to the collar of the target object using one or more magnetic connection.
  • the leash can be configured to attach to the collar of the target object with aid of a robotic arm.
  • the robotic arm can comprise one or more extension that guide the leash to the collar.
  • the one or more vision sensors can be configured to capture at least one image of the target object wearing the collar.
  • the UAV can further comprise one or more processors configured to recognize the target object from the image of the target object.
  • the UAV can further comprise one or more processors configured to recognize the target object from the image of the collar.
  • the one or more processors can be configured to, subsequent to recognizing the target object, generate a signal to the one or more propulsion units to effect flight of the UAV to a closer proximity of the target object in order to get into position to automatically attach the UAV to the collar of the target object.
  • the UAV can be configured to guide the target object by pulling on the leash.
  • the one or more processors can be configured to compare a calculation of the target object motion and the UAV motion to determine one or more parameter with which the UAV pulls on the leash.
  • the one or more vision sensors can be configured to collect an image of the target object while the target object is in locomotion and is attached to the UAV via the leash.
  • the one or more vision sensors can be configured to collect an image of the collar of the target object.
  • the UAV can further comprise one or more speaker configured to play the user’s voice to the target object while the target object is in locomotion and is attached to the UAV via the leash.
  • the user’s voice can be transmitted from the user device to the UAV in real-time.
  • the user’s voice can be a pre-recording.
  • the user’s voice can be speaking a command to the target object.
  • a method of guiding a target object using a UAV can comprise: recognizing the target object, with aid of one or more vision sensors on board the UAV; automatically displaying, without human aid or invention, an attractor to the target object when the target object is recognized; and flying the UAV while the target object is in locomotion and following the attractor.
  • the target object can be an animal.
  • the attractor can be ab edible treat.
  • the method can further comprise emitting, using the attractor, a selected scent.
  • the UAV can display the attractor by dangling the attractor at or near a head level of the target object.
  • the attractor can comprise an image that is displayed on a screen carried by the UAV.
  • the image can be a static image.
  • the image can be an image of an owner of the target object.
  • the image can be a videa.
  • the image can be a video of the owner of the target object.
  • the method can further comprise determining, using the one or more vision sensors, a location of the target object relative to the UAV and adjusting or maintaining the speed of the UAV flight to remain within a proximity of the target object that is sufficiently close for the target object to perceive the attractor.
  • the method can further comprise determining, using the one or more vision sensors, a trajectory of the locomotion of the target object relative to the UAV and adjusting or maintaining the direction of the UAV flight remain within a proximity of the target object that is sufficiently close for the target object to perceive the attractor.
  • the method can further comprise capturing at least one image of the target object using the one or more vision sensors.
  • the UAV can further comprise one or more processors configured to recognize the target object from the image of the target object.
  • the target object can be wearing a collar.
  • the UAV can further comprise one or more processors configured to recognize the target object form the image of the collar.
  • the method can further comprise playing the user’s voice to the target object while the target object is in locomotion and is attached to the UAV via the leash.
  • the user’s voice can be transmitted from the user device to the UAV in real-time.
  • the user’s voice can be a pre-recording.
  • the user’s voice can be saying a command to the target object.
  • the target object can be an animal.
  • the attractor can be an edible treat.
  • the attractor can emit a selected scent.
  • the UAV can display the attractor by dangling the attractor at or near a head level of the target object.
  • the attractor can comprise an image that is displayed on a screen carried by the UAV.
  • the image can be a static image.
  • the image can be an image of an owner of the target object.
  • the image can be a video.
  • the image can be a video of the owner of the target object.
  • the UAV can be further configured to determine, using the one or more vision sensors, a location of the target object relative to the UAV and adjust or maintain the speed of the UAV flight to remain within a proximity of the target object that is sufficiently close for the target object to perceive the attractor.
  • the UAV can be further configured determine, using the one or more vision sensors, a trajectory of the locomotion of the target object relative to the UAV and adjust or maintain the direction of the UAV flight remain within a proximity of the target object that is sufficiently close for the target object to perceive the attractor.
  • the one or more vision sensors can capture at least one image of the target object.
  • the UAV can further comprise one or more processors configured to recognize the target object from the image of the target object.
  • the target object can be wearing a collar.
  • the UAV can further comprise a speaker configured to play the user’s voice to the target object while the target object is in locomotion and is attached to the UAV via the leash.
  • the user’s voice can be transmitted from the user device to the UAV in real-time.
  • the user’s voice can be a pre-recording.
  • the user’s voice can be saying a command to the target object.
  • a method of guiding a target object may be provided.
  • the method may comprise: providing a UAV that guides the target object, wherein a location of the UAV is known; recognizing the target object, with aid of one or more vision sensors on board the UAV; recognizing waste generated by the target object, with aid of the one or more vision sensors on board the UAV; and alerting the user that the waste has been generated by the target object.
  • the target object can be an animal.
  • the animal can be a dog or a cat.
  • the method can further comprise providing information to the user about a location where the waste was generated.
  • the UAV can further comprise one or more processors configured to recognize the waste from the image of the waste.
  • the user can be alerted through a user device.
  • the user can be alerted through a user device comprising a display.
  • the user device can be a smartphone, tablet, or a personal computer.
  • the user device can display a map showing the location of where the waste was generated.
  • the user device can display an image of the waste generated by the target object.
  • the UAV can guide the target object by being physically attached to the target object.
  • the UAV can be attached to the target object by a leash that is attached to the collar of the target object.
  • the UAV can guide the target object by displaying an attractor to the target object.
  • the attractor can be an edible treat.
  • the user can be a target object waste removal professional.
  • the UAV can comprise a location device that transmits information about the UAV’s location.
  • the location device can be a GPS sensor.
  • a UAV configured to guide a target object can comprise: one or more vision sensors configured to capture an image of the target object and waste generated by the target object; one or more processors configured to, individually or collectively, (1) recognize the target object from the image of the target object, and (2) recognize the waste generated by the target object from the image of the waste generated by the target object; a communication unit configured to send a signal to a user device that alerts the user that the waste has been generated by the target object; and one or more propulsion units configured to permit flight of the UAV while guiding the target object.
  • the target object can be an animal.
  • the animal can be a dog or a cat.
  • the UAV can be further configured to provide information to the user about a location where the waste was generated.
  • the user device can comprise a display.
  • the user device can be a smartphone, tablet, or personal computer.
  • the user device can be configured to display a map showing the location of where the waste was generated.
  • the user device can be configured to display an image of the waste generated by the target object.
  • the UAV can be configured to guide the target object by being physically attached to the target object.
  • the UAV can be attached to the target object by a leash that is attached to the collar of the target object.
  • the UAV can be configured to guide the target object by displaying an attractor to the target object.
  • the attractor can be an edible treat.
  • the user can be a target object waste removal professional.
  • the UAV can be a rotorcraft comprising a plurality of rotors that permit the UAV to take off and/or land vertically.
  • the UAV can comprise a location device that transmits information about the UAV’s location.
  • the location device can be a GPS sensor.
  • a method of guiding a target object can comprise: providing a UAV that guides the target object, wherein a location of the UAV is known; recognizing the target object, with aid of one or more vision sensors on board the UAV; recognizing waste generated by the target object, with aid of the one or more vision sensors on board the UAV; and removing the waste in response to recognizing the waste, using the UAV.
  • the target object can be an animal.
  • the animal can be a dog or a cat.
  • the UAV can be further configured to provide information to the user about a location where the waste was generated.
  • the user device can comprise a display.
  • the user device can be a smartphone, tablet, or personal computer.
  • the user device can be configured to display a map showing the location of where the waste was generated.
  • the user device can be configured to display an image of the waste generated by the target object.
  • the UAV can be configured to guide the target object by being physically attached to the target object.
  • the UAV can be attached to the target object by a leash that is attached to a collar of the target object.
  • the UAV can be further configured to guide the target object by displaying an attractor to the target object.
  • the attractor can be an edible treat.
  • the user can be a target object waste removal professional.
  • the UAV can be a rotorcraft comprising a plurality of rotors that permit the UAV to take off and/or land vertically.
  • the UAV can comprise a location device that transmits information about the UAV’s location.
  • the location device can be a GPS sensor.
  • the method can further comprise removing the waste with a mechanical arm.
  • a UAV can be configured to guide a target object
  • the UAV can comprise: one or more vision sensors configured to capture an image of the target object and waste generated by the target object; one or more processors configured to, individually or collectively, (1) recognize the target object from the image of the target object, and (2) recognize the waste generated by the target object from the image of the waste generated by the target object; one or more waste removal units, configured to remove the waste in response to the recognition of the waste; and one or more propulsion units configured to permit flight of the UAV while guiding the target object.
  • the target object can be an animal.
  • the animal can be a dog or a cat.
  • the UAV can be further configured to provide information to the user about a location where the waste was generated.
  • the UAV can further comprise one or more processors configured to recognize the waste from the image of the waste.
  • the UAV can guide the target object by being physically attached to the target object.
  • the UAV can be attached to a leash that is attached to a collar of the target object.
  • the UAV can guide the target object by displaying an attractor to the target object.
  • the attractor can be an edible treat.
  • the UAV can be a rotorcraft comprising a plurality of rotors that permit the UAV to take off and/or land vertically.
  • the UAV can comprise a location device that transmits information about the UAV’s location.
  • the location device can be a GPS sensor.
  • the one or more waste removal units can include a mechanical arm that extends from the UAV to remove the waste.
  • a method of guiding a target object can comprise: receiving a user input, through a user device, a travel route for a UAV to guide the target object; guiding the target object using the UAV by flying the UAV along the travel route while the target object is in locomotion, wherein a location of the UAV is known; receiving, through the user device while the UAV is guiding the target object along the travel route, a change to the travel route to provide an updated travel route; and flying the UAV along the updated travel route.
  • the user input can comprise global coordinates that define the travel route.
  • the user input can comprise global coordinates that define the updated travel route.
  • the user input can comprise an image or line on a map defining the travel route.
  • the user input can comprise an image or line on a map defining the updated travel route.
  • the UAV can guide the target object by being physically attached to the target object.
  • the UAV can be attached to a leash that is attached to a collar of the target object.
  • the target object can be an animal.
  • the animal can be a dog or a cat.
  • the UAV can be a rotorcraft comprising a plurality of rotors that permit the UAV to take off and/or land vertically.
  • the UAV can comprise a location device that transmits information about the UAV’s location.
  • the location device can be a GPS sensor.
  • the method can further comprise capturing, with aid of one or more vision sensors on board the UAV, an image of the target object.
  • the method can further comprise detecting, with aid of one or more processors, when the target object is deviating from the travel route or the updated travel route based on the image of the target object.
  • the method can further comprise playing the user’s voice to the target object when the target object is deviating from the travel route or the updated travel route.
  • the user’s voice can be transmitted from the user device to the UAV in real-time.
  • the user’s voice can be a pre-recording.
  • the method can further comprise delivering an electric shock to the target object when the target object deviates from the travel route beyond a predetermined distance.
  • FIG. 1 shows an example of a system comprising a user, an unmanned aerial vehicle (UAV), and a target object where a UAV is configured to guide the target object while in communication with the user.
  • UAV unmanned aerial vehicle
  • FIG. 2 shows a map that can be used to designate areas that a permissible for travel of the target object or impermissible for travel of the target object.
  • FIG. 3 shows an example of how a user can define a permissible or impermissible area for a target object to travel on a user interface.
  • FIG. 4 shows a boundary and a threshold surrounding the boundary that can be approached and/or crossed by a target object.
  • FIG. 5 shows an example of travel routes that the UAV can guide the target object on.
  • FIG. 6 shows a target object wearing a collar that can be recognized by a UAV.
  • FIG. 7 shows a UAV guiding a target object while physically connected to the target object.
  • FIG. 8 shows a UAV displaying audio and/or visual stimuli from a user to a target object.
  • FIG. 9 shows a UAV guiding a target object without a physical connection to the target object.
  • FIG. 10 shows a UAV recognizing waste generated by a target object.
  • FIG. 11 shows process in which a UAV may alert a user of the occurrence and location of waste generated by a target object.
  • FIG. 12 illustrates an unmanned aerial vehicle, in accordance with an embodiment of the invention.
  • FIG. 13 illustrates a movable object including a carrier and a payload, in accordance with an embodiment of the invention.
  • FIG. 14 is a schematic illustration by way of block diagram of a system for controlling a movable object, in accordance with an embodiment of the invention.
  • the systems, devices, and methods of the present invention provide mechanisms for guiding a target object by an unmanned aerial vehicle (UAV) along a predefined route, an instantaneously defined route, or an undefined route within a designated area or region.
  • UAV unmanned aerial vehicle
  • the systems, devices, and methods of the present invention further provide responses to recognized actions and/or behaviors of the target object.
  • Description of the UAV may be applied to any other type of unmanned vehicle, or any other type of movable object.
  • a UAV can be provided to guide a target object.
  • a user can provide instructions to the UAV to guide the target object through a device that is in communication with the UAV.
  • the device may be directly in communication with the UAV or may communicate with the UAV over a network.
  • the user can provide the instructions before the UAV guides the target object or while the UAV is guiding the target object in real time.
  • the UAV can interface to broadcast a visual and/or audio stream or recording of the user to the target object.
  • the UAV can be configured to remain within a specified distance from the target object.
  • the target object can be attached to the UAV through a physical attachment mechanism (e.g. a leash).
  • the UAV may exert force on the physical attachment mechanism to aid in guiding the target object.
  • the UAV can comprise one or more vision sensors.
  • the vision sensors can be in communication with a processor that is configured to recognize an image of the target object.
  • the UAV can remain within a specified distance of the target object without being physically attached to the target object using the vison sensors and the one or more processors.
  • the UAV can provide an attractor to the target object when the target object refused to follow or remain with a specified distance from the UAV.
  • the UAV can be configured to lead or direct a target object or being.
  • a target object can be one or more animals.
  • a target object can be a pet.
  • a pet can be, for example, a dog, cat, lizard, horse, rabbit, ferret, pig, or any rodent that may be kept as a pet by a user.
  • the pet may be a mammal.
  • the pet may be a reptile.
  • the pet may be a land bound pet that may traverse a surface.
  • the pet may optionally be capable of being airborne (e.g. a bird).
  • the UAV can lead a target object along a pre-defined path, along an undefined path in a pre-defined area, or anywhere in accordance with certain travel parameters (e.g., length of route, amount of time, remaining outside of impermissible areas).
  • the UAV can receive instructions regarding the pre-defined path or area from one or more processors.
  • the processors can be on-board or off-board the UAV.
  • the one or more processors may be on an external device such as a server, user device, or may be provided on a cloud computing infrastructure.
  • the processors can additionally be in communication with at least one user through a communication interface.
  • a user can provide parameters to define a path or geographic region for the UAV to direct a target object along or within respectively.
  • a UAV can have a vision sensor.
  • the vision sensor can be configured to recognize the target object.
  • the UAV can continuously monitor the location of the target object. In some cases, the vision sensor can be configured to recognize an item attached to the target object, for example, a collar or harness.
  • the UAV can be configured to maintain a fixed distance from the target object.
  • the target object can be attached or tethered to the UAV, for example by a leash.
  • Leash can be a flexible object that attaches on one end to the UAV and on the other end to the target object.
  • a processor can be in communication with one or more locating sensors on-board a UAV.
  • a locating sensor can determine the position of a UAV in a relative or global coordinate system.
  • a global coordinate system may be an absolute coordinate system. In an example a global coordinate system can define the location of the UAV using longitude and latitude.
  • a relative coordinate system can determine the distance or location of a UAV from a reference point or landmark.
  • a relative coordinate system can be derived from a measurement of movement of a UAV from a known starting point or movement of a UAV in a known area.
  • a locating sensor configured to determine the absolute location of a UAV can be a GPS sensor. One or more locating sensors can be used to determine the relative location of a UAV.
  • relative angular velocity can be provided by a gyroscope
  • relative translational acceleration can be provided by an accelerometer
  • relative attitude information can be provided by a vision sensor
  • relative distance information can be provided by an ultrasonic sensor, lidar, or time-of-flight camera.
  • the relative and or global location of the UAV can be communicated to the processor.
  • the processor can inform a user through a user interface of the local or global position of the UAV.
  • the global or local location of the UAV can correspond to the global or local location of the target object that may be in proximity of or tethered to the UAV.
  • the systems and methods herein may permit a UAV to aid in taking a target object out on a walk without requiring significant human intervention.
  • a human may remain at home while the UAV guides the target object.
  • the human may be able to monitor the situation in real-time and intervene if needed.
  • the human may intervene remotely by communicating with the target object through the UAV, or may be informed of a location so the human can intervene in person if necessary.
  • FIG. 1 shows an example of a target object guidance system 100 including a user 101, one or more processors 102, and a UAV 103 guiding or leading a target object 104.
  • the UAV 103 can guide the target object 104 along a pre-defined or an undefined path.
  • the UAV 103 can guide the target object 104 for a specified duration of time. In some cases the target object can follow the UAV along a route. Alternatively the target object 104 can wander in a region while the UAV 103 follows the target object. In instances where the UAV 103 follows the target object 104 the UAV can prevent the target object from wandering into an impermissible regions or out of a permissible region.
  • the UAV may or may not exert force on the target object while the target object is moving around.
  • the user 101 can be in a first location 105.
  • a first location may be a house, yard, room, building, vehicle, or another space or area.
  • a user 101 can communicate with one or more processors 102 through a user interface on an electronic device 106.
  • a user interface can be on an electronic display such as a desktop computer, laptop computer, smart phone, smart watch, smart glasses, tablet, or another device configured to communicate with the one or more processors.
  • the electronic device 106 may or may not be a mobile device.
  • the electronic device may or may not be a remote terminal capable of manually controlling flight of the UAV.
  • the electronic device can be in communication with the UAV directly through a wired or wireless connection 107.
  • the electronic device can further be in communication with a processor 102, through a wired or wireless connection 108, the processor 102 can additionally be in communication with the UAV through a wired or wireless connection 109.
  • the processor may be on-board the electronic device 106 and/or the UAV 103.
  • the UAV can have one or more on-board processors.
  • the one or more on-board processors can communicate with an external processor 102 and or an electronic device 106 with a user interface.
  • the on-board processors may perform any functions of processors 102 described herein.
  • a UAV can communicate directly with the electronic device to communicate with an intermediate device or processor.
  • the UAV can comprise a vision sensor 111.
  • a vision sensor 111 can be a camera.
  • the vision sensor 111 can be enclosed in the body of the UAV or carried by the UAV as an external payload. In a case in which the vision sensor 111 is carried externally as a payload the UAV can orient the vision sensor below the body of the UAV.
  • the vision sensor can be attached to the UAV by one or more attachments, such as a carrier 112.
  • the carrier 112 can be configured such that the vision sensor can rotate and/or tilt independently of the UAV.
  • the carrier may permit the vision sensor to translate and/or rotate in three-dimensions.
  • the carrier can permit translation and/or rotation of the vision sensor independently of the movement of the UAV about an x, y, or z axis.
  • the vision sensor e.g., camera
  • the vision sensor may be able to rotate about a pitch, roll, and/or yaw axis with respect to the UAV and/or a fixed reference frame. Similar rotation and translation can be achieved in any other three-dimensional coordinate system (e.g. spherical coordinates).
  • the carrier may permit rotation and/or translation of the vision sensor about only one or about only two axes.
  • a target object 104 can optionally have a wearable identifier 113.
  • the target object may have a collar.
  • the UAV vision sensor can detect a visual pattern on the wearable identifier in order to locate the target object.
  • the target object 104 can be tethered to the UAV 103 by a physical connection 114.
  • a physical connection 114 can be a flexible connector of a given length that is connected on one end to the target object 104 and on another end to the UAV 103.
  • the physical connection may or may not expand or contract (thus being able to vary its length).
  • the physical connection may have a limited maximum length (e.g., less than or equal to about 20 m, 15 m, 10 m, 7 m, 5 m, 4m, 3 m, 2 m, or 1m).
  • a user 101 can define a route or region in which the UAV 103 can guide or lead the target object 104.
  • a user can define the route or region through a user interface on an electronic device 106 or any other device that may or may not be in communication with the UAV.
  • a user can generate a defined area in which the user would like the target object to be led by the UAV 103.
  • a user 101 can define a specified route along which the user 101 would like the UAV 103 to guide the target object 104.
  • the user 101 can instruct the UAV 103 to guide the target object 104 within a geographic area.
  • the user may define or choose a defined area where the UAV is not to guide that target object.
  • the UAV 103 can be provided with an additional instruction from the user 101 to further constrain an act of guiding the target object 104 in the geographic area.
  • the additional instruction can be a duration of total time, end time, total cumulative distance, pace, or performance of an event or task by the target object 104.
  • a duration of total time may include the total amount of time to guide the target object (e.g., length of walk, such as a 30 minute walk).
  • the route or action of the UAV guiding the target object may be altered to comply with the duration of total time.
  • the end time may be preset (e.g., finish guiding the target object and return home by 2:00 pm).
  • the route or action of the UAV guiding the target object may be altered to comply with the end time (e.g., if the target object is moving slowly, a shortcut may be taken to get the target object home on time).
  • a total cumulative distance may enable a user to define the distance to be traveled by the target object (e.g., a user may specify a 1 mile walk).
  • the user may optionally set a pace for the guidance (e.g., have the target object move at a rate of least 4 miles/hours).
  • a user may set an event or task to be completed by the target object and monitored by the UAV (e.g., walk uphill, walk downhill, sprints, fetch an object, etc.).
  • the additional instructions may include impermissible areas to keep the target object away from.
  • the UAV 103 can have one or more sensors.
  • the UAV may comprise one or more vision sensors such as an image sensor.
  • an image sensor may be a monocular camera, stereo vision camera, radar, sonar, or an infrared camera.
  • the UAV may further comprise other sensors that may be used to determine a location of the UAV, such as global positioning system (GPS) sensors, inertial sensors which may be used as part of or separately from an inertial measurement unit (IMU) (e.g., accelerometers, gyroscopes, magnetometers), lidar, ultrasonic sensors, acoustic sensors, WiFi sensors.
  • GPS global positioning system
  • IMU inertial measurement unit
  • the UAV can have sensors on-board the UAV that collect information directly from an environment without contacting an additional component off-board the UAV for additional information or processing.
  • a sensor that collects data directly in an environment can be a vision or audio sensor.
  • the UAV can have sensors that are on-board the UAV but contact one or more components off-board the UAV to collect data about an environment.
  • a sensor that contacts a component off-board the UAV to collect data about an environment may be a GPS sensor or another sensor that relies on connection to a another device, such as a satellite, tower, router, server, or other external device.
  • sensors may include, but are not limited to, location sensors (e.g., global positioning system (GPS) sensors, mobile device transmitters enabling location triangulation), vision sensors (e.g., imaging devices capable of detecting visible, infrared, or ultraviolet light, such as cameras), proximity or range sensors (e.g., ultrasonic sensors, lidar, time-of-flight or depth cameras), inertial sensors (e.g., accelerometers, gyroscopes, inertial measurement units (IMUs)), altitude sensors, attitude sensors (e.g., compasses) pressure sensors (e.g., barometers), audio sensors (e.g., microphones) or field sensors (e.g., magnetometers, electromagnetic sensors). Any suitable number and combination of sensors can be used, such as one, two, three, four, five, or more sensors.
  • GPS global positioning system
  • vision sensors e.g., imaging devices capable of detecting visible, infrared, or ultraviolet light, such as cameras
  • proximity or range sensors e
  • the data can be received from sensors of different types (e.g., two, three, four, five, or more types).
  • Sensors of different types may measure different types of signals or information (e.g., position, orientation, velocity, acceleration, proximity, pressure, etc.) and/or utilize different types of measurement techniques to obtain data.
  • the sensors may include any suitable combination of active sensors (e.g., sensors that generate and measure energy from their own energy source) and passive sensors (e.g., sensors that detect available energy).
  • some sensors may generate absolute measurement data that is provided in terms of a global coordinate system (e.g., position data provided by a GPS sensor, attitude data provided by a compass or magnetometer), while other sensors may generate relative measurement data that is provided in terms of a local coordinate system (e.g., relative angular velocity provided by a gyroscope; relative translational acceleration provided by an accelerometer; relative attitude information provided by a vision sensor; relative distance information provided by an ultrasonic sensor, lidar, or time-of-flight camera).
  • the sensors onboard or off board the UAV may collect information such as location of the UAV, location of other objects, orientation of the UAV, or environmental information.
  • a single sensor may be able to collect a complete set of information in an environment or a group of sensors may work together to collect a complete set of information in an environment.
  • Sensors may be used for mapping of a location, navigation between locations, detection of obstacles, or detection of a target.
  • Sensors may be used for surveillance of an environment or a subject of interest.
  • Sensors may be used to recognize a target object, such as an animal.
  • the target object may be distinguished from other objects in the environment.
  • Sensors may be used to recognize an object worn or carried by the target object.
  • the worn or carried object may be distinguished from other objects in the environment.
  • any description herein of a UAV may apply to any type of movable object.
  • the description of a UAV may apply to any type of unmanned movable object (e.g., which may traverse the air, land, water, or space).
  • the UAV may be capable of responding to commands from a remote controller.
  • the remote controller may be not connected to the UAV, the remote controller may communicate with the UAV wirelessly from a distance.
  • the UAV may be capable of operating autonomously or semi-autonomously.
  • the UAV may be capable of following a set of pre-programmed instructions.
  • the UAV may operate semi-autonomously by responding to one or more commands from a remote controller while otherwise operating autonomously. For instance, one or more commands from a remote controller may initiate a sequence of autonomous or semi-autonomous actions by the UAV in accordance with one or more parameters.
  • the UAV may be an aerial vehicle.
  • the UAV may have one or more propulsion units that may permit the UAV to move about in the air.
  • the one or more propulsion units may enable the UAV to move about one or more, two or more, three or more, four or more, five or more, six or more degrees of freedom.
  • the UAV may be able to rotate about one, two, three or more axes of rotation.
  • the axes of rotation may be orthogonal to one another.
  • the axes of rotation may remain orthogonal to one another throughout the course of the UAV’s flight.
  • the axes of rotation may include a pitch axis, roll axis, and/or yaw axis.
  • the UAV may be able to move along one or more dimensions.
  • the UAV may be able to move upwards due to the lift generated by one or more rotors.
  • the UAV may be capable of moving along a Z axis (which may be up relative to the UAV orientation), an X axis, and/or a Y axis (which may be lateral).
  • the UAV may be capable of moving along one, two, or three axes that may be orthogonal to one another.
  • the UAV may be a rotorcraft.
  • the UAV may be a multi-rotor craft that may include a plurality of rotors.
  • the plurality or rotors may be capable of rotating to generate lift for the UAV.
  • the rotors may be propulsion units that may enable the UAV to move about freely through the air.
  • the rotors may rotate at the same rate and/or may generate the same amount of lift or thrust.
  • the rotors may optionally rotate at varying rates, which may generate different amounts of lift or thrust and/or permit the UAV to rotate.
  • one, two, three, four, five, six, seven, eight, nine, ten, or more rotors may be provided on a UAV.
  • the rotors may be arranged so that their axes of rotation are parallel to one another. In some instances, the rotors may have axes of rotation that are at any angle relative to one another, which may affect the motion of the UAV.
  • the UAV shown may have a plurality of rotors.
  • the rotors may connect to the body of the UAV which may comprise a control unit, one or more sensors, processor, and a power source.
  • the sensors may include vision sensors and/or other sensors that may collect information about the UAV environment. The information from the sensors may be used to determine a location of the UAV.
  • the rotors may be connected to the body via one or more arms or extensions that may branch from a central portion of the body. For example, one or more arms may extend radially from a central body of the UAV, and may have rotors at or near the ends of the arms.
  • a vertical position and/or velocity of the UAV may be controlled by maintaining and/or adjusting output to one or more propulsion units of the UAV. For example, increasing the speed of rotation of one or more rotors of the UAV may aid in causing the UAV to increase in altitude or increase in altitude at a faster rate. Increasing the speed of rotation of the one or more rotors may increase the thrust of the rotors. Decreasing the speed of rotation of one or more rotors of the UAV may aid in causing the UAV to decrease in altitude or decrease in altitude at a faster rate. Decreasing the speed of rotation of the one or more rotors may decrease the thrust of the one or more rotors.
  • the output When a UAV is taking off, the output may be provided to the propulsion units may be increased from its previous landed state. When the UAV is landing, the output provided to the propulsion units may be decreased from its previous flight state.
  • the UAV may be configured to take off and/or land in a substantially vertical manner.
  • a lateral position and/or velocity of the UAV may be controlled by maintaining and/or adjusting output to one or more propulsion units of the UAV.
  • the altitude of the UAV and the speed of rotation of one or more rotors of the UAV may affect the lateral movement of the UAV.
  • the UAV may be tilted in a particular direction to move in that direction and the speed of the rotors of the UAV may affect the speed of the lateral movement and/or trajectory of movement.
  • Lateral position and/or velocity of the UAV may be controlled by varying or maintaining the speed of rotation of one or more rotors of the UAV.
  • the UAV may be of small dimensions.
  • the UAV may be capable of being lifted and/or carried by a human.
  • the UAV may be capable of being carried by a human in one hand.
  • the UAV may have a greatest dimension (e.g., length, width, height, diagonal, diameter) of no more than 100 cm.
  • the greatest dimension may be less than or equal to 1 mm, 5 mm, 1 cm, 3 cm, 5 cm, 10 cm, 12 cm, 15 cm, 20 cm, 25 cm, 30 cm, 35 cm, 40 cm, 45 cm, 50 cm, 55 cm, 60 cm, 65 cm, 70 cm, 75 cm, 80 cm, 85 cm, 90 cm, 95 cm, 100 cm, 110 cm, 120 cm, 130 cm, 140 cm, 150 cm, 160 cm, 170 cm, 180 cm, 190 cm, 200 cm, 220 cm, 250 cm, or 300 cm.
  • the greatest dimension of the UAV may be greater than or equal to any of the values described herein.
  • the UAV may have a greatest dimension falling within a range between any two of the values described herein.
  • the UAV may be lightweight.
  • the UAV may weigh less than or equal to 1 mg, 5 mg, 10 mg, 50 mg, 100 mg, 500 mg, 1 g, 2 g, 3 g, 5 g, 7 g, 10 g, 12 g, 15 g, 20 g, 25 g, 30 g, 35 g, 40 g, 45 g, 50 g, 60 g, 70 g, 80 g, 90 g, 100 g, 120 g, 150 g, 200 g, 250 g, 300 g, 350 g, 400 g, 450 g, 500 g, 600 g, 700 g, 800 g, 900 g, 1 kg, 1.1 kg, 1.2 kg, 1.3 kg, 1.4 kg, 1.5 kg, 1.7 kg, 2 kg, 2.2 kg, 2.5 kg, 3 kg, 3.5 kg, 4 kg, 4.5 kg, 5 kg, 5.5 kg, 6 kg, 6.5 kg, 7 kg, 7.5 kg, 8 kg, 8.5 kg, 9 kg, 9.5
  • a user can define an area in which the UAV guides the target object.
  • the user can define the area using a user interface that is in communication with a processor on-board or off-board the UAV.
  • the one or more processors can be in communication with one or more memory storage units.
  • the memory storage unit can store past user defined areas or routes.
  • the memory storage device units can store geographic data, such as maps and may optionally be updated.
  • a user can define a unique area or route each time the UAV guides the target object or the user can choose from one or more stored routes or areas. Examples of possible areas 200 in which the UAV can guide the target object are shown in FIG. 2.
  • An area can be defined as a region in which the target object is permitted to travel, a boundary past which a target object is not permitted to travel, and/or a region in which the target object is not permitted to travel.
  • region 201 can be an area in which a target object is permitted to travel.
  • Region 201 can be enclosed by boundaries 202 past which the target object is not permitted to travel.
  • a region can enclose sub regions in which the target object is not permitted.
  • Region 203 is an enclosed region in which the target object is permitted.
  • Region 203 encloses region 204 in which the target object is not permitted to travel.
  • a region can be defined as the region enclosed by regions where the target object is not permitted.
  • a user can define a plurality of regions in which a target object is not permitted such that the pluralities of non-permitted regions enclose a region that is allowed.
  • region 205 can be a region in which a target object is permitted.
  • Region 205 can be surrounded by region 206 in which the target object is not permitted.
  • a UAV may be permitted to guide a pet within a park, such that the pet is permitted to remain within a lawn 203, while not being permitted to be guided on a road 206 or in a lake 204.
  • a region can be defined by a geographic radius.
  • a geographic radius can be a radial region centered at an initial location of a target object.
  • a geographic radius can be defined as a radial region centered at a location of a user.
  • a geographic radius can be a radial region with a center point defined by a user.
  • a user can define a geographic region using global coordinates.
  • a geographic region can be defined as a region within user defined boundaries, the boundaries can be defined using global coordinates.
  • a user-defined geofence may be provided which may function as a boundary of a permissible or region or an impermissible region for the target object to be. Any regular or irregular shape may be provided as a boundary.
  • a geographic region can be bound by user defined obstacles.
  • a user can instruct a UAV to guide a target object in a region without crossing a physical boundary or feature.
  • a physical boundary or feature can be a fence, road, ditch, water way, or ground surface transition (e.g. grass to dirt or grass to pavement).
  • the UAV can be configured to detect a physical boundary or feature or the UAV can know the location of the physical boundary or feature apriori.
  • a user can provide a visual map to define permissible and impermissible regions for the target object to travel.
  • a visual map can be generated in a user interface on an electronic device.
  • the user interface can provide a map of a chosen or local space in which a target object can be led by the UAV.
  • a user can mark areas that are permissible or impermissible for the UAV and the target object to travel on the map provided by the user interface.
  • a user can mark areas on the map using a touch screen provided on the user interface.
  • a user’s finger or a pointer e.g., mouse pointer, trackball pointer, etc.
  • the user can draw circles on the user interface to define an area.
  • the user can click on or touch points to define the coordinates of a region.
  • a user can provide an input to the user interface 300 to define a permissible or impermissible region for the UAV and the target object to travel.
  • the input provided by the user can be communicated to the user interface by any method that is acceptable to the electronic device comprising the user interface, for example a user may communicate with the user interface on the electronic device through a tactile or audio command.
  • a user can speak the name or coordinates of a permissible or impermissible area, for example the user can give the command “Dog Park permissible” or “lake impermissible” to designate the dog park as permissible and the lake as impermissible travel regions for the UAV and the target object.
  • a user can draw or trace a region on a map that is permissible or impermissible for the UAV and the target object to travel.
  • a user can draw or trace the region with their finger or a stylus.
  • a user can define a set of coordinates (X 1 ,Y 1 ), (X 2 ,Y 2 ), (X 3 ,Y 3 ), (X 4 ,Y 4 ).
  • Line segments can be formed to connect the set of coordinates and to enclose a geographic region.
  • a user can define the enclosed geographic region as permissible or impermissible for travel of the target object.
  • a user can define a first coordinate (X 5 ,Y 5 ) and trace a closed region that includes the first coordinate, (X 5 ,Y 5 ). The user can define this closed region as permissible or impermissible for travel of the target object.
  • One or more processors can monitor the location of the UAV while it is guiding a target object by receiving a location signal from one or more location signals on-board the UAV.
  • the one or more processors can receive a user input signal that defines permissible areas for the target object to travel and/or impermissible areas for the target object to travel.
  • the one or more processors can compare a locating signal from a UAV guiding a target object to the user input signal that defines permissible areas for the target object to travel and/or impermissible areas for the target object to travel to determine if the UAV has guided the target object outside of the permissible area or into an impermissible area.
  • a locating signal e.g.
  • GPS from a UAV can be compared to a map of permissible and impermissible regions as defined by a user.
  • the processor can initiate a response.
  • the location of the target object can be approximated as the location of the UAV. The approximation that the location of the target object and the location of the UAV can be appropriate in cases when the UAV is very close to the target object, for example, when the target object is attached to the UAV by a relatively short leash.
  • the location of the target object can be determined from a combination of the location of the UAV as determined by one or more location sensors and the location of the target object as determined from one or more vision sensors.
  • a location of a UAV can be known from a GPS sensor and the location of the target object relative to the UAV can be determined from one or more vision sensors configured to recognize the target object.
  • One or more processors can determine the location of the target object relative to the UAV to determine the absolute location of the target object.
  • the location of the target object can be known from a locating sensor on the target object, for example, a GPS sensor in a collar worn by the target object.
  • a locating sensor on the target object can communicate with a processor on or off-board the UAV.
  • a response can be informing a user that the target object has deviated from the permissible area or entered the impermissible area.
  • a user can be informed that the target object has deviated from the permissible area or entered the impermissible area through a user interface on an electronic device.
  • the electronic device can alert a user with an audio signal, vibration signal, text message, phone call, video message, visual image message, electronic notification, and/or email.
  • a response can be a flight instruction to the UAV, the UAV can be instructed by the processor to re-enter the permissible area or exit the impermissible area.
  • the processor can automatically provide a flight instruction to the UAV when the UAV has deviated from the permissible area or entered the impermissible area.
  • the processor can provide a flight instruction to the UAV when the UAV has deviated from the permissible area or entered the impermissible area in response to a user input from an electronic device after the electronic device has alerted the user that the UAV has deviated from the permissible area or entered the impermissible area.
  • the flight instruction can be for the UAV to return to the permissible area or exit the impermissible area.
  • the flight instruction can be for the UAV to entice or direct the target object to control the movement of the target object such that the target object returns to the permissible area or exits the impermissible area.
  • the user can provide a specific flight instruction to the UAV.
  • the specific flight instruction can be for the UAV to fly a specific direction and a specified distance in that direction.
  • the flight instruction can also include a specified distance that should be maintained between the UAV and the target object while the UAV is moving the specified distance in the specified direction.
  • a user can initiate an automated or predetermined flight sequence to return the UAV and the target object to a permissible area.
  • the locating signal can indicate that the UAV has exited a permissible area for the target object to travel or that the UAV has entered an area that is impermissible for the target object to travel when the UAV crosses over a user defined boundary. In some cases, the locating signal can indicate that the UAV has exited a permissible area for the target object to travel or that the UAV has entered an area that is impermissible for the target object to travel when the UAV approaches and is within a predetermined threshold distance from a user defined boundary. The locating signal can indicate exiting a permissible area when a UAV is detected within a threshold distance from a user defined boundary regardless of the direction that the target object and the UAV are heading.
  • the locating signal can indicate exiting a permissible area when a UAV is detected within a threshold distance from a user defined boundary and the direction that the target object and the UAV is towards the boundary.
  • the speed of the target object can be determined.
  • the speed of the target object can be determined from a velocity sensor on-board the UAV.
  • the speed of the target object can be estimated as the speed of the UAV as determined by the velocity sensor.
  • the UAV can comprise a vision sensor to detect the location of the target object.
  • the UAV can determine the speed of the target object from the measurements taken by the vision sensor using a processor on or off-board the UAV.
  • the target object can wear a locating sensor, for example, a locating sensor imbedded in a collar worn by the target object.
  • the locating sensor can be a GPS sensor.
  • the locating sensor worn by the target object can be in communication with one or more processors on or off-board the UAV.
  • the one or more processors can determine the speed of the target object from information transmitted by the locating sensor.
  • the speed of the target object can be a factor in the indication that the UAV has exited a permissible area for the target object to travel or that the UAV has entered an area that is impermissible for the target object to travel when the UAV crosses over a user defined boundary.
  • an indication that the UAV has exited a permissible area for the target object to travel or that the UAV has entered an area that is impermissible for the target object to travel when the UAV crosses over a user defined boundary can be provided.
  • FIG. 4 shows an example of a target object 401 within the proximity of a boundary 402.
  • the boundary 402 can have either or both of a first threshold 403 and a second threshold 404.
  • the first 403 and second 404 thresholds can outline either edge of the boundary 402.
  • the distance between the boundary 402 and the first 403 and second 404 threshold can be defined by a user.
  • the distance between the threshold and the boundary can be at least 1 inch (in), 6 in, 1 foot (ft), 2 ft, 3 ft, 4 ft, 5 ft, 6 ft, 7 ft, 8 ft, 9 ft, 10 ft, 11 ft, 12 ft, 13 ft, 14 ft, 15 ft, 16 ft, 17 ft, 18 ft, 19 ft, or 20 ft.
  • the distance between the boundary and a threshold can be greater than 20 ft.
  • the distance between a boundary and threshold can fall between any of the listed values.
  • the distance between the boundary 402 and the first 403 and second 404 threshold can be uniform or the distance can vary.
  • the distance can vary along a boundary 402.
  • the distance can vary such that the distance between a first threshold 403 and the boundary 402 is different from the distance between the boundary 402 and the second 404 threshold.
  • a first boundary can have a first threshold distance and a second boundary can have a second threshold distance.
  • a first boundary indicates that a target object cannot enter a dangerous area (e.g. street, parking lot, or a region containing other aggressive animals)
  • a distance between the first boundary and the threshold can be relatively large.
  • a second boundary indicates that a target object cannot enter a comparatively less dangerous area (e.g. lake, neighbor’s yard, or a dirt pile) a distance between the first boundary and the threshold can be relatively small.
  • the direction that the target object is heading 405 can be a factor in determining if an indication that the target object has crossed a boundary 402 or a threshold 403, 404. For example, when a target object is heading in the direction of a boundary 402 or a threshold 403, 404 an indication that the target object is exiting a permissible area or entering an impermissible area can be provided.
  • a UAV can be physically connected or attached to the target object by a physical attachment mechanism.
  • a physical attachment mechanism can be a leash, rope, or chain that tethers the target object to the UAV.
  • the physical attachment mechanism can attach to a region on the body of the UAV on one end and the target object on the other end.
  • the physical attachment mechanism can attach to a collar that is worn around a neck of the target object.
  • the physical attachment mechanism can attached to a harness that attaches to a body of the target object.
  • the UAV can provide a deterrent mechanism to the target object when the target object approaches a boundary or a threshold of a boundary such that the target object is exiting a permissible area or entering an impermissible area.
  • the UAV may or may not be configured to provide sufficient force to pull a target object away from a boundary enclosing an impermissible area. In some cases the UAV may require a deterrent mechanism to prevent a target object from travelling into an impermissible area or out of a permissible area.
  • the UAV can be configured only to provide one type of deterrent mechanism. Alternatively the UAV can be configured to provide a primary deterrent mechanism followed by at least one additional deterrent mechanism.
  • the additional deterrent mechanism can be provided when the target object fails to obey the primary deterrent mechanism within a specified time interval after the primary deterrent mechanism is provided.
  • the additional deterrent mechanism can be harsher than the primary deterrent mechanism.
  • the specified time interval between the primary and additional deterrent mechanism can be fixed or it can be dependent on the action of the target object. For example, if a target object is rapidly approaching a boundary of an impermissible region the specified time interval can be shorter than in cases where the target object is slowly approaching the boundary.
  • the deterrent mechanism can be the user’s voice.
  • the user’s voice can be a recording played through a microphone on-board the UAV.
  • the recording can be stored on a memory storage device on or off-board the UAV.
  • a user can be alerted in real time though a user device that the target object is approaching a boundary or a threshold of a boundary such that the target object is exiting a permissible area or entering an impermissible area.
  • the user’s voice can be transmitted from the user device to the UAV in real time.
  • the recording of the user’s voice or the transmission of the user’s voice in real time can be provided to the target object through a user interface on-board the UAV.
  • the user interface on board the UAV can comprise a microphone to emit an audio alert to the target object.
  • the audio alert can be a live stream or recording of a user’s voice, an unpleasant sound, a high pitched ring, or any other audio stimulus that commands attention and obedience of the target object.
  • a user can tell a target object to stop, sit, or come through a live stream or a recording.
  • the user interface can further comprise a screen such that the alert can be provided to the target visually.
  • the alert can be both audio and visual or only one of the two.
  • the visual alert can be a video recording or a live video of the user.
  • the deterrent mechanism can be an electric shock.
  • the electric shock can be provided by an electric shock collar worn by the target object.
  • the UAV can be in communication with the electric shock collar through a wired or wireless connection.
  • the wired connection can be imbedded in the physical attachment mechanism between the UAV and the target object.
  • the UAV can instruct the electric shock collar to provide an electric shock to the target object when the target object approaches a boundary or a threshold of a boundary such that the target object is exiting a permissible area or entering an impermissible area.
  • the electric shock can be a first response to the target object approaching a boundary or a threshold of a boundary such that the target object is exiting a permissible area or entering an impermissible area.
  • the electric shock can be a secondary response after playing a real time or recorded voice of the user.
  • An electric shock can be provided to the target object if the target object does not respond to the user’s voice within a predetermined period of time.
  • the predetermined period of time can be a fixed value, for example the predetermined period of time can be at least 1 second (sec), 5 sec, 10 sec, 15 sec, 20 sec, 25 sec, 30 sec, 35 sec, 40 sec, 45 sec, 50 sec, 55 sec, or 1 minute.
  • the period of time can be a function of the speed of the target object such that the time between a user’s voice and the electric shock is inversely proportional to the speed at which the target object is traveling.
  • An additional deterrent mechanism that can be used alone or in combination with the user’s voice and/or the electric shock can include emitting a noise (e.g. beep, buzz, or siren) that the target object recognizes or has been conditioned to recognize as a signal to stop moving in a direction.
  • a noise e.g. beep, buzz, or siren
  • Another deterrent mechanism that can be used alone or in combination with the user’s voice and/or the electric shock can be a spray of a liquid that has a smell that deters the target object, for example the liquid may be citronella.
  • a user can also define a specific route along which the UAV can lead or guide a target object.
  • a user can define a unique route or a user can pick from a plurality of routes that can be stored on a storage memory device on or off-board the UAV.
  • the stored routes can originate from previous routes that a user has used. In some cases the stored routes can come from other users in the area that also use a UAV to guide their target object through a route sharing network.
  • FIG. 5 shows a map 500 with possible routes that a UAV 501 can travel to guide a target object.
  • a UAV can start a route with a target object at a home 502.
  • the UAV can lead the target object along route R 0 .
  • the UAV can return to the home 502 along route R 1 or route R 2 .
  • a user can specify which route, R 1 or route R 2 , should be taken by the UAV to return home 502. In some cases the user can specify the choice of R 1 or route R 2 in real time while the UAV is guiding the target object.
  • a route can be changed in real time.
  • a UAV can begin guiding a target object along a route R 0 with the initial plan of following from R 0 to R 1 .
  • a user can, in real time, update the route such that the UAV guides the target object from R 0 to R 2 instead of from R 0 to R 1 .
  • a user may provide an input that alters a route from a pre-defined route while the UAV is in flight and traveling along the route. This may provide flexibility if an event comes up while the UAV is away.
  • the user may change the route while the target object is out with the UAV to bring the target object home more directly or quickly.
  • the user may advantageously alter the route while the UAV is out with the target object.
  • the UAV can guide the target object along a travel route.
  • the UAV can be in flight while guiding the target object.
  • the UAV can achieve flight with one or more propulsion units, for example a propulsion unit can comprise one or more rotors.
  • the target object can be in locomotion while the UAV is guiding the target object.
  • a UAV can receive a travel route that describes an area or path along which the target object should be guided.
  • the travel route can be a user input to a user device that is in communication with the UAV.
  • the user device can be a computer, tablet, smart phone, smart watch, or smart glasses.
  • the UAV can guide the target object along the route by flying along the travel route while the target object is in motion.
  • the target object can be in motion close to the location of the UAV.
  • the UAV can be suspended in flight and the target object can be on the ground directly below the UAV.
  • the target object can be on the ground below the UAV and off set to the right, left, back or front of the UAV.
  • the target object can be physically attached to the UAV through a physical attachment mechanism, for example, a leash.
  • the leash can be attached on one end to the UAV and on the other end to the target object.
  • the end of the leash attached to the target object can be attached to a collar or harness worn by the target object.
  • the travel route can be updated in real time.
  • a UAV can begin guiding a target object from a starting location.
  • the UAV can guide the target object along a first travel route.
  • the UAV can receive a route update from a user device that is in communication with the UAV.
  • the route update can provide a change from a first travel route to a second travel route.
  • the change can be provided to make the travel route longer, shorter, to avoid a location, or to include a location that was not part of the first travel route.
  • Once the UAV receives the updated route the UAV can continue guiding the target object along the updated route.
  • the route can be updated at least once while the UAV is guiding the target object. In some cases the route can be updated at least once, twice, three times, four times, five times, six times, seven times, eight times, nine times, or ten times.
  • a user can update the travel route with a user device that is in communication with the UAV.
  • the user device can be in communication with the UAV through a wired or wireless connection.
  • the user can define the first travel route using a global location identifier, for example global coordinates.
  • the user can define the first travel route using an image or a line on a map.
  • the map can be provided on a user interface on the user device.
  • the image or line can be interpreted by a processor to define a travel route in global or local coordinates.
  • a user can provide an updated route using coordinates, a map image, or a line on a map.
  • the location of the UAV can be determined by one or more locating sensors on-board the UAV.
  • a locating sensor can be a GPS sensor.
  • the location of the UAV determined by the one or more locating sensors can be transmitted to a user device and/or a processor off-board the UAV.
  • the location of the UAV can roughly define the location of the target
  • the UAV can comprise one or more vision sensors configured to capture an image of the target object.
  • the location of the target object can be determined by one or more processors from the location of the UAV determined by the one or more locating sensors on-board the UAV and the image of the target object.
  • the one or more processors can determine when the target object is deviating from the travel route.
  • the travel route can be a first travel route or an updated travel route.
  • an attractor or instruction can be provided to prevent the target object from continuing to divert from the travel route and/or to force or entice the target object to return to the travel route.
  • the UAV when the target object is deviating the travel route the UAV can play a live stream or a recording of the user’s voice to the target object.
  • the live stream or recording can be an instruction from the user for the target object to come closer to the UAV or to stop traveling in a direction away from the travel route.
  • a user can be the target object’s owner.
  • the user can be an individual designated by the target object’s owner to monitor the target object.
  • the user’s voice can be transmitted through a user device to the UAV in real time. Alternatively, the user’s voice can be pre-recorded and stored on a memory storage device on or off-board the UAV.
  • another stimulus can be provided to the target object when the target object deviates from the travel route.
  • the stimulus can be provided in addition to or instead of the user’s voice.
  • the stimulus can be an attractor, for example an edible treat or an emission of a smell that is of interest to a target object.
  • the attractor can be provided to guide the target object back to the travel route.
  • the stimulus can be an electric shock.
  • the electric shock can signal to the target object that the target object should stop moving. The electric shock can be provided when the target object deviates from the travel route a predetermined distance.
  • a UAV can be configured to recognize a target object.
  • a UAV can recognize the target object using an image recognition algorithm that can detect defining features of the target object. For example, the UAV may be able to discern target object size, gait, coloration/patterns, or proportions (e.g., limbs, torso, face).
  • the UAV can detect a collar worn by the target object using a vision sensor on-board the UAV.
  • a collar worn by a target object can have unique identifiers such that the UAV can distinguish the collar worn by the target object from another collar worn by an alternative target object.
  • the distinguishing features can be patterns, symbols, or a unique combination of numbers and letters.
  • FIG. 6 shows an example of a target object 601 wearing a collar 602 that can be recognized by a UAV.
  • Description of the collar can apply to any object that is wearable by the target object, for example, a harness, sweater, ankle band, hat, or paw bootie.
  • the UAV can recognize target object wearing the collar using one or more vision sensors on-board the UAV.
  • the collar 602 can comprise at least one of a pattern of symbols 603, letters 604, or numbers.
  • the pattern of symbols 603, letters 604, or numbers can be provided on a display screen on the collar.
  • the pattern of symbols 603, letters 604, or numbers can be a constant display or the display can change. In some cases a pattern displayed on the collar can communicate information to the UAV.
  • the collar can emit a signal that can be detected by the UAV for example an IR signal.
  • the collar can further comprise at least one component 605 configured to permit connection to a physical connection mechanism between a UAV and the collar.
  • the component 605 configured to permit connection to a physical connection mechanism between a UAV and the collar can be a magnet, hook, hole, rivet, snap, or other connection hardware.
  • a UAV can be configured to attach to the collar of the target object using a physical connection mechanism (e.g. leash) automatically without human intervention.
  • the UAV may require human intervention to attach to the collar of the target object.
  • the UAV can attach to the collar of the target object after confirming recognition of the target object using one or more vision sensors to detect the target object.
  • the leash can attach to the collar using a magnetic mechanism such that a magnet on the collar is attracted to a magnet on an end of the leash.
  • only one of either the leash end or the collar can comprise a magnet and the other component (e.g. leash end or collar) can comprise a metal that is attracted to the magnet.
  • the leash can be made from a flexible and/or bendable material, for example plastic, rubber, elastic, or another flexible material.
  • the UAV can attach a leash to the target object automatically without human aid or intervention.
  • the UAV can attach a least to the target object using a mechanical mechanism.
  • Mechanical mechanism can be a hook, clamp, or robotic arm.
  • the mechanical mechanism is a robotic arm the robotic arm can be on-board the UAV.
  • the robotic arm can extend and retract to guide a leash to a collar on a target object.
  • the robotic arm can extend and retract using a telescoping mechanism.
  • the UAV can hover directly above or to a side of the target object while the UAV is attaching a leash to the target object.
  • One or more vision sensors can detect the location of the target object while the UAV is attaching the leash to the target object.
  • the robotic arm can have a feature at its terminal end configured to attach the leash to a collar on the target object.
  • the leash can attach to the collar using any mechanical or electrical connection mechanism for example, a hook and loop, snap, magnetic, Velcro, or any other mechanical coupling mechanism.
  • the coupling mechanism between the leash and the collar on a target object can be generic or the coupling mechanism can have a size or shape that is unique to a specific leash and collar connection. The unique coupling mechanism can prevent a UAV from accidentally connecting to a wrong target object.
  • one or more vision sensors on-board the UAV can detect the target object while the UAV is attaching the leash to the target object to verify that the UAV is attaching the leash to the correct target object.
  • Other sensors may be used to verify that the leash is attached to the correct target object.
  • a collar or other wearable connection of the target object may interact with the leash to confirm the correct identity.
  • a signal may pass between the collar and leash upon contact or wirelessly. The signal may include an identifier of the collar which may be verified by the leash, UAV or any processors anywhere in the system.
  • the physical connection mechanism between the UAV and the target object can have a fixed or adjustable length.
  • the length of the physical connection mechanism can determine a permitted distance between a UAV and a target object. In cases where the length of the physical connection mechanism is adjustable the physical connection mechanism can be retractable.
  • the maximum extension of the physical connection can be fixed or the maximum extension of the physical connection can be determined by a location or a distance from a defined boundary. For example, when a UAV attached to a target object with a physical connection mechanism is a relatively far from a defined boundary the physical connection mechanism can be extended to a relatively long length. In comparison, when the UAV attached to a target object with a physical connection mechanism is a relatively close to a defined boundary the physical connection mechanism can be extended to a relatively short length.
  • the physical connection mechanism can be extended and retracted while the UAV is in flight.
  • FIG. 7 shows an example of a UAV 701 and a target object 702.
  • the UAV 701 is connected to the target object 702 through a physical mechanism 703, for example a leash.
  • the UAV 701 can have at least one on-board vision sensor 704, for example a camera.
  • the vision sensor 704 can be on the body of the UAV or the vision sensor 704 can be extended from a surface of the UAV, for example the bottom surface, by a support structure 705.
  • the vision sensor 704 can be movable relative to the UAV.
  • the vision sensor 704 can be configured to rotate and/or translate independent of the position of the UAV.
  • the vision sensor can capture at least one image of a target object.
  • the vision sensor can be moved to track the movement of the target object.
  • the image can be stored on a memory storage device on or off-board the UAV.
  • the image can be analyzed to identify a target object.
  • the image may be of the target object or a collar 706 worn by the target object.
  • the UAV can be in communication with one or more processors on or off-board the UAV.
  • the processors can be configured to analyze an image from a vision sensor and recognize the target object from an image of the target object or an image of the collar 706.
  • the UAV 701 can approach the target object 702 and attach the physical mechanism 703 to the collar 706 worn by the target object.
  • the UAV can automatically attach the physical mechanism to the collar 706 worn by the target object 702 by joining a mating or coupling connection on a terminal end of the physical mechanism to a corresponding connection 707 on the collar.
  • the UAV can fly while the target object is in locomotion.
  • the UAV can guide a target object by pulling on the leash.
  • the pulling force with which the UAV pulls on the least can be calculated from the motion of the target object and the motion of the UAV.
  • the motion of the target object and the motion of the UAV can be compared to determine one or more parameters with which the UAV pulls on the leash.
  • parameters that can be determined may be the magnitude and/or the direction of the pulling force.
  • the magnitude of the pulling force can fall within a predefined range.
  • the predefined range of pulling forces can be determined by a user or calculated from a user input.
  • a user input can be the weight of the target object.
  • the UAV can be configured to provide sufficient force to control a target object having a weight of at least 1 kilogram (kg), 2 kg, 3 kg, 4 kg, 5 kg, 10 kg, 15 kg, 20 kg, 25 kg, 30 kg, 35 kg, 40 kg, 50 kg, 55 kg, 60 kg, 65 kg, 70 kg, 75 kg, 80 kg, 85 kg, 90 kg, 95 kg, 100 kg, 105 kg, 110 kg, 115 kg, 120 kg, 125 kg, 130 kg, 135 kg, 140 kg, 145 kg, or 150 kg.
  • the UAV can continuously collect images of the target object while the UAV is in flight and the target object is in locomotion or while the target object is stationary.
  • the target object can be attached or tethered to the UAV with via a leash while the UAV is collecting images of the target object.
  • the images of the target object can be saved on a memory storage device that can be on-board or off-board the UAV.
  • the UAV can collect images of the target object with a vision sensor.
  • the vision sensor can collect still images or video images of the target object.
  • the UAV can comprise at least one additional vision sensor configured to collect images of the environment. In some cases, the at least one additional vision sensor can track other objects in the environment.
  • the images can be displayed to a user through a user interface that is in communication with a processor on or off-board the UAV.
  • the user interface can also display the location of the UAV while it is attached to the target object. The location may be shown on a map in the user interface.
  • a UAV can play the user’s voice to the target object while the target object is attached to the UAV by a leash or other physical attachment mechanism.
  • the UAV can play the user’s voice to the target object while the target object is in locomotion and/or while the UAV is flying.
  • the user’s voice can be provided by the UAV through an audio or visual display on-board the UAV.
  • the target object may be familiar and therefore more responsive to the user’s voice as compared to a voice from a human that is not the user.
  • the user’s voice can be provided to the target object in real time.
  • the user’s voice can be transmitted from the user device to the UAV in real time.
  • the user’s voice can say a command to the target object.
  • a user can receive an image of a target object and/or a location of a UAV attached to a target object through a user interface on a user device.
  • the user may wish to speak to the target object in response to the image or location of the target object.
  • the user can speak to the target object to provide positive or negative feedback in response to the image or location of the target object.
  • the user can speak to the target object in real time by transmitting their voice through the user device to the UAV.
  • the user’s voice can be a pre-recording.
  • the pre-recording can be an audio or video recording of the user.
  • a processor on or off-board the UAV can be configured to recognize a behavior or action committed by the target object from an image of the target object or a location of a UAV attached to a target object.
  • the processor can instruct the UAV to provide a pre-recording of a user’s voice to provide a command or negative or positive feedback to a target object in response to a detected behavior or action committed by the target object.
  • a user can recognize a behavior or action committed by the target object from an image of the target object or a location of a UAV attached to a target object provided on a user device’s user interface.
  • the user can transmit an instruction the UAV to provide a pre-recording of a user’s voice to provide a command or negative or positive feedback to a target object in response to a detected behavior or action committed by the target object.
  • FIG. 8 shows an example of a UAV 801 and a target object 802 where the UAV is providing an audio and visual stimulus to the target object.
  • the visual stimulus can be provided to the target object through a screen 803 on-board, carried by, or attached to the UAV 801.
  • the screen can be permanently exposed or the screen can be folded or retracted into the UAV when it is not in use.
  • the audio stimulus can be provided through a microphone or speaker 804 on-board the UAV.
  • the audio stimulus can be a recording or a live stream of a user’s voice.
  • a user can be the owner of the target object or an individual designated to monitor the target object while it is being guided by the UAV.
  • the microphone can be bi-directional such that a user’s voice can be provided to the target object and an audio response (e.g. barking, meowing, or whining) from the target object can be collected and transmitted to a user through a user device.
  • the UAV 801 can further comprise one or more visual sensors 805.
  • the visual sensors can collect still images and/or video images of the target object.
  • the images can be analyzed by one or more processors on or off-board the UAV to recognize the target object.
  • the images can be further analyzed to determine the location of the target object relative to a known location of the UAV.
  • the UAV can be attached to the target object through a physical connection, for example, a leash 806.
  • the UAV may not be attached to the target object while the UAV is guiding the target object.
  • a UAV can guide the target object by recognizing the target object and automatically, without human aid or invention, display an attractor to the target object.
  • the UAV can fly while displaying the attractor and the target object can be in locomotion following the attractor.
  • the attractor can be a visual, auditory, or olfactory stimulus that is configured to attract the attention of the target object.
  • the attractor can be an edible treat, for example, a dog treat, bacon, peanut butter, or another edible product that is desirable to a target object.
  • the attractor can emit a scent.
  • the scent can be associated with an entity that is of interest to a target object, for example, a food item or another target object.
  • the attractor can emit the scent from the entity itself or from a chemical configured to have a scent typically associated with the entity.
  • a strip of bacon can be stored on-board the UAV and the scent of the bacon can be wafted towards the target object.
  • the UAV can have a chemical configured to smell like bacon stored on-board the UAV.
  • the UAV can emit a spray or mist of the chemical to attract the target object.
  • the attractor can be an image that is displayed on a screen carried by the UAV.
  • the image can be a static image or a video.
  • the image can depict an owner of the target object.
  • the image of the owner can be a static image or a video.
  • the image may be accompanied by an audio recording or a live audio stream of the owner.
  • the UAV can comprise an audio player (e.g. speaker or microphone) that can play the user’s voice to the target object while the target object is in locomotion and is attached to the UAV via a leash or while the target object is being guided by the UAV without a leash.
  • the user’s voice can be transmitted from a user device to the UAV in real time.
  • a user can be the owner of the target object.
  • the user’s voice can be prerecorded.
  • a combination of edible treats and images can be used in combination or consecutively to attract the target object.
  • the UAV can carry the attractor outside of the body of the UAV.
  • the attractor can be connected to the UAV by a support structure.
  • the attractor can be moved vertically and/or horizontally relative to the UAV. In some cases, the attractor can rotate relative to the UAV.
  • the UAV can display the attractor by dangling the attractor at or near a head level of the target object.
  • FIG. 9 shows an example of a UAV 901 guiding a target object 902 with an attractor 903.
  • the UAV 901 can comprise one or more on-board vision sensors.
  • the visions sensors on-board the UAV 901 can be configured to determine, with the aid of one or more processors, the location of the target object 902 relative to the UAV 901.
  • the UAV can be instructed by the one or more processors to adjust or maintain the flight speed of the UAV such that the UAV remains within a proximity of the target object.
  • the proximity to the target object can be set to a distance that is sufficiently close for the target object to perceive the attractor 903.
  • One or more vision sensors can be configured to determine the trajectory of the locomotion of the target object 902 relative to the UAV 903. The determined trajectory of the locomotion of the target object relative to the UAV can result in an instruction to the UAV to adjust or maintain the direction of the UAV flight to remain within a proximity of the target object.
  • the proximity to the target object can be set to a distance that is sufficiently close for the target object to perceive the attractor 903.
  • the vision sensors can determine the location or a target object and/or the trajectory of locomotion of target object and cause a movement of the attractor 903.
  • the attractor 903 can be moved to increase or decrease interaction of the target object with the attractor. For example, if a target object is jumping upward an attractor can be raised to avoid contact with the target object. In another example, a target object can move to a side of the UAV and the attractor may rotate relative to the UAV to remain in a line of sight with the target object.
  • the attractor can be an edible treat.
  • a target object can be initially attracted to an edible treat. After a period of time the target object can become frustrated or discouraged if the edible treat is not provided for consumption.
  • UAV can be configured to periodically provide the target object with at least a fraction of an edible treat that is being used as an attractor. The fraction of the edible treat can be provided as a reward for a positive behavior or action and/or to keep the attention of the target object while the target object is being guided by the UAV.
  • the UAV can provide the fraction of the edible treat to the target object at fixed intervals, at specified route locations, or whenever the one or more vision sensors detect that the target object appears to be losing interest in the attractor, where the attractor is an edible treat.
  • a vision sensor can detect that a target object is losing interest in the attractor, where the attractor is an edible treat when the target object suspends locomotion or wanders away a threshold distance from the location of the UAV.
  • a target object can generate waste while the target object is being guided by the UAV. In some locations it may be impermissible to leave waste generated by the target object in the location where it was generated.
  • a UAV can be configured to guide a target object and to recognize waste generated by the target object. In some cases the UAV can be configured to collect and dispose of waste generated by the target object.
  • a UAV can recognize waste generated by a target object using a one or more vision sensors. The vision sensors may be on-board the UAV. The vision sensors can be the same vision sensors used to recognize the target object or the vision sensors can be a second set of sensors that are not used to recognize the target object. The UAV can recognize waste generated by the target object and alert a user (e.g.
  • the UAV can provide an alert to a user that includes the location of the waste generated by the target object.
  • the vision sensors can be configured to capture an image of the target object and the waste generated by the target object.
  • the image can be a still photograph or a video image.
  • the UAV can comprise on or more processors that are configured to recognize the target object from the image and also recognize waste generated by the target object from the image.
  • the one or more processors can be located on or off-board the UAV.
  • the UAV can further comprise a communication unit configured to send a signal to a user device that alerts the user that the waste has been generated by the target object.
  • FIG. 10 shows an example of a UAV 1001 that is guiding a target object 1002.
  • the UAV can comprise one or more vision sensors 1003.
  • the one or more vision sensors 1003 can be inside the body of the UAV 1001 or suspended from an outer surface of the UAV 1001 by a support structure 1004. In some cases the vision sensor can be configured to translate and/or rotate independently of the UAV 1001.
  • the target object 1002 can be attached to the UAV 1001 by a physical attachment. In some cases the target object 1002 may not be attached to the UAV 1001.
  • the vision sensor 1003 can be configured to recognize a target object 1002 and collect an image of the target object 1002.
  • the vision sensor 1003 can be further configured to recognize waste 1005 generated by a target object with the aid of one or more processors.
  • the one or more processors 1007 can be on-board the UAV.
  • the vision sensors can capture and image of the waste.
  • the images captured by the vision sensor can be stored on a memory storage device 1006.
  • the memory storage device can be on or off-board the UAV.
  • the one or more processors 1007 can be configured to recognize waste generated by the target object from one or more images of the waste provided by the vision sensor.
  • the UAV can further comprise a communication unit configured to send or transmit a signal to a user device to alert the user (e.g. owner of the target object) that the target objected has generated waste.
  • the communication unit can send or transmit a signal or alert to a user device to alert a user that the target object has generated waste.
  • the user device can be a smartphone, tablet, personal computer, smart watch, smart glasses, or a wireless pager.
  • the user device can comprise a user interface.
  • the user interface can be interactive such that a user can control the UAV through the user interface.
  • the alert can be an audio, visual, or tactile (e.g. vibration) alert.
  • the alert can include a location where the waste was generated.
  • the location can be provided in global coordinates. In some cases the location can be displayed on a map provided on the user interface on the user device.
  • the user device can also provide an image of the waste generated by the target object.
  • a user can receive alerts about the location of the target object, behavior of the target object, and the location of waste generated by the target object.
  • a user can be the owner of the target object.
  • a user can be a waste removal professional.
  • a waste removal professional can be a friend of a user, an acquaintance of a user, a volunteer, or an employee hired by the user.
  • a waste removal professional can be any human that removes waste generated by the target object.
  • a communication unit on-board the UAV can provide alerts to a waste removal professional about the time and/or location of waste generation by the target object.
  • a waste removal professional can be contracted by an owner of the target object to dispose of waste generated by a target object.
  • a waste removal professional can be a volunteer.
  • the owner of the target object can provide an item of value (e.g. currency, credit, or commodity) in exchange for removal of the waste generated by the target object.
  • the waste removal professional can be compensated with a flat weekly, monthly, quarterly, bi-yearly or yearly rate. In some cases the waste removal professional can be compensated per waste disposal.
  • FIG. 11 shows an example of an alert from a UAV 1101 indicating that a target object 1102 has generated waste 1103.
  • the alert can include the exact location of the waste or a general region in which the target object generated the waste.
  • the alert can be transmitted from the UAV to a user device 1105 in communication with the UAV.
  • the user device can be a computer, smart phone, tablet, smart watch, or smart glasses.
  • the waste location or region of the waste location can be provided in relative or global coordinates.
  • the alert can be provided only to waste removal professionals within a specified radius of the waste generation location 1104.
  • a waste removal professional 1107 outside of the region 1104 may not receive an alert on their electronic device 1110.
  • the location can be provided on a map displayed on a user interface on a user device 1105.
  • the alert can be provided to either or both of an owner 1106 of a target object and a waste removal professional 1109 within a specified radius of the waste generation location 1104.
  • the owner of the target object or the waste removal professional can be set as a default to receive an alert to collect and/or dispose of the waste.
  • the owner of the target object can be the default to collect and/or dispose of the waste, the owner can choose to divert the alert to a waste removal professional using their electronic device 1111.
  • the owner may choose to divert the alert to a waste removal professional when they do not want or are not able to leave their home 1108, office, store, school, or other location to collect and/or dispose of the waste.
  • the owner of the target object can control the alerts such that a waste removal professional is the default receiver of a waste alert during specified hours.
  • a waste removal professional can be the default recipient during morning and evening hours and a waste removal professional can be the default recipient during the middle of the day.
  • the recipient that receives the alert e.g. the owner or the waste removal professional
  • the UAV can be configured to recognize and remove waste generated by the target object.
  • the UAV can capture one or more images of the target object and waste generated by the target object.
  • One or more processors on or off-board the UAV can be configured to recognize the target object from the one or more images of the target object and to recognize waste generated by the target object from the one or more images of the waste generated by the target object.
  • the UAV can comprise one or more waste removal units.
  • the waste removal units can be configured to remove waste in response to recognition of the waste generated by the target object.
  • the waste removal unit can include a mechanism configured to extend from the UAV to remove the waste, for example a mechanical arm.
  • the mechanism configured to extend from the UAV to remove the waste can be an extendible structure with a scoop, shovel, or disposable container (e.g.
  • the UAV can be configured to collect waste generated by the target object and store the waste until it can be disposed of in a disposal container (e.g. trashcan, landfill, dumpster, or compost collector).
  • the UAV can comprise one or more vision sensors that can capture images of the environment in the vicinity of the UAV. The images of the environment can be analyzed by one or more processors on-board or off-board the UAV that can be configured to recognize a disposal container. In response to locating the disposal container the UAV can dispose of the waste in the disposal container.
  • the UAV can continue to store the waste until a disposal container inside of a permissible area or outside of an impermissible area is located.
  • any description herein of a UAV may apply to and be used for any movable object.
  • Any description herein of an aerial vehicle may apply specifically to UAVs.
  • a movable object of the present invention can be configured to move within any suitable environment, such as in air (e.g., a fixed-wing aircraft, a rotary-wing aircraft, or an aircraft having neither fixed wings nor rotary wings), in water (e.g., a ship or a submarine), on ground (e.g., a motor vehicle, such as a car, truck, bus, van, motorcycle, bicycle; a movable structure or frame such as a stick, fishing pole; or a train), under the ground (e.g., a subway), in space (e.g., a spaceplane, a satellite, or a probe), or any combination of these environments.
  • the movable object can be a vehicle, such as a vehicle described elsewhere herein.
  • the movable object can be carried by a living subject, or take off from a living subject, such as a human or an animal.
  • Suitable animals can include avines, canines, felines, equines, bovines, ovines, porcines, delphines, rodents, or insects.
  • the movable object may be capable of moving freely within the environment with respect to six degrees of freedom (e.g., three degrees of freedom in translation and three degrees of freedom in rotation). Alternatively, the movement of the movable object can be constrained with respect to one or more degrees of freedom, such as by a predetermined path, track, or orientation.
  • the movement can be actuated by any suitable actuation mechanism, such as an engine or a motor.
  • the actuation mechanism of the movable object can be powered by any suitable energy source, such as electrical energy, magnetic energy, solar energy, wind energy, gravitational energy, chemical energy, nuclear energy, or any suitable combination thereof.
  • the movable object may be self-propelled via a propulsion system, as described elsewhere herein.
  • the propulsion system may optionally run on an energy source, such as electrical energy, magnetic energy, solar energy, wind energy, gravitational energy, chemical energy, nuclear energy, or any suitable combination thereof.
  • the movable object may be carried by a living being.
  • the movable object can be an aerial vehicle.
  • aerial vehicles may be fixed-wing aircraft (e.g., airplane, gliders), rotary-wing aircraft (e.g., helicopters, rotorcraft), aircraft having both fixed wings and rotary wings, or aircraft having neither (e.g., blimps, hot air balloons).
  • An aerial vehicle can be self-propelled, such as self-propelled through the air.
  • a self-propelled aerial vehicle can utilize a propulsion system, such as a propulsion system including one or more engines, motors, wheels, axles, magnets, rotors, propellers, blades, nozzles, or any suitable combination thereof.
  • the propulsion system can be used to enable the movable object to take off from a surface, land on a surface, maintain its current position and/or orientation (e.g., hover), change orientation, and/or change position.
  • the movable object can be controlled remotely by a user or controlled locally by an occupant within or on the movable object.
  • the movable object may be controlled remotely via an occupant within a separate vehicle.
  • the movable object is an unmanned movable object, such as a UAV.
  • An unmanned movable object, such as a UAV may not have an occupant on-board the movable object.
  • the movable object can be controlled by a human or an autonomous control system (e.g., a computer control system), or any suitable combination thereof.
  • the movable object can be an autonomous or semi-autonomous robot, such as a robot configured with an artificial intelligence.
  • the movable object can have any suitable size and/or dimensions.
  • the movable object may be of a size and/or dimensions to have a human occupant within or on the vehicle.
  • the movable object may be of size and/or dimensions smaller than that capable of having a human occupant within or on the vehicle.
  • the movable object may be of a size and/or dimensions suitable for being lifted or carried by a human.
  • the movable object may be larger than a size and/or dimensions suitable for being lifted or carried by a human.
  • the movable object may have a maximum dimension (e.g., length, width, height, diameter, diagonal) of less than or equal to about: 2 cm, 5 cm, 10 cm, 50 cm, 1 m, 2 m, 5 m, or 10 m.
  • the maximum dimension may be greater than or equal to about: 2 cm, 5 cm, 10 cm, 50 cm, 1 m, 2 m, 5 m, or 10 m.
  • the distance between shafts of opposite rotors of the movable object may be less than or equal to about: 2 cm, 5 cm, 10 cm, 50 cm, 1 m, 2 m, 5 m, or 10 m.
  • the distance between shafts of opposite rotors may be greater than or equal to about: 2 cm, 5 cm, 10 cm, 50 cm, 1 m, 2 m, 5 m, or 10 m.
  • the movable object may have a volume of less than 100 cm x 100 cm x 100 cm, less than 50 cm x 50 cm x 30 cm, or less than 5 cm x 5 cm x 3 cm.
  • the total volume of the movable object may be less than or equal to about: 1 cm 3 , 2 cm 3 , 5 cm 3 , 10 cm 3 , 20 cm 3 , 30 cm 3 , 40 cm 3 , 50 cm 3 , 60 cm 3 , 70 cm 3 , 80 cm 3 , 90 cm 3 , 100 cm 3 , 150 cm 3 , 200 cm 3 , 300 cm 3 , 500 cm 3 , 750 cm 3 , 1000 cm 3 , 5000 cm 3 , 10,000 cm 3 , 100,000 cm 3 3, 1 m 3 , or 10 m 3 .
  • the total volume of the movable object may be greater than or equal to about: 1 cm 3 , 2 cm 3 , 5 cm 3 , 10 cm 3 , 20 cm 3 , 30 cm 3 , 40 cm 3 , 50 cm 3 , 60 cm 3 , 70 cm 3 , 80 cm 3 , 90 cm 3 , 100 cm 3 , 150 cm 3 , 200 cm 3 , 300 cm 3 , 500 cm 3 , 750 cm 3 , 1000 cm 3 , 5000 cm 3 , 10,000 cm 3 , 100,000 cm 3 , 1 m 3 , or 10 m 3 .
  • the movable object may have a footprint (which may refer to the lateral cross-sectional area encompassed by the movable object) less than or equal to about: 32,000 cm 2 , 20,000 cm 2 , 10,000 cm 2 , 1,000 cm 2 , 500 cm 2 , 100 cm 2 , 50 cm 2 , 10 cm 2 , or 5 cm 2 .
  • the footprint may be greater than or equal to about: 32,000 cm 2 , 20,000 cm 2 , 10,000 cm 2 , 1,000 cm 2 , 500 cm 2 , 100 cm 2 , 50 cm 2 , 10 cm 2 , or 5 cm 2 .
  • the movable object may weigh no more than 1000 kg.
  • the weight of the movable object may be less than or equal to about: 1000 kg, 750 kg, 500 kg, 200 kg, 150 kg, 100 kg, 80 kg, 70 kg, 60 kg, 50 kg, 45 kg, 40 kg, 35 kg, 30 kg, 25 kg, 20 kg, 15 kg, 12 kg, 10 kg, 9 kg, 8 kg, 7 kg, 6 kg, 5 kg, 4 kg, 3 kg, 2 kg, 1 kg, 0.5 kg, 0.1 kg, 0.05 kg, or 0.01 kg.
  • the weight may be greater than or equal to about: 1000 kg, 750 kg, 500 kg, 200 kg, 150 kg, 100 kg, 80 kg, 70 kg, 60 kg, 50 kg, 45 kg, 40 kg, 35 kg, 30 kg, 25 kg, 20 kg, 15 kg, 12 kg, 10 kg, 9 kg, 8 kg, 7 kg, 6 kg, 5 kg, 4 kg, 3 kg, 2 kg, 1 kg, 0.5 kg, 0.1 kg, 0.05 kg, or 0.01 kg.
  • a movable object may be small relative to a load carried by the movable object.
  • the load may include a payload and/or a carrier, as described in further detail elsewhere herein.
  • a ratio of a movable object weight to a load weight may be greater than, less than, or equal to about 1:1.
  • a ratio of a movable object weight to a load weight may be greater than, less than, or equal to about 1:1.
  • a ratio of a carrier weight to a load weight may be greater than, less than, or equal to about 1:1.
  • the ratio of an movable object weight to a load weight may be less than or equal to: 1:2, 1:3, 1:4, 1:5, 1:10, or even less.
  • the ratio of a movable object weight to a load weight can also be greater than or equal to: 2:1, 3:1, 4:1, 5:1, 10:1, or even greater.
  • the movable object may have low energy consumption.
  • the movable object may use less than about: 5 W/h, 4 W/h, 3 W/h, 2 W/h, 1 W/h, or less.
  • a carrier of the movable object may have low energy consumption.
  • the carrier may use less than about: 5 W/h, 4 W/h, 3 W/h, 2 W/h, 1 W/h, or less.
  • a payload of the movable object may have low energy consumption, such as less than about: 5 W/h, 4 W/h, 3 W/h, 2 W/h, 1 W/h, or less.
  • FIG. 12 illustrates an unmanned aerial vehicle (UAV) 1200, in accordance with embodiments of the present invention.
  • the UAV may be an example of a movable object as described herein.
  • the UAV 1200 can include a propulsion system having four rotors 1202, 1204, 1206, and 1208. Any number of rotors may be provided (e.g., one, two, three, four, five, six, or more).
  • the rotors, rotor assemblies, or other propulsion systems of the unmanned aerial vehicle may enable the unmanned aerial vehicle to hover/maintain position, change orientation, and/or change location.
  • the distance between shafts of opposite rotors can be any suitable length 410.
  • the length 1210 can be less than or equal to 2 m, or less than equal to 5 m. In some embodiments, the length 1210 can be within a range from 40 cm to 1 m, from 10 cm to 2 m, or from 5 cm to 5 m. Any description herein of a UAV may apply to a movable object, such as a movable object of a different type, and vice versa. The UAV may use an assisted takeoff system or method as described herein.
  • the movable object can be configured to carry a load.
  • the load can include one or more of passengers, cargo, equipment, instruments, and the like.
  • the load can be provided within a housing.
  • the housing may be separate from a housing of the movable object, or be part of a housing for a movable object.
  • the load can be provided with a housing while the movable object does not have a housing.
  • portions of the load or the entire load can be provided without a housing.
  • the load can be rigidly fixed relative to the movable object.
  • the load can be movable relative to the movable object (e.g., translatable or rotatable relative to the movable object).
  • the load can include a payload and/or a carrier, as described elsewhere herein.
  • the movement of the movable object, carrier, and payload relative to a fixed reference frame (e.g., the surrounding environment) and/or to each other, can be controlled by a terminal.
  • the terminal can be a remote control device at a location distant from the movable object, carrier, and/or payload.
  • the terminal can be disposed on or affixed to a support platform.
  • the terminal can be a handheld or wearable device.
  • the terminal can include a smartphone, tablet, laptop, computer, glasses, gloves, helmet, microphone, or suitable combinations thereof.
  • the terminal can include a user interface, such as a keyboard, mouse, joystick, touchscreen, or display. Any suitable user input can be used to interact with the terminal, such as manually entered commands, voice control, gesture control, or position control (e.g., via a movement, location or tilt of the terminal).
  • the terminal can be used to control any suitable state of the movable object, carrier, and/or payload.
  • the terminal can be used to control the position and/or orientation of the movable object, carrier, and/or payload relative to a fixed reference from and/or to each other.
  • the terminal can be used to control individual elements of the movable object, carrier, and/or payload, such as the actuation assembly of the carrier, a sensor of the payload, or an emitter of the payload.
  • the terminal can include a wireless communication device adapted to communicate with one or more of the movable object, carrier, or payload.
  • the terminal can include a suitable display unit for viewing information of the movable object, carrier, and/or payload.
  • the terminal can be configured to display information of the movable object, carrier, and/or payload with respect to position, translational velocity, translational acceleration, orientation, angular velocity, angular acceleration, or any suitable combinations thereof.
  • the terminal can display information provided by the payload, such as data provided by a functional payload (e.g., images recorded by a camera or other image capturing device).
  • the same terminal may both control the movable object, carrier, and/or payload, or a state of the movable object, carrier and/or payload, as well as receive and/or display information from the movable object, carrier and/or payload.
  • a terminal may control the positioning of the payload relative to an environment, while displaying image data captured by the payload, or information about the position of the payload.
  • different terminals may be used for different functions. For example, a first terminal may control movement or a state of the movable object, carrier, and/or payload while a second terminal may receive and/or display information from the movable object, carrier, and/or payload.
  • a first terminal may be used to control the positioning of the payload relative to an environment while a second terminal displays image data captured by the payload.
  • Various communication modes may be utilized between a movable object and an integrated terminal that both controls the movable object and receives data, or between the movable object and multiple terminals that both control the movable object and receives data.
  • at least two different communication modes may be formed between the movable object and the terminal that both controls the movable object and receives data from the movable object.
  • FIG. 13 illustrates a movable object 1300 including a carrier 1302 and a payload 1304, in accordance with embodiments.
  • the movable object 1300 is depicted as an aircraft, this depiction is not intended to be limiting, and any suitable type of movable object can be used, as previously described herein.
  • the payload 1304 may be provided on the movable object 1300 without requiring the carrier 1302.
  • the movable object 1300 may include propulsion mechanisms 1306, a sensing system 1308, and a communication system 1310.
  • the propulsion mechanisms 1306 can include one or more of rotors, propellers, blades, engines, motors, wheels, axles, magnets, or nozzles, as previously described.
  • the movable object may have one or more, two or more, three or more, or four or more propulsion mechanisms.
  • the propulsion mechanisms may all be of the same type. Alternatively, one or more propulsion mechanisms can be different types of propulsion mechanisms.
  • the propulsion mechanisms 1306 can be mounted on the movable object 1300 using any suitable means, such as a support element (e.g., a drive shaft) as described elsewhere herein.
  • the propulsion mechanisms 1306 can be mounted on any suitable portion of the movable object 1300, such on the top, bottom, front, back, sides, or suitable combinations thereof.
  • the propulsion mechanisms 1306 can enable the movable object 1300 to take off vertically from a surface or land vertically on a surface without requiring any horizontal movement of the movable object 1300 (e.g., without traveling down a runway).
  • the propulsion mechanisms 1306 can be operable to permit the movable object 1300 to hover in the air at a specified position and/or orientation.
  • One or more of the propulsion mechanisms 1300 may be controlled independently of the other propulsion mechanisms.
  • the propulsion mechanisms 1300 can be configured to be controlled simultaneously.
  • the movable object 1300 can have multiple horizontally oriented rotors that can provide lift and/or thrust to the movable object.
  • the multiple horizontally oriented rotors can be actuated to provide vertical takeoff, vertical landing, and hovering capabilities to the movable object 1300.
  • one or more of the horizontally oriented rotors may spin in a clockwise direction, while one or more of the horizontally rotors may spin in a counterclockwise direction.
  • the number of clockwise rotors may be equal to the number of counterclockwise rotors.
  • each of the horizontally oriented rotors can be varied independently in order to control the lift and/or thrust produced by each rotor, and thereby adjust the spatial disposition, velocity, and/or acceleration of the movable object 1300 (e.g., with respect to up to three degrees of translation and up to three degrees of rotation).
  • the sensing system 1308 can include one or more sensors that may sense the spatial disposition, velocity, and/or acceleration of the movable object 1300 (e.g., with respect to up to three degrees of translation and up to three degrees of rotation).
  • the one or more sensors can include global positioning system (GPS) sensors, motion sensors, inertial sensors, proximity sensors, or image sensors.
  • GPS global positioning system
  • the sensing data provided by the sensing system 1308 can be used to control the spatial disposition, velocity, and/or orientation of the movable object 1300 (e.g., using a suitable processing unit and/or control module, as described below).
  • the sensing system 1308 can be used to provide data regarding the environment surrounding the movable object, such as weather conditions, proximity to potential obstacles, location of geographical features, location of manmade structures, and the like.
  • the communication system 1310 enables communication with terminal 1312 having a communication system 1314 via wireless signals 1316.
  • the communication systems 1310, 1314 may include any number of transmitters, receivers, and/or transceivers suitable for wireless communication.
  • the communication may be one-way communication, such that data can be transmitted in only one direction.
  • one-way communication may involve only the movable object 1300 transmitting data to the terminal 1312, or vice-versa.
  • the data may be transmitted from one or more transmitters of the communication system 1310 to one or more receivers of the communication system 1312, or vice-versa.
  • the communication may be two-way communication, such that data can be transmitted in both directions between the movable object 1300 and the terminal 1312.
  • the two-way communication can involve transmitting data from one or more transmitters of the communication system 1310 to one or more receivers of the communication system 1314, and vice-versa.
  • the terminal 1312 can provide control data to one or more of the movable object 1300, carrier 1302, and payload 1304 and receive information from one or more of the movable object 1300, carrier 1302, and payload 1304 (e.g., position and/or motion information of the movable object, carrier or payload; data sensed by the payload such as image data captured by a payload camera).
  • control data from the terminal may include instructions for relative positions, movements, actuations, or controls of the movable object, carrier and/or payload.
  • control data may result in a modification of the location and/or orientation of the movable object (e.g., via control of the propulsion mechanisms 1306), or a movement of the payload with respect to the movable object (e.g., via control of the carrier 1302).
  • the control data from the terminal may result in control of the payload, such as control of the operation of a camera or other image capturing device (e.g., taking still or moving pictures, zooming in or out, turning on or off, switching imaging modes, change image resolution, changing focus, changing depth of field, changing exposure time, changing viewing angle or field of view).
  • the communications from the movable object, carrier and/or payload may include information from one or more sensors (e.g., of the sensing system 1308 or of the payload 1304).
  • the communications may include sensed information from one or more different types of sensors (e.g., GPS sensors, motion sensors, inertial sensor, proximity sensors, or image sensors). Such information may pertain to the position (e.g., location, orientation), movement, or acceleration of the movable object, carrier and/or payload.
  • Such information from a payload may include data captured by the payload or a sensed state of the payload.
  • the control data provided transmitted by the terminal 1312 can be configured to control a state of one or more of the movable object 1300, carrier 1302, or payload 1304.
  • the carrier 1302 and payload 1304 can also each include a communication module configured to communicate with terminal 1312, such that the terminal can communicate with and control each of the movable object 1300, carrier 1302, and payload 1304 independently.
  • the movable object 1300 can be configured to communicate with another remote device in addition to the terminal 1312, or instead of the terminal 1312.
  • the terminal 1312 may also be configured to communicate with another remote device as well as the movable object 1300.
  • the movable object 1300 and/or terminal 1312 may communicate with another movable object, or a carrier or payload of another movable object.
  • the remote device may be a second terminal or other computing device (e.g., computer, laptop, tablet, smartphone, or other mobile device).
  • the remote device can be configured to transmit data to the movable object 1300, receive data from the movable object 1300, transmit data to the terminal 1312, and/or receive data from the terminal 1312.
  • the remote device can be connected to the Internet or other telecommunications network, such that data received from the movable object 1300 and/or terminal 1312 can be uploaded to a website or server.
  • FIG. 14 is a schematic illustration by way of block diagram of a system 1400 for controlling a movable object, in accordance with embodiments.
  • the system 1400 can be used in combination with any suitable embodiment of the systems, devices, and methods disclosed herein.
  • the system 1400 can include a sensing module 1402, processing unit 1404, non-transitory computer readable medium 1406, control module 1408, and communication module 1410.
  • the sensing module 1402 can utilize different types of sensors that collect information relating to the movable objects in different ways. Different types of sensors may sense different types of signals or signals from different sources.
  • the sensors can include inertial sensors, GPS sensors, proximity sensors (e.g., lidar), or vision/image sensors (e.g., a camera).
  • the sensing module 1402 can be operatively coupled to a processing unit 1404 having a plurality of processors.
  • the sensing module can be operatively coupled to a transmission module 1412 (e.g., a Wi-Fi image transmission module) configured to directly transmit sensing data to a suitable external device or system.
  • the transmission module 1412 can be used to transmit images captured by a camera of the sensing module 1402 to a remote terminal.
  • the processing unit 1404 can have one or more processors, such as a programmable processor (e.g., a central processing unit (CPU)).
  • the processing unit 1404 can be operatively coupled to a non-transitory computer readable medium 1406.
  • the non-transitory computer readable medium 1406 can store logic, code, and/or program instructions executable by the processing unit 1404 for performing one or more steps.
  • the non-transitory computer readable medium can include one or more memory units (e.g., removable media or external storage such as an SD card or random access memory (RAM)).
  • data from the sensing module 1402 can be directly conveyed to and stored within the memory units of the non-transitory computer readable medium 1406.
  • the memory units of the non-transitory computer readable medium 1406 can store logic, code and/or program instructions executable by the processing unit 1404 to perform any suitable embodiment of the methods described herein.
  • the processing unit 1404 can be configured to execute instructions causing one or more processors of the processing unit 1404 to analyze sensing data produced by the sensing module.
  • the memory units can store sensing data from the sensing module to be processed by the processing unit 1404.
  • the memory units of the non-transitory computer readable medium 1406 can be used to store the processing results produced by the processing unit 1404.
  • the processing unit 1404 can be operatively coupled to a control module 1408 configured to control a state of the movable object.
  • the control module 1408 can be configured to control the propulsion mechanisms of the movable object to adjust the spatial disposition, velocity, and/or acceleration of the movable object with respect to six degrees of freedom.
  • the control module 1408 can control one or more of a state of a carrier, payload, or sensing module.
  • the processing unit 1404 can be operatively coupled to a communication module 1410 configured to transmit and/or receive data from one or more external devices (e.g., a terminal, display device, or other remote controller). Any suitable means of communication can be used, such as wired communication or wireless communication.
  • the communication module 1410 can utilize one or more of local area networks (LAN), wide area networks (WAN), infrared, radio, WiFi, point-to-point (P2P) networks, telecommunication networks, cloud communication, and the like.
  • relay stations such as towers, satellites, or mobile stations, can be used.
  • Wireless communications can be proximity dependent or proximity independent. In some embodiments, line-of-sight may or may not be required for communications.
  • the communication module 1410 can transmit and/or receive one or more of sensing data from the sensing module 1402, processing results produced by the processing unit 1404, predetermined control data, user commands from a terminal or remote controller, and the like.
  • the components of the system 1400 can be arranged in any suitable configuration.
  • one or more of the components of the system 1400 can be located on the movable object, carrier, payload, terminal, sensing system, or an additional external device in communication with one or more of the above.
  • FIG. 14 depicts a single processing unit 1404 and a single non-transitory computer readable medium 1406, one of skill in the art would appreciate that this is not intended to be limiting, and that the system 1400 can include a plurality of processing units and/or non-transitory computer readable media.
  • one or more of the plurality of processing units and/or non-transitory computer readable media can be situated at different locations, such as on the movable object, carrier, payload, terminal, sensing module, additional external device in communication with one or more of the above, or suitable combinations thereof, such that any suitable aspect of the processing and/or memory functions performed by the system 1400 can occur at one or more of the aforementioned locations.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Environmental Sciences (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Animal Husbandry (AREA)
  • Zoology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Remote Sensing (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Robotics (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Navigation (AREA)
  • User Interface Of Digital Computer (AREA)
  • Toys (AREA)
  • Business, Economics & Management (AREA)
  • Traffic Control Systems (AREA)
  • Astronomy & Astrophysics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Game Theory and Decision Science (AREA)
  • Medical Informatics (AREA)
  • Birds (AREA)
  • Biophysics (AREA)

Abstract

Systems and methods are provided for guiding a target object with an unmanned aerial vehicle (UAV) in an environment. The UAV may be able to recognize and locate the target object. The UAV can be configured to communicate the actions and behavior of the target object to a user through a user device in communication with the UAV. The UAV can provide positive and negative stimuli to the target object to encourage an action or behavior. The UAV can be configured to recognize and manage waste generated by the target object.

Description

SYSTEMS AND METHODS FOR WALKING PETS BACKGROUND OF THE INVENTION
Aerial vehicles such as unmanned aerial vehicles (UAVs) can travel along defined routes.
Individuals can have pets that require regular exercise and time outside of a home or building. Individuals may not have time or desire to take their pets outside. Oftentimes individuals may hire human pet walkers, which can be costly.
SUMMARY OF THE INVENTION
A need exists to provide a method of guiding a target object by a movable object, such as an unmanned aerial vehicle (UAV), along a predefined route, instantaneously defined route, or an undefined route within a designated area or region. A UAV can be configured to autonomously or semi-autonomously guide a pet or target object along a route in order to provide the target object with exercise or time outdoors. Provided herein are systems and methods of guiding a target object by a UAV. The systems and methods further provide the ability to provide a defined route or a defined region where a UAV can guide a target object. Communication can occur between a user and the UAV in response to the defined route and/or region. Further communication can occur related to actions or behaviors exhibited by the target object. The UAV can be configured to locate the target object and to recognize actions and behaviors exhibited by the target object. The target object may be an animal such as a pet owned by a user.
In an aspect the invention, a method of guiding a target object comprises, receiving a user input, through a user device, that defines a target area, the target area comprises (1) a permissible area for the target object to travel, or (2) an impermissible area where the target object is not permitted to travel; receiving, from a movable object that guides the target object, a signal indicative of a location of the movable object; and receiving an indicator of the movable object exiting the permissible area for the target object to travel or an indicator of the movable object entering the impermissible area where the target object is not permitted to travel, said indicator generated based on the location of the movable object and the target area; and generating a movable object operation, in response to the indicator.
In some cases the target object can be an animal. The moveable object can be an unmanned aerial vehicle (UAV). The movable object operation can include controlling flight of the UAV to control movement of the target object. The movable object operation can include alerting the user that the UAV is exiting the permissible area or entering the impermissible area. The user input can comprise global coordinates that can define the permissible area or the impermissible area.
The user input can comprise an image or outline on a map defining the boundaries of the permissible area or the impermissible area. In some instances the method can comprise guiding the target object using the UAV, wherein the UAV can be physically attached to the target object. The UAV can be attached to the target object by a leash that is attached to a collar of the target object.
In some cases, the UAV can be a rotorcraft comprising a plurality of rotors that can permit the UAV to take off and/or land vertically. The UAV can comprise a location device that transmits information about the UAV’s location. The location device can be a GPS sensor. The indicator of exiting the permissible area can be received when the target object exits the permissible area. The indicator of exiting the permissible area can be received when the target object is within a predetermined threshold distance of a boundary of the permissible area and the target object is heading in the direction of the boundary. The target object can be heading in the direction of the boundary at a speed exceeding a threshold speed. The indicator of entering the impermissible area can be received when the target object enters the impermissible area. The indicator of entering the impermissible area can be received when the target object is within a predetermined threshold distance of a boundary of the impermissible area and the target object is heading in the direction of the boundary. The target object can be heading in the direction of the boundary at a speed exceeding a threshold speed. The movable object operation can include playing the user’s voice to the target object when the indicator of exiting the permissible area or entering the impermissible area is received. The method can further comprise transmitting the user’s voice from the user device to the UAV in real-time. The user’s voice can be a pre-recording. The movable object operation can include delivering an electric shock to the target object if the target object does not respond to the user’s voice within a predetermined period of time. The user interface can be a screen of the UAV and the alert can be provided visually. The user interface can be a speaker of the UAV and the alert can be provided audibly.
In another aspect of the invention, a system for guiding a target object can comprise: one or more processors, individually or collectively, configured to: (a) receive a signal indicative of a user input that defines a target area, said target area comprising (1) a permissible area for the target object to travel, or (2) an impermissible area where the target object is not permitted to travel; (b) receive a signal indicative of a location of a movable object that guides the target object; and (c) determine, based on the target area and the signal indicative of the location of the movable object, when the movable object is exiting the permissible area for the target object to travel or when the movable object is entering the impermissible area where the target object is not permitted to travel; and (d) determine a movable object operation, in response to the determination of whether the movable object is exiting the permissible area for the target object to travel or entering the impermissible area where the target object is not permitted to travel.
In some cases, the target object can be an animal. The movable object can be an unmanned aerial vehicle (UAV). The moveable object operation can include controlling flight of the UAV to control movement of the target object. The movable object operation can include alerting the user that the UAV is exiting the permissible area or entering the impermissible area. The UAV can be physically attached to the target object while the UAV is guiding the target object. The UAV can be attached to the target object by a leash that is attached to a collar of the target object.
The UAV can be a rotorcraft comprising a plurality of rotors that permit the UAV to take off and/or land vertically. The UAV can comprise a location device that transmits information of the UAV’s location. The location device can be a GPS sensor.
In some cases, the indicator of exiting the permissible area can be provided when the target object exits the permissible area. The indicator of exiting the permissible area can be provided when the target object is within a predetermined threshold distance of a boundary of the permissible area and the target object is heading in the direction of the boundary. The target object can be heading in the direction of the boundary at a speed exceeding a threshold speed. The one or more processors can be configured to determine the UAV is entering the impermissible area when the target object enters the impermissible area. The one or more processors can be configured to determine that the UAV is entering the impermissible area when the target object is within a predetermined threshold distance of a boundary of the impermissible area and the target object is heading in the direction of the boundary. The one or more processors can be configured to determine that the target object is heading in the direction of the boundary at a speed exceeding a threshold speed.
The movable object operation can include playing the user’s voice to the target object when the indicator of exiting the permissible area or entering the impermissible area is received, and the one or more processors are configured to effect the movable object operation. The user’s voice can be transmitted from the user device to the UAV in real-time. The user’s voice can be a pre-recording. The movable object operation can include delivering an electric shock to the target object if the target object does not respond to the user’s voice within a predetermined period of time. The user interface can be a screen of the UAV and the alert can be provided visually. The user interface can be a speaker of the UAV and the alert can be provided audibly.
In an aspect of the invention, a method of guiding a target object using a movable object can comprise: recognizing the target object wearing a collar, with aid of one or more vision sensors on board the UAV; automatically attaching, without human aid, the movable object to the collar of the target object using a leash when the target object is recognized; and flying the movable object while the target object is attached to the movable object via the leash.
The target object can be an animal. The movable object can be an unmanned aerial vehicle (UAV), and the UAV can be flying while the target object is in locomotion. The leash can be formed of a flexible or bendable material. The method can further comprise extending or retracting the leash while the UAV is in flight. The leash can attach to the collar of the target object using one or more magnetic connection. The leash can attach to the collar of the target object with aid a robotic arm. The robotic arm can comprise one or more extension that guides the leash to the collar.
The method can further comprising capturing, using the one or more vision sensors, at least one image of the target object wearing the collar. The method can further comprise recognizing, with aid of one or more processors, the target object from the image of the target object. The movable object can further comprise one or more processors configured to recognize the target object from the image of the collar.
The movable object can be a UAV, and the method can further comprise flying the UAV, subsequent to recognizing the target object, to a closer proximity of the target object in order to get into position to automatically attach the UAV to the collar of the target object. Flying the movable object can include guiding the target object by pulling on the leash. The method can further comprise comparing a calculation of the target object motion and the movable object motion to determine one or more parameter with which the movable object pulls on the leash.
The method can further comprise collecting, using the movable object, an image of the target object while the target object is in locomotion and is attached to the movable object via the leash. The method can further comprise displaying, on a map, the location of the movable object to the user. The method can further comprise playing the user’s voice to the target object while the target object is in locomotion and is attached to the movable object via a leash. The user’s voice can be transmitted from the user device to the movable object in real-time. The user’s voice can be a pre-recording. The user’s voice can be speaking a command to the target object.
In an aspect of the invention, a UAV can be configured to guide a target object, the UAV can comprise: one or more vision sensors configured to capture an image of the target object wearing a collar; one or more processors configured to, individually or collectively, recognize the target object from the image of the target object wearing the collar; a leash attachment mechanism configured to automatically attach, without human aid, a leash to the collar of the target object when the target object is recognized; and one or more propulsion units configured to permit flight of the UAV while the target object is attached to the UAV via the leash.
The target object can be an animal. The UAV can be flying while the target object is in locomotion. The leash can be formed of a flexible or bendable material. The leash can be extendible or retractable while the UAV is in flight. The leash can be configured to attach to the collar of the target object using one or more magnetic connection. The leash can be configured to attach to the collar of the target object with aid of a robotic arm. The robotic arm can comprise one or more extension that guide the leash to the collar. The one or more vision sensors can be configured to capture at least one image of the target object wearing the collar. The UAV can further comprise one or more processors configured to recognize the target object from the image of the target object. The UAV can further comprise one or more processors configured to recognize the target object from the image of the collar. The one or more processors can be configured to, subsequent to recognizing the target object, generate a signal to the one or more propulsion units to effect flight of the UAV to a closer proximity of the target object in order to get into position to automatically attach the UAV to the collar of the target object.
The UAV can be configured to guide the target object by pulling on the leash. The one or more processors can be configured to compare a calculation of the target object motion and the UAV motion to determine one or more parameter with which the UAV pulls on the leash. The one or more vision sensors can be configured to collect an image of the target object while the target object is in locomotion and is attached to the UAV via the leash. The one or more vision sensors can be configured to collect an image of the collar of the target object. The UAV can further comprise one or more speaker configured to play the user’s voice to the target object while the target object is in locomotion and is attached to the UAV via the leash. The user’s voice can be transmitted from the user device to the UAV in real-time. The user’s voice can be a pre-recording. The user’s voice can be speaking a command to the target object.
In another aspect of the invention, a method of guiding a target object using a UAV can comprise: recognizing the target object, with aid of one or more vision sensors on board the UAV; automatically displaying, without human aid or invention, an attractor to the target object when the target object is recognized; and flying the UAV while the target object is in locomotion and following the attractor.
The target object can be an animal. The attractor can be ab edible treat. The method can further comprise emitting, using the attractor, a selected scent. The UAV can display the attractor by dangling the attractor at or near a head level of the target object. The attractor can comprise an image that is displayed on a screen carried by the UAV. The image can be a static image. The image can be an image of an owner of the target object. The image can be a videa. The image can be a video of the owner of the target object.
The method can further comprise determining, using the one or more vision sensors, a location of the target object relative to the UAV and adjusting or maintaining the speed of the UAV flight to remain within a proximity of the target object that is sufficiently close for the target object to perceive the attractor. The method can further comprise determining, using the one or more vision sensors, a trajectory of the locomotion of the target object relative to the UAV and adjusting or maintaining the direction of the UAV flight remain within a proximity of the target object that is sufficiently close for the target object to perceive the attractor. The method can further comprise capturing at least one image of the target object using the one or more vision sensors.
The UAV can further comprise one or more processors configured to recognize the target object from the image of the target object. The target object can be wearing a collar. The UAV can further comprise one or more processors configured to recognize the target object form the image of the collar.
The method can further comprise playing the user’s voice to the target object while the target object is in locomotion and is attached to the UAV via the leash. The user’s voice can be transmitted from the user device to the UAV in real-time. The user’s voice can be a pre-recording. The user’s voice can be saying a command to the target object.
In another aspect of the invention, a UAV configured to guide a target object can comprise: one or more vision sensors configured to capture an image of the target object wearing a collar; one or more processors configured to, individually or collectively, recognize the target object from the image of the target object; an attractor display mechanism configured to display, without human aid or intervention, an attractor to the target object when the target object is recognized; and one or more propulsion units configured to permit flight of the UAV while the attractor is displayed to the target object.
The target object can be an animal. The attractor can be an edible treat. The attractor can emit a selected scent. The UAV can display the attractor by dangling the attractor at or near a head level of the target object. The attractor can comprise an image that is displayed on a screen carried by the UAV. The image can be a static image. The image can be an image of an owner of the target object. The image can be a video. The image can be a video of the owner of the target object.
The UAV can be further configured to determine, using the one or more vision sensors, a location of the target object relative to the UAV and adjust or maintain the speed of the UAV flight to remain within a proximity of the target object that is sufficiently close for the target object to perceive the attractor. The UAV can be further configured determine, using the one or more vision sensors, a trajectory of the locomotion of the target object relative to the UAV and adjust or maintain the direction of the UAV flight remain within a proximity of the target object that is sufficiently close for the target object to perceive the attractor. The one or more vision sensors can capture at least one image of the target object. The UAV can further comprise one or more processors configured to recognize the target object from the image of the target object.
The target object can be wearing a collar. The UAV can further comprise a speaker configured to play the user’s voice to the target object while the target object is in locomotion and is attached to the UAV via the leash. The user’s voice can be transmitted from the user device to the UAV in real-time. The user’s voice can be a pre-recording. The user’s voice can be saying a command to the target object.
In another aspect of the invention, a method of guiding a target object may be provided. The method may comprise: providing a UAV that guides the target object, wherein a location of the UAV is known; recognizing the target object, with aid of one or more vision sensors on board the UAV; recognizing waste generated by the target object, with aid of the one or more vision sensors on board the UAV; and alerting the user that the waste has been generated by the target object.
The target object can be an animal. The animal can be a dog or a cat. The method can further comprise providing information to the user about a location where the waste was generated. The UAV can further comprise one or more processors configured to recognize the waste from the image of the waste. The user can be alerted through a user device. The user can be alerted through a user device comprising a display. The user device can be a smartphone, tablet, or a personal computer. The user device can display a map showing the location of where the waste was generated. The user device can display an image of the waste generated by the target object. The UAV can guide the target object by being physically attached to the target object. The UAV can be attached to the target object by a leash that is attached to the collar of the target object. The UAV can guide the target object by displaying an attractor to the target object. The attractor can be an edible treat. The user can be a target object waste removal professional. The UAV can comprise a location device that transmits information about the UAV’s location. The location device can be a GPS sensor.
In another aspect of the invention, a UAV configured to guide a target object can comprise: one or more vision sensors configured to capture an image of the target object and waste generated by the target object; one or more processors configured to, individually or collectively, (1) recognize the target object from the image of the target object, and (2) recognize the waste generated by the target object from the image of the waste generated by the target object; a communication unit configured to send a signal to a user device that alerts the user that the waste has been generated by the target object; and one or more propulsion units configured to permit flight of the UAV while guiding the target object.
The target object can be an animal. The animal can be a dog or a cat. The UAV can be further configured to provide information to the user about a location where the waste was generated. The user device can comprise a display. The user device can be a smartphone, tablet, or personal computer. The user device can be configured to display a map showing the location of where the waste was generated. The user device can be configured to display an image of the waste generated by the target object.
The UAV can be configured to guide the target object by being physically attached to the target object. The UAV can be attached to the target object by a leash that is attached to the collar of the target object. The UAV can be configured to guide the target object by displaying an attractor to the target object. The attractor can be an edible treat. The user can be a target object waste removal professional.
The UAV can be a rotorcraft comprising a plurality of rotors that permit the UAV to take off and/or land vertically. The UAV can comprise a location device that transmits information about the UAV’s location. The location device can be a GPS sensor.
In another aspect of the invention, a method of guiding a target object can comprise: providing a UAV that guides the target object, wherein a location of the UAV is known; recognizing the target object, with aid of one or more vision sensors on board the UAV; recognizing waste generated by the target object, with aid of the one or more vision sensors on board the UAV; and removing the waste in response to recognizing the waste, using the UAV.
The target object can be an animal. The animal can be a dog or a cat. The UAV can be further configured to provide information to the user about a location where the waste was generated. The user device can comprise a display. The user device can be a smartphone, tablet, or personal computer. The user device can be configured to display a map showing the location of where the waste was generated. The user device can be configured to display an image of the waste generated by the target object.
The UAV can be configured to guide the target object by being physically attached to the target object. The UAV can be attached to the target object by a leash that is attached to a collar of the target object. The UAV can be further configured to guide the target object by displaying an attractor to the target object. The attractor can be an edible treat. The user can be a target object waste removal professional.
The UAV can be a rotorcraft comprising a plurality of rotors that permit the UAV to take off and/or land vertically. The UAV can comprise a location device that transmits information about the UAV’s location. The location device can be a GPS sensor.
The method can further comprise removing the waste with a mechanical arm.
In another aspect of the invention, a UAV can be configured to guide a target object, the UAV can comprise: one or more vision sensors configured to capture an image of the target object and waste generated by the target object; one or more processors configured to, individually or collectively, (1) recognize the target object from the image of the target object, and (2) recognize the waste generated by the target object from the image of the waste generated by the target object; one or more waste removal units, configured to remove the waste in response to the recognition of the waste; and one or more propulsion units configured to permit flight of the UAV while guiding the target object.
The target object can be an animal. The animal can be a dog or a cat. The UAV can be further configured to provide information to the user about a location where the waste was generated. The UAV can further comprise one or more processors configured to recognize the waste from the image of the waste. The UAV can guide the target object by being physically attached to the target object. The UAV can be attached to a leash that is attached to a collar of the target object. The UAV can guide the target object by displaying an attractor to the target object.
The attractor can be an edible treat. The UAV can be a rotorcraft comprising a plurality of rotors that permit the UAV to take off and/or land vertically. The UAV can comprise a location device that transmits information about the UAV’s location. The location device can be a GPS sensor. The one or more waste removal units can include a mechanical arm that extends from the UAV to remove the waste.
In another aspect of the invention, a method of guiding a target object can comprise: receiving a user input, through a user device, a travel route for a UAV to guide the target object; guiding the target object using the UAV by flying the UAV along the travel route while the target object is in locomotion, wherein a location of the UAV is known; receiving, through the user device while the UAV is guiding the target object along the travel route, a change to the travel route to provide an updated travel route; and flying the UAV along the updated travel route.
The user input can comprise global coordinates that define the travel route. The user input can comprise global coordinates that define the updated travel route. The user input can comprise an image or line on a map defining the travel route. The user input can comprise an image or line on a map defining the updated travel route.
The UAV can guide the target object by being physically attached to the target object. The UAV can be attached to a leash that is attached to a collar of the target object. The target object can be an animal. The animal can be a dog or a cat. The UAV can be a rotorcraft comprising a plurality of rotors that permit the UAV to take off and/or land vertically. The UAV can comprise a location device that transmits information about the UAV’s location. The location device can be a GPS sensor.
The method can further comprise capturing, with aid of one or more vision sensors on board the UAV, an image of the target object. The method can further comprise detecting, with aid of one or more processors, when the target object is deviating from the travel route or the updated travel route based on the image of the target object. The method can further comprise playing the user’s voice to the target object when the target object is deviating from the travel route or the updated travel route. The user’s voice can be transmitted from the user device to the UAV in real-time. The user’s voice can be a pre-recording. The method can further comprise delivering an electric shock to the target object when the target object deviates from the travel route beyond a predetermined distance.
Other objects and features of the present invention will become apparent by a review of the specification, claims, and appended figures.
INCORPORATION BY REFERENCE
All publications, patents, and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication, patent, or patent application was specifically and individually indicated to be incorporated by reference.
BRIEF DESCRIPTION OF THE DRAWINGS
The novel features of the invention are set forth with particularity in the appended claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings of which:
FIG. 1 shows an example of a system comprising a user, an unmanned aerial vehicle (UAV), and a target object where a UAV is configured to guide the target object while in communication with the user.
FIG. 2 shows a map that can be used to designate areas that a permissible for travel of the target object or impermissible for travel of the target object.
FIG. 3 shows an example of how a user can define a permissible or impermissible area for a target object to travel on a user interface.
FIG. 4 shows a boundary and a threshold surrounding the boundary that can be approached and/or crossed by a target object.
FIG. 5 shows an example of travel routes that the UAV can guide the target object on.
FIG. 6 shows a target object wearing a collar that can be recognized by a UAV.
FIG. 7 shows a UAV guiding a target object while physically connected to the target object.
FIG. 8 shows a UAV displaying audio and/or visual stimuli from a user to a target object.
FIG. 9 shows a UAV guiding a target object without a physical connection to the target object.
FIG. 10 shows a UAV recognizing waste generated by a target object.
FIG. 11 shows process in which a UAV may alert a user of the occurrence and location of waste generated by a target object.
FIG. 12 illustrates an unmanned aerial vehicle, in accordance with an embodiment of the invention.
FIG. 13 illustrates a movable object including a carrier and a payload, in accordance with an embodiment of the invention.
FIG. 14 is a schematic illustration by way of block diagram of a system for controlling a movable object, in accordance with an embodiment of the invention.
DETAILED DESCRIPTION OF THE INVENTION
The systems, devices, and methods of the present invention provide mechanisms for guiding a target object by an unmanned aerial vehicle (UAV) along a predefined route, an instantaneously defined route, or an undefined route within a designated area or region. The systems, devices, and methods of the present invention further provide responses to recognized actions and/or behaviors of the target object. Description of the UAV may be applied to any other type of unmanned vehicle, or any other type of movable object.
A UAV can be provided to guide a target object. A user can provide instructions to the UAV to guide the target object through a device that is in communication with the UAV. The device may be directly in communication with the UAV or may communicate with the UAV over a network. The user can provide the instructions before the UAV guides the target object or while the UAV is guiding the target object in real time. In some cases the UAV can interface to broadcast a visual and/or audio stream or recording of the user to the target object.
The UAV can be configured to remain within a specified distance from the target object. In some cases, the target object can be attached to the UAV through a physical attachment mechanism (e.g. a leash). The UAV may exert force on the physical attachment mechanism to aid in guiding the target object. The UAV can comprise one or more vision sensors. The vision sensors can be in communication with a processor that is configured to recognize an image of the target object. The UAV can remain within a specified distance of the target object without being physically attached to the target object using the vison sensors and the one or more processors. In some cases, the UAV can provide an attractor to the target object when the target object refused to follow or remain with a specified distance from the UAV.
The UAV can be configured to lead or direct a target object or being. In some cases a target object can be one or more animals. A target object can be a pet. A pet can be, for example, a dog, cat, lizard, horse, rabbit, ferret, pig, or any rodent that may be kept as a pet by a user. The pet may be a mammal. In some cases, the pet may be a reptile. The pet may be a land bound pet that may traverse a surface. The pet may optionally be capable of being airborne (e.g. a bird). The UAV can lead a target object along a pre-defined path, along an undefined path in a pre-defined area, or anywhere in accordance with certain travel parameters (e.g., length of route, amount of time, remaining outside of impermissible areas).
The UAV can receive instructions regarding the pre-defined path or area from one or more processors. The processors can be on-board or off-board the UAV. For instance, the one or more processors may be on an external device such as a server, user device, or may be provided on a cloud computing infrastructure. The processors can additionally be in communication with at least one user through a communication interface. A user can provide parameters to define a path or geographic region for the UAV to direct a target object along or within respectively. A UAV can have a vision sensor. The vision sensor can be configured to recognize the target object. The UAV can continuously monitor the location of the target object. In some cases, the vision sensor can be configured to recognize an item attached to the target object, for example, a collar or harness. The UAV can be configured to maintain a fixed distance from the target object. In some cases the target object can be attached or tethered to the UAV, for example by a leash. Leash can be a flexible object that attaches on one end to the UAV and on the other end to the target object.
A processor can be in communication with one or more locating sensors on-board a UAV. A locating sensor can determine the position of a UAV in a relative or global coordinate system. A global coordinate system may be an absolute coordinate system. In an example a global coordinate system can define the location of the UAV using longitude and latitude. A relative coordinate system can determine the distance or location of a UAV from a reference point or landmark. A relative coordinate system can be derived from a measurement of movement of a UAV from a known starting point or movement of a UAV in a known area. A locating sensor configured to determine the absolute location of a UAV can be a GPS sensor. One or more locating sensors can be used to determine the relative location of a UAV. For example, relative angular velocity can be provided by a gyroscope; relative translational acceleration can be provided by an accelerometer; relative attitude information can be provided by a vision sensor; relative distance information can be provided by an ultrasonic sensor, lidar, or time-of-flight camera. The relative and or global location of the UAV can be communicated to the processor. The processor can inform a user through a user interface of the local or global position of the UAV. The global or local location of the UAV can correspond to the global or local location of the target object that may be in proximity of or tethered to the UAV.
The systems and methods herein may permit a UAV to aid in taking a target object out on a walk without requiring significant human intervention. For instance, a human may remain at home while the UAV guides the target object. The human may be able to monitor the situation in real-time and intervene if needed. The human may intervene remotely by communicating with the target object through the UAV, or may be informed of a location so the human can intervene in person if necessary.
FIG. 1 shows an example of a target object guidance system 100 including a user 101, one or more processors 102, and a UAV 103 guiding or leading a target object 104. The UAV 103 can guide the target object 104 along a pre-defined or an undefined path. The UAV 103 can guide the target object 104 for a specified duration of time. In some cases the target object can follow the UAV along a route. Alternatively the target object 104 can wander in a region while the UAV 103 follows the target object. In instances where the UAV 103 follows the target object 104 the UAV can prevent the target object from wandering into an impermissible regions or out of a permissible region. The UAV may or may not exert force on the target object while the target object is moving around.
The user 101 can be in a first location 105. A first location may be a house, yard, room, building, vehicle, or another space or area. A user 101 can communicate with one or more processors 102 through a user interface on an electronic device 106. In an example, a user interface can be on an electronic display such as a desktop computer, laptop computer, smart phone, smart watch, smart glasses, tablet, or another device configured to communicate with the one or more processors. The electronic device 106 may or may not be a mobile device. The electronic device may or may not be a remote terminal capable of manually controlling flight of the UAV. The electronic device can be in communication with the UAV directly through a wired or wireless connection 107. The electronic device can further be in communication with a processor 102, through a wired or wireless connection 108, the processor 102 can additionally be in communication with the UAV through a wired or wireless connection 109. Alternatively, the processor may be on-board the electronic device 106 and/or the UAV 103. For instance, the UAV can have one or more on-board processors. The one or more on-board processors can communicate with an external processor 102 and or an electronic device 106 with a user interface. The on-board processors may perform any functions of processors 102 described herein. Alternatively, a UAV can communicate directly with the electronic device to communicate with an intermediate device or processor.
The UAV can comprise a vision sensor 111. In an example a vision sensor 111 can be a camera. The vision sensor 111 can be enclosed in the body of the UAV or carried by the UAV as an external payload. In a case in which the vision sensor 111 is carried externally as a payload the UAV can orient the vision sensor below the body of the UAV. The vision sensor can be attached to the UAV by one or more attachments, such as a carrier 112. The carrier 112 can be configured such that the vision sensor can rotate and/or tilt independently of the UAV. The carrier may permit the vision sensor to translate and/or rotate in three-dimensions. For example, in Cartesian coordinates the carrier can permit translation and/or rotation of the vision sensor independently of the movement of the UAV about an x, y, or z axis. The vision sensor (e.g., camera) may be able to rotate about a pitch, roll, and/or yaw axis with respect to the UAV and/or a fixed reference frame. Similar rotation and translation can be achieved in any other three-dimensional coordinate system (e.g. spherical coordinates). In some cases the carrier may permit rotation and/or translation of the vision sensor about only one or about only two axes.
A target object 104 can optionally have a wearable identifier 113. For example, the target object may have a collar. The UAV vision sensor can detect a visual pattern on the wearable identifier in order to locate the target object. In some cases the target object 104 can be tethered to the UAV 103 by a physical connection 114. A physical connection 114 can be a flexible connector of a given length that is connected on one end to the target object 104 and on another end to the UAV 103. The physical connection may or may not expand or contract (thus being able to vary its length). The physical connection may have a limited maximum length (e.g., less than or equal to about 20 m, 15 m, 10 m, 7 m, 5 m, 4m, 3 m, 2 m, or 1m).
A user 101 can define a route or region in which the UAV 103 can guide or lead the target object 104. A user can define the route or region through a user interface on an electronic device 106 or any other device that may or may not be in communication with the UAV. A user can generate a defined area in which the user would like the target object to be led by the UAV 103. In some cases, a user 101 can define a specified route along which the user 101 would like the UAV 103 to guide the target object 104. In other cases the user 101 can instruct the UAV 103 to guide the target object 104 within a geographic area. In some instances, the user may define or choose a defined area where the UAV is not to guide that target object. When guiding the target object in a defined geographic area, the UAV 103 can be provided with an additional instruction from the user 101 to further constrain an act of guiding the target object 104 in the geographic area. In an example, the additional instruction can be a duration of total time, end time, total cumulative distance, pace, or performance of an event or task by the target object 104. A duration of total time may include the total amount of time to guide the target object (e.g., length of walk, such as a 30 minute walk). As the target object’s pace may vary, the route or action of the UAV guiding the target object may be altered to comply with the duration of total time. In another example, the end time may be preset (e.g., finish guiding the target object and return home by 2:00 pm). Similarly, as the target object’s pace may vary, the route or action of the UAV guiding the target object may be altered to comply with the end time (e.g., if the target object is moving slowly, a shortcut may be taken to get the target object home on time). A total cumulative distance may enable a user to define the distance to be traveled by the target object (e.g., a user may specify a 1 mile walk). The user may optionally set a pace for the guidance (e.g., have the target object move at a rate of least 4 miles/hours). A user may set an event or task to be completed by the target object and monitored by the UAV (e.g., walk uphill, walk downhill, sprints, fetch an object, etc.). In some instances, the additional instructions may include impermissible areas to keep the target object away from.
The UAV 103 can have one or more sensors. The UAV may comprise one or more vision sensors such as an image sensor. For example, an image sensor may be a monocular camera, stereo vision camera, radar, sonar, or an infrared camera. The UAV may further comprise other sensors that may be used to determine a location of the UAV, such as global positioning system (GPS) sensors, inertial sensors which may be used as part of or separately from an inertial measurement unit (IMU) (e.g., accelerometers, gyroscopes, magnetometers), lidar, ultrasonic sensors, acoustic sensors, WiFi sensors.
The UAV can have sensors on-board the UAV that collect information directly from an environment without contacting an additional component off-board the UAV for additional information or processing. For example, a sensor that collects data directly in an environment can be a vision or audio sensor. Alternatively, the UAV can have sensors that are on-board the UAV but contact one or more components off-board the UAV to collect data about an environment. For example, a sensor that contacts a component off-board the UAV to collect data about an environment may be a GPS sensor or another sensor that relies on connection to a another device, such as a satellite, tower, router, server, or other external device.
Various examples of sensors may include, but are not limited to, location sensors (e.g., global positioning system (GPS) sensors, mobile device transmitters enabling location triangulation), vision sensors (e.g., imaging devices capable of detecting visible, infrared, or ultraviolet light, such as cameras), proximity or range sensors (e.g., ultrasonic sensors, lidar, time-of-flight or depth cameras), inertial sensors (e.g., accelerometers, gyroscopes, inertial measurement units (IMUs)), altitude sensors, attitude sensors (e.g., compasses) pressure sensors (e.g., barometers), audio sensors (e.g., microphones) or field sensors (e.g., magnetometers, electromagnetic sensors). Any suitable number and combination of sensors can be used, such as one, two, three, four, five, or more sensors.
Optionally, the data can be received from sensors of different types (e.g., two, three, four, five, or more types). Sensors of different types may measure different types of signals or information (e.g., position, orientation, velocity, acceleration, proximity, pressure, etc.) and/or utilize different types of measurement techniques to obtain data. For instance, the sensors may include any suitable combination of active sensors (e.g., sensors that generate and measure energy from their own energy source) and passive sensors (e.g., sensors that detect available energy). As another example, some sensors may generate absolute measurement data that is provided in terms of a global coordinate system (e.g., position data provided by a GPS sensor, attitude data provided by a compass or magnetometer), while other sensors may generate relative measurement data that is provided in terms of a local coordinate system (e.g., relative angular velocity provided by a gyroscope; relative translational acceleration provided by an accelerometer; relative attitude information provided by a vision sensor; relative distance information provided by an ultrasonic sensor, lidar, or time-of-flight camera). The sensors onboard or off board the UAV may collect information such as location of the UAV, location of other objects, orientation of the UAV, or environmental information. A single sensor may be able to collect a complete set of information in an environment or a group of sensors may work together to collect a complete set of information in an environment. Sensors may be used for mapping of a location, navigation between locations, detection of obstacles, or detection of a target. Sensors may be used for surveillance of an environment or a subject of interest. Sensors may be used to recognize a target object, such as an animal. The target object may be distinguished from other objects in the environment. Sensors may be used to recognize an object worn or carried by the target object. The worn or carried object may be distinguished from other objects in the environment.
Any description herein of a UAV may apply to any type of movable object. The description of a UAV may apply to any type of unmanned movable object (e.g., which may traverse the air, land, water, or space). The UAV may be capable of responding to commands from a remote controller. The remote controller may be not connected to the UAV, the remote controller may communicate with the UAV wirelessly from a distance. In some instances, the UAV may be capable of operating autonomously or semi-autonomously. The UAV may be capable of following a set of pre-programmed instructions. In some instances, the UAV may operate semi-autonomously by responding to one or more commands from a remote controller while otherwise operating autonomously. For instance, one or more commands from a remote controller may initiate a sequence of autonomous or semi-autonomous actions by the UAV in accordance with one or more parameters.
The UAV may be an aerial vehicle. The UAV may have one or more propulsion units that may permit the UAV to move about in the air. The one or more propulsion units may enable the UAV to move about one or more, two or more, three or more, four or more, five or more, six or more degrees of freedom. In some instances, the UAV may be able to rotate about one, two, three or more axes of rotation. The axes of rotation may be orthogonal to one another. The axes of rotation may remain orthogonal to one another throughout the course of the UAV’s flight. The axes of rotation may include a pitch axis, roll axis, and/or yaw axis. The UAV may be able to move along one or more dimensions. For example, the UAV may be able to move upwards due to the lift generated by one or more rotors. In some instances, the UAV may be capable of moving along a Z axis (which may be up relative to the UAV orientation), an X axis, and/or a Y axis (which may be lateral). The UAV may be capable of moving along one, two, or three axes that may be orthogonal to one another.
The UAV may be a rotorcraft. In some instances, the UAV may be a multi-rotor craft that may include a plurality of rotors. The plurality or rotors may be capable of rotating to generate lift for the UAV. The rotors may be propulsion units that may enable the UAV to move about freely through the air. The rotors may rotate at the same rate and/or may generate the same amount of lift or thrust. The rotors may optionally rotate at varying rates, which may generate different amounts of lift or thrust and/or permit the UAV to rotate. In some instances, one, two, three, four, five, six, seven, eight, nine, ten, or more rotors may be provided on a UAV. The rotors may be arranged so that their axes of rotation are parallel to one another. In some instances, the rotors may have axes of rotation that are at any angle relative to one another, which may affect the motion of the UAV.
The UAV shown may have a plurality of rotors. The rotors may connect to the body of the UAV which may comprise a control unit, one or more sensors, processor, and a power source. The sensors may include vision sensors and/or other sensors that may collect information about the UAV environment. The information from the sensors may be used to determine a location of the UAV. The rotors may be connected to the body via one or more arms or extensions that may branch from a central portion of the body. For example, one or more arms may extend radially from a central body of the UAV, and may have rotors at or near the ends of the arms.
A vertical position and/or velocity of the UAV may be controlled by maintaining and/or adjusting output to one or more propulsion units of the UAV. For example, increasing the speed of rotation of one or more rotors of the UAV may aid in causing the UAV to increase in altitude or increase in altitude at a faster rate. Increasing the speed of rotation of the one or more rotors may increase the thrust of the rotors. Decreasing the speed of rotation of one or more rotors of the UAV may aid in causing the UAV to decrease in altitude or decrease in altitude at a faster rate. Decreasing the speed of rotation of the one or more rotors may decrease the thrust of the one or more rotors. When a UAV is taking off, the output may be provided to the propulsion units may be increased from its previous landed state. When the UAV is landing, the output provided to the propulsion units may be decreased from its previous flight state. The UAV may be configured to take off and/or land in a substantially vertical manner.
A lateral position and/or velocity of the UAV may be controlled by maintaining and/or adjusting output to one or more propulsion units of the UAV. The altitude of the UAV and the speed of rotation of one or more rotors of the UAV may affect the lateral movement of the UAV. For example, the UAV may be tilted in a particular direction to move in that direction and the speed of the rotors of the UAV may affect the speed of the lateral movement and/or trajectory of movement. Lateral position and/or velocity of the UAV may be controlled by varying or maintaining the speed of rotation of one or more rotors of the UAV.
The UAV may be of small dimensions. The UAV may be capable of being lifted and/or carried by a human. The UAV may be capable of being carried by a human in one hand.
The UAV may have a greatest dimension (e.g., length, width, height, diagonal, diameter) of no more than 100 cm. In some instances, the greatest dimension may be less than or equal to 1 mm, 5 mm, 1 cm, 3 cm, 5 cm, 10 cm, 12 cm, 15 cm, 20 cm, 25 cm, 30 cm, 35 cm, 40 cm, 45 cm, 50 cm, 55 cm, 60 cm, 65 cm, 70 cm, 75 cm, 80 cm, 85 cm, 90 cm, 95 cm, 100 cm, 110 cm, 120 cm, 130 cm, 140 cm, 150 cm, 160 cm, 170 cm, 180 cm, 190 cm, 200 cm, 220 cm, 250 cm, or 300 cm. Optionally, the greatest dimension of the UAV may be greater than or equal to any of the values described herein. The UAV may have a greatest dimension falling within a range between any two of the values described herein.
The UAV may be lightweight. For example, the UAV may weigh less than or equal to 1 mg, 5 mg, 10 mg, 50 mg, 100 mg, 500 mg, 1 g, 2 g, 3 g, 5 g, 7 g, 10 g, 12 g, 15 g, 20 g, 25 g, 30 g, 35 g, 40 g, 45 g, 50 g, 60 g, 70 g, 80 g, 90 g, 100 g, 120 g, 150 g, 200 g, 250 g, 300 g, 350 g, 400 g, 450 g, 500 g, 600 g, 700 g, 800 g, 900 g, 1 kg, 1.1 kg, 1.2 kg, 1.3 kg, 1.4 kg, 1.5 kg, 1.7 kg, 2 kg, 2.2 kg, 2.5 kg, 3 kg, 3.5 kg, 4 kg, 4.5 kg, 5 kg, 5.5 kg, 6 kg, 6.5 kg, 7 kg, 7.5 kg, 8 kg, 8.5 kg, 9 kg, 9.5 kg, 10 kg, 11 kg, 12 kg, 13 kg, 14 kg, 15 kg, 17 kg, or 20 kg. The UAV may have a weight greater than or equal to any of the values described herein. The UAV may have a weight falling within a range between any two of the values described herein.
A user can define an area in which the UAV guides the target object. The user can define the area using a user interface that is in communication with a processor on-board or off-board the UAV. The one or more processors can be in communication with one or more memory storage units. The memory storage unit can store past user defined areas or routes. The memory storage device units can store geographic data, such as maps and may optionally be updated. A user can define a unique area or route each time the UAV guides the target object or the user can choose from one or more stored routes or areas. Examples of possible areas 200 in which the UAV can guide the target object are shown in FIG. 2. An area can be defined as a region in which the target object is permitted to travel, a boundary past which a target object is not permitted to travel, and/or a region in which the target object is not permitted to travel. In FIG. 2 region 201 can be an area in which a target object is permitted to travel. Region 201 can be enclosed by boundaries 202 past which the target object is not permitted to travel. In some cases, a region can enclose sub regions in which the target object is not permitted. Region 203 is an enclosed region in which the target object is permitted. Region 203 encloses region 204 in which the target object is not permitted to travel. In some cases, a region can be defined as the region enclosed by regions where the target object is not permitted. In this case a user can define a plurality of regions in which a target object is not permitted such that the pluralities of non-permitted regions enclose a region that is allowed. For example, region 205 can be a region in which a target object is permitted. Region 205 can be surrounded by region 206 in which the target object is not permitted. In one example, a UAV may be permitted to guide a pet within a park, such that the pet is permitted to remain within a lawn 203, while not being permitted to be guided on a road 206 or in a lake 204.
A region can be defined by a geographic radius. For example, a geographic radius can be a radial region centered at an initial location of a target object. In another example a geographic radius can be defined as a radial region centered at a location of a user. In some cases a geographic radius can be a radial region with a center point defined by a user. A user can define a geographic region using global coordinates. In some cases a geographic region can be defined as a region within user defined boundaries, the boundaries can be defined using global coordinates. A user-defined geofence may be provided which may function as a boundary of a permissible or region or an impermissible region for the target object to be. Any regular or irregular shape may be provided as a boundary. A geographic region can be bound by user defined obstacles. For example a user can instruct a UAV to guide a target object in a region without crossing a physical boundary or feature. A physical boundary or feature can be a fence, road, ditch, water way, or ground surface transition (e.g. grass to dirt or grass to pavement). The UAV can be configured to detect a physical boundary or feature or the UAV can know the location of the physical boundary or feature apriori.
[0092] In some cases a user can provide a visual map to define permissible and impermissible regions for the target object to travel. A visual map can be generated in a user interface on an electronic device. The user interface can provide a map of a chosen or local space in which a target object can be led by the UAV. A user can mark areas that are permissible or impermissible for the UAV and the target object to travel on the map provided by the user interface. In some cases, a user can mark areas on the map using a touch screen provided on the user interface. A user’s finger or a pointer (e.g., mouse pointer, trackball pointer, etc.) may be used to trace the outline of boundaries. The user can draw circles on the user interface to define an area. Alternatively, the user can click on or touch points to define the coordinates of a region. In the example shown in FIG. 3 a user can provide an input to the user interface 300 to define a permissible or impermissible region for the UAV and the target object to travel. The input provided by the user can be communicated to the user interface by any method that is acceptable to the electronic device comprising the user interface, for example a user may communicate with the user interface on the electronic device through a tactile or audio command. In an example a user can speak the name or coordinates of a permissible or impermissible area, for example the user can give the command “Dog Park permissible” or “lake impermissible” to designate the dog park as permissible and the lake as impermissible travel regions for the UAV and the target object. In another example a user can draw or trace a region on a map that is permissible or impermissible for the UAV and the target object to travel. A user can draw or trace the region with their finger or a stylus. For example a user can define a set of coordinates (X1,Y1), (X2,Y2), (X3,Y3), (X4,Y4). Line segments can be formed to connect the set of coordinates and to enclose a geographic region. A user can define the enclosed geographic region as permissible or impermissible for travel of the target object. Alternatively, a user can define a first coordinate (X5,Y5) and trace a closed region that includes the first coordinate, (X5,Y5). The user can define this closed region as permissible or impermissible for travel of the target object.
One or more processors can monitor the location of the UAV while it is guiding a target object by receiving a location signal from one or more location signals on-board the UAV. The one or more processors can receive a user input signal that defines permissible areas for the target object to travel and/or impermissible areas for the target object to travel. The one or more processors can compare a locating signal from a UAV guiding a target object to the user input signal that defines permissible areas for the target object to travel and/or impermissible areas for the target object to travel to determine if the UAV has guided the target object outside of the permissible area or into an impermissible area. For example, a locating signal (e.g. GPS) from a UAV can be compared to a map of permissible and impermissible regions as defined by a user. When the UAV guiding the target object leaves the permissible areas for the target object to travel and/or enters the impermissible areas for the target object to travel the processor can initiate a response. The location of the target object can be approximated as the location of the UAV. The approximation that the location of the target object and the location of the UAV can be appropriate in cases when the UAV is very close to the target object, for example, when the target object is attached to the UAV by a relatively short leash. In some cases the location of the target object can be determined from a combination of the location of the UAV as determined by one or more location sensors and the location of the target object as determined from one or more vision sensors. For example, a location of a UAV can be known from a GPS sensor and the location of the target object relative to the UAV can be determined from one or more vision sensors configured to recognize the target object. One or more processors can determine the location of the target object relative to the UAV to determine the absolute location of the target object. In another embodiment the location of the target object can be known from a locating sensor on the target object, for example, a GPS sensor in a collar worn by the target object. A locating sensor on the target object can communicate with a processor on or off-board the UAV.
A response can be informing a user that the target object has deviated from the permissible area or entered the impermissible area. A user can be informed that the target object has deviated from the permissible area or entered the impermissible area through a user interface on an electronic device. The electronic device can alert a user with an audio signal, vibration signal, text message, phone call, video message, visual image message, electronic notification, and/or email. In some cases a response can be a flight instruction to the UAV, the UAV can be instructed by the processor to re-enter the permissible area or exit the impermissible area. The processor can automatically provide a flight instruction to the UAV when the UAV has deviated from the permissible area or entered the impermissible area. Alternatively, the processor can provide a flight instruction to the UAV when the UAV has deviated from the permissible area or entered the impermissible area in response to a user input from an electronic device after the electronic device has alerted the user that the UAV has deviated from the permissible area or entered the impermissible area. The flight instruction can be for the UAV to return to the permissible area or exit the impermissible area. In some cases the flight instruction can be for the UAV to entice or direct the target object to control the movement of the target object such that the target object returns to the permissible area or exits the impermissible area. In an example, the user can provide a specific flight instruction to the UAV. The specific flight instruction can be for the UAV to fly a specific direction and a specified distance in that direction. The flight instruction can also include a specified distance that should be maintained between the UAV and the target object while the UAV is moving the specified distance in the specified direction. Alternatively, a user can initiate an automated or predetermined flight sequence to return the UAV and the target object to a permissible area.
The locating signal can indicate that the UAV has exited a permissible area for the target object to travel or that the UAV has entered an area that is impermissible for the target object to travel when the UAV crosses over a user defined boundary. In some cases, the locating signal can indicate that the UAV has exited a permissible area for the target object to travel or that the UAV has entered an area that is impermissible for the target object to travel when the UAV approaches and is within a predetermined threshold distance from a user defined boundary. The locating signal can indicate exiting a permissible area when a UAV is detected within a threshold distance from a user defined boundary regardless of the direction that the target object and the UAV are heading. In some cases, the locating signal can indicate exiting a permissible area when a UAV is detected within a threshold distance from a user defined boundary and the direction that the target object and the UAV is towards the boundary. The speed of the target object can be determined. The speed of the target object can be determined from a velocity sensor on-board the UAV. The speed of the target object can be estimated as the speed of the UAV as determined by the velocity sensor. In another embodiment the UAV can comprise a vision sensor to detect the location of the target object. The UAV can determine the speed of the target object from the measurements taken by the vision sensor using a processor on or off-board the UAV. In some cases the target object can wear a locating sensor, for example, a locating sensor imbedded in a collar worn by the target object. The locating sensor can be a GPS sensor. The locating sensor worn by the target object can be in communication with one or more processors on or off-board the UAV. The one or more processors can determine the speed of the target object from information transmitted by the locating sensor. The speed of the target object can be a factor in the indication that the UAV has exited a permissible area for the target object to travel or that the UAV has entered an area that is impermissible for the target object to travel when the UAV crosses over a user defined boundary. When the target object is detected heading in the direction of the boundary at a speed exceeding a threshold speed an indication that the UAV has exited a permissible area for the target object to travel or that the UAV has entered an area that is impermissible for the target object to travel when the UAV crosses over a user defined boundary can be provided.
FIG. 4 shows an example of a target object 401 within the proximity of a boundary 402. The boundary 402 can have either or both of a first threshold 403 and a second threshold 404. The first 403 and second 404 thresholds can outline either edge of the boundary 402. The distance between the boundary 402 and the first 403 and second 404 threshold can be defined by a user. In an example, the distance between the threshold and the boundary can be at least 1 inch (in), 6 in, 1 foot (ft), 2 ft, 3 ft, 4 ft, 5 ft, 6 ft, 7 ft, 8 ft, 9 ft, 10 ft, 11 ft, 12 ft, 13 ft, 14 ft, 15 ft, 16 ft, 17 ft, 18 ft, 19 ft, or 20 ft. In some cases, the distance between the boundary and a threshold can be greater than 20 ft. The distance between a boundary and threshold can fall between any of the listed values. The distance between the boundary 402 and the first 403 and second 404 threshold can be uniform or the distance can vary. The distance can vary along a boundary 402. The distance can vary such that the distance between a first threshold 403 and the boundary 402 is different from the distance between the boundary 402 and the second 404 threshold. In some cases a first boundary can have a first threshold distance and a second boundary can have a second threshold distance. For example, if a first boundary indicates that a target object cannot enter a dangerous area (e.g. street, parking lot, or a region containing other aggressive animals) a distance between the first boundary and the threshold can be relatively large. In another example, if a second boundary indicates that a target object cannot enter a comparatively less dangerous area (e.g. lake, neighbor’s yard, or a dirt pile) a distance between the first boundary and the threshold can be relatively small. The direction that the target object is heading 405 can be a factor in determining if an indication that the target object has crossed a boundary 402 or a threshold 403, 404. For example, when a target object is heading in the direction of a boundary 402 or a threshold 403, 404 an indication that the target object is exiting a permissible area or entering an impermissible area can be provided.
A UAV can be physically connected or attached to the target object by a physical attachment mechanism. A physical attachment mechanism can be a leash, rope, or chain that tethers the target object to the UAV. The physical attachment mechanism can attach to a region on the body of the UAV on one end and the target object on the other end. The physical attachment mechanism can attach to a collar that is worn around a neck of the target object. Alternatively the physical attachment mechanism can attached to a harness that attaches to a body of the target object.
The UAV can provide a deterrent mechanism to the target object when the target object approaches a boundary or a threshold of a boundary such that the target object is exiting a permissible area or entering an impermissible area. The UAV may or may not be configured to provide sufficient force to pull a target object away from a boundary enclosing an impermissible area. In some cases the UAV may require a deterrent mechanism to prevent a target object from travelling into an impermissible area or out of a permissible area. The UAV can be configured only to provide one type of deterrent mechanism. Alternatively the UAV can be configured to provide a primary deterrent mechanism followed by at least one additional deterrent mechanism. The additional deterrent mechanism can be provided when the target object fails to obey the primary deterrent mechanism within a specified time interval after the primary deterrent mechanism is provided. In some cases, the additional deterrent mechanism can be harsher than the primary deterrent mechanism. The specified time interval between the primary and additional deterrent mechanism can be fixed or it can be dependent on the action of the target object. For example, if a target object is rapidly approaching a boundary of an impermissible region the specified time interval can be shorter than in cases where the target object is slowly approaching the boundary.
The deterrent mechanism can be the user’s voice. The user’s voice can be a recording played through a microphone on-board the UAV. The recording can be stored on a memory storage device on or off-board the UAV. In some cases a user can be alerted in real time though a user device that the target object is approaching a boundary or a threshold of a boundary such that the target object is exiting a permissible area or entering an impermissible area. The user’s voice can be transmitted from the user device to the UAV in real time. The recording of the user’s voice or the transmission of the user’s voice in real time can be provided to the target object through a user interface on-board the UAV. The user interface on board the UAV can comprise a microphone to emit an audio alert to the target object. The audio alert can be a live stream or recording of a user’s voice, an unpleasant sound, a high pitched ring, or any other audio stimulus that commands attention and obedience of the target object. A user can tell a target object to stop, sit, or come through a live stream or a recording.
The user interface can further comprise a screen such that the alert can be provided to the target visually. The alert can be both audio and visual or only one of the two. The visual alert can be a video recording or a live video of the user.
In another example, the deterrent mechanism can be an electric shock. The electric shock can be provided by an electric shock collar worn by the target object. The UAV can be in communication with the electric shock collar through a wired or wireless connection. In the case of a wired connection, the wired connection can be imbedded in the physical attachment mechanism between the UAV and the target object. The UAV can instruct the electric shock collar to provide an electric shock to the target object when the target object approaches a boundary or a threshold of a boundary such that the target object is exiting a permissible area or entering an impermissible area. The electric shock can be a first response to the target object approaching a boundary or a threshold of a boundary such that the target object is exiting a permissible area or entering an impermissible area. Alternatively the electric shock can be a secondary response after playing a real time or recorded voice of the user. An electric shock can be provided to the target object if the target object does not respond to the user’s voice within a predetermined period of time. The predetermined period of time can be a fixed value, for example the predetermined period of time can be at least 1 second (sec), 5 sec, 10 sec, 15 sec, 20 sec, 25 sec, 30 sec, 35 sec, 40 sec, 45 sec, 50 sec, 55 sec, or 1 minute. In some cases the period of time can be a function of the speed of the target object such that the time between a user’s voice and the electric shock is inversely proportional to the speed at which the target object is traveling. An additional deterrent mechanism that can be used alone or in combination with the user’s voice and/or the electric shock can include emitting a noise (e.g. beep, buzz, or siren) that the target object recognizes or has been conditioned to recognize as a signal to stop moving in a direction. Another deterrent mechanism that can be used alone or in combination with the user’s voice and/or the electric shock can be a spray of a liquid that has a smell that deters the target object, for example the liquid may be citronella.
[00102] In addition to or instead of defining boundaries that generate areas that are permissible and/or impermissible for the target object to travel, a user can also define a specific route along which the UAV can lead or guide a target object. A user can define a unique route or a user can pick from a plurality of routes that can be stored on a storage memory device on or off-board the UAV. The stored routes can originate from previous routes that a user has used. In some cases the stored routes can come from other users in the area that also use a UAV to guide their target object through a route sharing network. FIG. 5 shows a map 500 with possible routes that a UAV 501 can travel to guide a target object. A UAV can start a route with a target object at a home 502. The UAV can lead the target object along route R0. When the UAV reaches the midpoint 503 of the route the UAV can return to the home 502 along route R1 or route R2. A user can specify which route, R1 or route R2, should be taken by the UAV to return home 502. In some cases the user can specify the choice of R1 or route R2 in real time while the UAV is guiding the target object.
[00103] Additionally, a route can be changed in real time. For example, a UAV can begin guiding a target object along a route R0 with the initial plan of following from R0 to R1. A user can, in real time, update the route such that the UAV guides the target object from R0 to R2 instead of from R0 to R1. Thus, a user may provide an input that alters a route from a pre-defined route while the UAV is in flight and traveling along the route. This may provide flexibility if an event comes up while the UAV is away. For example, if a user becomes aware of an event that requires the target object to be brought home quickly, the user may change the route while the target object is out with the UAV to bring the target object home more directly or quickly. Similarly, if the user becomes aware of construction or another event along an originally predefined route and wishes for the target object to avoid that region, the user may advantageously alter the route while the UAV is out with the target object.
The UAV can guide the target object along a travel route. The UAV can be in flight while guiding the target object. The UAV can achieve flight with one or more propulsion units, for example a propulsion unit can comprise one or more rotors. The target object can be in locomotion while the UAV is guiding the target object. A UAV can receive a travel route that describes an area or path along which the target object should be guided. The travel route can be a user input to a user device that is in communication with the UAV. The user device can be a computer, tablet, smart phone, smart watch, or smart glasses. The UAV can guide the target object along the route by flying along the travel route while the target object is in motion. The target object can be in motion close to the location of the UAV. For example, the UAV can be suspended in flight and the target object can be on the ground directly below the UAV. In some cases, the target object can be on the ground below the UAV and off set to the right, left, back or front of the UAV. The target object can be physically attached to the UAV through a physical attachment mechanism, for example, a leash. The leash can be attached on one end to the UAV and on the other end to the target object. The end of the leash attached to the target object can be attached to a collar or harness worn by the target object.
The travel route can be updated in real time. A UAV can begin guiding a target object from a starting location. The UAV can guide the target object along a first travel route. While the UAV is guiding the target object along the first travel route the UAV can receive a route update from a user device that is in communication with the UAV. The route update can provide a change from a first travel route to a second travel route. The change can be provided to make the travel route longer, shorter, to avoid a location, or to include a location that was not part of the first travel route. Once the UAV receives the updated route the UAV can continue guiding the target object along the updated route. The route can be updated at least once while the UAV is guiding the target object. In some cases the route can be updated at least once, twice, three times, four times, five times, six times, seven times, eight times, nine times, or ten times.
A user can update the travel route with a user device that is in communication with the UAV. The user device can be in communication with the UAV through a wired or wireless connection. The user can define the first travel route using a global location identifier, for example global coordinates. In some cases, the user can define the first travel route using an image or a line on a map. The map can be provided on a user interface on the user device. The image or line can be interpreted by a processor to define a travel route in global or local coordinates. A user can provide an updated route using coordinates, a map image, or a line on a map. The location of the UAV can be determined by one or more locating sensors on-board the UAV. In an example, a locating sensor can be a GPS sensor. The location of the UAV determined by the one or more locating sensors can be transmitted to a user device and/or a processor off-board the UAV. The location of the UAV can roughly define the location of the target object being guided by the UAV.
The UAV can comprise one or more vision sensors configured to capture an image of the target object. The location of the target object can be determined by one or more processors from the location of the UAV determined by the one or more locating sensors on-board the UAV and the image of the target object. The one or more processors can determine when the target object is deviating from the travel route. The travel route can be a first travel route or an updated travel route. When the processor detects that the target object has deviated from the travel route an attractor or instruction can be provided to prevent the target object from continuing to divert from the travel route and/or to force or entice the target object to return to the travel route. In an example, when the target object is deviating the travel route the UAV can play a live stream or a recording of the user’s voice to the target object. The live stream or recording can be an instruction from the user for the target object to come closer to the UAV or to stop traveling in a direction away from the travel route. In some cases a user can be the target object’s owner. In another case, the user can be an individual designated by the target object’s owner to monitor the target object. The user’s voice can be transmitted through a user device to the UAV in real time. Alternatively, the user’s voice can be pre-recorded and stored on a memory storage device on or off-board the UAV. In some cases, another stimulus can be provided to the target object when the target object deviates from the travel route. The stimulus can be provided in addition to or instead of the user’s voice. In some cases the stimulus can be an attractor, for example an edible treat or an emission of a smell that is of interest to a target object. The attractor can be provided to guide the target object back to the travel route. In another example, the stimulus can be an electric shock. The electric shock can signal to the target object that the target object should stop moving. The electric shock can be provided when the target object deviates from the travel route a predetermined distance.
A UAV can be configured to recognize a target object. A UAV can recognize the target object using an image recognition algorithm that can detect defining features of the target object. For example, the UAV may be able to discern target object size, gait, coloration/patterns, or proportions (e.g., limbs, torso, face).
In some cases, the UAV can detect a collar worn by the target object using a vision sensor on-board the UAV. A collar worn by a target object can have unique identifiers such that the UAV can distinguish the collar worn by the target object from another collar worn by an alternative target object. The distinguishing features can be patterns, symbols, or a unique combination of numbers and letters.
FIG. 6 shows an example of a target object 601 wearing a collar 602 that can be recognized by a UAV. Description of the collar can apply to any object that is wearable by the target object, for example, a harness, sweater, ankle band, hat, or paw bootie. The UAV can recognize target object wearing the collar using one or more vision sensors on-board the UAV. The collar 602 can comprise at least one of a pattern of symbols 603, letters 604, or numbers. The pattern of symbols 603, letters 604, or numbers can be provided on a display screen on the collar. The pattern of symbols 603, letters 604, or numbers can be a constant display or the display can change. In some cases a pattern displayed on the collar can communicate information to the UAV. The collar can emit a signal that can be detected by the UAV for example an IR signal. The collar can further comprise at least one component 605 configured to permit connection to a physical connection mechanism between a UAV and the collar. The component 605 configured to permit connection to a physical connection mechanism between a UAV and the collar can be a magnet, hook, hole, rivet, snap, or other connection hardware.
A UAV can be configured to attach to the collar of the target object using a physical connection mechanism (e.g. leash) automatically without human intervention. In some cases, the UAV may require human intervention to attach to the collar of the target object. The UAV can attach to the collar of the target object after confirming recognition of the target object using one or more vision sensors to detect the target object. The leash can attach to the collar using a magnetic mechanism such that a magnet on the collar is attracted to a magnet on an end of the leash. In some cases only one of either the leash end or the collar can comprise a magnet and the other component (e.g. leash end or collar) can comprise a metal that is attracted to the magnet. The leash can be made from a flexible and/or bendable material, for example plastic, rubber, elastic, or another flexible material.
The UAV can attach a leash to the target object automatically without human aid or intervention. The UAV can attach a least to the target object using a mechanical mechanism. Mechanical mechanism can be a hook, clamp, or robotic arm. In cases in which the mechanical mechanism is a robotic arm the robotic arm can be on-board the UAV. The robotic arm can extend and retract to guide a leash to a collar on a target object. In some cases, the robotic arm can extend and retract using a telescoping mechanism. The UAV can hover directly above or to a side of the target object while the UAV is attaching a leash to the target object. One or more vision sensors can detect the location of the target object while the UAV is attaching the leash to the target object. When the vision sensors detect movement of the target object, the UAV may move to stay in a location directly above or to the side of the target object. The robotic arm can have a feature at its terminal end configured to attach the leash to a collar on the target object. The leash can attach to the collar using any mechanical or electrical connection mechanism for example, a hook and loop, snap, magnetic, Velcro, or any other mechanical coupling mechanism. The coupling mechanism between the leash and the collar on a target object can be generic or the coupling mechanism can have a size or shape that is unique to a specific leash and collar connection. The unique coupling mechanism can prevent a UAV from accidentally connecting to a wrong target object. Alternatively, one or more vision sensors on-board the UAV can detect the target object while the UAV is attaching the leash to the target object to verify that the UAV is attaching the leash to the correct target object. Other sensors may be used to verify that the leash is attached to the correct target object. For instance, a collar or other wearable connection of the target object may interact with the leash to confirm the correct identity. For instance, a signal may pass between the collar and leash upon contact or wirelessly. The signal may include an identifier of the collar which may be verified by the leash, UAV or any processors anywhere in the system.
The physical connection mechanism between the UAV and the target object can have a fixed or adjustable length. The length of the physical connection mechanism can determine a permitted distance between a UAV and a target object. In cases where the length of the physical connection mechanism is adjustable the physical connection mechanism can be retractable. The maximum extension of the physical connection can be fixed or the maximum extension of the physical connection can be determined by a location or a distance from a defined boundary. For example, when a UAV attached to a target object with a physical connection mechanism is a relatively far from a defined boundary the physical connection mechanism can be extended to a relatively long length. In comparison, when the UAV attached to a target object with a physical connection mechanism is a relatively close to a defined boundary the physical connection mechanism can be extended to a relatively short length. The physical connection mechanism can be extended and retracted while the UAV is in flight.
FIG. 7 shows an example of a UAV 701 and a target object 702. The UAV 701 is connected to the target object 702 through a physical mechanism 703, for example a leash. The UAV 701 can have at least one on-board vision sensor 704, for example a camera. The vision sensor 704 can be on the body of the UAV or the vision sensor 704 can be extended from a surface of the UAV, for example the bottom surface, by a support structure 705. The vision sensor 704 can be movable relative to the UAV. The vision sensor 704 can be configured to rotate and/or translate independent of the position of the UAV. The vision sensor can capture at least one image of a target object. The vision sensor can be moved to track the movement of the target object. The image can be stored on a memory storage device on or off-board the UAV. The image can be analyzed to identify a target object. The image may be of the target object or a collar 706 worn by the target object. The UAV can be in communication with one or more processors on or off-board the UAV. The processors can be configured to analyze an image from a vision sensor and recognize the target object from an image of the target object or an image of the collar 706. When a target object 702 is positively identified by the one or more processors the UAV 701 can approach the target object 702 and attach the physical mechanism 703 to the collar 706 worn by the target object. The UAV can automatically attach the physical mechanism to the collar 706 worn by the target object 702 by joining a mating or coupling connection on a terminal end of the physical mechanism to a corresponding connection 707 on the collar.
Once a target object has been positively identified and connected to a physical mechanism (e.g. leash) attached to a UAV, the UAV can fly while the target object is in locomotion. The UAV can guide a target object by pulling on the leash. The pulling force with which the UAV pulls on the least can be calculated from the motion of the target object and the motion of the UAV. The motion of the target object and the motion of the UAV can be compared to determine one or more parameters with which the UAV pulls on the leash. In an example, parameters that can be determined may be the magnitude and/or the direction of the pulling force. The magnitude of the pulling force can fall within a predefined range. The predefined range of pulling forces can be determined by a user or calculated from a user input. A user input can be the weight of the target object. The UAV can be configured to provide sufficient force to control a target object having a weight of at least 1 kilogram (kg), 2 kg, 3 kg, 4 kg, 5 kg, 10 kg, 15 kg, 20 kg, 25 kg, 30 kg, 35 kg, 40 kg, 50 kg, 55 kg, 60 kg, 65 kg, 70 kg, 75 kg, 80 kg, 85 kg, 90 kg, 95 kg, 100 kg, 105 kg, 110 kg, 115 kg, 120 kg, 125 kg, 130 kg, 135 kg, 140 kg, 145 kg, or 150 kg.
The UAV can continuously collect images of the target object while the UAV is in flight and the target object is in locomotion or while the target object is stationary. The target object can be attached or tethered to the UAV with via a leash while the UAV is collecting images of the target object. The images of the target object can be saved on a memory storage device that can be on-board or off-board the UAV. The UAV can collect images of the target object with a vision sensor. The vision sensor can collect still images or video images of the target object. The UAV can comprise at least one additional vision sensor configured to collect images of the environment. In some cases, the at least one additional vision sensor can track other objects in the environment. In some cases, the images can be displayed to a user through a user interface that is in communication with a processor on or off-board the UAV. The user interface can also display the location of the UAV while it is attached to the target object. The location may be shown on a map in the user interface.
A UAV can play the user’s voice to the target object while the target object is attached to the UAV by a leash or other physical attachment mechanism. The UAV can play the user’s voice to the target object while the target object is in locomotion and/or while the UAV is flying. The user’s voice can be provided by the UAV through an audio or visual display on-board the UAV. The target object may be familiar and therefore more responsive to the user’s voice as compared to a voice from a human that is not the user. In some cases the user’s voice can be provided to the target object in real time. The user’s voice can be transmitted from the user device to the UAV in real time. The user’s voice can say a command to the target object. In an example, a user can receive an image of a target object and/or a location of a UAV attached to a target object through a user interface on a user device. The user may wish to speak to the target object in response to the image or location of the target object. For example, the user can speak to the target object to provide positive or negative feedback in response to the image or location of the target object. The user can speak to the target object in real time by transmitting their voice through the user device to the UAV. In some cases, the user’s voice can be a pre-recording. The pre-recording can be an audio or video recording of the user. A processor on or off-board the UAV can be configured to recognize a behavior or action committed by the target object from an image of the target object or a location of a UAV attached to a target object. The processor can instruct the UAV to provide a pre-recording of a user’s voice to provide a command or negative or positive feedback to a target object in response to a detected behavior or action committed by the target object. In some cases, a user can recognize a behavior or action committed by the target object from an image of the target object or a location of a UAV attached to a target object provided on a user device’s user interface. The user can transmit an instruction the UAV to provide a pre-recording of a user’s voice to provide a command or negative or positive feedback to a target object in response to a detected behavior or action committed by the target object.
FIG. 8 shows an example of a UAV 801 and a target object 802 where the UAV is providing an audio and visual stimulus to the target object. The visual stimulus can be provided to the target object through a screen 803 on-board, carried by, or attached to the UAV 801. The screen can be permanently exposed or the screen can be folded or retracted into the UAV when it is not in use. The audio stimulus can be provided through a microphone or speaker 804 on-board the UAV. The audio stimulus can be a recording or a live stream of a user’s voice. A user can be the owner of the target object or an individual designated to monitor the target object while it is being guided by the UAV. In some cases the microphone can be bi-directional such that a user’s voice can be provided to the target object and an audio response (e.g. barking, meowing, or whining) from the target object can be collected and transmitted to a user through a user device. The UAV 801 can further comprise one or more visual sensors 805. The visual sensors can collect still images and/or video images of the target object. The images can be analyzed by one or more processors on or off-board the UAV to recognize the target object. The images can be further analyzed to determine the location of the target object relative to a known location of the UAV. The UAV can be attached to the target object through a physical connection, for example, a leash 806.
In some cases the UAV may not be attached to the target object while the UAV is guiding the target object. A UAV can guide the target object by recognizing the target object and automatically, without human aid or invention, display an attractor to the target object. The UAV can fly while displaying the attractor and the target object can be in locomotion following the attractor. The attractor can be a visual, auditory, or olfactory stimulus that is configured to attract the attention of the target object. In some cases the attractor can be an edible treat, for example, a dog treat, bacon, peanut butter, or another edible product that is desirable to a target object. In some cases, the attractor can emit a scent. The scent can be associated with an entity that is of interest to a target object, for example, a food item or another target object. The attractor can emit the scent from the entity itself or from a chemical configured to have a scent typically associated with the entity. For example a strip of bacon can be stored on-board the UAV and the scent of the bacon can be wafted towards the target object. Alternatively, the UAV can have a chemical configured to smell like bacon stored on-board the UAV. The UAV can emit a spray or mist of the chemical to attract the target object.
In some cases, the attractor can be an image that is displayed on a screen carried by the UAV. The image can be a static image or a video. The image can depict an owner of the target object. The image of the owner can be a static image or a video. The image may be accompanied by an audio recording or a live audio stream of the owner. The UAV can comprise an audio player (e.g. speaker or microphone) that can play the user’s voice to the target object while the target object is in locomotion and is attached to the UAV via a leash or while the target object is being guided by the UAV without a leash. The user’s voice can be transmitted from a user device to the UAV in real time. A user can be the owner of the target object. In some cases the user’s voice can be prerecorded. A combination of edible treats and images can be used in combination or consecutively to attract the target object.
The UAV can carry the attractor outside of the body of the UAV. The attractor can be connected to the UAV by a support structure. The attractor can be moved vertically and/or horizontally relative to the UAV. In some cases, the attractor can rotate relative to the UAV. The UAV can display the attractor by dangling the attractor at or near a head level of the target object. FIG. 9 shows an example of a UAV 901 guiding a target object 902 with an attractor 903. The UAV 901 can comprise one or more on-board vision sensors. The visions sensors on-board the UAV 901 can be configured to determine, with the aid of one or more processors, the location of the target object 902 relative to the UAV 901. The UAV can be instructed by the one or more processors to adjust or maintain the flight speed of the UAV such that the UAV remains within a proximity of the target object. The proximity to the target object can be set to a distance that is sufficiently close for the target object to perceive the attractor 903. One or more vision sensors can be configured to determine the trajectory of the locomotion of the target object 902 relative to the UAV 903. The determined trajectory of the locomotion of the target object relative to the UAV can result in an instruction to the UAV to adjust or maintain the direction of the UAV flight to remain within a proximity of the target object. The proximity to the target object can be set to a distance that is sufficiently close for the target object to perceive the attractor 903. In some cases the vision sensors can determine the location or a target object and/or the trajectory of locomotion of target object and cause a movement of the attractor 903. The attractor 903 can be moved to increase or decrease interaction of the target object with the attractor. For example, if a target object is jumping upward an attractor can be raised to avoid contact with the target object. In another example, a target object can move to a side of the UAV and the attractor may rotate relative to the UAV to remain in a line of sight with the target object.
In some cases the attractor can be an edible treat. A target object can be initially attracted to an edible treat. After a period of time the target object can become frustrated or discouraged if the edible treat is not provided for consumption. UAV can be configured to periodically provide the target object with at least a fraction of an edible treat that is being used as an attractor. The fraction of the edible treat can be provided as a reward for a positive behavior or action and/or to keep the attention of the target object while the target object is being guided by the UAV. The UAV can provide the fraction of the edible treat to the target object at fixed intervals, at specified route locations, or whenever the one or more vision sensors detect that the target object appears to be losing interest in the attractor, where the attractor is an edible treat. A vision sensor can detect that a target object is losing interest in the attractor, where the attractor is an edible treat when the target object suspends locomotion or wanders away a threshold distance from the location of the UAV.
A target object can generate waste while the target object is being guided by the UAV. In some locations it may be impermissible to leave waste generated by the target object in the location where it was generated. A UAV can be configured to guide a target object and to recognize waste generated by the target object. In some cases the UAV can be configured to collect and dispose of waste generated by the target object. A UAV can recognize waste generated by a target object using a one or more vision sensors. The vision sensors may be on-board the UAV. The vision sensors can be the same vision sensors used to recognize the target object or the vision sensors can be a second set of sensors that are not used to recognize the target object. The UAV can recognize waste generated by the target object and alert a user (e.g. owner of the target object) that waste has been generated by the target object. The UAV can provide an alert to a user that includes the location of the waste generated by the target object. The vision sensors can be configured to capture an image of the target object and the waste generated by the target object. The image can be a still photograph or a video image. The UAV can comprise on or more processors that are configured to recognize the target object from the image and also recognize waste generated by the target object from the image. The one or more processors can be located on or off-board the UAV. The UAV can further comprise a communication unit configured to send a signal to a user device that alerts the user that the waste has been generated by the target object.
FIG. 10 shows an example of a UAV 1001 that is guiding a target object 1002. The UAV can comprise one or more vision sensors 1003. The one or more vision sensors 1003 can be inside the body of the UAV 1001 or suspended from an outer surface of the UAV 1001 by a support structure 1004. In some cases the vision sensor can be configured to translate and/or rotate independently of the UAV 1001. The target object 1002 can be attached to the UAV 1001 by a physical attachment. In some cases the target object 1002 may not be attached to the UAV 1001. The vision sensor 1003 can be configured to recognize a target object 1002 and collect an image of the target object 1002. The vision sensor 1003 can be further configured to recognize waste 1005 generated by a target object with the aid of one or more processors. The one or more processors 1007 can be on-board the UAV. The vision sensors can capture and image of the waste. The images captured by the vision sensor can be stored on a memory storage device 1006. The memory storage device can be on or off-board the UAV. The one or more processors 1007 can be configured to recognize waste generated by the target object from one or more images of the waste provided by the vision sensor. The UAV can further comprise a communication unit configured to send or transmit a signal to a user device to alert the user (e.g. owner of the target object) that the target objected has generated waste.
The communication unit can send or transmit a signal or alert to a user device to alert a user that the target object has generated waste. The user device can be a smartphone, tablet, personal computer, smart watch, smart glasses, or a wireless pager. The user device can comprise a user interface. The user interface can be interactive such that a user can control the UAV through the user interface. The alert can be an audio, visual, or tactile (e.g. vibration) alert. The alert can include a location where the waste was generated. The location can be provided in global coordinates. In some cases the location can be displayed on a map provided on the user interface on the user device. The user device can also provide an image of the waste generated by the target object.
A user can receive alerts about the location of the target object, behavior of the target object, and the location of waste generated by the target object. In some cases, a user can be the owner of the target object. Alternatively a user can be a waste removal professional. A waste removal professional can be a friend of a user, an acquaintance of a user, a volunteer, or an employee hired by the user. A waste removal professional can be any human that removes waste generated by the target object. A communication unit on-board the UAV can provide alerts to a waste removal professional about the time and/or location of waste generation by the target object. A waste removal professional can be contracted by an owner of the target object to dispose of waste generated by a target object. In some cases, a waste removal professional can be a volunteer. The owner of the target object can provide an item of value (e.g. currency, credit, or commodity) in exchange for removal of the waste generated by the target object. The waste removal professional can be compensated with a flat weekly, monthly, quarterly, bi-yearly or yearly rate. In some cases the waste removal professional can be compensated per waste disposal.
FIG. 11 shows an example of an alert from a UAV 1101 indicating that a target object 1102 has generated waste 1103. The alert can include the exact location of the waste or a general region in which the target object generated the waste. The alert can be transmitted from the UAV to a user device 1105 in communication with the UAV. The user device can be a computer, smart phone, tablet, smart watch, or smart glasses. The waste location or region of the waste location can be provided in relative or global coordinates. The alert can be provided only to waste removal professionals within a specified radius of the waste generation location 1104. A waste removal professional 1107 outside of the region 1104 may not receive an alert on their electronic device 1110. In some cases, the location can be provided on a map displayed on a user interface on a user device 1105. The alert can be provided to either or both of an owner 1106 of a target object and a waste removal professional 1109 within a specified radius of the waste generation location 1104. The owner of the target object or the waste removal professional can be set as a default to receive an alert to collect and/or dispose of the waste. In some cases the owner of the target object can be the default to collect and/or dispose of the waste, the owner can choose to divert the alert to a waste removal professional using their electronic device 1111. The owner may choose to divert the alert to a waste removal professional when they do not want or are not able to leave their home 1108, office, store, school, or other location to collect and/or dispose of the waste. In some cases the owner of the target object can control the alerts such that a waste removal professional is the default receiver of a waste alert during specified hours. For example, an owner can be the default recipient during morning and evening hours and a waste removal professional can be the default recipient during the middle of the day. The recipient that receives the alert (e.g. the owner or the waste removal professional) can travel to the location or the region in which the waste is generated and remove, collect, or dispose of the waste.
The UAV can be configured to recognize and remove waste generated by the target object. The UAV can capture one or more images of the target object and waste generated by the target object. One or more processors on or off-board the UAV can be configured to recognize the target object from the one or more images of the target object and to recognize waste generated by the target object from the one or more images of the waste generated by the target object. The UAV can comprise one or more waste removal units. The waste removal units can be configured to remove waste in response to recognition of the waste generated by the target object. The waste removal unit can include a mechanism configured to extend from the UAV to remove the waste, for example a mechanical arm. The mechanism configured to extend from the UAV to remove the waste can be an extendible structure with a scoop, shovel, or disposable container (e.g. plastic bag) at a terminal end configured to collect, remove, and/or dispose of waste generated by the target object. The UAV can be configured to collect waste generated by the target object and store the waste until it can be disposed of in a disposal container (e.g. trashcan, landfill, dumpster, or compost collector). The UAV can comprise one or more vision sensors that can capture images of the environment in the vicinity of the UAV. The images of the environment can be analyzed by one or more processors on-board or off-board the UAV that can be configured to recognize a disposal container. In response to locating the disposal container the UAV can dispose of the waste in the disposal container. If the disposal container is located outside of a permissible area of travel for the target object or inside of an impermissible area for travel of the target object the UAV can continue to store the waste until a disposal container inside of a permissible area or outside of an impermissible area is located.
The systems, devices, and methods described herein can be applied to a wide variety of movable objects. As previously mentioned, any description herein of a UAV, may apply to and be used for any movable object. Any description herein of an aerial vehicle may apply specifically to UAVs. A movable object of the present invention can be configured to move within any suitable environment, such as in air (e.g., a fixed-wing aircraft, a rotary-wing aircraft, or an aircraft having neither fixed wings nor rotary wings), in water (e.g., a ship or a submarine), on ground (e.g., a motor vehicle, such as a car, truck, bus, van, motorcycle, bicycle; a movable structure or frame such as a stick, fishing pole; or a train), under the ground (e.g., a subway), in space (e.g., a spaceplane, a satellite, or a probe), or any combination of these environments. The movable object can be a vehicle, such as a vehicle described elsewhere herein. In some embodiments, the movable object can be carried by a living subject, or take off from a living subject, such as a human or an animal. Suitable animals can include avines, canines, felines, equines, bovines, ovines, porcines, delphines, rodents, or insects.
The movable object may be capable of moving freely within the environment with respect to six degrees of freedom (e.g., three degrees of freedom in translation and three degrees of freedom in rotation). Alternatively, the movement of the movable object can be constrained with respect to one or more degrees of freedom, such as by a predetermined path, track, or orientation. The movement can be actuated by any suitable actuation mechanism, such as an engine or a motor. The actuation mechanism of the movable object can be powered by any suitable energy source, such as electrical energy, magnetic energy, solar energy, wind energy, gravitational energy, chemical energy, nuclear energy, or any suitable combination thereof. The movable object may be self-propelled via a propulsion system, as described elsewhere herein. The propulsion system may optionally run on an energy source, such as electrical energy, magnetic energy, solar energy, wind energy, gravitational energy, chemical energy, nuclear energy, or any suitable combination thereof. Alternatively, the movable object may be carried by a living being.
In some instances, the movable object can be an aerial vehicle. For example, aerial vehicles may be fixed-wing aircraft (e.g., airplane, gliders), rotary-wing aircraft (e.g., helicopters, rotorcraft), aircraft having both fixed wings and rotary wings, or aircraft having neither (e.g., blimps, hot air balloons). An aerial vehicle can be self-propelled, such as self-propelled through the air. A self-propelled aerial vehicle can utilize a propulsion system, such as a propulsion system including one or more engines, motors, wheels, axles, magnets, rotors, propellers, blades, nozzles, or any suitable combination thereof. In some instances, the propulsion system can be used to enable the movable object to take off from a surface, land on a surface, maintain its current position and/or orientation (e.g., hover), change orientation, and/or change position.
The movable object can be controlled remotely by a user or controlled locally by an occupant within or on the movable object. The movable object may be controlled remotely via an occupant within a separate vehicle. In some embodiments, the movable object is an unmanned movable object, such as a UAV. An unmanned movable object, such as a UAV, may not have an occupant on-board the movable object. The movable object can be controlled by a human or an autonomous control system (e.g., a computer control system), or any suitable combination thereof. The movable object can be an autonomous or semi-autonomous robot, such as a robot configured with an artificial intelligence.
The movable object can have any suitable size and/or dimensions. In some embodiments, the movable object may be of a size and/or dimensions to have a human occupant within or on the vehicle. Alternatively, the movable object may be of size and/or dimensions smaller than that capable of having a human occupant within or on the vehicle. The movable object may be of a size and/or dimensions suitable for being lifted or carried by a human. Alternatively, the movable object may be larger than a size and/or dimensions suitable for being lifted or carried by a human. In some instances, the movable object may have a maximum dimension (e.g., length, width, height, diameter, diagonal) of less than or equal to about: 2 cm, 5 cm, 10 cm, 50 cm, 1 m, 2 m, 5 m, or 10 m. The maximum dimension may be greater than or equal to about: 2 cm, 5 cm, 10 cm, 50 cm, 1 m, 2 m, 5 m, or 10 m. For example, the distance between shafts of opposite rotors of the movable object may be less than or equal to about: 2 cm, 5 cm, 10 cm, 50 cm, 1 m, 2 m, 5 m, or 10 m. Alternatively, the distance between shafts of opposite rotors may be greater than or equal to about: 2 cm, 5 cm, 10 cm, 50 cm, 1 m, 2 m, 5 m, or 10 m.
[00134] In some embodiments, the movable object may have a volume of less than 100 cm x 100 cm x 100 cm, less than 50 cm x 50 cm x 30 cm, or less than 5 cm x 5 cm x 3 cm. The total volume of the movable object may be less than or equal to about: 1 cm3, 2 cm3, 5 cm3, 10 cm3, 20 cm3, 30 cm3, 40 cm3, 50 cm3, 60 cm3, 70 cm3, 80 cm3, 90 cm3, 100 cm3, 150 cm3, 200 cm3, 300 cm3, 500 cm3, 750 cm3, 1000 cm3, 5000 cm3, 10,000 cm3, 100,000 cm33, 1 m3, or 10 m3. Conversely, the total volume of the movable object may be greater than or equal to about: 1 cm3, 2 cm3, 5 cm3, 10 cm3, 20 cm3, 30 cm3, 40 cm3, 50 cm3, 60 cm3, 70 cm3, 80 cm3, 90 cm3, 100 cm3, 150 cm3, 200 cm3, 300 cm3, 500 cm3, 750 cm3, 1000 cm3, 5000 cm3, 10,000 cm3, 100,000 cm3, 1 m3, or 10 m3.
[00135] In some embodiments, the movable object may have a footprint (which may refer to the lateral cross-sectional area encompassed by the movable object) less than or equal to about: 32,000 cm2, 20,000 cm2, 10,000 cm2, 1,000 cm2, 500 cm2, 100 cm2, 50 cm2, 10 cm2, or 5 cm2. Conversely, the footprint may be greater than or equal to about: 32,000 cm2, 20,000 cm2, 10,000 cm2, 1,000 cm2, 500 cm2, 100 cm2, 50 cm2, 10 cm2, or 5 cm2.
In some instances, the movable object may weigh no more than 1000 kg. The weight of the movable object may be less than or equal to about: 1000 kg, 750 kg, 500 kg, 200 kg, 150 kg, 100 kg, 80 kg, 70 kg, 60 kg, 50 kg, 45 kg, 40 kg, 35 kg, 30 kg, 25 kg, 20 kg, 15 kg, 12 kg, 10 kg, 9 kg, 8 kg, 7 kg, 6 kg, 5 kg, 4 kg, 3 kg, 2 kg, 1 kg, 0.5 kg, 0.1 kg, 0.05 kg, or 0.01 kg. Conversely, the weight may be greater than or equal to about: 1000 kg, 750 kg, 500 kg, 200 kg, 150 kg, 100 kg, 80 kg, 70 kg, 60 kg, 50 kg, 45 kg, 40 kg, 35 kg, 30 kg, 25 kg, 20 kg, 15 kg, 12 kg, 10 kg, 9 kg, 8 kg, 7 kg, 6 kg, 5 kg, 4 kg, 3 kg, 2 kg, 1 kg, 0.5 kg, 0.1 kg, 0.05 kg, or 0.01 kg.
In some embodiments, a movable object may be small relative to a load carried by the movable object. The load may include a payload and/or a carrier, as described in further detail elsewhere herein. In some examples, a ratio of a movable object weight to a load weight may be greater than, less than, or equal to about 1:1. In some instances, a ratio of a movable object weight to a load weight may be greater than, less than, or equal to about 1:1. Optionally, a ratio of a carrier weight to a load weight may be greater than, less than, or equal to about 1:1. When desired, the ratio of an movable object weight to a load weight may be less than or equal to: 1:2, 1:3, 1:4, 1:5, 1:10, or even less. Conversely, the ratio of a movable object weight to a load weight can also be greater than or equal to: 2:1, 3:1, 4:1, 5:1, 10:1, or even greater.
In some embodiments, the movable object may have low energy consumption. For example, the movable object may use less than about: 5 W/h, 4 W/h, 3 W/h, 2 W/h, 1 W/h, or less. In some instances, a carrier of the movable object may have low energy consumption. For example, the carrier may use less than about: 5 W/h, 4 W/h, 3 W/h, 2 W/h, 1 W/h, or less. Optionally, a payload of the movable object may have low energy consumption, such as less than about: 5 W/h, 4 W/h, 3 W/h, 2 W/h, 1 W/h, or less.
FIG. 12 illustrates an unmanned aerial vehicle (UAV) 1200, in accordance with embodiments of the present invention. The UAV may be an example of a movable object as described herein. The UAV 1200 can include a propulsion system having four rotors 1202, 1204, 1206, and 1208. Any number of rotors may be provided (e.g., one, two, three, four, five, six, or more). The rotors, rotor assemblies, or other propulsion systems of the unmanned aerial vehicle may enable the unmanned aerial vehicle to hover/maintain position, change orientation, and/or change location. The distance between shafts of opposite rotors can be any suitable length 410. For example, the length 1210 can be less than or equal to 2 m, or less than equal to 5 m. In some embodiments, the length 1210 can be within a range from 40 cm to 1 m, from 10 cm to 2 m, or from 5 cm to 5 m. Any description herein of a UAV may apply to a movable object, such as a movable object of a different type, and vice versa. The UAV may use an assisted takeoff system or method as described herein.
In some embodiments, the movable object can be configured to carry a load. The load can include one or more of passengers, cargo, equipment, instruments, and the like. The load can be provided within a housing. The housing may be separate from a housing of the movable object, or be part of a housing for a movable object. Alternatively, the load can be provided with a housing while the movable object does not have a housing. Alternatively, portions of the load or the entire load can be provided without a housing. The load can be rigidly fixed relative to the movable object. Optionally, the load can be movable relative to the movable object (e.g., translatable or rotatable relative to the movable object). The load can include a payload and/or a carrier, as described elsewhere herein.
In some embodiments, the movement of the movable object, carrier, and payload relative to a fixed reference frame (e.g., the surrounding environment) and/or to each other, can be controlled by a terminal. The terminal can be a remote control device at a location distant from the movable object, carrier, and/or payload. The terminal can be disposed on or affixed to a support platform. Alternatively, the terminal can be a handheld or wearable device. For example, the terminal can include a smartphone, tablet, laptop, computer, glasses, gloves, helmet, microphone, or suitable combinations thereof. The terminal can include a user interface, such as a keyboard, mouse, joystick, touchscreen, or display. Any suitable user input can be used to interact with the terminal, such as manually entered commands, voice control, gesture control, or position control (e.g., via a movement, location or tilt of the terminal).
The terminal can be used to control any suitable state of the movable object, carrier, and/or payload. For example, the terminal can be used to control the position and/or orientation of the movable object, carrier, and/or payload relative to a fixed reference from and/or to each other. In some embodiments, the terminal can be used to control individual elements of the movable object, carrier, and/or payload, such as the actuation assembly of the carrier, a sensor of the payload, or an emitter of the payload. The terminal can include a wireless communication device adapted to communicate with one or more of the movable object, carrier, or payload.
The terminal can include a suitable display unit for viewing information of the movable object, carrier, and/or payload. For example, the terminal can be configured to display information of the movable object, carrier, and/or payload with respect to position, translational velocity, translational acceleration, orientation, angular velocity, angular acceleration, or any suitable combinations thereof. In some embodiments, the terminal can display information provided by the payload, such as data provided by a functional payload (e.g., images recorded by a camera or other image capturing device).
Optionally, the same terminal may both control the movable object, carrier, and/or payload, or a state of the movable object, carrier and/or payload, as well as receive and/or display information from the movable object, carrier and/or payload. For example, a terminal may control the positioning of the payload relative to an environment, while displaying image data captured by the payload, or information about the position of the payload. Alternatively, different terminals may be used for different functions. For example, a first terminal may control movement or a state of the movable object, carrier, and/or payload while a second terminal may receive and/or display information from the movable object, carrier, and/or payload. For example, a first terminal may be used to control the positioning of the payload relative to an environment while a second terminal displays image data captured by the payload. Various communication modes may be utilized between a movable object and an integrated terminal that both controls the movable object and receives data, or between the movable object and multiple terminals that both control the movable object and receives data. For example, at least two different communication modes may be formed between the movable object and the terminal that both controls the movable object and receives data from the movable object.
FIG. 13 illustrates a movable object 1300 including a carrier 1302 and a payload 1304, in accordance with embodiments. Although the movable object 1300 is depicted as an aircraft, this depiction is not intended to be limiting, and any suitable type of movable object can be used, as previously described herein. One of skill in the art would appreciate that any of the embodiments described herein in the context of aircraft systems can be applied to any suitable movable object (e.g., an UAV). In some instances, the payload 1304 may be provided on the movable object 1300 without requiring the carrier 1302. The movable object 1300 may include propulsion mechanisms 1306, a sensing system 1308, and a communication system 1310.
The propulsion mechanisms 1306 can include one or more of rotors, propellers, blades, engines, motors, wheels, axles, magnets, or nozzles, as previously described. The movable object may have one or more, two or more, three or more, or four or more propulsion mechanisms. The propulsion mechanisms may all be of the same type. Alternatively, one or more propulsion mechanisms can be different types of propulsion mechanisms. The propulsion mechanisms 1306 can be mounted on the movable object 1300 using any suitable means, such as a support element (e.g., a drive shaft) as described elsewhere herein. The propulsion mechanisms 1306 can be mounted on any suitable portion of the movable object 1300, such on the top, bottom, front, back, sides, or suitable combinations thereof.
In some embodiments, the propulsion mechanisms 1306 can enable the movable object 1300 to take off vertically from a surface or land vertically on a surface without requiring any horizontal movement of the movable object 1300 (e.g., without traveling down a runway). Optionally, the propulsion mechanisms 1306 can be operable to permit the movable object 1300 to hover in the air at a specified position and/or orientation. One or more of the propulsion mechanisms 1300 may be controlled independently of the other propulsion mechanisms. Alternatively, the propulsion mechanisms 1300 can be configured to be controlled simultaneously. For example, the movable object 1300 can have multiple horizontally oriented rotors that can provide lift and/or thrust to the movable object. The multiple horizontally oriented rotors can be actuated to provide vertical takeoff, vertical landing, and hovering capabilities to the movable object 1300. In some embodiments, one or more of the horizontally oriented rotors may spin in a clockwise direction, while one or more of the horizontally rotors may spin in a counterclockwise direction. For example, the number of clockwise rotors may be equal to the number of counterclockwise rotors. The rotation rate of each of the horizontally oriented rotors can be varied independently in order to control the lift and/or thrust produced by each rotor, and thereby adjust the spatial disposition, velocity, and/or acceleration of the movable object 1300 (e.g., with respect to up to three degrees of translation and up to three degrees of rotation).
The sensing system 1308 can include one or more sensors that may sense the spatial disposition, velocity, and/or acceleration of the movable object 1300 (e.g., with respect to up to three degrees of translation and up to three degrees of rotation). The one or more sensors can include global positioning system (GPS) sensors, motion sensors, inertial sensors, proximity sensors, or image sensors. The sensing data provided by the sensing system 1308 can be used to control the spatial disposition, velocity, and/or orientation of the movable object 1300 (e.g., using a suitable processing unit and/or control module, as described below). Alternatively, the sensing system 1308 can be used to provide data regarding the environment surrounding the movable object, such as weather conditions, proximity to potential obstacles, location of geographical features, location of manmade structures, and the like.
The communication system 1310 enables communication with terminal 1312 having a communication system 1314 via wireless signals 1316. The communication systems 1310, 1314 may include any number of transmitters, receivers, and/or transceivers suitable for wireless communication. The communication may be one-way communication, such that data can be transmitted in only one direction. For example, one-way communication may involve only the movable object 1300 transmitting data to the terminal 1312, or vice-versa. The data may be transmitted from one or more transmitters of the communication system 1310 to one or more receivers of the communication system 1312, or vice-versa. Alternatively, the communication may be two-way communication, such that data can be transmitted in both directions between the movable object 1300 and the terminal 1312. The two-way communication can involve transmitting data from one or more transmitters of the communication system 1310 to one or more receivers of the communication system 1314, and vice-versa.
In some embodiments, the terminal 1312 can provide control data to one or more of the movable object 1300, carrier 1302, and payload 1304 and receive information from one or more of the movable object 1300, carrier 1302, and payload 1304 (e.g., position and/or motion information of the movable object, carrier or payload; data sensed by the payload such as image data captured by a payload camera). In some instances, control data from the terminal may include instructions for relative positions, movements, actuations, or controls of the movable object, carrier and/or payload. For example, the control data may result in a modification of the location and/or orientation of the movable object (e.g., via control of the propulsion mechanisms 1306), or a movement of the payload with respect to the movable object (e.g., via control of the carrier 1302). The control data from the terminal may result in control of the payload, such as control of the operation of a camera or other image capturing device (e.g., taking still or moving pictures, zooming in or out, turning on or off, switching imaging modes, change image resolution, changing focus, changing depth of field, changing exposure time, changing viewing angle or field of view). In some instances, the communications from the movable object, carrier and/or payload may include information from one or more sensors (e.g., of the sensing system 1308 or of the payload 1304). The communications may include sensed information from one or more different types of sensors (e.g., GPS sensors, motion sensors, inertial sensor, proximity sensors, or image sensors). Such information may pertain to the position (e.g., location, orientation), movement, or acceleration of the movable object, carrier and/or payload. Such information from a payload may include data captured by the payload or a sensed state of the payload. The control data provided transmitted by the terminal 1312 can be configured to control a state of one or more of the movable object 1300, carrier 1302, or payload 1304. Alternatively or in combination, the carrier 1302 and payload 1304 can also each include a communication module configured to communicate with terminal 1312, such that the terminal can communicate with and control each of the movable object 1300, carrier 1302, and payload 1304 independently.
In some embodiments, the movable object 1300 can be configured to communicate with another remote device in addition to the terminal 1312, or instead of the terminal 1312. The terminal 1312 may also be configured to communicate with another remote device as well as the movable object 1300. For example, the movable object 1300 and/or terminal 1312 may communicate with another movable object, or a carrier or payload of another movable object. When desired, the remote device may be a second terminal or other computing device (e.g., computer, laptop, tablet, smartphone, or other mobile device). The remote device can be configured to transmit data to the movable object 1300, receive data from the movable object 1300, transmit data to the terminal 1312, and/or receive data from the terminal 1312. Optionally, the remote device can be connected to the Internet or other telecommunications network, such that data received from the movable object 1300 and/or terminal 1312 can be uploaded to a website or server.
FIG. 14 is a schematic illustration by way of block diagram of a system 1400 for controlling a movable object, in accordance with embodiments. The system 1400 can be used in combination with any suitable embodiment of the systems, devices, and methods disclosed herein. The system 1400 can include a sensing module 1402, processing unit 1404, non-transitory computer readable medium 1406, control module 1408, and communication module 1410.
The sensing module 1402 can utilize different types of sensors that collect information relating to the movable objects in different ways. Different types of sensors may sense different types of signals or signals from different sources. For example, the sensors can include inertial sensors, GPS sensors, proximity sensors (e.g., lidar), or vision/image sensors (e.g., a camera). The sensing module 1402 can be operatively coupled to a processing unit 1404 having a plurality of processors. In some embodiments, the sensing module can be operatively coupled to a transmission module 1412 (e.g., a Wi-Fi image transmission module) configured to directly transmit sensing data to a suitable external device or system. For example, the transmission module 1412 can be used to transmit images captured by a camera of the sensing module 1402 to a remote terminal.
The processing unit 1404 can have one or more processors, such as a programmable processor (e.g., a central processing unit (CPU)). The processing unit 1404 can be operatively coupled to a non-transitory computer readable medium 1406. The non-transitory computer readable medium 1406 can store logic, code, and/or program instructions executable by the processing unit 1404 for performing one or more steps. The non-transitory computer readable medium can include one or more memory units (e.g., removable media or external storage such as an SD card or random access memory (RAM)). In some embodiments, data from the sensing module 1402 can be directly conveyed to and stored within the memory units of the non-transitory computer readable medium 1406. The memory units of the non-transitory computer readable medium 1406 can store logic, code and/or program instructions executable by the processing unit 1404 to perform any suitable embodiment of the methods described herein. For example, the processing unit 1404 can be configured to execute instructions causing one or more processors of the processing unit 1404 to analyze sensing data produced by the sensing module. The memory units can store sensing data from the sensing module to be processed by the processing unit 1404. In some embodiments, the memory units of the non-transitory computer readable medium 1406 can be used to store the processing results produced by the processing unit 1404.
In some embodiments, the processing unit 1404 can be operatively coupled to a control module 1408 configured to control a state of the movable object. For example, the control module 1408 can be configured to control the propulsion mechanisms of the movable object to adjust the spatial disposition, velocity, and/or acceleration of the movable object with respect to six degrees of freedom. Alternatively or in combination, the control module 1408 can control one or more of a state of a carrier, payload, or sensing module.
The processing unit 1404 can be operatively coupled to a communication module 1410 configured to transmit and/or receive data from one or more external devices (e.g., a terminal, display device, or other remote controller). Any suitable means of communication can be used, such as wired communication or wireless communication. For example, the communication module 1410 can utilize one or more of local area networks (LAN), wide area networks (WAN), infrared, radio, WiFi, point-to-point (P2P) networks, telecommunication networks, cloud communication, and the like. Optionally, relay stations, such as towers, satellites, or mobile stations, can be used. Wireless communications can be proximity dependent or proximity independent. In some embodiments, line-of-sight may or may not be required for communications. The communication module 1410 can transmit and/or receive one or more of sensing data from the sensing module 1402, processing results produced by the processing unit 1404, predetermined control data, user commands from a terminal or remote controller, and the like.
The components of the system 1400 can be arranged in any suitable configuration. For example, one or more of the components of the system 1400 can be located on the movable object, carrier, payload, terminal, sensing system, or an additional external device in communication with one or more of the above. Additionally, although FIG. 14 depicts a single processing unit 1404 and a single non-transitory computer readable medium 1406, one of skill in the art would appreciate that this is not intended to be limiting, and that the system 1400 can include a plurality of processing units and/or non-transitory computer readable media. In some embodiments, one or more of the plurality of processing units and/or non-transitory computer readable media can be situated at different locations, such as on the movable object, carrier, payload, terminal, sensing module, additional external device in communication with one or more of the above, or suitable combinations thereof, such that any suitable aspect of the processing and/or memory functions performed by the system 1400 can occur at one or more of the aforementioned locations.
While preferred embodiments of the present invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions will now occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.

Claims (205)

  1. A method of guiding a target object, said method comprising:
    receiving a user input, through a user device, that defines a target area, said target area comprising (1) a permissible area for the target object to travel, or (2) an impermissible area where the target object is not permitted to travel;
    receiving, from a movable object that guides the target object, a signal indicative of a location of the movable object; and
    receiving an indicator of the movable object exiting the permissible area for the target object to travel or an indicator of the movable object entering the impermissible area where the target object is not permitted to travel, said indicator generated based on the location of the movable object and the target area; and
    generating a movable object operation, in response to the indicator.
  2. The method of claim 1, wherein the target object is an animal.
  3. The method of claim 1, wherein the movable object is an unmanned aerial vehicle (UAV).
  4. The method of claim 3, wherein the movable object operation includes controlling flight of the UAV to control movement of the target object.
  5. The method of claim 3, wherein the movable object operation includes alerting the user that the UAV is exiting the permissible area or entering the impermissible area.
  6. The method of claim 1, wherein the user input comprises global coordinates that define the permissible area or the impermissible area.
  7. The method of claim 1, wherein the user input comprises an image or outline on a map defining the boundaries of the permissible area or the impermissible area.
  8. The method of claim 3, guiding the target object using the UAV, wherein the UAV is physically attached to the target object.
  9. The method of claim 8, wherein the UAV is attached to the target object by a leash that is attached to a collar of the target object.
  10. The method of claim 3, wherein the UAV is a rotorcraft comprising a plurality of rotors that permit the UAV to take off and/or land vertically.
  11. The method of claim 3, wherein the UAV comprises a location device that transmits information about the UAV’s location.
  12. The method of claim 11, wherein the location device is a GPS sensor.
  13. The method of claim 1, wherein the indicator of exiting the permissible area is received when the target object exits the permissible area.
  14. The method of claim 1, wherein the indicator of exiting the permissible area is received when the target object is within a predetermined threshold distance of a boundary of the permissible area and the target object is heading in the direction of the boundary.
  15. The method of claim 14, wherein the target object is heading in the direction of the boundary at a speed exceeding a threshold speed.
  16. The method of claim 1, wherein the indicator of entering the impermissible area is received when the target object enters the impermissible area.
  17. The method of claim 1, wherein the indicator of entering the impermissible area is received when the target object is within a predetermined threshold distance of a boundary of the impermissible area and the target object is heading in the direction of the boundary.
  18. The method of claim 17, wherein the target object is heading in the direction of the boundary at a speed exceeding a threshold speed.
  19. The method of claim 1, wherein the movable object operation includes playing the user’s voice to the target object when the indicator of exiting the permissible area or entering the impermissible area is received.
  20. The method of claim 19, further comprising transmitting the user’s voice from the user device to the UAV in real-time.
  21. The method of claim 19, wherein the user’s voice is a pre-recording.
  22. The method of claim 19, wherein the movable object operation includes delivering an electric shock to the target object if the target object does not respond to the user’s voice within a predetermined period of time.
  23. The method of claim 3, wherein the user interface is a screen of the UAV and the alert is provided visually.
  24. The method of claim 3, wherein the user interface is a speaker of the UAV and the alert is provided audibly.
  25. A system for guiding a target object, said system comprising: one or more processors, individually or collectively, configured to:
    (a) receive a signal indicative of a user input that defines a target area, said target area comprising (1) a permissible area for the target object to travel, or (2) an impermissible area where the target object is not permitted to travel;
    (b) receive a signal indicative of a location of a movable object that guides the target object;
    (c) determine, based on the target area and the signal indicative of the location of the movable object, when the movable object is exiting the permissible area for the target object to travel or when the movable object is entering the impermissible area where the target object is not permitted to travel; and
    (d) determine a movable object operation, in response to the determination of whether the movable object is exiting the permissible area for the target object to travel or entering the impermissible area where the target object is not permitted to travel.
  26. The system of claim 25, wherein the target object is an animal.
  27. The system of claim 26, wherein the movable object is an unmanned aerial vehicle (UAV).
  28. The system of claim 27, wherein the movable object operation includes controlling flight of the UAV to control movement of the target object.
  29. The system of claim 27, wherein the movable object operation includes alerting the user that the UAV is exiting the permissible area or entering the impermissible area.
  30. The system of claim 25, wherein the user input comprises global coordinates that define the permissible area or the impermissible area.
  31. The system of claim 25, wherein the user input comprises an image or outline on a map defining the boundaries of the permissible area or the impermissible area.
  32. The system of claim 27, wherein the UAV is physically attached to the target object while the UAV is guiding the target object.
  33. The system of claim 32, wherein the UAV is attached to the target object by a leash that is attached to a collar of the target object.
  34. The system of claim 27, wherein the UAV is a rotorcraft comprising a plurality of rotors that permit the UAV to take off and/or land vertically.
  35. The system of claim 27, wherein the UAV comprises a location device that transmits information about the UAV’s location
  36. The system of claim 35, wherein the location device is a GPS sensor.
  37. The system of claim 25, wherein the indicator of exiting the permissible area is provided when the target object exits the permissible area.
  38. The system of claim 25, wherein the indicator of exiting the permissible area is provided when the target object is within a predetermined threshold distance of a boundary of the permissible area and the target object is heading in the direction of the boundary.
  39. The system of claim 38, wherein the target object is heading in the direction of the boundary at a speed exceeding a threshold speed.
  40. The system of claim 25, wherein the one or more processors are configured to determine the UAV is entering the impermissible area when the target object enters the impermissible area.
  41. The system of claim 25, wherein the one or more processors are configured to determine that the UAV is entering the impermissible area when the target object is within a predetermined threshold distance of a boundary of the impermissible area and the target object is heading in the direction of the boundary.
  42. The system of claim 41, wherein the one or more processors are configured to determine that the target object is heading in the direction of the boundary at a speed exceeding a threshold speed.
  43. The system of claim 25, wherein the movable object operation includes playing the user’s voice to the target object when the indicator of exiting the permissible area or entering the impermissible area is received, and the one or more processors are configured to effect the movable object operation.
  44. The system of claim 43, wherein the user’s voice is transmitted from the user device to the UAV in real-time.
  45. The system of claim 43, wherein the user’s voice is a pre-recording.
  46. The system of claim 43, wherein the movable object operation includes delivering an electric shock to the target object if the target object does not respond to the user’s voice within a predetermined period of time.
  47. The system of claim 27, wherein the user interface is a screen of the UAV and the alert is provided visually.
  48. The system of claim 25, wherein the user interface is a speaker of the UAV and the alert is provided audibly.
  49. A method of guiding a target object using a movable object, said method comprising:
    recognizing the target object wearing a collar, with aid of one or more vision sensors on board the UAV;
    automatically attaching, without human aid, the movable object to the collar of the target object using a leash when the target object is recognized; and
    flying the movable object while the target object is attached to the movable object via the leash.
  50. The method of claim 49, wherein the target object is an animal.
  51. The method of claim 49, wherein the movable object is an unmanned aerial vehicle (UAV), and the UAV is flying while the target object is in locomotion.
  52. The method of claim 49, wherein the leash is formed of a flexible or bendable material.
  53. The method of claim 49, further comprising extending or retracting the leash while the UAV is in flight.
  54. The method of claim 49, wherein the leash attaches to the collar of the target object using one or more magnetic connection.
  55. The method of claim 49, wherein the leash attaches to the collar of the target object with aid of a robotic arm.
  56. The method of claim 55, wherein the robotic arm comprises one or more extension that guides the leash to the collar.
  57. The method of claim 49, further comprising capturing, using the one or more vision sensors, at least one image of the target object wearing the collar.
  58. The method of claim 57, further comprising recognizing, with aid of one or more processors, the target object from the image of the target object.
  59. The method of claim 57, wherein the movable object further comprises one or more processors configured to recognize the target object from the image of the collar.
  60. The method of claim 57, wherein the movable object is a UAV, and further comprising flying the UAV, subsequent to recognizing the target object, to a closer proximity of the target object in order to get into position to automatically attach the UAV to the collar of the target object.
  61. The method of claim 49, wherein flying the movable object includes guiding the target object by pulling on the leash.
  62. The method of claim 61, further comprising comparing a calculation of the target object motion and the movable object motion to determine one or more parameter with which the movable object pulls on the leash.
  63. The method of claim 49, further comprising collecting, using the movable object, an image of the target object while the target object is in locomotion and is attached to the movable object via the leash.
  64. The method of claim 49, further comprising displaying, on a map, the location of the movable object to the user.
  65. The method of claim 49, further comprising playing the user’s voice to the target object while the target object is in locomotion and is attached to the movable object via the leash.
  66. The method of claim 65, wherein the user’s voice is transmitted from the user device to the movable object in real-time.
  67. The method of claim 65, wherein the user’s voice is a pre-recording.
  68. The method of claim 65, wherein the user’s voice is speaking a command to the target object.
  69. A UAV configured to guide a target object, said UAV comprising:
    one or more vision sensors configured to capture an image of the target object wearing a collar;
    one or more processors configured to, individually or collectively, recognize the target object from the image of the target object wearing the collar;
    a leash attachment mechanism configured to automatically attach, without human aid, a leash to the collar of the target object when the target object is recognized; and
    one or more propulsion units configured to permit flight of the UAV while the target object is attached to the UAV via the leash.
  70. The UAV of claim 69, wherein the target object is an animal.
  71. The UAV of claim 69, wherein the UAV is flying while the target object is in locomotion.
  72. The UAV of claim 69, wherein the leash is formed of a flexible or bendable material.
  73. The UAV of claim 69, wherein the leash is extendible or retractable while the UAV is in flight.
  74. The UAV of claim 69, wherein the leash is configured to attach to the collar of the target object using one or more magnetic connection.
  75. The UAV of claim 69, wherein the leash is configured to attach to the collar of the target object with aid of a robotic arm.
  76. The UAV of claim 75, wherein the robotic arm comprises one or more extension that guides the leash to the collar.
  77. The UAV of claim 69, wherein the one or more vision sensors are configured to capture at least one image of the target object wearing the collar.
  78. The UAV of claim 77, wherein the UAV further comprises one or more processors configured to recognize the target object from the image of the target object.
  79. The UAV of claim 77, wherein the UAV further comprises one or more processors configured to recognize the target object from the image of the collar.
  80. The UAV of claim 77, wherein the one or more processors are configured to, subsequent to recognizing the target object, generate a signal to the one or more propulsion units to effect flight of the UAV to a closer proximity of the target object in order to get into position to automatically attach the UAV to the collar of the target object.
  81. The UAV of claim 69, wherein the UAV is configured to guide the target object by pulling on the leash.
  82. The UAV of claim 81, wherein the one or more processors are configured to compare a calculation of the target object motion and the UAV motion to determine one or more parameter with which the UAV pulls on the leash.
  83. The UAV of claim 69, wherein the one or more vision sensors are configured to collect an image of the target object while the target object is in locomotion and is attached to the UAV via the leash.
  84. The UAV of claim 83, wherein the one or more vision sensors are configured to collect an image of the collar of the target object.
  85. The UAV of claim 69, further comprising one or more speaker configured to play the user’s voice to the target object while the target object is in locomotion and is attached to the UAV via the leash.
  86. The UAV of claim 85, wherein the user’s voice is transmitted from the user device to the UAV in real-time.
  87. The UAV of claim 85, wherein the user’s voice is a pre-recording.
  88. The UAV of claim 85, wherein the user’s voice is speaking a command to the target object.
  89. A method of guiding a target object using a UAV, said method comprising:
    recognizing the target object, with aid of one or more vision sensors on board the UAV;
    automatically displaying, without human aid or invention, an attractor to the target object when the target object is recognized; and
    flying the UAV while the target object is in locomotion and following the attractor.
  90. The method of claim 89, wherein the target object is an animal.
  91. The method of claim 89, wherein the attractor is an edible treat.
  92. The method of claim 89, further comprising emitting, using the attractor, a selected scent.
  93. The method of claim 89, wherein the UAV displays the attractor by dangling the attractor at or near a head level of the target object.
  94. The method of claim 89, wherein the attractor comprises an image that is displayed on a screen carried by the UAV.
  95. The method of claim 89, wherein the image is a static image.
  96. The method of claim 95, wherein the image is an image of an owner of the target object.
  97. The method of claim 89, wherein the image is a video.
  98. The method of claim 97, wherein the image is a video of the owner of the target object.
  99. The method of claim 89, further comprising determining, using the one or more vision sensors, a location of the target object relative to the UAV and adjusting or maintaining the speed of the UAV flight to remain within a proximity of the target object that is sufficiently close for the target object to perceive the attractor.
  100. The method of claim 89, further comprising determining, using the one or more vision sensors, a trajectory of the locomotion of the target object relative to the UAV and adjusting or maintaining the direction of the UAV flight remain within a proximity of the target object that is sufficiently close for the target object to perceive the attractor.
  101. The method of claim 89, further comprising capturing at least one image of the target object using the one or more vision sensors.
  102. The method of claim 101, wherein the UAV further comprises one or more processors configured to recognize the target object from the image of the target object.
  103. The method of claim 101, wherein the target object is wearing a collar.
  104. The method of claim 103, wherein the UAV further comprises one or more processors configured to recognize the target object from the image of the collar.
  105. The method of claim 89, further comprising playing the user’s voice to the target object while the target object is in locomotion and is attached to the UAV via the leash.
  106. The method of claim 105, wherein the user’s voice is transmitted from the user device to the UAV in real-time.
  107. The method of claim 105, wherein the user’s voice is a pre-recording.
  108. The method of claim 105, wherein the user’s voice is saying a command to the target object.
  109. A UAV configured to guide a target object, said UAV comprising:
    one or more vision sensors configured to capture an image of the target object wearing a collar;
    one or more processors configured to, individually or collectively, recognize the target object from the image of the target object;
    an attractor display mechanism configured to display, without human aid or intervention, an attractor to the target object when the target object is recognized; and
    one or more propulsion units configured to permit flight of the UAV while the attractor is displayed to the target object.
  110. The UAV of claim 109, wherein the target object is an animal.
  111. The UAV of claim 109, wherein the attractor is an edible treat.
  112. The UAV of claim 109, wherein the attractor emits a selected scent.
  113. The UAV of claim 109, wherein the UAV displays the attractor by dangling the attractor at or near a head level of the target object.
  114. The UAV of claim 109, wherein the attractor comprises an image that is displayed on a screen carried by the UAV.
  115. The UAV of claim 109, wherein the image is a static image.
  116. The UAV of claim 115, wherein the image is an image of an owner of the target object.
  117. The UAV of claim 109, wherein the image is a video.
  118. The UAV of claim 117, wherein the image is a video of the owner of the target object.
  119. The UAV of claim 109, wherein the UAV is further configured to determine, using the one or more vision sensors, a location of the target object relative to the UAV and adjust or maintain the speed of the UAV flight to remain within a proximity of the target object that is sufficiently close for the target object to perceive the attractor.
  120. The UAV of claim 109, wherein the UAV is further configured to determine, using the one or more vision sensors, a trajectory of the locomotion of the target object relative to the UAV and adjust or maintain the direction of the UAV flight remain within a proximity of the target object that is sufficiently close for the target object to perceive the attractor.
  121. The UAV of claim 109, wherein the one or more vision sensors capture at least one image of the target object.
  122. The UAV of claim 121, wherein the UAV further comprises one or more processors configured to recognize the target object from the image of the target object.
  123. The UAV configured to guide a target object of claim 121, wherein the target object is wearing a collar.
  124. The UAV of claim 123, wherein the UAV further comprises one or more processors configured to recognize the target object from the image of the collar.
  125. The UAV of claim 109, wherein the UAV further comprises a speaker configured to play the user’s voice to the target object while the target object is in locomotion and is attached to the UAV via the leash.
  126. The UAV claim 125, wherein the user’s voice is transmitted from the user device to the UAV in real-time.
  127. The UAV claim 125, wherein the user’s voice is a pre-recording.
  128. The UAV of claim 125, wherein the user’s voice is saying a command to the target object.
  129. A method of guiding a target object, said method comprising:
    providing a UAV that guides the target object, wherein a location of the UAV is known;
    recognizing the target object, with aid of one or more vision sensors on board the UAV;
    recognizing waste generated by the target object, with aid of the one or more vision sensors on board the UAV; and
    alerting the user that the waste has been generated by the target object.
  130. The method of claim 129, wherein the target object is an animal.
  131. The method of claim 130, wherein the animal is a dog or a cat.
  132. The method of claim 129, further comprising providing information to the user about a location where the waste was generated.
  133. The method of claim 129, wherein the UAV further comprises one or more processors configured to recognize the waste from the image of the waste.
  134. The method of claim 129, wherein the user is alerted through a user device comprising a display.
  135. The method of claim 134, wherein the user device is a smartphone, tablet, or a personal computer.
  136. The method of claim 134, wherein the user device displays a map showing the location of where the waste was generated.
  137. The method of claim 134, wherein the user device displays an image of the waste generated by the target object.
  138. The method of claim 129, wherein the UAV guides the target object by being physically attached to the target object.
  139. The method of claim 138, wherein the UAV is attached to the target object by a leash that is attached to a collar of the target object.
  140. The method of claim 129, wherein the UAV guides the target object by displaying an attractor to the target object.
  141. The method of claim 140, wherein the attractor is an edible treat.
  142. The method of claim 129, wherein the user is a target object waste removal professional.
  143. The method of claim 129, wherein the UAV is a rotorcraft comprising a plurality of rotors that permit the UAV to take off and/or land vertically.
  144. The method of claim 129, wherein the UAV comprises a location device that transmits information about the UAV’s location.
  145. The method of claim 144, wherein the location device is a GPS sensor.
  146. A UAV configured to guide a target object, said UAV comprising:
    one or more vision sensors configured to capture an image of the target object and waste generated by the target object;
    one or more processors configured to, individually or collectively, (1) recognize the target object from the image of the target object, and (2) recognize the waste generated by the target object from the image of the waste generated by the target object;
    a communication unit configured to send a signal to a user device that alerts the user that the waste has been generated by the target object; and
    one or more propulsion units configured to permit flight of the UAV while guiding the target object.
  147. The UAV of claim 146, wherein the target object is an animal.
  148. The UAV of claim 147, wherein the animal is a dog or a cat.
  149. The UAV of claim 146, wherein the UAV is further configured to provide information to the user about a location where the waste was generated.
  150. The UAV of claim 146, wherein the user device comprises a display.
  151. The UAV of claim 150, wherein the user device is a smartphone, tablet, or a personal computer.
  152. The UAV of claim 150, wherein the user device is configured to display a map showing the location of where the waste was generated.
  153. The UAV of claim 150, wherein the user device is configured to display an image of the waste generated by the target object.
  154. The UAV of claim 146, wherein the UAV is configured to guide the target object by being physically attached to the target object.
  155. The UAV of claim 154, wherein the UAV is attached to the target object by a leash that is attached to a collar of the target object.
  156. The UAV of claim 146, wherein the UAV is configured to guide the target object by displaying an attractor to the target object.
  157. The UAV of claim 156, wherein the attractor is an edible treat.
  158. The UAV of claim 146, wherein the user is a target object waste removal professional.
  159. The UAV of claim 146, wherein the UAV is a rotorcraft comprising a plurality of rotors that permit the UAV to take off and/or land vertically.
  160. The UAV of claim 146, wherein the UAV comprises a location device that transmits information about the UAV’s location.
  161. The UAV of claim 160, wherein the location device is a GPS sensor.
  162. A method of guiding a target object, said method comprising:
    providing a UAV that guides the target object, wherein a location of the UAV is known;
    recognizing the target object, with aid of one or more vision sensors on board the UAV;
    recognizing waste generated by the target object, with aid of the one or more vision sensors on board the UAV; and
    removing the waste in response to recognizing the waste, using the UAV.
  163. The method of claim 162, wherein the target object is an animal.
  164. The method of claim 163, wherein the animal is a dog or a cat.
  165. The method of claim 162, further comprising providing information to the user about a location where the waste was generated.
  166. The method of claim 162, wherein the UAV further comprises one or more processors configured to recognize the waste from the image of the waste.
  167. The method of claim 162, wherein the UAV guides the target object by being physically attached to the target object.
  168. The method of claim 167, wherein the UAV is attached to the target object by a leash that is attached to a collar of the target object.
  169. The method of claim 162, wherein the UAV guides the target object by displaying an attractor to the target object.
  170. The method of claim 169, wherein the attractor is an edible treat.
  171. The method of claim 162, wherein the UAV is a rotorcraft comprising a plurality of rotors that permit the UAV to take off and/or land vertically.
  172. The method of claim 162, wherein the UAV comprises a location device that transmits information about the UAV’s location.
  173. The method of claim 172, wherein the location device is a GPS sensor.
  174. The method of claim 162, further comprising removing the waste with a mechanical arm.
  175. A UAV configured to guide a target object, said UAV comprising:
    one or more vision sensors configured to capture an image of the target object and waste generated by the target object;
    one or more processors configured to, individually or collectively, (1) recognize the target object from the image of the target object, and (2) recognize the waste generated by the target object from the image of the waste generated by the target object;
    one or more waste removal units, configured to remove the waste in response to the recognition of the waste; and
    one or more propulsion units configured to permit flight of the UAV while guiding the target object.
  176. The UAV of claim 175, wherein the target object is an animal.
  177. The UAV of claim 175, wherein the animal is a dog or a cat.
  178. The UAV of claim 175, wherein the UAV is configured to provide information to the user about a location where the waste was generated.
  179. The UAV of claim 175, wherein the UAV further comprises one or more processors configured to recognize the waste from the image of the waste.
  180. The UAV of claim 175, wherein the UAV guides the target object by being physically attached to the target object.
  181. The UAV of claim 175, wherein the UAV is attached to a leash that is attached to a collar of the target object.
  182. The UAV of claim 175, wherein the UAV guides the target object by displaying an attractor to the target object.
  183. The UAV of claim 182, wherein the attractor is an edible treat.
  184. The UAV of claim 175, wherein the UAV is a rotorcraft comprising a plurality of rotors that permit the UAV to take off and/or land vertically.
  185. The UAV of claim 175, wherein the UAV comprises a location device that transmits information about the UAV’s location.
  186. The UAV of claim 182, wherein the location device is a GPS sensor.
  187. The UAV of claim 175, wherein the one or more waste removal units include a mechanical arm that extends from the UAV to remove the waste.
  188. A method of guiding a target object, said method comprising:
    receiving a user input, through a user device, a travel route for a UAV to guide the target object;
    guiding the target object using the UAV by flying the UAV along the travel route while the target object is in locomotion, wherein a location of the UAV is known;
    receiving, through the user device while the UAV is guiding the target object along the travel route, a change to the travel route to provide an updated travel route; and
    flying the UAV along the updated travel route.
  189. The method of claim 188, wherein the user input comprises global coordinates that define the travel route.
  190. The method of claim 188, wherein the user input comprises global coordinates that define the updated travel route.
  191. The method of claim 188, wherein the user input comprises an image or line on a map defining the travel route.
  192. The method of claim 188, wherein the user input comprises an image or line on a map defining the updated travel route.
  193. The method of claim 188, wherein the UAV guides the target object by being physically attached to the target object.
  194. The method of claim 193, wherein the UAV is attached to a leash that is attached to a collar of the target object.
  195. The method of claim 188, wherein the target object is an animal.
  196. The method of claim 195, wherein the animal is a dog or a cat.
  197. The method of claim 188, wherein the UAV is a rotorcraft comprising a plurality of rotors that permit the UAV to take off and/or land vertically.
  198. The method of claim 188, wherein the UAV comprises a location device that transmits information about the UAV’s location.
  199. The method of claim 198, wherein the location device is a GPS sensor.
  200. The method of claim 188, capturing, with aid of one or more vision sensors on board the UAV, an image of the target object.
  201. The method of claim 200, detecting, with aid of one or more processors, when the target object is deviating from the travel route or the updated travel route based on the image of the target object.
  202. The method of claim 201, further comprising playing the user’s voice to the target object when the target object is deviating from the travel route or the updated travel route.
  203. The method of claim 202, wherein the user’s voice is transmitted from the user device to the UAV in real-time.
  204. The method of claim 202, wherein the user’s voice is a pre-recording.
  205. The method of claim 201, further comprising delivering an electric shock to the target object when the target object deviates from the travel route beyond a predetermined distance.
PCT/CN2014/090082 2014-10-31 2014-10-31 Systems and methods for walking pets WO2016065625A1 (en)

Priority Applications (10)

Application Number Priority Date Filing Date Title
CN202010667247.4A CN111913494B (en) 2014-10-31 2014-10-31 System and method for walking pets
CN201480079886.1A CN106455523B (en) 2014-10-31 2014-10-31 System and method for walking pets
JP2016553444A JP6181321B2 (en) 2014-10-31 2014-10-31 Method, system and apparatus for guiding a target object
PCT/CN2014/090082 WO2016065625A1 (en) 2014-10-31 2014-10-31 Systems and methods for walking pets
US15/214,076 US9661827B1 (en) 2014-10-31 2016-07-19 Systems and methods for walking pets
US15/493,072 US9861075B2 (en) 2014-10-31 2017-04-20 Systems and methods for walking pets
US15/827,787 US10159218B2 (en) 2014-10-31 2017-11-30 Systems and methods for walking pets
US16/228,190 US10729103B2 (en) 2014-10-31 2018-12-20 Unmanned aerial vehicle (UAV) and method of using UAV to guide a target
US16/984,037 US11246289B2 (en) 2014-10-31 2020-08-03 Systems and methods for walking pets
US17/651,062 US20220159928A1 (en) 2014-10-31 2022-02-14 Systems and methods for guiding a target

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2014/090082 WO2016065625A1 (en) 2014-10-31 2014-10-31 Systems and methods for walking pets

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/214,076 Continuation US9661827B1 (en) 2014-10-31 2016-07-19 Systems and methods for walking pets

Publications (1)

Publication Number Publication Date
WO2016065625A1 true WO2016065625A1 (en) 2016-05-06

Family

ID=55856440

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/090082 WO2016065625A1 (en) 2014-10-31 2014-10-31 Systems and methods for walking pets

Country Status (4)

Country Link
US (6) US9661827B1 (en)
JP (1) JP6181321B2 (en)
CN (2) CN111913494B (en)
WO (1) WO2016065625A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105912021A (en) * 2016-05-31 2016-08-31 成都德善能科技有限公司 Dog walking unmanned aerial vehicle
CN106325290A (en) * 2016-09-30 2017-01-11 北京奇虎科技有限公司 Monitoring system and device based on unmanned aerial vehicle
CN107309872A (en) * 2017-05-08 2017-11-03 南京航空航天大学 A kind of flying robot and its control method with mechanical arm
US20180046177A1 (en) * 2015-03-03 2018-02-15 Guangzhou Ehang Intelligent Technology Co., Ltd. Motion Sensing Flight Control System Based on Smart Terminal and Terminal Equipment
WO2018058305A1 (en) 2016-09-27 2018-04-05 SZ DJI Technology Co., Ltd. System and method for controlling unmaned vehicle with presence of live object
CN109416546A (en) * 2017-01-19 2019-03-01 车荣天 Use the traction device of unmanned plane
CN113269946A (en) * 2021-03-22 2021-08-17 陇东学院 Security alarm device for community Internet of things rescue
KR20220020459A (en) * 2020-08-11 2022-02-21 울산과학기술원 Drone for for walking with pet
US20220104457A1 (en) * 2020-10-02 2022-04-07 Toyota Jidosha Kabushiki Kaisha Guidance vehicle
US11325703B2 (en) * 2018-07-09 2022-05-10 Panasonic Intellectual Property Management Co., Ltd. Control device, information processing method, and tethering device
KR20220113557A (en) * 2021-02-05 2022-08-16 동의대학교 산학협력단 Companion animal autonomous walking drone
BE1030608B1 (en) * 2022-06-10 2024-01-15 Somomo Bvba Enrichment method and system for zoo predators

Families Citing this family (71)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11425891B2 (en) 2012-09-19 2022-08-30 Botsitter, Llc Method and system for remote monitoring, care and maintenance of animals
US11249495B2 (en) 2012-09-19 2022-02-15 Botsitter, Llc Method and system for remote monitoring, care and maintenance of animals
US11278000B2 (en) * 2012-09-19 2022-03-22 Botsitter, Llc Method and system for remote monitoring, care and maintenance of animals
US10555498B2 (en) * 2012-09-19 2020-02-11 Botsitter, Llc Method and system for remote monitoring, care and maintenance of animals
US11297801B2 (en) * 2013-12-17 2022-04-12 Swift Paws, Inc. Lure chasing system
CN111913494B (en) 2014-10-31 2023-10-17 深圳市大疆创新科技有限公司 System and method for walking pets
CN105874397A (en) * 2014-11-28 2016-08-17 深圳市大疆创新科技有限公司 Unmanned aerial vehicle and water sample detection method thereof
WO2017020222A1 (en) * 2015-08-03 2017-02-09 北京艾肯拓科技有限公司 Method and device for controlling movement of external device
US20170039857A1 (en) * 2015-08-08 2017-02-09 Kenneth S. Kwan Systems and methods for interacting with aerial drones
KR20170022489A (en) * 2015-08-20 2017-03-02 엘지전자 주식회사 Unmanned air device and method of controlling the same
CA2947688C (en) * 2015-11-06 2023-08-01 David RANCOURT Tethered wing structures complex flight path
US10853756B2 (en) * 2016-03-02 2020-12-01 International Business Machines Corporation Vehicle identification and interception
KR101917195B1 (en) * 2016-07-22 2018-11-09 이창섭 Pet toilet training apparatus and method
US10170011B2 (en) 2016-07-26 2019-01-01 International Business Machines Corporation Guide drones for airplanes on the ground
US10820574B2 (en) * 2016-07-29 2020-11-03 International Business Machines Corporation Specialized contextual drones for virtual fences
US9987971B2 (en) 2016-07-29 2018-06-05 International Business Machines Corporation Drone-enhanced vehicle external lights
US11518510B1 (en) * 2016-10-06 2022-12-06 Gopro, Inc. Systems and methods for controlling an unmanned aerial vehicle
US11068837B2 (en) * 2016-11-21 2021-07-20 International Business Machines Corporation System and method of securely sending and receiving packages via drones
KR102680675B1 (en) * 2016-12-05 2024-07-03 삼성전자주식회사 Flight controlling method and electronic device supporting the same
CN107114263A (en) * 2017-04-12 2017-09-01 丁永胜 A kind of free-ranging sheep intelligent control method and system based on positional information
JP2018185167A (en) * 2017-04-24 2018-11-22 三菱電機株式会社 Flying control device and shape measurement device
CN107047366B (en) * 2017-04-28 2019-11-08 徐州网递智能科技有限公司 A kind of control method and its unmanned plane of automatic intelligent traction unmanned plane
US20180321681A1 (en) * 2017-05-05 2018-11-08 Pinnacle Vista, LLC Leading drone method
US10133281B1 (en) * 2017-05-05 2018-11-20 Pinnacle Vista, LLC Leading drone system
CN107263493A (en) * 2017-06-12 2017-10-20 苏州寅初信息科技有限公司 A kind of intelligence traction robot and its guard method with protection mechanism
CN107049719B (en) * 2017-06-12 2019-10-18 尚良仲毅(沈阳)高新科技有限公司 A kind of intelligent blind-guiding alarming method for power and its system based on unmanned plane
KR20190009103A (en) * 2017-07-18 2019-01-28 삼성전자주식회사 Electronic Device that is moved based on Distance to External Object and the Control Method
WO2019028528A1 (en) * 2017-08-11 2019-02-14 Bucher Municipal Pty Ltd A refuse collection system
CN109426708A (en) * 2017-08-22 2019-03-05 上海荆虹电子科技有限公司 A kind of pet management system and method
US10777008B2 (en) * 2017-08-31 2020-09-15 Disney Enterprises, Inc. Drones generating various air flow effects around a virtual reality or augmented reality user
US10909830B1 (en) 2017-11-07 2021-02-02 Pica Product Development, Llc Personal emergency alert system, method and device
US10798541B2 (en) 2017-11-07 2020-10-06 Pica Product Development, Llc Systems, methods and devices for remote trap monitoring
US10694338B2 (en) 2017-11-07 2020-06-23 Pica Product Development, Llc Cellular automated external defibrillator (AED) tracker
CN108012326B (en) * 2017-12-07 2019-06-11 珠海市一微半导体有限公司 The method and chip of robot monitoring pet based on grating map
CN108033006A (en) * 2017-12-16 2018-05-15 佛山市神风航空科技有限公司 A kind of intelligent bionic mechanical bird
US11048277B1 (en) 2018-01-24 2021-06-29 Skydio, Inc. Objective-based control of an autonomous unmanned aerial vehicle
CN108353805A (en) * 2018-02-01 2018-08-03 深圳市启智来科技有限公司 A kind of Intelligent flight device for pet care
US20190250601A1 (en) * 2018-02-13 2019-08-15 Skydio, Inc. Aircraft flight user interface
KR102111432B1 (en) * 2018-02-27 2020-05-15 (주)호모미미쿠스 Remote control system and method to support separate operation of an animal
WO2019168258A1 (en) * 2018-02-27 2019-09-06 (주)호모미미쿠스 Remote control system and method for supporting animal's independent task execution, and animal wearable multi-purpose modular platform system
EP3764779A4 (en) * 2018-03-14 2022-03-16 Protect Animals with Satellites, LLC Corrective collar utilizing geolocation technology
CN108617538A (en) * 2018-05-08 2018-10-09 黎弋凡 It is a kind of to drive necklace and positioning navigation method for what is herded
CN108719270A (en) * 2018-06-06 2018-11-02 杨育萱 A kind of bio-control system and method based on virtual electronic fence
CN108762311A (en) * 2018-06-22 2018-11-06 东汉太阳能无人机技术有限公司 The flight control method and device of aircraft
TWI662290B (en) * 2018-08-16 2019-06-11 National Formosa University Wearable system for aviation internet of things and captive animals
US11307584B2 (en) 2018-09-04 2022-04-19 Skydio, Inc. Applications and skills for an autonomous unmanned aerial vehicle
CN109159134A (en) * 2018-10-09 2019-01-08 上海思依暄机器人科技股份有限公司 A kind of robot control method and robot
JP7394322B2 (en) * 2018-11-28 2023-12-08 パナソニックIpマネジメント株式会社 Unmanned flying vehicle, control method and program
CN110547218A (en) * 2019-04-30 2019-12-10 内蒙古物通天下网络科技有限责任公司 Livestock searching system
CN110146070B (en) * 2019-05-13 2021-01-15 珠海市一微半导体有限公司 Laser navigation method suitable for pet attraction
CN110321796A (en) * 2019-05-30 2019-10-11 北京迈格威科技有限公司 A kind of dog only instructs and guides method, apparatus, system and storage medium
CA3144145A1 (en) * 2019-08-15 2021-01-18 Kenneth Scott EHRMAN Corrective collar utilizing geolocation technology
US11958183B2 (en) 2019-09-19 2024-04-16 The Research Foundation For The State University Of New York Negotiation-based human-robot collaboration via augmented reality
CN110677488B (en) * 2019-09-30 2022-06-14 青岛海尔科技有限公司 Event planning method and device for Internet of things system, storage medium and electronic device
CN110622919B (en) * 2019-10-31 2021-08-13 义乌市坤玥玩具有限公司 Pet walking protector
US20210261247A1 (en) * 2020-02-26 2021-08-26 Nxp B.V. Systems and methodology for voice and/or gesture communication with device having v2x capability
KR102319260B1 (en) 2020-03-30 2021-11-01 (주)한컴텔라딘 Drone for flying over the road and interlocking with traffic lights based on the flight path provided to follow the road and operating method thereof
CN111838005B (en) * 2020-06-22 2022-04-19 中国科学院深圳先进技术研究院 Observation device for observing animal activities
KR102214408B1 (en) * 2020-08-27 2021-02-09 임형순 Remote control device for pet
US11726475B2 (en) 2020-11-30 2023-08-15 At&T Intellectual Property I, L.P. Autonomous aerial vehicle airspace claiming and announcing
US11797896B2 (en) 2020-11-30 2023-10-24 At&T Intellectual Property I, L.P. Autonomous aerial vehicle assisted viewing location selection for event venue
US11443518B2 (en) 2020-11-30 2022-09-13 At&T Intellectual Property I, L.P. Uncrewed aerial vehicle shared environment privacy and security
JP7319244B2 (en) * 2020-12-07 2023-08-01 Hapsモバイル株式会社 Control device, program, system and method
KR102405763B1 (en) * 2021-01-29 2022-06-03 한국항공대학교산학협력단 Drone playing with pet animal, drone set playing with pet animal having the same and pet animal nosework nethod using the same
CN113184075B (en) * 2021-05-25 2022-08-26 重庆邮电大学 Wind-resistant vibration-resistant climbing robot imitating exendin
CN113892440A (en) * 2021-09-29 2022-01-07 湖南纳九物联科技有限公司 Device and method for monitoring lead rope during dog walking
CN114342826B (en) * 2021-12-09 2022-11-29 深圳先进技术研究院 Electric shock device and application
CN114403042A (en) * 2021-12-13 2022-04-29 深圳先进技术研究院 Animal model for anxiety of large animals
IT202200011069A1 (en) 2022-05-26 2023-11-26 Trade Company S R L REMOTE PILOT DEVICE FOR TRAINING AND HANDLING ANIMALS
US20240284873A1 (en) * 2023-02-24 2024-08-29 Zaid Almanssoori Robodayton Hybrid System
US20240324557A1 (en) * 2023-03-30 2024-10-03 Audrey Havican Method of communication with a pet

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102809969A (en) * 2011-06-03 2012-12-05 鸿富锦精密工业(深圳)有限公司 Unmanned aerial vehicle control system and method
CN103941750A (en) * 2014-04-30 2014-07-23 东北大学 Device and method for composition based on small quad-rotor unmanned aerial vehicle

Family Cites Families (74)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3716029A (en) * 1971-01-22 1973-02-13 C Pillsbury Animal exercising device
US4865328A (en) * 1984-03-16 1989-09-12 The United States Of America As Represented By The Secretary Of The Navy Low-cost, expendable, crushable target aircraft
JP3343364B2 (en) 1991-07-31 2002-11-11 東芝ライテック株式会社 Low pressure mercury vapor discharge lamp
JPH0536383U (en) * 1991-10-21 1993-05-18 株式会社ケンウツド Mobile auxiliary equipment
US5255629A (en) * 1992-03-09 1993-10-26 Jerry Paterson Rider remote-controlled cutting horse trainer
US5870973A (en) * 1996-05-30 1999-02-16 Invisible Fence Company, Inc. Electronic animal control system transmitter with variable phase control
JPH10319116A (en) * 1997-05-20 1998-12-04 Aisin Seiki Co Ltd Device for detecting animal utilizing electromagnetic wave
JP2000157089A (en) * 1998-11-25 2000-06-13 Taiji Nishimura Collar with telephone function
US6213056B1 (en) * 1999-06-21 2001-04-10 Martin Bergmann Automatic horse walker
US7937042B2 (en) * 2000-06-09 2011-05-03 Dot Holdings, Llc Animal training and tracking system using RF identification tags
US20020046713A1 (en) * 2000-09-08 2002-04-25 Otto James R. Method for remotely controlling movement of an animal
JP2003150717A (en) 2001-11-09 2003-05-23 Toto Ltd Lifestyle habit improvement support system
US6904868B2 (en) * 2002-07-12 2005-06-14 Robert S. Block Interactive mobile food dispenser
ITPR20030019A1 (en) * 2003-03-19 2004-09-20 Bice Srl ANIMAL LEASH APPLICABLE TO VEHICLE.
US6782847B1 (en) * 2003-06-18 2004-08-31 David Shemesh Automated surveillance monitor of non-humans in real time
JP2005289307A (en) 2004-04-05 2005-10-20 Seiko Epson Corp Flight guiding device, walk guiding method, and flight guiding program
US7156054B1 (en) * 2004-06-16 2007-01-02 Rick York Horse walker/animal conditioning system
US7409924B2 (en) * 2004-07-15 2008-08-12 Lawrence Kates Training, management, and/or entertainment system for canines, felines, or other animals
US7479884B1 (en) * 2004-08-31 2009-01-20 Cedar Ridge Research System and method for monitoring objects, people, animals or places
US8395484B2 (en) * 2004-08-31 2013-03-12 Cedar Ridge Research Llc System and method for monitoring objects, people, animals or places
JP4316477B2 (en) * 2004-11-18 2009-08-19 パナソニック株式会社 Tracking method of mobile robot
US7620493B2 (en) * 2005-06-10 2009-11-17 The Board Of Regents, The University Of Texas System System, method and apparatus for providing navigational assistance
JP2008061640A (en) * 2006-09-08 2008-03-21 Kofukuka Nakakawaji Relaxation equipment for dog
US20080262669A1 (en) * 2006-09-22 2008-10-23 Jadi, Inc. Autonomous vehicle controller
US8221290B2 (en) 2007-08-17 2012-07-17 Adidas International Marketing B.V. Sports electronic training system with electronic gaming features, and applications thereof
CN101278653B (en) * 2007-09-26 2011-08-24 深圳先进技术研究院 Intelligent robot for nursing household pet
CN101278654B (en) * 2007-09-26 2010-12-01 深圳先进技术研究院 Robot system for nursing pet
US8178825B2 (en) * 2007-10-29 2012-05-15 Honeywell International Inc. Guided delivery of small munitions from an unmanned aerial vehicle
US9026272B2 (en) * 2007-12-14 2015-05-05 The Boeing Company Methods for autonomous tracking and surveillance
US8392065B2 (en) 2008-09-11 2013-03-05 Deere & Company Leader-follower semi-autonomous vehicle with operator on side
US8989972B2 (en) * 2008-09-11 2015-03-24 Deere & Company Leader-follower fully-autonomous vehicle with operator on side
US8028662B2 (en) * 2008-10-31 2011-10-04 Raymond Laurel D Method of training a dog to chew acceptable objects through scent marking and chemical composition thereof
EP2386052A1 (en) * 2009-01-09 2011-11-16 Mbda Uk Limited Missile guidance system
IL201681A (en) * 2009-10-22 2014-06-30 Abraham Abershitz Uav system and method
FI122052B (en) * 2009-11-06 2011-08-15 Domuset Oy A method and arrangement for tracking the path of a pet pet at home
US8253572B2 (en) * 2010-01-22 2012-08-28 Lytle Jr Bradley D Electronic tether system and method with rate of change detection and vehicle braking features
WO2011146584A1 (en) * 2010-05-18 2011-11-24 Woodstream Corporation, Inc. Custom-shape wireless dog fence system and method
CA2841987A1 (en) * 2011-06-13 2012-12-20 Robert Jesurum Pet restraint system
US9582006B2 (en) * 2011-07-06 2017-02-28 Peloton Technology, Inc. Systems and methods for semi-autonomous convoying of vehicles
WO2013033954A1 (en) * 2011-09-09 2013-03-14 深圳市大疆创新科技有限公司 Gyroscopic dynamic auto-balancing ball head
CA2851154A1 (en) * 2011-10-05 2013-04-11 Radio Systems Corporation Image-based animal control systems and methods
US9329001B2 (en) * 2011-10-26 2016-05-03 Farrokh Mohamadi Remote detection, confirmation and detonation of buried improvised explosive devices
US8805008B1 (en) * 2011-11-02 2014-08-12 The Boeing Company Tracking closely spaced objects in images
US8922363B2 (en) * 2011-11-07 2014-12-30 Min Jae SO Animal training apparatus for locating collar transceiver using GPS and method of controlling the same
US9110168B2 (en) * 2011-11-18 2015-08-18 Farrokh Mohamadi Software-defined multi-mode ultra-wideband radar for autonomous vertical take-off and landing of small unmanned aerial systems
CN102530255A (en) * 2011-12-13 2012-07-04 江西洪都航空工业集团有限责任公司 Accurate parachute landing device for traction type unmanned plane and method
CN202464124U (en) * 2012-01-13 2012-10-03 安徽理工大学 Four-rotor aircraft
JP3175361U (en) 2012-02-19 2012-05-10 株式会社クラフト Mountain navigation system
US9030491B1 (en) * 2012-04-18 2015-05-12 The United States Of America As Represented By The Secretary Of The Navy System and method for displaying data from multiple devices on a single user interface
CN102722697B (en) * 2012-05-16 2015-06-03 北京理工大学 Unmanned aerial vehicle autonomous navigation landing visual target tracking method
US20140018979A1 (en) * 2012-07-13 2014-01-16 Honeywell International Inc. Autonomous airspace flight planning and virtual airspace containment system
AU2013299897A1 (en) * 2012-08-06 2015-02-26 Radio Systems Corporation Housebreaking reward system
US20150350614A1 (en) * 2012-08-31 2015-12-03 Brain Corporation Apparatus and methods for tracking using aerial video
US8707900B1 (en) * 2012-09-19 2014-04-29 Krystalka Ronette Womble Method and system for remote monitoring, care and maintenance of animals
US8800488B2 (en) * 2012-10-02 2014-08-12 Alex Jon Stone Internet controlled pet feeder
US20140100773A1 (en) * 2012-10-07 2014-04-10 Practical Intellect, Llc Method of Assistance for the Visually Impaired
CN203152217U (en) * 2012-11-15 2013-08-28 韩晴羽 Dog raising robot
JP6029446B2 (en) 2012-12-13 2016-11-24 セコム株式会社 Autonomous flying robot
US9776716B2 (en) * 2012-12-19 2017-10-03 Elwah LLC Unoccupied flying vehicle (UFV) inter-vehicle communication for hazard handling
US8930044B1 (en) * 2012-12-28 2015-01-06 Google Inc. Multi-part navigation process by an unmanned aerial vehicle for navigating to a medical situatiion
US9538725B2 (en) * 2013-03-08 2017-01-10 Eb Partners Mobile telephone dog training tool and method
JP6190750B2 (en) * 2013-04-16 2017-08-30 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Excrement detection system, excrement detection method, and excrement detection program
CN103345157A (en) * 2013-06-21 2013-10-09 南京航空航天大学 Unmanned aerial vehicle three freedom degree model building method
CN103490842B (en) * 2013-09-26 2016-09-28 深圳市大疆创新科技有限公司 Data transmission system and method
CN103576692A (en) * 2013-11-07 2014-02-12 哈尔滨工程大学 Method for achieving coordinated flight of multiple unmanned aerial vehicles
US20150175276A1 (en) * 2013-12-19 2015-06-25 Kenneth Lee Koster Delivery platform for unmanned aerial vehicles
CN103914076B (en) * 2014-03-28 2017-02-15 浙江吉利控股集团有限公司 Cargo transferring system and method based on unmanned aerial vehicle
US9087451B1 (en) * 2014-07-14 2015-07-21 John A. Jarrell Unmanned aerial vehicle communication, monitoring, and traffic management
US9170117B1 (en) * 2014-08-21 2015-10-27 International Business Machines Corporation Unmanned aerial vehicle navigation assistance
CN110174903B (en) * 2014-09-05 2023-05-09 深圳市大疆创新科技有限公司 System and method for controlling a movable object within an environment
US9429945B2 (en) * 2014-10-22 2016-08-30 Honeywell International Inc. Surveying areas using a radar system and an unmanned aerial vehicle
CN111913494B (en) 2014-10-31 2023-10-17 深圳市大疆创新科技有限公司 System and method for walking pets
US9530058B2 (en) * 2014-12-11 2016-12-27 Toyota Motor Engineering & Manufacturing North America, Inc. Visual-assist robots
US9637233B2 (en) 2015-09-21 2017-05-02 International Business Machines Corporation Unmanned aerial vehicle for interacting with a pet

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102809969A (en) * 2011-06-03 2012-12-05 鸿富锦精密工业(深圳)有限公司 Unmanned aerial vehicle control system and method
CN103941750A (en) * 2014-04-30 2014-07-23 东北大学 Device and method for composition based on small quad-rotor unmanned aerial vehicle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"A UAV LEADS A PET DOG IN A PARK FOR A WALK, AND WILL BE AN ANTIFACT WALKING A DOG", 29 May 2014 (2014-05-29), Retrieved from the Internet <URL:http://www.chengshiw.com/mil/2014/369142.html> *

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180046177A1 (en) * 2015-03-03 2018-02-15 Guangzhou Ehang Intelligent Technology Co., Ltd. Motion Sensing Flight Control System Based on Smart Terminal and Terminal Equipment
CN105912021A (en) * 2016-05-31 2016-08-31 成都德善能科技有限公司 Dog walking unmanned aerial vehicle
WO2018058305A1 (en) 2016-09-27 2018-04-05 SZ DJI Technology Co., Ltd. System and method for controlling unmaned vehicle with presence of live object
CN108700877A (en) * 2016-09-27 2018-10-23 深圳市大疆创新科技有限公司 System and method for controlling unmanned apparatus of transport when there are mobiles
EP3377952A4 (en) * 2016-09-27 2018-12-19 SZ DJI Technology Co., Ltd. System and method for controlling unmaned vehicle with presence of live object
EP3761135A1 (en) * 2016-09-27 2021-01-06 SZ DJI Technology Co., Ltd. System and method for controlling an unmanned vehicle in presence of a live object
US11475682B2 (en) 2016-09-27 2022-10-18 SZ DJI Technology Co., Ltd. System and method for controlling an unmanned vehicle with presence of live object
CN106325290A (en) * 2016-09-30 2017-01-11 北京奇虎科技有限公司 Monitoring system and device based on unmanned aerial vehicle
CN109416546A (en) * 2017-01-19 2019-03-01 车荣天 Use the traction device of unmanned plane
CN107309872A (en) * 2017-05-08 2017-11-03 南京航空航天大学 A kind of flying robot and its control method with mechanical arm
CN107309872B (en) * 2017-05-08 2021-06-15 南京航空航天大学 Flying robot with mechanical arm and control method thereof
US11325703B2 (en) * 2018-07-09 2022-05-10 Panasonic Intellectual Property Management Co., Ltd. Control device, information processing method, and tethering device
KR102430073B1 (en) 2020-08-11 2022-08-08 울산과학기술원 Drone for for walking with pet
KR20220020459A (en) * 2020-08-11 2022-02-21 울산과학기술원 Drone for for walking with pet
US20220104457A1 (en) * 2020-10-02 2022-04-07 Toyota Jidosha Kabushiki Kaisha Guidance vehicle
US11718967B2 (en) * 2020-10-02 2023-08-08 Toyota Jidosha Kabushiki Kaisha Guidance vehicle
KR20220113557A (en) * 2021-02-05 2022-08-16 동의대학교 산학협력단 Companion animal autonomous walking drone
KR102440081B1 (en) 2021-02-05 2022-09-05 동의대학교 산학협력단 Companion animal autonomous walking drone
CN113269946B (en) * 2021-03-22 2022-08-02 陇东学院 Security alarm device for community Internet of things rescue
CN113269946A (en) * 2021-03-22 2021-08-17 陇东学院 Security alarm device for community Internet of things rescue
BE1030608B1 (en) * 2022-06-10 2024-01-15 Somomo Bvba Enrichment method and system for zoo predators

Also Published As

Publication number Publication date
US20170215381A1 (en) 2017-08-03
CN111913494B (en) 2023-10-17
JP2017509330A (en) 2017-04-06
US20200359600A1 (en) 2020-11-19
US10729103B2 (en) 2020-08-04
CN111913494A (en) 2020-11-10
CN106455523B (en) 2020-08-04
US20180077902A1 (en) 2018-03-22
US9661827B1 (en) 2017-05-30
US10159218B2 (en) 2018-12-25
US9861075B2 (en) 2018-01-09
US20190116758A1 (en) 2019-04-25
US11246289B2 (en) 2022-02-15
US20170127652A1 (en) 2017-05-11
JP6181321B2 (en) 2017-08-16
CN106455523A (en) 2017-02-22
US20220159928A1 (en) 2022-05-26

Similar Documents

Publication Publication Date Title
WO2016065625A1 (en) Systems and methods for walking pets
WO2016106746A1 (en) Vehicle altitude restrictions and control
US20220091607A1 (en) Systems and methods for target tracking
WO2016065623A1 (en) Systems and methods for surveillance with visual marker
WO2016049924A1 (en) Systems and methods for flight simulation
WO2016015232A1 (en) Systems and methods for payload stabilization
WO2017059581A1 (en) Salient feature based vehicle positioning
WO2016192024A1 (en) Spraying system having a liquid flow and rotating speed feedback
WO2017096548A1 (en) Systems and methods for auto-return
WO2016197307A1 (en) Methods and apparatus for image processing
JP2021144260A (en) Information processing device, information processing method, program, and information processing system
JP2017503226A5 (en)
JP2018007677A (en) Method for guiding target object and uav
CN112740226A (en) Operating system and method of movable object based on human body indication
EP3761135B1 (en) System and method for controlling an unmanned vehicle in presence of a live object
JP2018129063A (en) Method for controlling unmanned aircraft, unmanned aircraft, and system for controlling unmanned aircraft
Czygier et al. Autonomous searching robot with object recognition based on neural networks
JP2021025831A (en) Walking control device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14904980

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016553444

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14904980

Country of ref document: EP

Kind code of ref document: A1