WO2024105676A1 - Unmanned aerial vehicle for neutralizing insects - Google Patents

Unmanned aerial vehicle for neutralizing insects Download PDF

Info

Publication number
WO2024105676A1
WO2024105676A1 PCT/IL2023/051190 IL2023051190W WO2024105676A1 WO 2024105676 A1 WO2024105676 A1 WO 2024105676A1 IL 2023051190 W IL2023051190 W IL 2023051190W WO 2024105676 A1 WO2024105676 A1 WO 2024105676A1
Authority
WO
WIPO (PCT)
Prior art keywords
insect
drone
stationary
location
camera
Prior art date
Application number
PCT/IL2023/051190
Other languages
French (fr)
Inventor
Saar Wilf
Nadav BENEDEK
Zvi Friedman
Original Assignee
Bzigo Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bzigo Ltd filed Critical Bzigo Ltd
Publication of WO2024105676A1 publication Critical patent/WO2024105676A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/14Flying platforms with four distinct rotor axes, e.g. quadcopters
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M1/00Stationary means for catching or killing insects
    • A01M1/06Catching insects by using a suction effect
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M1/00Stationary means for catching or killing insects
    • A01M1/14Catching by adhesive surfaces
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M5/00Catching insects in fields, gardens, or forests by movable appliances
    • A01M5/02Portable appliances
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C27/00Rotorcraft; Rotors peculiar thereto
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U30/00Means for producing lift; Empennages; Arrangements thereof
    • B64U30/20Rotors; Rotor supports
    • B64U30/26Ducted or shrouded rotors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U30/00Means for producing lift; Empennages; Arrangements thereof
    • B64U30/20Rotors; Rotor supports
    • B64U30/29Constructional aspects of rotors or rotor supports; Arrangements thereof
    • B64U30/294Rotors arranged in the UAV body
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U30/00Means for producing lift; Empennages; Arrangements thereof
    • B64U30/20Rotors; Rotor supports
    • B64U30/29Constructional aspects of rotors or rotor supports; Arrangements thereof
    • B64U30/299Rotor guards
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/40UAVs specially adapted for particular uses or applications for agriculture or forestry operations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/70UAVs specially adapted for particular uses or applications for use inside enclosed spaces, e.g. in buildings or in vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U70/00Launching, take-off or landing arrangements
    • B64U70/90Launching from or landing on platforms
    • B64U70/95Means for guiding the landing UAV towards the platform, e.g. lighting means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U70/00Launching, take-off or landing arrangements
    • B64U70/90Launching from or landing on platforms
    • B64U70/97Means for guiding the UAV to a specific location on the platform, e.g. platform structures preventing landing off-centre

Definitions

  • the present invention is in the field of pest control, specifically, using an unmanned aerial vehicle for neutralizing pests, such as flying insects.
  • UAV unmanned aerial vehicles
  • Embodiments of the invention provide a system and method for neutralizing an insect, such as a mosquito or other flying insect, by detecting, in images, an insect landing on a surface and calculating or estimating a stationary location of the insect after landing on the surface and guiding a UAV to the stationary location of the insect.
  • the UAV navigates to the stationary location of the insect autonomously, based on images in which there is an unobstructed line of sight to the insect, thus avoiding obstacles in the environment without the need to rely on collision avoidance algorithms.
  • a system for neutralizing an insect includes a camera to capture images of an area and a processor in communication with the camera to detect an insect landing on a surface in the area and to calculate a stationary location of the insect after landing on the surface. Responsive to detecting a stationary insect after landing, the processor generates navigation instructions for a UAV (also referred to herein as “drone”) to guide the drone to the stationary location of the insect.
  • the drone is configured to navigate to the stationary insect on the surface and neutralize the insect.
  • the drone includes an insect- prodding component configured to cause the insect to leave the surface.
  • the insect prodding component is configured to be moved to an approximate unchanging, stationary location of the insect on the surface, possibly to make physical contact with the insect.
  • the insect prodding component may cause the insect to leave the surface by causing changes in the environment of the insect, which will cause the insect to leave the surface, such as by emitting light, moving air, creating an electromagnetic field, etc.
  • Methods for neutralizing an insect include calculating from an image, a location of a stationary insect, e.g., after landing on a surface, and guiding a drone to the proximity of the stationary location.
  • the method further includes controlling the drone to cause the insect to leave the surface, thereby enabling the drone to neutralize the insect after leaving the surface, e.g., by sucking the insect into a propeller of the drone.
  • Solutions provided by embodiments of the invention can be applied in inexpensive, computationally light devices suitable for everyday use.
  • FIGs. 1A and IB are schematic illustrations of systems for neutralizing insects, according to embodiments of the invention.
  • FIG. 2 is a schematic illustration of a method for neutralizing an insect, according to embodiments of the invention.
  • FIG. 3 is a schematic representation of a camera FOV, according to an embodiment of the invention.
  • FIG. 4 is a schematic illustration of a method for guiding a drone to neutralize an insect, according to one embodiment of the invention.
  • FIG. 5 is a schematic illustration of a method for guiding a drone to neutralize an insect, according to another embodiment of the invention.
  • FIG. 6 schematically illustrates a drone according to embodiments of the invention.
  • FIG. 7 schematically illustrates a docking station for a drone, according to embodiments of the invention.
  • Embodiments of the invention provide systems and methods for neutralizing an insect by using a camera to automatically detect in images an insect landing on a surface and autonomously guide a drone, based on the images, to the stationary location of the insect after landing, to neutralize the insect.
  • Examples described herein refer mainly to flying insects, specifically mosquitoes. However, embodiments of the invention may be used to locate and neutralize other pests as well. [0021] In the following description, various aspects of the present invention will be described. For purposes of explanation, specific configurations and details are set forth in order to provide a thorough understanding of the present invention. However, it will also be apparent to one skilled in the art that the present invention may be practiced without the specific details presented herein. Furthermore, well-known features may be omitted or simplified in order not to obscure the present invention.
  • a system for neutralizing an insect includes a camera to capture images of an area and a processor in communication with the camera.
  • the processor detects in the images a moving object and tracks the moving object until determining that the moving object has become a stationary object.
  • the processor determines that the stationary object is the insect (also referred to as “target insect”).
  • the processor may apply one or more image processing algorithms on an image of the stationary object to determine that the stationary object is the target insect.
  • the processor calculates the stationary location of the target insect, for example, on a surface in the imaged area, and generates navigation instructions for a drone that is configured to neutralize the insect, to be guided from a docking station of the drone to the stationary location of the target insect on the surface.
  • the system for neutralizing an insect includes a camera 103 having a field of view (FOV) 103’.
  • the FOV 103’ covers an area which includes a surface 104.
  • the area covered by FOV 103’ may include parts of a room or other indoor or outdoor location, and surface 104 may be an element or part of an element in the room or other location, such as a wall, window, ceiling, floor, furniture in the room, an outdoor construction, vegetation, etc.
  • the system also includes a processor 102 in communication with the camera 103.
  • the processor 102 detects in images captured by the camera 103, a stationary insect 106.
  • processor 102 detects in images captured by camera 103 a moving object that becomes stationary, e.g., an insect 106 flying and then landing on the surface 104.
  • processor 102 calculates the stationary location of insect 106 on surface 104 and generates navigation instructions by which to guide a drone 105 to the stationary location of insect 106 on surface 104.
  • Drone 105 is configured to be guided in accordance with the navigation instructions and to neutralize the insect.
  • the navigation instructions generated by processor 102 are based on detecting the stationary insect 106 in a plurality of consecutive images captured by camera 103, thus ensuring a direct and continuous “line of sight” from camera 103 to the stationary insect 106, which indicates that there is an obstacle- free path between the camera 103 and the stationary insect 106 on surface 104.
  • Processor 102 typically detects a landing insect (such as a mosquito) by using machine vision techniques.
  • processor 102 detects an alighting mosquito. Namely, the mosquito is detected flying and then settling down and becoming stationary on surface 104.
  • Processor 102 then calculates the unchanging location of the mosquito on the surface 104 in one or more images of the mosquito after it has settled down and become stationary. The calculated location is used to generate navigation instructions for drone 105.
  • Processor 102 may use, for example, tracking algorithms to detect and track an object suspected of being an insect (e.g., a mosquito) in images while the object approaches and lands on surface 104.
  • a subtraction image (e.g., created by subtracting a past image which contains surface 104 from a current image of surface 104) can be used to detect the object on surface 104.
  • Machine vision algorithms can then be used to determine that the object is a specific or target insect (e.g., mosquito).
  • object detection algorithms and/or segmentation algorithms may be used to detect an object in an image (e.g., a subtraction image) and to determine which pixels are associated with the object.
  • Additional algorithms such as size, shape and/or color detection algorithms may be used to analyze the subtraction image to determine if the object is a mosquito (or another insect).
  • a learning model may be applied on images which include the surface and stationary object on the surface, to determine that a detected object is a mosquito (or other insect).
  • a learning model may be applied, for example, on the subtraction image to detect an object having a predetermined criterion and/or the model may be applied on a current image to determine if the object is a mosquito (or another insect).
  • a learning model may be applied at other steps as well, such as integrating various inputs (e.g., color, transparency, size, movement pattern, etc.) into a single decision of determining whether the stationary object is a mosquito (or another insect).
  • the stationary object on the surface is a specific insect (for example, a mosquito)
  • the unchanging location of the insect is estimated, and a drone can be guided to the estimated location.
  • the system includes a light source 108, such as an infrared (IR) illumination source which may include an LED or other illumination source emitting in a range of about 750-1050 nm.
  • IR illumination source illuminates at around 850nm. The use of an IR illumination source can enable operating the system even in a dark space by providing illumination that is not visible and/or irritating to the human eye but which enables camera 103 to obtain meaningful images of surface 104.
  • Camera 103 typically includes an image sensor, such as an appropriate chip, e.g., a CCD or CMOS chip, and may be a 2D or 3D camera.
  • the camera 103 may include lenses and/or other optics to enable obtaining images of the insect flying and landing on surface 104.
  • Camera 103 may include a stereoscopic camera or another camera device with depth perception capability.
  • camera 103 includes an IR-sensitive sensor and/or may include lenses and/or filters to filter out specific wavelengths to reduce noise (e.g., from fluorescent lighting or displays), to enable obtaining images in noisy environments and/or in special illumination conditions, e.g., in low illumination environments, as discussed above.
  • Processor 102 may include, for example, one or more processors such as a central processing unit (CPU), a digital signal processor (DSP), a graphics processing unit (GPU), a microprocessor, a controller, a chip, a microchip, an integrated circuit (IC), or any other suitable multi-purpose or specific processor or controller.
  • processor 102 is in communication with one or more memory unit(s) 112.
  • Memory unit(s) 112 may include, for example, a random-access memory (RAM), a dynamic RAM (DRAM), a flash memory, a volatile memory, a non-volatile memory, a cache memory, a buffer, a short term memory unit, a long-term memory unit, or other suitable memory units or storage units.
  • RAM random-access memory
  • DRAM dynamic RAM
  • flash memory volatile memory
  • non-volatile memory a cache memory
  • buffer a short term memory unit
  • long-term memory unit or other suitable memory units or storage units.
  • Components of the system may be connected to each other via appropriate cabling or suitable ports such as USB.
  • components of the system are connected wirelessly, e.g., via wireless communication unit 111 using, for example, Bluetooth and WiFi protocols or other suitable radio frequency methods.
  • Communication between processor 102 and drone 105 may be done via Bluetooth or Wi-Fi, or other suitable wireless communication protocols.
  • camera 103 and processor 102 are located in a stationary device which is separate from drone 105.
  • camera 103 and processor 102 may be attached to or enclosed within a single housing 101 configured to be placed on the wall or other location within a room or other area.
  • Housing 101 which may be made of materials that are practical and safe for use, such as plastic and/or metal, may include one or more pivoting elements, such as hinges, rotatable joints or ball joints, allowing for various movements of the housing 101.
  • housing 101 can be stationed at one location in a room but can enable several fields of view to camera 103, which is encased within housing 101, by rotating and/or tilting housing 101 or parts of housing 101. Housing 101 may also provide stability for camera 103 while obtaining images. Housing 101 may additionally be part of a docking station for drone 105 (as further described below).
  • camera 103 is positioned such that its focal plane is facing surface 104.
  • a surface in a room may include the floor or ceiling of the room or a wall or surface of furniture in the room, etc.
  • the system may include a warning device (not shown), e.g., a sound emitting device and/or a light source, such as a dedicated LED, and processor 102 may generate a warning signal, such as to cause a sound or light to be emitted, based on detection of an insect.
  • the warning signal may be generated for example, when an insect landing on a surface is detected (e.g., an insect’s movement is tracked until the insect becomes stationary) and/or when the stationary insect is determined to be the target insect and/or when the stationary location of the target insect is calculated.
  • processor 102 may generate a signal to be sent to a remote device (such as a user’s mobile phone) to advise a user that an insect has been detected.
  • a safety feature may be implemented, where a user must send a command to launch drone 105 before drone 105 can be guided to the location of the insect.
  • a method for neutralizing an insect may include receiving a command from a user prior to and/or as a prerequisite for guiding a drone to the location of an insect in order to neutralize the insect.
  • drone 105 is visible in FOV 103’ (and in an image captured by camera 103) while it is being guided to the stationary insect 106 on surface 104, thus camera 103 and processor 102 can track drone 105 while it is within FOV 103’, and estimate locations of the drone in each image. Therefore, processor 102 may generate navigation instructions based on an estimated location of drone 105 and an estimated location of the stationary insect 106 in images captured by camera 103. Thus, navigation of the drone becomes a two-dimensional problem rather than having to find different three-dimensional ranges of the drone to the target (the stationary insect). Additionally, since the target is stationary at a known (e.g., estimated) unchanging, stationary location, only the drone needs to be tracked, thereby reducing the level of complexity of computing required to guide drone 105.
  • processor 102 is configured to estimate the distance of drone 105 from the location of insect 106 on surface 104 and the navigation instructions are generated by processor 102 based on the estimated distance of the drone from the insect.
  • drone 105 may include a detector 115 to estimate the distance of the drone from surface 104.
  • Detector 115 may include a proximity sensor such as an ultrasonic sensor, IR range detector or a physical contact sensor.
  • Processor 102 (or a processor onboard the drone) may be in communication with detector 115 to estimate a distance of drone 105 from surface 104 based on input from detector 115.
  • drone 105 may include a device (such as a laser or other light source projector) that can project a pattern including at least two points, onto surface 104.
  • the pattern can be captured by camera 103, providing an indication of the location of surface 104 in the images captured by camera 103.
  • Processor 102 can analyze images from camera 103 which include drone 105 and the pattern, to determine the distance of drone 105 from surface 104. Changes in the distance between drone 105 and surface 104 may be detected based on a change in the distance between at least two points in the pattern projected onto surface 104.
  • the system may include, e.g., as part of housing 101, a visual mark generator 114 (e.g., a laser or other light source projector) to generate a visual mark (e.g., a spot of light) on surface 104 at the stationary location of the insect.
  • a visual mark generator 114 e.g., a laser or other light source projector
  • Processor 102 can control the visual mark generator 114 to project a laser or other light beam to the approximate location of the stationary insect 106 on the surface 104, forming a visual mark on the surface close to the insect.
  • a dedicated optical sensor 113 which is mounted onboard drone 105, and possibly an onboard processor, may be used to increase the accuracy of guiding drone 105 to insect 106, e.g., by increasing the accuracy of the final stages of the drone’s navigation. For example, if a visual mark is formed close to the insect, the visual mark on surface 104, as imaged by optical sensor 113, may assist drone 105 to accurately reach insect 106.
  • the system includes a sensor to detect presence of the insect 106 on the surface while drone 105 is being guided to insect 106 on the surface 104.
  • the sensor may include the optical sensor 113, which may be attached to the drone, and which can improve the accuracy of guiding drone 105 to insect 106 on surface 104.
  • a method for neutralizing an insect may include the steps of tracking a moving object in images to until it is determined that the moving object has become a stationary object, determining that the stationary object is the insect (e.g., a target insect) by applying one or more image processing algorithm on an image of the stationary object.
  • the method includes calculating a stationary location of the insect and guiding a drone to proximity of the stationary location.
  • the drone is guided from its docking station to the stationary location of the insect, upon determination that the stationary object is the insect and upon calculation of the stationary location of the insect.
  • the drone is then controlled to cause the insect to move from the stationary location, thereby enabling the drone to neutralize the insect after the insect moves away from the stationary location.
  • processor 102 calculates from an image a location of a stationary insect after landing on a surface (step 202). Drone 105 is then guided to a proximate stationary location of the insect on the surface (step 204), typically based on navigation instructions generated by processor 102. For example, the drone can be guided to a specific pixel in the image, e.g., drone 105 can be guided to a pixel below or above pixels associated with the stationary location of the insect or to another pixel that is at a predetermined relation to a pixel associated with the stationary location of the insect.
  • the method may include sending a signal to a remote device such as a user’s end device (e.g., mobile phone) and/or to a warning device (as described above) after detection of the insect and/or after calculating the unchanging, stationary location of the insect (e.g., after step 202), to advise the user of the detection and/or location of the insect.
  • the method may further include receiving a signal or command from the user to launch the drone, prior to and/or as a prerequisite for guiding the drone to the location of the insect (e.g., prior to step 204).
  • Drone 105 is then controlled to cause the insect to move away from the stationary location, e.g., leave the surface (step 206), enabling the drone to neutralize the insect after leaving the surface (step 208).
  • Drone 105 may be controlled to hover at its location (e.g., close to the stationary location of the insect) for a predetermined period (e.g., 2-5 seconds) to ensure the neutralization of the insect before returning to its docking station. In other embodiments, drone 105 may be controlled to hover at its location until the insect is detected leaving the surface.
  • Controlling drone 105 to cause the insect to move away from its stationary location, e.g., leave the surface can include using an insect prodding component of the drone to cause the insect to leave the surface so it may be eliminated by an insect neutralizing system of drone 105.
  • An insect neutralizing system may include, for example, means for chemically, mechanically or electrically neutralizing an insect.
  • the drone’s propellers may be used to mechanically neutralize an insect (as described below).
  • other appropriate known insect neutralizing systems may be used, such as a system for spraying an insecticide at the insect, a system for electrocuting an insect, etc.
  • the insect prodding component may be configured to come in close proximity to the insect and/or to make physical contact with the insect, thereby prodding the insect to leave the surface.
  • the insect prodding component may include a device that can cause a change in the environment of the insect, which will cause the insect to leave the surface.
  • the insect prodding component may include one or more of: a light-emitting device, an air-moving device and an electromagnetic field emitter.
  • Drone 105 may neutralize the insect after leaving the surface by using an insect neutralizing system, e.g., a propulsion unit of the drone may be included in the insect neutralizing system and may be used to create suction to suck the insect into a propeller of the propulsion unit and neutralize the insect between the blades of the propeller.
  • an insect neutralizing system e.g., a propulsion unit of the drone may be included in the insect neutralizing system and may be used to create suction to suck the insect into a propeller of the propulsion unit and neutralize the insect between the blades of the propeller.
  • Guiding drone 105 to a proximate location of a stationary insect 106 on a surface 104 may be done by using camera 103 to track drone 105 within the camera FOV 103’.
  • Fig. 3 which shows a representation 300 of FOV 103’ (e.g., an image captured by camera 103)
  • drone 105 may be visible in the FOV 103’ while it is being guided to the stationary location of insect 106 on surface 104.
  • One or more pixels determined to be associated with the stationary location of the insect is depicted by coordinate 303 which represents the stationary location of an insect on a surface 104, within FOV 103’.
  • the pixels associated with drone 105 can also be detected by processor 102 by using similar or other appropriate image processing algorithms.
  • Coordinate 315 represents the location of the drone within the FOV 103’.
  • Processor 102 may then generate navigation instructions based on a location of the insect (as represented by coordinate 303) and a location of the drone within FOV 103’.
  • the navigation instructions to the stationary location of the insect on the surface may include a command to reach coordinate 303, which includes at least one pixel associated with the location of the insect.
  • the navigation instructions to the stationary location of the insect on the surface may include a command to reach a coordinate located in a predetermined location relative to the location of the insect, e.g., the navigation instructions may be to reach a pixel in the vicinity of a location of the insect, e.g., to reach coordinate 313 that represents a pixel located below coordinate 303.
  • a method for guiding a drone to neutralize an insect is schematically illustrated in Fig. 4.
  • Processor 102 detects within a FOV of a camera, a drone and a stationary insect on a surface (step 402).
  • a coordinate related to location of the drone e.g., coordinate 315) and a coordinate related to a location of the insect (e.g., coordinate 303 or 313) are determined (step 404).
  • step 405 the coordinate related to location of the drone is at a predetermined distance or within a predetermined range of distances from the coordinate related to the location of the stationary insect, then the drone is controlled to cause the insect to leave the surface (step 406), enabling the drone to neutralize the insect after leaving the surface (step 408). If, in step 405, the coordinate related to location of the drone is not at a predetermined distance or within a predetermined range of distances from the coordinate related to the location of the stationary insect, then a command is generated for the drone to move toward the coordinate related to a location of the insect (step 409) and the step of determining whether the drone is within a predetermined distance (or range of distances) is preformed again.
  • the coordinate can include a pixel associated with the insect and/or a pixel proximal to a location of the insect, e.g., at a predetermined location, e.g., below, above or to the side of the location of the insect.
  • a method for navigating a drone to neutralize an insect is schematically illustrated in Fig. 5.
  • Processor 102 detects within a FOV of a camera a stationary insect on a surface and guides the drone to the location of the stationary insect (step 502), e.g., as described above. In step 503, it is determined whether the drone reached the surface.
  • step 503 If it is determined (in step 503) that the drone did not reach the surface, then a command is generated for the drone to move toward the surface (step 504). If it is determined (in step 503) that the drone has reached the surface, a navigation command is generated for the drone to move away from the surface (506) to enable maneuvering space for the drone so that the drone may be situated at a location and angle that will enable sucking the insect into the propulsion unit of the drone and neutralizing the insect (step 508).
  • the determination (in step 503) of whether the drone reached the surface can be done by using image processing algorithms and/or by receiving a signal from a sensor that is in connection with the drone.
  • a proximity sensor such as detector 115
  • processor 102 or possibly with a drone-onboard processor
  • processor 102 or a drone-onboard processor
  • the processor 102 can determine a change of distance of the drone from the camera (e.g., based on a known location of the camera and using image processing to determine distance of the drone from the camera). A determination can be made that the drone has reached the surface based on the change of distance of the drone from the camera. For example, if navigation instructions are transmitted to the drone to move toward the surface, but there is no change in the distance of the drone from the camera, then it can be determined that the drone has reached the surface and is physically prevented from progressing further.
  • the processor 102 can receive acceleration data of the drone (e.g., by tracking the drone in images captured by camera 103 or by data received from, e.g., a drone-onboard accelerometer) and, based on a detected change in acceleration of the drone it can be determined that the drone has reached the surface. For example, based on a change in acceleration of the drone, the processor may detect a vibration of the drone that occurs when the drone comes in contact with the surface.
  • acceleration data of the drone e.g., by tracking the drone in images captured by camera 103 or by data received from, e.g., a drone-onboard accelerometer
  • Drone 605 includes a frame 601, which supports propulsion units 611, typically three or more propulsion units, and a power source (not shown), such as a battery.
  • Each propulsion unit 611 includes a propeller having rotating blades 612.
  • at least one propulsion unit includes a ducted fan such that blades 612 rotate within a cylindrical housing 613.
  • drone 605 is configured to neutralize an insect by sucking the insect into one of propulsion units 611 of the drone.
  • the dimensions of propulsion units 611 and the velocity of rotation of blades 612 are designed to create suction that will suck in an insect (e.g., mosquito) after it has left the surface.
  • the propeller has a diameter of 30-40 mm.
  • the propeller operates at a speed to cause the air to move at above 2 meter/sec.
  • drone 605 includes an insect prodding component 603 configured to cause the insect to leave the surface.
  • the insect prodding component 603 also serves as a structure to maintain a distance between propulsion units 611 of the drone and the surface.
  • the insect prodding component 603 may be, for example, a dome-shaped structure attached to frame 601 of drone 605, which can be used to prod the insect to leave the surface and also to maintain a distance between propulsion units 611 of the drone and the surface.
  • prodding component 603 may include, for example, a moveable, possibly extendible member, such as a telescopic arm attached to frame 601.
  • the member may be controlled by a processor (e.g., processor 102) to extend from frame 601 to come close to the insect on the surface and possibly make physical contact with the insect.
  • the insect prodding component 603 can cause a change in the environment of the insect, which will cause the insect to leave the surface.
  • the insect prodding component may include one or more of: a light-emitting device, an airmoving device and an electromagnetic field emitter.
  • several different types of prodding components can be used, possibly simultaneously, to cause the insect to leave the surface.
  • Drone 605 typically includes a receiver to receive navigation instructions, e.g., from processor 102, and may include proximity sensors and/or bumpers, e.g., attached to its frame 601 to avoid collision with the surface.
  • drone 605 may have a visible marking, e.g., on its frame, such as a colored design or light source, to enable easier detection of drone 605 in images.
  • Drone 605 may have onboard sensors (e.g., as described above) and/or processing capabilities.
  • FIG. 7 schematically illustrates a docking station for the drone, according to one embodiment of the invention.
  • Docking station 700 includes a surface 701 for drone 705 to rest on and a port for charging the drone (not shown).
  • a camera 703 is attached to the docking station such that the camera FOV 703’ includes drone 705 while it is being guided to a stationary insect on a surface and while the drone takes off and lands at the docking station 700, typically on surface 701.
  • docking station 700 includes an assisted- landing element 715 configured to allow passive alignment of drone 705 with docking station 700, specifically with surface 701.
  • Assisted-landing element 715 may include a rod or other construction that can enable drone 705 to slide down it by gravity alone. At least a distal end 715’ of assisted- landing element 715 is visible within FOV 703’. When drone 705 approaches and is close enough to docking station 700, both distal end 715’ and drone 705 can be detected within a single image.
  • a processor detects locations in the image of both distal end 715’ and of drone 705 (e.g., using techniques such as described herein) and can provide navigation instructions for guiding drone 705 to distal end 715’, e.g., based on the locations in the image.
  • the drone propulsion units may be throttled down or turned off and drone 705 can then passively slide down assisted-landing element 715 to surface 701 (as depicted by the dashed lined drone illustrations), moved by gravity alone. Such passive landing causes less turbulence, and the landing is more accurate.
  • Solutions provided by embodiments of the invention use images captured by a stationary camera and analyzed by a processor typically located off-drone, to autonomously guide a drone to a stationary insect to neutralize the insect and return to the drone docking station, while using computationally efficient methods, enabling to use a simple and inexpensive drone.

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Pest Control & Pesticides (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Mechanical Engineering (AREA)
  • Insects & Arthropods (AREA)
  • Wood Science & Technology (AREA)
  • Zoology (AREA)
  • Environmental Sciences (AREA)
  • Ecology (AREA)
  • Forests & Forestry (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Remote Sensing (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Catching Or Destruction (AREA)

Abstract

A system and method for neutralizing an insect include detecting, in images, a moving object and tracking the moving object until determining that the moving object has become a stationary object. An image of the stationary object is then analyzed to determine that the stationary object is a target insect. Once it is determined that the object is a target insect, the location of the target insect is calculated and navigation instructions for a drone that is configured to neutralize the insect, are generated, to guide the drone from its docking station to the stationary location of the target insect. Once at the location of the stationary target insect, the drone is controlled to cause the insect to move away from its stationary location, allowing the drone to neutralize the insect, e.g., by sucking the insect into propellers of the drone.

Description

TITLE
UNMANNED AERIAL VEHICLE FOR NEUTRALIZING INSECTS
FIELD
[0001] The present invention is in the field of pest control, specifically, using an unmanned aerial vehicle for neutralizing pests, such as flying insects.
BACKGROUND
[0002] In homes and other urban spaces, pests, such as flying insects, which share the environment with humans, spread disease, spoil foodstuff and generally cause a nuisance. Control of these pests is usually attempted through exclusion, repulsion, physical removal or chemical means.
[0003] It has been suggested to use unmanned aerial vehicles (UAV) to capture and eliminate flying insects, such as moths and mosquitos, by identifying and tracking the insect in flight and using the UAV propellers to capture and/or kill the flying insect. Identifying and tracking the insect in flight is done by using acoustic sensors and possibly cameras.
[0004] However, the task of identifying and accurately estimating a location of a small moving object in images so as to enable a UAV to intercept it, is a computationally heavy task, not easily performed in real-time and expensive. Additionally, an autonomously navigating UAV passing through indoor and outdoor environments, which may include fixed and moving obstacles, typically relies on collision avoidance algorithms, e.g., to compute evasion paths while considering the geometry of the environment. These algorithms are typically complex and difficult to develop. Additionally, the use of such algorithms adds to the already heavy computational burden of the current solutions. Thus, current pest control solutions using UAVs to track and eliminate flying insects are not economically viable and are not prevalent in commercial use.
SUMMARY
[0005] Embodiments of the invention provide a system and method for neutralizing an insect, such as a mosquito or other flying insect, by detecting, in images, an insect landing on a surface and calculating or estimating a stationary location of the insect after landing on the surface and guiding a UAV to the stationary location of the insect. The UAV navigates to the stationary location of the insect autonomously, based on images in which there is an unobstructed line of sight to the insect, thus avoiding obstacles in the environment without the need to rely on collision avoidance algorithms.
[0006] Calculating a stationary, unchanging location of an insect and guiding a UAV to the unchanging location without having to locate and track the insect accurately during flight, requires less computational effort and reduces the algorithm complexity, enabling low-cost and efficient systems and methods for neutralizing insects.
[0007] A system for neutralizing an insect, according to one embodiment of the invention, includes a camera to capture images of an area and a processor in communication with the camera to detect an insect landing on a surface in the area and to calculate a stationary location of the insect after landing on the surface. Responsive to detecting a stationary insect after landing, the processor generates navigation instructions for a UAV (also referred to herein as “drone”) to guide the drone to the stationary location of the insect. The drone is configured to navigate to the stationary insect on the surface and neutralize the insect.
[0008] The drone, according to some embodiments of the invention, includes an insect- prodding component configured to cause the insect to leave the surface. In one embodiment, the insect prodding component is configured to be moved to an approximate unchanging, stationary location of the insect on the surface, possibly to make physical contact with the insect. In other embodiments, the insect prodding component may cause the insect to leave the surface by causing changes in the environment of the insect, which will cause the insect to leave the surface, such as by emitting light, moving air, creating an electromagnetic field, etc.
[0009] Methods for neutralizing an insect, according to some embodiments of the invention, include calculating from an image, a location of a stationary insect, e.g., after landing on a surface, and guiding a drone to the proximity of the stationary location. The method further includes controlling the drone to cause the insect to leave the surface, thereby enabling the drone to neutralize the insect after leaving the surface, e.g., by sucking the insect into a propeller of the drone. [0010] Solutions provided by embodiments of the invention can be applied in inexpensive, computationally light devices suitable for everyday use.
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] The invention will now be described in relation to certain examples and embodiments with reference to the following illustrative drawing figures so that it may be more fully understood. In the drawings:
[0012] Figs. 1A and IB are schematic illustrations of systems for neutralizing insects, according to embodiments of the invention;
[0013] Fig. 2 is a schematic illustration of a method for neutralizing an insect, according to embodiments of the invention;
[0014] Fig. 3 is a schematic representation of a camera FOV, according to an embodiment of the invention;
[0015] Fig. 4 is a schematic illustration of a method for guiding a drone to neutralize an insect, according to one embodiment of the invention;
[0016] Fig. 5 is a schematic illustration of a method for guiding a drone to neutralize an insect, according to another embodiment of the invention;
[0017] Fig. 6 schematically illustrates a drone according to embodiments of the invention; and
[0018] Fig. 7 schematically illustrates a docking station for a drone, according to embodiments of the invention.
DETAILED DESCRIPTION
[0019] Embodiments of the invention provide systems and methods for neutralizing an insect by using a camera to automatically detect in images an insect landing on a surface and autonomously guide a drone, based on the images, to the stationary location of the insect after landing, to neutralize the insect.
[0020] Examples described herein refer mainly to flying insects, specifically mosquitoes. However, embodiments of the invention may be used to locate and neutralize other pests as well. [0021] In the following description, various aspects of the present invention will be described. For purposes of explanation, specific configurations and details are set forth in order to provide a thorough understanding of the present invention. However, it will also be apparent to one skilled in the art that the present invention may be practiced without the specific details presented herein. Furthermore, well-known features may be omitted or simplified in order not to obscure the present invention.
[0022] Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as “analyzing”, "processing", "computing", "calculating", "determining", “detecting”, “identifying”, “estimating”, “understanding” or the like, refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulates and/or transforms data represented as physical, such as electronic, quantities within the computing system's registers and/or memories into other data similarly represented as physical quantities within the computing system's memories, registers or other such information storage, transmission or display devices.
[0023] A system for neutralizing an insect, according to one embodiment of the invention, includes a camera to capture images of an area and a processor in communication with the camera. The processor detects in the images a moving object and tracks the moving object until determining that the moving object has become a stationary object. The processor then determines that the stationary object is the insect (also referred to as “target insect”). For example, the processor may apply one or more image processing algorithms on an image of the stationary object to determine that the stationary object is the target insect. The processor then calculates the stationary location of the target insect, for example, on a surface in the imaged area, and generates navigation instructions for a drone that is configured to neutralize the insect, to be guided from a docking station of the drone to the stationary location of the target insect on the surface.
[0024] In one embodiment, which is schematically illustrated in Fig. 1A, the system for neutralizing an insect includes a camera 103 having a field of view (FOV) 103’. The FOV 103’ covers an area which includes a surface 104. The area covered by FOV 103’ may include parts of a room or other indoor or outdoor location, and surface 104 may be an element or part of an element in the room or other location, such as a wall, window, ceiling, floor, furniture in the room, an outdoor construction, vegetation, etc.
[0025] The system also includes a processor 102 in communication with the camera 103. The processor 102 detects in images captured by the camera 103, a stationary insect 106. In one example, processor 102 detects in images captured by camera 103 a moving object that becomes stationary, e.g., an insect 106 flying and then landing on the surface 104. Once insect 106 has landed and is stationary, processor 102 calculates the stationary location of insect 106 on surface 104 and generates navigation instructions by which to guide a drone 105 to the stationary location of insect 106 on surface 104. Drone 105 is configured to be guided in accordance with the navigation instructions and to neutralize the insect.
[0026] In some embodiments, the navigation instructions generated by processor 102 are based on detecting the stationary insect 106 in a plurality of consecutive images captured by camera 103, thus ensuring a direct and continuous “line of sight” from camera 103 to the stationary insect 106, which indicates that there is an obstacle- free path between the camera 103 and the stationary insect 106 on surface 104.
[0027] Processor 102 typically detects a landing insect (such as a mosquito) by using machine vision techniques. In one example, processor 102 detects an alighting mosquito. Namely, the mosquito is detected flying and then settling down and becoming stationary on surface 104. Processor 102 then calculates the unchanging location of the mosquito on the surface 104 in one or more images of the mosquito after it has settled down and become stationary. The calculated location is used to generate navigation instructions for drone 105. [0028] Processor 102 may use, for example, tracking algorithms to detect and track an object suspected of being an insect (e.g., a mosquito) in images while the object approaches and lands on surface 104. After landing, a subtraction image (e.g., created by subtracting a past image which contains surface 104 from a current image of surface 104) can be used to detect the object on surface 104. Machine vision algorithms can then be used to determine that the object is a specific or target insect (e.g., mosquito). For example, object detection algorithms and/or segmentation algorithms may be used to detect an object in an image (e.g., a subtraction image) and to determine which pixels are associated with the object. Additional algorithms such as size, shape and/or color detection algorithms may be used to analyze the subtraction image to determine if the object is a mosquito (or another insect). In some embodiments, a learning model may be applied on images which include the surface and stationary object on the surface, to determine that a detected object is a mosquito (or other insect). A learning model may be applied, for example, on the subtraction image to detect an object having a predetermined criterion and/or the model may be applied on a current image to determine if the object is a mosquito (or another insect). A learning model may be applied at other steps as well, such as integrating various inputs (e.g., color, transparency, size, movement pattern, etc.) into a single decision of determining whether the stationary object is a mosquito (or another insect).
[0029] Once it is determined that the stationary object on the surface is a specific insect (for example, a mosquito), the unchanging location of the insect is estimated, and a drone can be guided to the estimated location.
[0030] In some embodiments, the system includes a light source 108, such as an infrared (IR) illumination source which may include an LED or other illumination source emitting in a range of about 750-1050 nm. In one example, an IR illumination source illuminates at around 850nm. The use of an IR illumination source can enable operating the system even in a dark space by providing illumination that is not visible and/or irritating to the human eye but which enables camera 103 to obtain meaningful images of surface 104.
[0031] Camera 103 typically includes an image sensor, such as an appropriate chip, e.g., a CCD or CMOS chip, and may be a 2D or 3D camera. The camera 103 may include lenses and/or other optics to enable obtaining images of the insect flying and landing on surface 104. Camera 103 may include a stereoscopic camera or another camera device with depth perception capability. In some embodiments, camera 103 includes an IR-sensitive sensor and/or may include lenses and/or filters to filter out specific wavelengths to reduce noise (e.g., from fluorescent lighting or displays), to enable obtaining images in noisy environments and/or in special illumination conditions, e.g., in low illumination environments, as discussed above.
[0032] Processor 102 may include, for example, one or more processors such as a central processing unit (CPU), a digital signal processor (DSP), a graphics processing unit (GPU), a microprocessor, a controller, a chip, a microchip, an integrated circuit (IC), or any other suitable multi-purpose or specific processor or controller. In some embodiments, processor 102 is in communication with one or more memory unit(s) 112. Memory unit(s) 112 may include, for example, a random-access memory (RAM), a dynamic RAM (DRAM), a flash memory, a volatile memory, a non-volatile memory, a cache memory, a buffer, a short term memory unit, a long-term memory unit, or other suitable memory units or storage units. [0033] At least some of the images obtained by camera 103 may be stored in memory 112. Memory 112 may further store executable instructions that, when executed by processor 102, facilitate methods as described herein.
[0034] Components of the system may be connected to each other via appropriate cabling or suitable ports such as USB. In some embodiments, components of the system are connected wirelessly, e.g., via wireless communication unit 111 using, for example, Bluetooth and WiFi protocols or other suitable radio frequency methods.
[0035] Communication between processor 102 and drone 105 may be done via Bluetooth or Wi-Fi, or other suitable wireless communication protocols.
[0036] In some embodiments, camera 103 and processor 102 (and optionally additional components, such as light source 108) are located in a stationary device which is separate from drone 105. For example, camera 103 and processor 102 may be attached to or enclosed within a single housing 101 configured to be placed on the wall or other location within a room or other area. Housing 101, which may be made of materials that are practical and safe for use, such as plastic and/or metal, may include one or more pivoting elements, such as hinges, rotatable joints or ball joints, allowing for various movements of the housing 101. For example, housing 101 can be stationed at one location in a room but can enable several fields of view to camera 103, which is encased within housing 101, by rotating and/or tilting housing 101 or parts of housing 101. Housing 101 may also provide stability for camera 103 while obtaining images. Housing 101 may additionally be part of a docking station for drone 105 (as further described below).
[0037] In some embodiments, camera 103 is positioned such that its focal plane is facing surface 104. For example, a surface in a room may include the floor or ceiling of the room or a wall or surface of furniture in the room, etc. [0038] In some embodiments, the system may include a warning device (not shown), e.g., a sound emitting device and/or a light source, such as a dedicated LED, and processor 102 may generate a warning signal, such as to cause a sound or light to be emitted, based on detection of an insect. The warning signal may be generated for example, when an insect landing on a surface is detected (e.g., an insect’s movement is tracked until the insect becomes stationary) and/or when the stationary insect is determined to be the target insect and/or when the stationary location of the target insect is calculated. In some embodiments, responsive to any of the detections or determinations described above, processor 102 may generate a signal to be sent to a remote device (such as a user’s mobile phone) to advise a user that an insect has been detected.
[0039] In some embodiments, a safety feature may be implemented, where a user must send a command to launch drone 105 before drone 105 can be guided to the location of the insect. Thus, in some embodiments, a method for neutralizing an insect may include receiving a command from a user prior to and/or as a prerequisite for guiding a drone to the location of an insect in order to neutralize the insect.
[0040] Typically, drone 105 is visible in FOV 103’ (and in an image captured by camera 103) while it is being guided to the stationary insect 106 on surface 104, thus camera 103 and processor 102 can track drone 105 while it is within FOV 103’, and estimate locations of the drone in each image. Therefore, processor 102 may generate navigation instructions based on an estimated location of drone 105 and an estimated location of the stationary insect 106 in images captured by camera 103. Thus, navigation of the drone becomes a two-dimensional problem rather than having to find different three-dimensional ranges of the drone to the target (the stationary insect). Additionally, since the target is stationary at a known (e.g., estimated) unchanging, stationary location, only the drone needs to be tracked, thereby reducing the level of complexity of computing required to guide drone 105.
[0041] In some embodiments, one example of which is schematically illustrated in Fig. IB, processor 102 is configured to estimate the distance of drone 105 from the location of insect 106 on surface 104 and the navigation instructions are generated by processor 102 based on the estimated distance of the drone from the insect. [0042] For example, drone 105 may include a detector 115 to estimate the distance of the drone from surface 104. Detector 115 may include a proximity sensor such as an ultrasonic sensor, IR range detector or a physical contact sensor. Processor 102 (or a processor onboard the drone) may be in communication with detector 115 to estimate a distance of drone 105 from surface 104 based on input from detector 115.
[0043] In another example, drone 105 may include a device (such as a laser or other light source projector) that can project a pattern including at least two points, onto surface 104. The pattern can be captured by camera 103, providing an indication of the location of surface 104 in the images captured by camera 103. Processor 102 can analyze images from camera 103 which include drone 105 and the pattern, to determine the distance of drone 105 from surface 104. Changes in the distance between drone 105 and surface 104 may be detected based on a change in the distance between at least two points in the pattern projected onto surface 104.
[0044] In one embodiment, the system may include, e.g., as part of housing 101, a visual mark generator 114 (e.g., a laser or other light source projector) to generate a visual mark (e.g., a spot of light) on surface 104 at the stationary location of the insect. Processor 102 can control the visual mark generator 114 to project a laser or other light beam to the approximate location of the stationary insect 106 on the surface 104, forming a visual mark on the surface close to the insect.
[0045] In some embodiments, a dedicated optical sensor 113, which is mounted onboard drone 105, and possibly an onboard processor, may be used to increase the accuracy of guiding drone 105 to insect 106, e.g., by increasing the accuracy of the final stages of the drone’s navigation. For example, if a visual mark is formed close to the insect, the visual mark on surface 104, as imaged by optical sensor 113, may assist drone 105 to accurately reach insect 106.
[0046] In some embodiments, the system includes a sensor to detect presence of the insect 106 on the surface while drone 105 is being guided to insect 106 on the surface 104. The sensor may include the optical sensor 113, which may be attached to the drone, and which can improve the accuracy of guiding drone 105 to insect 106 on surface 104. [0047] A method for neutralizing an insect, according to one embodiment of the invention, may include the steps of tracking a moving object in images to until it is determined that the moving object has become a stationary object, determining that the stationary object is the insect (e.g., a target insect) by applying one or more image processing algorithm on an image of the stationary object. The method includes calculating a stationary location of the insect and guiding a drone to proximity of the stationary location. Typically, the drone is guided from its docking station to the stationary location of the insect, upon determination that the stationary object is the insect and upon calculation of the stationary location of the insect. The drone is then controlled to cause the insect to move from the stationary location, thereby enabling the drone to neutralize the insect after the insect moves away from the stationary location.
[0048] As schematically illustrated in Fig. 2, in one embodiment, processor 102 calculates from an image a location of a stationary insect after landing on a surface (step 202). Drone 105 is then guided to a proximate stationary location of the insect on the surface (step 204), typically based on navigation instructions generated by processor 102. For example, the drone can be guided to a specific pixel in the image, e.g., drone 105 can be guided to a pixel below or above pixels associated with the stationary location of the insect or to another pixel that is at a predetermined relation to a pixel associated with the stationary location of the insect.
[0049] In some embodiments, the method may include sending a signal to a remote device such as a user’s end device (e.g., mobile phone) and/or to a warning device (as described above) after detection of the insect and/or after calculating the unchanging, stationary location of the insect (e.g., after step 202), to advise the user of the detection and/or location of the insect. The method may further include receiving a signal or command from the user to launch the drone, prior to and/or as a prerequisite for guiding the drone to the location of the insect (e.g., prior to step 204).
[0050] Drone 105 is then controlled to cause the insect to move away from the stationary location, e.g., leave the surface (step 206), enabling the drone to neutralize the insect after leaving the surface (step 208). Drone 105 may be controlled to hover at its location (e.g., close to the stationary location of the insect) for a predetermined period (e.g., 2-5 seconds) to ensure the neutralization of the insect before returning to its docking station. In other embodiments, drone 105 may be controlled to hover at its location until the insect is detected leaving the surface.
[0051] Controlling drone 105 to cause the insect to move away from its stationary location, e.g., leave the surface, can include using an insect prodding component of the drone to cause the insect to leave the surface so it may be eliminated by an insect neutralizing system of drone 105. An insect neutralizing system may include, for example, means for chemically, mechanically or electrically neutralizing an insect. For example, the drone’s propellers may be used to mechanically neutralize an insect (as described below). In other examples other appropriate known insect neutralizing systems may be used, such as a system for spraying an insecticide at the insect, a system for electrocuting an insect, etc.
[0052] The insect prodding component (further detailed below) may be configured to come in close proximity to the insect and/or to make physical contact with the insect, thereby prodding the insect to leave the surface. In other embodiments, the insect prodding component may include a device that can cause a change in the environment of the insect, which will cause the insect to leave the surface. For example, the insect prodding component may include one or more of: a light-emitting device, an air-moving device and an electromagnetic field emitter.
[0053] Drone 105 may neutralize the insect after leaving the surface by using an insect neutralizing system, e.g., a propulsion unit of the drone may be included in the insect neutralizing system and may be used to create suction to suck the insect into a propeller of the propulsion unit and neutralize the insect between the blades of the propeller.
[0054] Guiding drone 105 to a proximate location of a stationary insect 106 on a surface 104 may be done by using camera 103 to track drone 105 within the camera FOV 103’. As schematically illustrated in Fig. 3, which shows a representation 300 of FOV 103’ (e.g., an image captured by camera 103), drone 105 may be visible in the FOV 103’ while it is being guided to the stationary location of insect 106 on surface 104. One or more pixels determined to be associated with the stationary location of the insect (e.g., by using segmentation and/or other image processing algorithms, e.g., as described above, to determine pixels associated with the insect and/or to determine pixels associated with a visual mark directed to the surface in the vicinity of the location of the insect) is depicted by coordinate 303 which represents the stationary location of an insect on a surface 104, within FOV 103’. The pixels associated with drone 105 can also be detected by processor 102 by using similar or other appropriate image processing algorithms. Coordinate 315 represents the location of the drone within the FOV 103’. Processor 102 may then generate navigation instructions based on a location of the insect (as represented by coordinate 303) and a location of the drone within FOV 103’. [0055] In one example, the navigation instructions to the stationary location of the insect on the surface may include a command to reach coordinate 303, which includes at least one pixel associated with the location of the insect. In other examples, the navigation instructions to the stationary location of the insect on the surface may include a command to reach a coordinate located in a predetermined location relative to the location of the insect, e.g., the navigation instructions may be to reach a pixel in the vicinity of a location of the insect, e.g., to reach coordinate 313 that represents a pixel located below coordinate 303.
[0056] A method for guiding a drone to neutralize an insect, according to one embodiment of the invention, is schematically illustrated in Fig. 4. Processor 102 detects within a FOV of a camera, a drone and a stationary insect on a surface (step 402). A coordinate related to location of the drone (e.g., coordinate 315) and a coordinate related to a location of the insect (e.g., coordinate 303 or 313) are determined (step 404). If, in step 405, the coordinate related to location of the drone is at a predetermined distance or within a predetermined range of distances from the coordinate related to the location of the stationary insect, then the drone is controlled to cause the insect to leave the surface (step 406), enabling the drone to neutralize the insect after leaving the surface (step 408). If, in step 405, the coordinate related to location of the drone is not at a predetermined distance or within a predetermined range of distances from the coordinate related to the location of the stationary insect, then a command is generated for the drone to move toward the coordinate related to a location of the insect (step 409) and the step of determining whether the drone is within a predetermined distance (or range of distances) is preformed again. As described above, the coordinate can include a pixel associated with the insect and/or a pixel proximal to a location of the insect, e.g., at a predetermined location, e.g., below, above or to the side of the location of the insect. [0057] A method for navigating a drone to neutralize an insect, according to another embodiment of the invention, is schematically illustrated in Fig. 5. Processor 102 detects within a FOV of a camera a stationary insect on a surface and guides the drone to the location of the stationary insect (step 502), e.g., as described above. In step 503, it is determined whether the drone reached the surface. If it is determined (in step 503) that the drone did not reach the surface, then a command is generated for the drone to move toward the surface (step 504). If it is determined (in step 503) that the drone has reached the surface, a navigation command is generated for the drone to move away from the surface (506) to enable maneuvering space for the drone so that the drone may be situated at a location and angle that will enable sucking the insect into the propulsion unit of the drone and neutralizing the insect (step 508).
[0058] The determination (in step 503) of whether the drone reached the surface can be done by using image processing algorithms and/or by receiving a signal from a sensor that is in connection with the drone. For example, a proximity sensor (such as detector 115) that estimates the proximity of the drone to the surface may be in communication with processor 102 (or possibly with a drone-onboard processor) and based on communication with the proximity sensor, processor 102 (or a drone-onboard processor) can determine when the drone reaches the surface.
[0059] In another embodiment, the processor 102 (or an onboard processor) can determine a change of distance of the drone from the camera (e.g., based on a known location of the camera and using image processing to determine distance of the drone from the camera). A determination can be made that the drone has reached the surface based on the change of distance of the drone from the camera. For example, if navigation instructions are transmitted to the drone to move toward the surface, but there is no change in the distance of the drone from the camera, then it can be determined that the drone has reached the surface and is physically prevented from progressing further.
[0060] In another embodiment, the processor 102 can receive acceleration data of the drone (e.g., by tracking the drone in images captured by camera 103 or by data received from, e.g., a drone-onboard accelerometer) and, based on a detected change in acceleration of the drone it can be determined that the drone has reached the surface. For example, based on a change in acceleration of the drone, the processor may detect a vibration of the drone that occurs when the drone comes in contact with the surface.
[0061] Fig. 6 schematically illustrates a drone according to embodiments of the invention. Drone 605 includes a frame 601, which supports propulsion units 611, typically three or more propulsion units, and a power source (not shown), such as a battery. Each propulsion unit 611 includes a propeller having rotating blades 612. In some embodiments, at least one propulsion unit includes a ducted fan such that blades 612 rotate within a cylindrical housing 613.
[0062] In one embodiment, drone 605 is configured to neutralize an insect by sucking the insect into one of propulsion units 611 of the drone. The dimensions of propulsion units 611 and the velocity of rotation of blades 612 are designed to create suction that will suck in an insect (e.g., mosquito) after it has left the surface. In one example, the propeller has a diameter of 30-40 mm. In another example, the propeller operates at a speed to cause the air to move at above 2 meter/sec.
[0063] In one embodiment, drone 605 includes an insect prodding component 603 configured to cause the insect to leave the surface. In one embodiment, the insect prodding component 603 also serves as a structure to maintain a distance between propulsion units 611 of the drone and the surface. The insect prodding component 603 may be, for example, a dome-shaped structure attached to frame 601 of drone 605, which can be used to prod the insect to leave the surface and also to maintain a distance between propulsion units 611 of the drone and the surface.
[0064] In some embodiments, prodding component 603 may include, for example, a moveable, possibly extendible member, such as a telescopic arm attached to frame 601. The member may be controlled by a processor (e.g., processor 102) to extend from frame 601 to come close to the insect on the surface and possibly make physical contact with the insect.
[0065] In some embodiments, the insect prodding component 603 can cause a change in the environment of the insect, which will cause the insect to leave the surface. For example, the insect prodding component may include one or more of: a light-emitting device, an airmoving device and an electromagnetic field emitter. [0066] In some embodiments, several different types of prodding components can be used, possibly simultaneously, to cause the insect to leave the surface.
[0067] Drone 605 typically includes a receiver to receive navigation instructions, e.g., from processor 102, and may include proximity sensors and/or bumpers, e.g., attached to its frame 601 to avoid collision with the surface. In some embodiments, drone 605 may have a visible marking, e.g., on its frame, such as a colored design or light source, to enable easier detection of drone 605 in images.
[0068] Drone 605 may have onboard sensors (e.g., as described above) and/or processing capabilities.
[0069] Fig. 7 schematically illustrates a docking station for the drone, according to one embodiment of the invention. Docking station 700 includes a surface 701 for drone 705 to rest on and a port for charging the drone (not shown). Typically, a camera 703 is attached to the docking station such that the camera FOV 703’ includes drone 705 while it is being guided to a stationary insect on a surface and while the drone takes off and lands at the docking station 700, typically on surface 701.
[0070] In one embodiment, docking station 700 includes an assisted- landing element 715 configured to allow passive alignment of drone 705 with docking station 700, specifically with surface 701. Assisted-landing element 715 may include a rod or other construction that can enable drone 705 to slide down it by gravity alone. At least a distal end 715’ of assisted- landing element 715 is visible within FOV 703’. When drone 705 approaches and is close enough to docking station 700, both distal end 715’ and drone 705 can be detected within a single image. A processor (such as processor 102) detects locations in the image of both distal end 715’ and of drone 705 (e.g., using techniques such as described herein) and can provide navigation instructions for guiding drone 705 to distal end 715’, e.g., based on the locations in the image. Once drone 705 and distal end 715’ are aligned (as can be determined from images captured by camera 703), the drone propulsion units may be throttled down or turned off and drone 705 can then passively slide down assisted-landing element 715 to surface 701 (as depicted by the dashed lined drone illustrations), moved by gravity alone. Such passive landing causes less turbulence, and the landing is more accurate. [0071] Solutions provided by embodiments of the invention use images captured by a stationary camera and analyzed by a processor typically located off-drone, to autonomously guide a drone to a stationary insect to neutralize the insect and return to the drone docking station, while using computationally efficient methods, enabling to use a simple and inexpensive drone.

Claims

1. A system for neutralizing an insect, the system comprising: a camera having a field of view, the camera to capture images of an area; a processor in communication with the camera, the processor to: detect in the images a moving object, track the moving object until determining that the moving object has become a stationary object, apply one or more image processing algorithm on an image of the stationary object to determine that the stationary object is a target insect, calculate a stationary location of the target insect on a surface in the area, and generate navigation instructions for a drone that is configured to neutralize the insect, to be guided from a docking station of the drone to the stationary location of the target insect on the surface.
2. The system of claim 1 wherein the camera tracks the drone within the field of view of the camera.
3. The system of claim 1, wherein the navigation instructions are based on the stationary location of the insect and on a location of the drone within the field of view of the camera.
4. The system of claim 1 wherein the processor is configured to estimate a distance of the drone from the stationary location and wherein the navigation instructions are based on the estimated distance of the drone from the stationary location.
5. The system of claim 1 comprising a visual mark generator to generate a visual mark on the surface at about the stationary location of the target insect, to assist in guiding the drone to the stationary location of the target insect on the surface.
6. The system of claim 1 wherein the drone comprises an insect prodding component configured to cause the insect to leave the surface.
7. The system of claim 6 wherein the drone comprises a structure configured to maintain a distance between a propulsion unit of the drone and the surface and wherein the structure comprises the insect prodding component.
8. The system of claim 7 wherein the structure is dome-shaped.
9. The system of claim 8 wherein the insect prodding component is configured to come in close proximity to the insect.
10. The system of claim 6 wherein the insect prodding component is configured to make physical contact with the insect.
11. The system of claim 1 wherein the navigation instructions comprise a command to reach a coordinate related to the stationary location of the target insect.
12. The system of claim 1 wherein the drone is configured to stay in proximity of the stationary location of the target insect for a predetermined period.
13. The system of claim 1 comprising a sensor attached to the drone, the sensor to detect presence of the insect on the surface while the drone is being navigated to the stationary location of the target insect.
14. The system of claim 1 wherein the processor is configured to: determine when the drone reaches the surface; and responsive to a determination that the drone has reached the surface, generate a navigation command for the drone to move away from the surface.
15. The system of claim 14 wherein the processor is configured to determine when the drone reaches the surface based on one or more of: communication of the processor with a proximity sensor used to determine proximity of the drone to the surface; a change of distance of the drone from the camera; and a change in the acceleration of the drone .
16. The system of claim 1 comprising: a docking station for the drone; and an assisted-landing element configured to allow passive alignment of the drone with the docking station.
17. The system of claim 1 wherein the drone is configured to neutralize the insect by sucking the insect into a propeller of the drone.
18. A method for neutralizing an insect, the method comprising: tracking a moving object in images to until it is determined that the moving object has become a stationary object; determining that the stationary object is the insect by applying one or more image processing algorithm on an image of the stationary object; calculating a stationary location of the insect ; guiding a drone to proximity of the stationary location; and controlling the drone to cause the insect to move from the stationary location, thereby enabling the drone to neutralize the insect after the insect moves away from the stationary location.
19. The method of claim 18 comprising using an insect prodding component of the drone to cause the insect to move from the stationary location.
20. The method of claim 18 comprising causing suction by a propeller of the drone by which to suck the insect into the propeller after the insect moves away from the stationary location.
21. The method of claim 18 comprising, prior to guiding the drone to proximity of the stationary location, receiving a signal from a user to launch the drone.
PCT/IL2023/051190 2022-11-16 2023-11-16 Unmanned aerial vehicle for neutralizing insects WO2024105676A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IL298319 2022-11-16
IL298319A IL298319A (en) 2022-11-16 2022-11-16 Unmanned aerial vehicle for neutralizing insects

Publications (1)

Publication Number Publication Date
WO2024105676A1 true WO2024105676A1 (en) 2024-05-23

Family

ID=91083921

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2023/051190 WO2024105676A1 (en) 2022-11-16 2023-11-16 Unmanned aerial vehicle for neutralizing insects

Country Status (2)

Country Link
IL (1) IL298319A (en)
WO (1) WO2024105676A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180077918A1 (en) * 2016-05-28 2018-03-22 Simon Siu-Chi Yu Multi Function Photo Electro Acoustic Ions Drone
US20180204320A1 (en) * 2011-07-05 2018-07-19 Bernard Fryshman Object image recognition and instant active response
US20200349819A1 (en) * 2016-07-07 2020-11-05 Sri International Passive optical detection method and system for vehicles
US20210251209A1 (en) * 2018-07-29 2021-08-19 Bzigo Ltd. System and method for locating and eliminating insects
US20210316857A1 (en) * 2017-03-12 2021-10-14 Nileworks Inc. Drone for capturing images of field crops

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6274430B2 (en) * 2014-06-03 2018-02-07 みこらった株式会社 Pest capture and storage device and pest insecticide device
US9693547B1 (en) * 2014-10-20 2017-07-04 Jean François Moitier UAV-enforced insect no-fly zone
NL2017984B1 (en) * 2016-12-13 2018-06-26 Univ Delft Tech Insect elimination system and use thereof
JP2023008878A (en) * 2021-06-30 2023-01-19 株式会社ダスキン collection drone

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180204320A1 (en) * 2011-07-05 2018-07-19 Bernard Fryshman Object image recognition and instant active response
US20180077918A1 (en) * 2016-05-28 2018-03-22 Simon Siu-Chi Yu Multi Function Photo Electro Acoustic Ions Drone
US20200349819A1 (en) * 2016-07-07 2020-11-05 Sri International Passive optical detection method and system for vehicles
US20210316857A1 (en) * 2017-03-12 2021-10-14 Nileworks Inc. Drone for capturing images of field crops
US20210251209A1 (en) * 2018-07-29 2021-08-19 Bzigo Ltd. System and method for locating and eliminating insects

Also Published As

Publication number Publication date
IL298319A (en) 2024-06-01

Similar Documents

Publication Publication Date Title
EP3479568B1 (en) Systems and methods for robotic behavior around moving bodies
US10444357B2 (en) System and method for optimizing active measurements in 3-dimensional map generation
JP5946147B2 (en) Movable human interface robot
JP6673371B2 (en) Method and system for detecting obstacle using movable object
JP5963372B2 (en) How to make a mobile robot follow people
WO2019128070A1 (en) Target tracking method and apparatus, mobile device and storage medium
JP2021513714A (en) Aircraft smart landing
US20210251209A1 (en) System and method for locating and eliminating insects
WO2016179802A1 (en) Apparatuses and methods of recognizing or detecting an obstacle
KR20180080498A (en) Robot for airport and method thereof
KR102391771B1 (en) Method for operation unmanned moving vehivle based on binary 3d space map
CN110612492A (en) Self-driven unmanned mower
GB2527207A (en) Mobile human interface robot
US20220137647A1 (en) System and method for operating a movable object based on human body indications
US20190004520A1 (en) Autonomous movement device, autonomous movement method and program recording medium
CA3061777A1 (en) Traffic stop drone
JP2019050007A (en) Method and device for determining position of mobile body and computer readable medium
US11215998B2 (en) Method for the navigation and self-localization of an autonomously moving processing device
KR20180074537A (en) Cleaning robot
KR20180074510A (en) Cleaning robot
WO2024105676A1 (en) Unmanned aerial vehicle for neutralizing insects
JP7192563B2 (en) autonomous mobile robot
JP7155062B2 (en) Surveillance system and flying robot
JP2022012173A (en) Information processing device, information processing system, information processing method, and program
JP7199337B2 (en) Position estimation device, position estimation method and program