WO2020026230A1 - System and method for locating and eliminating insects - Google Patents

System and method for locating and eliminating insects Download PDF

Info

Publication number
WO2020026230A1
WO2020026230A1 PCT/IL2019/050839 IL2019050839W WO2020026230A1 WO 2020026230 A1 WO2020026230 A1 WO 2020026230A1 IL 2019050839 W IL2019050839 W IL 2019050839W WO 2020026230 A1 WO2020026230 A1 WO 2020026230A1
Authority
WO
WIPO (PCT)
Prior art keywords
insect
image
space
location
images
Prior art date
Application number
PCT/IL2019/050839
Other languages
French (fr)
Inventor
Nadav BENEDEK
Saar Wilf
Original Assignee
Bzigo Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bzigo Ltd filed Critical Bzigo Ltd
Priority to EP19843552.1A priority Critical patent/EP3830755A4/en
Priority to CA3105655A priority patent/CA3105655A1/en
Priority to AU2019313665A priority patent/AU2019313665A1/en
Priority to KR1020217005289A priority patent/KR20210035252A/en
Priority to BR112021001634-1A priority patent/BR112021001634A2/en
Priority to JP2021504834A priority patent/JP2021531806A/en
Priority to CN201980049892.5A priority patent/CN112513880A/en
Priority to US17/259,205 priority patent/US12063920B2/en
Publication of WO2020026230A1 publication Critical patent/WO2020026230A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M1/00Stationary means for catching or killing insects
    • A01M1/02Stationary means for catching or killing insects with devices or substances, e.g. food, pheronones attracting the insects
    • A01M1/026Stationary means for catching or killing insects with devices or substances, e.g. food, pheronones attracting the insects combined with devices for monitoring insect presence, e.g. termites
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M1/00Stationary means for catching or killing insects
    • A01M1/06Catching insects by using a suction effect
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M1/00Stationary means for catching or killing insects
    • A01M1/14Catching by adhesive surfaces
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M1/00Stationary means for catching or killing insects
    • A01M1/20Poisoning, narcotising, or burning insects
    • A01M1/2022Poisoning or narcotising insects by vaporising an insecticide
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M1/00Stationary means for catching or killing insects
    • A01M1/22Killing insects by electric means
    • A01M1/223Killing insects by electric means by using electrocution
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M5/00Catching insects in fields, gardens, or forests by movable appliances
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects

Definitions

  • the present invention is in the field of pest control, specifically, using computer vision to detect, locate and eliminate pests, such as flying insects.
  • a system using an image sensor with a magnifying lens is used to detect pests in a typically agricultural setting, where the image sensor is moved or items are moved in view of the image sensor, to enable surveillance of a large area.
  • Another system that uses an image sensor tracks flying insects in an area of interest defined by a camera and a retroreflective surface spaced apart from the camera.
  • the need to employ a retroreflective surface in addition to a camera, renders this system obtrusive and cumbersome and thus, less likely to be widely installed in homes, offices and other urban spaces.
  • Embodiments of the invention provide a system and method for detecting and locating pests, such as flying insects, typically in an in-door environment, to enable effortless and accurate action against pests, typically, in an enclosed environment.
  • pests such as flying insects
  • Systems according to embodiments of the invention include a camera and processor to detect and locate pests from images obtained by the camera.
  • the system may operate from a single housing, which includes the camera, and does not require additional elements separate from the single housing, to locate pests. Additionally, the camera of the system does not have to be attached to or embedded within a moveable platform in order to capture usable images.
  • the system may be easily set up and unobtrusively located in a space such as a room in a house or office or public space such as a theater, a museum etc.
  • Embodiments of the invention can distinguish an insect from noise and/or from non insect objects.
  • the system can provide a mark visible to humans, to indicate a location of the insect in the room, for further action.
  • Embodiments of the invention provide a variety of types of solutions for acting against pests detected and located from images of the space.
  • FIG. 1A is a schematic illustration of a system for locating an insect in a space, according to an embodiment of the invention
  • FIG. 1B is a schematic illustration of a method for detecting and locating an insect in a space, according to an embodiment of the invention
  • FIGs. 2A and 2B are schematic illustrations of a system for locating an insect in a space, according to another embodiment of the invention.
  • FIG. 2C is a schematic illustration of a method for detecting and locating an insect in a space, according to another embodiment of the invention.
  • FIG. 3 is a schematic illustration of a system including a projector of a visual mark, according to an embodiment of the invention.
  • FIGs. 4A and 4B are schematic illustrations of systems including an auxiliary device for handling an insect, according to embodiments of the invention.
  • FIG. 4C is a schematic illustration of a method for controlling an auxiliary device for handling an insect, according to an embodiment of the invention.
  • FIG. 5 is a schematic illustration of an auxiliary device for handling an insect, according to an embodiment of the invention.
  • Fig. 6 is a schematic illustration of a method for detecting an insect in images of a space, according to an embodiment of the invention;
  • FIG. 7 is a schematic illustration of a method for determining if an object in an image is an insect, according to an embodiment of the invention.
  • FIG. 8 is a schematic illustration of a method for determining if an object in an image is an insect based on prior images, according to an embodiment of the invention.
  • Embodiments of the invention provide systems and methods for detecting a location of one or more insect in an enclosed space, such as a room, and indicating the detected location of the insect in the space.
  • Examples described herein refer mainly to insect pests, especially to flying insects, such as mosquitoes, however, embodiments of the invention may be used to locate other pests as well.
  • a system 100 for detecting and locating an insect includes a camera 103 to obtain an image of a space, such as, room 104 or portion of the room 104.
  • An insect 105 such as one or more mosquitos, may be in the room 104.
  • the camera 103 which includes an image sensor and suitable optics, is in communication with a processor 102.
  • Processor 102 receives an image of the room or portion of the room 104, obtained by camera 103, and detects the location of insect 105 in the image of the room. Based on the location of the insect 105 in the image, processor 102 generates a signal to enable creation of a location indicator, which is visible to a human eye, to indicate the location of the insect 105 in the room 104.
  • the processor 102 may determine the location of the insect 105 in a space (e.g., room 104) based on an image of the space and may control a projector device to direct a light source to create an indication visible to a human eye, in vicinity of the location of the insect in the space.
  • a space e.g., room 104
  • a projector device to direct a light source to create an indication visible to a human eye, in vicinity of the location of the insect in the space.
  • the location indicator is a visual mark 115 at the location of the insect 105 in the room 104.
  • the visual mark 115 is created, in one embodiment, via projector 108 that projects a laser or other beam to the vicinity of the insect 105, in the room 104, forming, in vicinity of the location of the insect in the room, a visual mark 115.
  • system 100 Some or all of the components of system 100 are attached to or enclosed within a housing 101.
  • camera 103 and processor 102 may be both included within a single housing 101.
  • some of the components of the system e.g., processor 102 are remotely located.
  • Housing 101 which may be made of materials practical and safe for use, such as plastic and/or metal, may include one or more pivoting element such as hinges, rotatable joints or ball joints, allowing for various movements of the housing 101.
  • housing 101 can be stationed at one location in room 104 but can enable several fields of view (FOV) to camera 103, which is encased within the housing 101 , by rotating and/or tilting the housing 101.
  • FOV fields of view
  • housing 101 typically provides stability for camera 103 such that the camera is not moved while obtaining images.
  • the camera 103 is positioned such that its focal plane is parallel to a surface in the room 104.
  • a surface in the room may include the floor or ceiling of the room or a wall or surface of a furniture in the room, etc.
  • processor 102 detects the location of the insect 105 in the image on a surface in the room (e.g., on a wall, ceiling, surface of a furniture in the room, etc.) and generates a signal to enable creating the visual mark 115 at the location of the insect 105 on the surface.
  • a surface in the room e.g., on a wall, ceiling, surface of a furniture in the room, etc.
  • the processor 102 detects a stationary (e.g., not flying) insect in an image of the room and the visual mark 115 is formed or directed to the location of the stationary insect.
  • a stationary insect e.g., not flying
  • the processor 102 detects an alighting insect, e.g., the processor detects the insect flying and then settling down. The processor 102 then detects the location of the insect after alighting, e.g., after settling down, and the visual mark 115 is formed or directed to the location of the insect after alighting.
  • the camera 103 may include an image sensor, e.g., an appropriate chip such as a CCD or CMOS chip and may be a 2D or 3D camera.
  • the camera 103 may include lenses and/or other optics to enable obtaining an image of the room (or part of the room) 104.
  • camera 103 includes an infrared (IR) sensitive sensor and/or may include lenses and/or filters to filter out other wavelengths to eliminate noise, to enable obtaining images of room 104 in special illumination conditions.
  • system 100 may include an IR illumination source 106.
  • IR illumination source 106 may include an LED or other illumination source emitting in a range of about 750-950nm. In one example illumination source 106 illuminates at around 850nm.
  • IR illumination source 106 can enable use of system 100 even in a dark room by providing illumination that is not visible and/or irritating to the human eye but which enables camera 103 to obtain meaningful images of a dark room.
  • Processor 102 may include, for example, one or more processors and may be a central processing unit (CPU), a digital signal processor (DSP), a microprocessor, a controller, a chip, a microchip, an integrated circuit (IC), or any other suitable multi-purpose or specific processor or controller.
  • system 100 may include a warning device, e.g., a sound emitting device and/or a light source, such as a dedicated LED, and processor 102 may generate a warning signal, such as to cause a sound or light to be emitted, based on detection of the location of the insect.
  • a warning device e.g., a sound emitting device and/or a light source, such as a dedicated LED
  • processor 102 may generate a warning signal, such as to cause a sound or light to be emitted, based on detection of the location of the insect.
  • processor 102 is in communication with one or more memory unit(s) 112.
  • Memory unit(s) 112 may include, for example, a random access memory (RAM), a dynamic RAM (DRAM), a flash memory, a volatile memory, a non-volatile memory, a cache memory, a buffer, a short term memory unit, a long term memory unit, or other suitable memory units or storage units.
  • Components of system 100 may be connected to each other wirelessly, e.g., via suitable network hubs, or via appropriate cabling or suitable ports such as USB.
  • Memory 112 may further store executable instructions that, when executed by the processor 102, facilitate methods as described herein.
  • the method for detecting and locating an insect in an enclosed space, includes the steps of obtaining an image of the space (1001), for example, room 104, and detecting a location of an insect in the image (1003).
  • the location of the insect in the image is translated to real-world coordinates (1005) and a location indicator is created to indicate the real-world coordinates (1007).
  • a signal is generated to notify a user.
  • the signal may be sent (e.g., via Bluetooth, radio, etc.) to a user’s mobile device (such as the user’s mobile phone or to a dedicated device).
  • the method includes detecting a stationary insect (e.g., an insect not flying and/or not changing locations in the space) in the image of the space and detecting the location of the stationary insect.
  • a location indicator is created to indicate real-world coordinates of the stationary insect.
  • the method includes detecting an alighting insect in images of the space and detecting the location of the insect after alighting.
  • a location indicator is created to indicate real-world coordinates of the insect after alighting.
  • the method includes projecting the location indicator (e.g., a beam of light visible to the human eye, such as, a visible light laser beam) to the location of the real-world coordinates in the space (1009) such that a visible mark is created at the location in space.
  • the location indicator e.g., a beam of light visible to the human eye, such as, a visible light laser beam
  • the beam of light is directed at the location on the surface such that a circle (or other shape) of light on the surface marks the location of the insect.
  • the location of the insect in the image can be translated to real-world coordinates (step 1005) by using projective geometry, for example, if the focal plane of the camera obtaining the image is parallel to a surface in the space on which the insect is located.
  • a system which includes an imager (e.g., camera 103) and projector (e.g., projector 108) may be pre-calibrated.
  • the projector may be positioned in close proximity to the camera (for example see distance D described with reference to Fig. 3 below).
  • a ray visible to the camera may be projected from the projector to several locations within the space and may be imaged by the camera at those locations.
  • each location in the image e.g., each pixel or group of pixels
  • can be correlated in real-time to an x,y coordinate in the space such that the projector can be directed to locations in the space based on locations detected in the image.
  • using a ray visible to the camera can enable correcting the direction of the projector in real-time based on the visible indication.
  • the projector includes one or more rotor to enable projection of a location indicator at different angles.
  • each location in the image can be correlated to a, b coordinates of the rotor, based on pre-calibration.
  • rotors may include a step motor, such that the change in angle is known for each step.
  • One or more physical stops may be used such that the angles of the rotor, at the limits of its movement, are known.
  • each pixel can be correlated to a known angle.
  • the number of steps required to direct the rotor at each angle can be calculated. Since the projector is typically not located at the same location as the camera, the calculations may require adjustment to the distance between the projector and the camera.
  • Other methods may be used to translate the location of the insect in the image to the real-world location.
  • system 200 detects an insect, e.g., as described herein, and creates a location indicator, which is visible in an image of the room.
  • processor 202 locates an insect 205 in an image 223 of the room and generates a signal to create a location indicator 225 in the image 223 at the location of the insect.
  • the image 223 of the room is displayed together with the location indicator 225, which may be an icon or other graphic indication superimposed on the image 223.
  • FIG. 2B An example of an image 223 of a room is shown in Fig. 2B.
  • a location indicator 225 is superimposed on the image 223 to indicate to a user viewing image 223, the location of the insect on the ceiling 226.
  • images obtained by camera 203 can be stored locally (e.g., in memory unit 212) and/or remotely (e.g., the images may be transmitted over the internet or by using another suitable wireless communication, to remote storage, e.g., on the cloud).
  • the images may then be retrieved and displayed on a device 209, such as a personal and/or mobile device (e.g., smartphone, tablet, etc.) or on a dedicated, typically mobile, device.
  • the image 223 of the room is an image of the room in real-time and the location indicator 225 is superimposed on the same image in which the location of insect 205 is detected.
  • the image 223 of the room is manipulated such that certain details (such as personal, private and/or confidential information) are obscured or removed from the image.
  • a real-time image (the same image in which insect 205 is detected) can be displayed without compromising privacy and/or confidentiality.
  • the image 223 can be manipulated to protect privacy and/or confidentiality by processor 202 or by a different processor (e.g., a processor in device 209).
  • a set of images of the room is obtained by camera 203.
  • Camera 203 is not moved or repositioned while obtaining the set of images such that all the images capture the same field of view.
  • a first image may be an image of the room 204 only, with no occupants, whereas a second image of the room 204 may be a real-time image of the room (possibly with occupants) in which an insect 205 is detected.
  • only the first image is transmitted to device 209 to be displayed and the location of the insect 205 in the second image, is indicated and displayed on the first image, which is the image being displayed to the user.
  • the first image (which typically does not include personal information) may be an image chosen by a user from a set of images of the room.
  • the first image may be a modified or manipulated image of the room in which personal information is obscured by modifying the personal information in the image.
  • the first image may be a representative image, which enables a user to understand the layout of the space being imaged but is not necessarily a real image of the space.
  • a representative image may be created from a combination of several images of the space, typically obtained by camera 203.
  • the representative image may be an average of several images from a set of images of the space.
  • a representative image may include a graphic representation of the space but not the actually imaged components of the space.
  • using an average image (or other representative image) as a first image may be useful in case the camera (e.g., camera 203) is repositioned between images, such that the images are not all of exactly the same field of view.
  • a method for detecting and locating an insect includes visually marking a location of an insect in the space on an image of the space.
  • An exemplary method which is schematically illustrated in Fig. 2C, includes obtaining a first image of a space (2001) and storing the first image (2003).
  • the first image includes the space empty of occupants and/or in which personal information is obscured.
  • a second image of the space is obtained (2005).
  • the second image is of about the same field of view as the first image but is obtained at a later time than the first image.
  • the second image includes an insect in the space.
  • the location of the insect in the second image is determined (2007) and a location indicator (e.g., a graphic mark) is created to mark that location in an image of the space (2009).
  • the location indicator marks the location on the same image in which the insect was detected.
  • the location indicator marks the location on a different image of the room.
  • the different image of the room may be an image captured at an earlier time, e.g., the first image of the room.
  • the method includes accepting input from a user and determining which image to use as a first image (namely, which image to display together with the location indicator) based on the input from the user.
  • a user can choose an image to send to storage and/or display, which does not include information which the user regards as personal or private.
  • the method includes a step of creating a representative image of the space (e.g., an average image) and using the representative image as the first image.
  • a representative image of the space e.g., an average image
  • the first image is retrieved from storage and displayed to a user, e.g., on the user’s personal mobile device or on a dedicated device, with the location indicator superimposed on it, at the same location as in the second image (2011).
  • a grid may be used on all the images of the space which are of the same field of view (or about the same field of view), such that a location of the insect in one image can be given x,y coordinates of the grid which are the same x,y coordinates in all the other images of the same field of view.
  • a projector 308 may be controlled by processor 302 to project or direct a location indicator to the location of the insect in the real-world space, e.g., room 104.
  • a projector 308 and a camera 303 are arranged in close proximity within housing 301.
  • the projector 308 includes an indicator source, e.g., a light source, such as laser 316 and an indicator directing device 312, such as an optical system, including lenses and/or mirrors or other optical components to direct light from the light source in a desired direction or angle.
  • the indicator directing device 312 includes rotating optical elements such as a mirror-bearing gimbal arranged to pivot about a single axis. A set of two or three such gimbals, one mounted on the other with orthogonal pivot axes, may be used to allow the light of laser 316 to be directed in any desired pitch, roll and yaw.
  • processor 302 controls indicator directing device 312 such that the indicator, e.g., laser 316, is directed to the real-world location of the insect. For example, control of the yaw, and pitch of the gimbals of indicator directing device 312 enables directing an indicator, such as laser 316, to a real-world location.
  • the indicator e.g., laser 316
  • camera 303 is located at a minimal distance D from the projector 308 (or from components of the projector such as the laser and/or indicator directing device) to enable accurate aim of the indicator.
  • camera 303 and laser 316 or indicator directing device 312 are located less than 20 cm of each other. In another example, camera 303 and laser 316 or indicator directing device 312 are located less than 10 cm of each other.
  • the laser 316 may include visible light such that the mark created by the laser at the detected location of the insect is visible and can be imaged by camera 303 and displayed to a user, for example on device 209.
  • a user may receive an image of a room with a visual indication of the location of the insect created by laser 316, in the image of the room.
  • the projector 308 is configured to eliminate or incapacitate the insect 305.
  • laser 316 may be a UV or IR or other light at high enough power such that when directed at an insect 305 on a surface in the room or at a stationary insect or at an insect after alighting, it may disable and/or kill insect 305.
  • projector 308 which includes an indicator source, e.g., a light source, such as laser 316 and an indicator directing device 312 controlled by a processor, may be used in fields other than pest control.
  • projector 308 may be used to produce visual effects, such as animation.
  • projector 308 may be part of a toy.
  • the processor controlling the directing device receives input from an image sensor and/or based on image processing and can be used in virtual reality games or other applications.
  • projector 308 may be used as a directing device, for example, to direct users to a specific point in an enclosed or other space.
  • directing device for example, to direct users to a specific point in an enclosed or other space.
  • a few examples include:
  • Some embodiments of the invention provide devices for handling insects, such as eliminating or incapacitating the insects.
  • a device may also include an apparatus such as an additional camera and/or illumination source, to assist in confirming the insect, e.g., confirming the existence and/or type of insect in an image.
  • the devices which are typically moveable, are controlled to approach a location of an insect in a space, such as an enclosed space, to handle the insect at close range, thereby limiting effects that may be hazardous to the surrounding space.
  • devices for handling insects are devices controlled by systems for locating insects according to embodiments of the invention, however, in some embodiments, the devices for handling insects may be controlled by other systems.
  • the systems as described above may include, in some embodiments, an auxiliary device to be used, together with the systems described herein, to eliminate and/or otherwise handle insects detected in images, according to embodiments of the invention.
  • a system for detecting a location of an insect in a room includes a housing 401 which encases a camera 403 used to obtain an image of a space (such as a room in a house, office space and other public or private indoor spaces).
  • Camera 403 is in communication with a processor 402 and memory 412, e.g., as described above.
  • the system further includes an auxiliary device in communication with processor 402.
  • the auxiliary device is an independently mobile device 415, which may be used to eliminate an insect or for other purposes, such as to remove, capture or analyze the insect, as further described in Fig. 5.
  • the system described in Fig. 4A may also include a port 413, typically on housing 401, such as a docking station or other terminal for powering and/or loading the independently mobile device 415.
  • the independently mobile device 415 is a flying device such as a drone.
  • Independently mobile device 415 may be remotely controlled by processor 402.
  • independently mobile device 415 may be in wireless communication (e.g., via Bluetooth, radio, etc.) with processor 402.
  • the system schematically illustrated in Fig. 4A includes a camera 403 to obtain images of a space and a mobile device 415 that is separately mobile from the camera 403.
  • the processor 402 may detect an insect in at least one of the images of the space obtained by camera 403 and may control the device 415 to move to vicinity of the insect, based on analysis of the images of the space.
  • processor 402 controls the mobile device 415 to move to the vicinity of the insect, based on analysis of an image of the space having the insect and the mobile device 415 within a single frame.
  • Processor 402 may control the mobile device 415 to move in a direct path from the camera 403 in the direction of the insect, wherein the direction to the insect can be estimated from the location of the image of the insect within the frame.
  • processor 402 further controls movement of mobile device 415, such that it stays in the vicinity of the insect in the image, while guiding it away from the camera and towards the insect.
  • processor 402 may periodically determine the angular distance of the mobile device 415 from the insect in the frame, which may be estimated using the distance, in pixels, between the two objects in the frame. If the determined angular distance is above a predetermined value, the processor 402 may calculate the distance and direction needed to move the mobile device 415 in order to bring it within the predetermined angular distance from the insect, and may cause the mobile device 415 to move the calculated distance in the calculated direction.
  • an elimination distance may be a distance from which the device can effectively handle the insect, for example, the distance from which an insecticide can be effectively sprayed on the insect.
  • a predetermined distance e.g., an elimination distance
  • device 415 and/or member 426 may be controlled to eliminate the insect, e.g., by using chemical, mechanical or electrical methods.
  • processor 402 estimates a direction of the insect from the camera 403 and controls the device to move approximately in that direction.
  • determining whether an elimination distance was reached can be done by utilizing an additional camera on the mobile device 415 to obtain an image of the insect.
  • the image of the insect may be analyzed (e.g. by comparing its size in the image to an expected size of this type of insect from the desired distance).
  • a processor e.g., processor 402 or another processor, which may be attached to mobile device 415) may be in communication with a rangefinder or similar system (which may be attached to the mobile device 415 or at another location within the system) to determine, based on input from the rangefinder, whether an elimination distance was reached.
  • determining whether an elimination distance was reached can be done by the mobile device 415 emitting light in a known direction (e.g.
  • the location of the mobile device 415 relative to camera 403 is known (as described herein). Therefore the angle from the mobile device 415 to the location of the point of light is known.
  • the angle from camera 403 to the location of the point of light can be calculated by detecting the pixel (or group of pixels) of the point in the image.
  • the distance to the point of light can be triangulated, from which the distance of the mobile device 415 to the insect can be estimated, since the insect is often on the same surface as the point of light.
  • mobile device 415 may include a projector to project a beam of a form of energy to vicinity of the insect, to create the point of light and/or to handle the insect. Additionally, mobile device 415 may include an additional camera (e.g., camera 503 in Fig. 5). The direction and/or distance of the mobile device 415 from an insect may be calculated (e.g., as described above) using the projector and/or additional camera of the mobile device 415. [0095] Once within the predetermined distance, mobile device 415 may use a member, possibly extendable from the device to the vicinity of the insect, e.g., to handle the insect, as described below.
  • the auxiliary device is attached to housing 401 at attachment point 411 and may be in communication with a power source and/or reservoir within housing 401, via attachment point 411.
  • the auxiliary device may include a handling tool, such as a moveable and typically extendible member 426, such as a telescopic arm. Member 426 may be controlled by processor 402 to extend from the housing 401 and move to the location of the insect to handle the insect at the location, for example, to capture or kill the insect, as described below.
  • member 426 is a telescopic and/or deformable arm or spring made of, for example, shape memory material that is usually in a folded or coiled form and can be extended and moved to interact with the insect at the location of the insect, upon a signal from processor 402.
  • Handling the insect may include using mechanical and/or chemical methods. In some cases, both mechanical and chemical means or methods are used to handle the insect.
  • member 426 serves as a conduit for instruments or agents used to handle the insect.
  • member 426 may include or may be in communication with a chamber containing a chemical substance (e.g., in the form of gas, liquid or powder) that can be sprayed at or dropped on the insect from a relatively close range, thereby limiting the effect of the chemical substance to the insect itself and not affecting the surrounding space.
  • the chamber may contain a pesticide.
  • the chamber may include a repellant such as citronella oil, which is a plant-based insect repellent.
  • housing 401 includes a reservoir of the chemical substance. In other embodiments housing 401 stores capsules (or other containers) of the chemical substance, which can be loaded into the member 426.
  • member 426 may include a nozzle attached to the distal end 427 of member 426.
  • the member 426, carrying a nozzle may be directed to the location of the insect and a pulse or spray of a chemical substance (e.g., as described above) may be directed at the insect at close range via the nozzle.
  • member 426 may include or may be in communication with a suction chamber to draw in and capture (and/or kill) the insect.
  • member 426 may include an electrifying element by which to electrocute the insect.
  • member 426 may include an adhesive element by which to capture (and/or kill) the insect.
  • Member 426 does not have human or other predator characteristics and is therefore typically not identified by insects (such as mosquitoes) as humans or predators and can thus approach the insect and get within close range of the insect without scaring it off.
  • an auxiliary device may include, for example, a projector (e.g., in addition to projector 108) to project a beam of any form of energy harmful or lethal to the insect to the location of the insect.
  • a projector e.g., in addition to projector 108 to project a beam of any form of energy harmful or lethal to the insect to the location of the insect.
  • a single projector e.g., projector 108 may be used to indicate a location of an insect and to project a beam to handle (e.g., incapacitate) the insect.
  • a projector may be controlled by a signal generated from processor 102 to project a beam of a form of energy such as light, heat, and the like, to the location of the insect, to handle the insect.
  • neural networks such as convolutional neural networks, or other computer vision software and algorithms are used to detect and identify details of the insect from an image or a plurality of images of the location.
  • shape and/or motion and/or color detection algorithms may be used to determine the shape and/or color and/or movement pattern and/or other details of the insect.
  • Movement pattern may include, for example, direction of movement, size of movement, velocity of movement, etc.
  • processor 102 controls the auxiliary device based on the determination of the type of insect.
  • a projector may be controlled to handle the insect only if it is a specific type of insect.
  • an auxiliary device may include, for example, a tool to enhance the image of the room at the location of the insect.
  • the system e.g., 100
  • may include a camera e.g., in addition to camera 103 with optics to enable enhancing the location of the insect, for example, to confirm the existence and/or type of insect at the location, based on an enlarged image of the location.
  • a long focus lens e.g., telephoto lens
  • a long focus lens may be used to zoom-in on the location of the insect to enable seeing the shape or other details of the insect in better detail and focus.
  • the additional camera may be directed and/or moved to the location of the suspected insect, for example, to confirm the existence and/or type of insect.
  • a camera with a long-focus lens may be attached to or located on indicator directing device 312, e.g., on a gimbal, such that the enlarging optics can be moved in parallel to the indicator directing device, automatically directing the optics at the location of a suspected insect.
  • differential analysis may be used to confirm a suspected insect and/or to detect an insect. For example, an area may be scanned at low resolution to detect a suspected insect, and the area of the suspected insect may then be analyzed at high resolution, e.g., to confirm the existence and/or type of insect. Using differential analysis of images enables to reduce processing, thereby providing a cost effective solution.
  • camera 103 may obtain a wide FOV image of the room and an auxiliary device, such as an additional camera that enables zooming-in, obtains a detailed image of a portion of the room.
  • Processor 102 can detect a location of a suspected insect in the wide FOV image of the room, direct the additional camera to the location of suspected insect (e.g., by controlling movement of the gimbals) and confirm the insect (e.g., confirm the existence and/or type of insect) in the detailed image of the portion of the room (the location of the suspected insect).
  • a system for handling an insect may include an auxiliary illumination source to allow higher resolution imaging of a location of a suspected insect and to assist in confirming the insect.
  • an illumination source which may also be attached to the gimbal such that it is moved in parallel to the indicator directing device, may be used, e.g., to obtain a brighter image.
  • the illumination source may have a relatively short wavelength (e.g. blue light) so as to reduce the diffraction limit and allow higher resolution imaging of the suspected insect.
  • the illumination source and the location indicator are the same element.
  • processor 102 can control projector 108 to indicate the location of the confirmed insect and possibly control another auxiliary device to eliminate or otherwise handle the confirmed insect.
  • auxiliary device such as an additional camera and/or additional illumination source
  • auxiliary device such as an additional camera and/or additional illumination source
  • a less powerful CPU may be used with camera 103, thereby providing a cost effective solution.
  • a single camera may be used to provide images from which to detect a location of an insect or suspected insect and to magnify or otherwise enhance the image at the detected location.
  • one optical element may be employed to image a large area (e.g., a room) and another optical element may be employed to image a small area within the large area (e.g., the detected location within the room).
  • differential analysis may be used to locally enhance regions within an image of a large area, for example, to assist in identifying an insect.
  • the tool to enhance the image of the room at the location of the insect may be controlled by processor 102.
  • a method for eliminating, incapacitating or otherwise handling an insect, includes obtaining an image of a space (4001) and detecting a location of an insect in the image (4003). The location of the insect in the image is translated to real-world coordinates (4005).
  • Processor 402 or another processor then controls an auxiliary device (such as independently mobile device 415 or member 426) based on the real-world coordinates. For example, the auxiliary device can be directed to the real-world coordinates (4007).
  • an auxiliary device is only employed to eliminate or otherwise handle an insect if it is determined that there are no other susceptible objects that can be harmed by the action of the auxiliary device.
  • Susceptible objects may include, for example, living beings (e.g., humans, pets, etc) and/or other objects or materials, such as paper or fabric or objects including such materials that can be harmed by the action of the auxiliary device.
  • a method for eliminating an insect may include a step of determining if there is a living being (or object or material that may be harmed by the action of the auxiliary device) in the vicinity of the location of the insect and directing the auxiliary device at the real-world coordinates detected in step (4005) only if no living being (or object or material) is detected in vicinity of the insect.
  • Existence of a living being in vicinity of location of the insect may be determined, for example, by, determining motion in the space. Motion above a predetermined size may indicate a person or other living being in the space. In one embodiment motion or a size of motion is determined by detecting changes over time in the images of the space.
  • existence of a person or other living being (or specific object or material) in the space may be determined by using computer vision techniques, e.g., to detect from the image (e.g., an image obtained by camera 103 or an additional camera) a shape, color or other attribute of a person or object or material.
  • computer vision techniques e.g., to detect from the image (e.g., an image obtained by camera 103 or an additional camera) a shape, color or other attribute of a person or object or material.
  • a system for eliminating an insect in a room includes a camera to obtain an image of the room and a processor to detect a location of the insect in the image of the room.
  • the processor detects, from the image of the room, an insect after alighting and/or an insect on a surface in a space.
  • the processor may then translate the location of the insect (e.g., the insect after alighting) in the image to real- world coordinates and control an auxiliary device based on the real-world coordinates to eliminate or otherwise handle the insect.
  • the processor may determine if there is a person (or other living being) or specific susceptible object or material, in vicinity of the insect and may control the auxiliary device to eliminate or otherwise handle the insect based on the determination.
  • the processor may confirm the existence and/or type of the insect at the location and may control the auxiliary device to eliminate or otherwise handle the insect based on the confirmation of the existence and/or type of the insect at the location.
  • the processor may control the camera or an additional camera to obtain an enlarged or more detailed image of the insect to confirm the existence and/or type of the insect at the location.
  • the control of the auxiliary device which may be via wireless communication, can be, for example, control of a propulsion mechanism of the auxiliary device and/or control of a handling tool of the auxiliary device.
  • FIG. 5 An example of an auxiliary device, which is independently mobile, is schematically illustrated in Fig. 5.
  • device 515 is a flying device (e.g., drone) which includes a propulsion mechanism 525 to move the device without assistance and an insect handling tool 526, or, alternatively or in addition, including an attachment point configured to releasably receive and secure a handling tool to the device 515.
  • a flying device e.g., drone
  • propulsion mechanism 525 to move the device without assistance
  • an insect handling tool 526 or, alternatively or in addition, including an attachment point configured to releasably receive and secure a handling tool to the device 515.
  • Handling tool 526 may apply mechanical and/or chemical and/or electrical methods by which to handle an insect. In some embodiments the handling tool 526 applies both mechanical and chemical means or methods by which to handle the insect.
  • handling tool 526 may include a suction chamber to draw in and capture (and/or kill) the insect.
  • handling tool 526 may include an electrifying element by which to electrocute the insect.
  • handling tool 526 may include an adhesive element by which to capture (and/or kill) the insect.
  • Other electrical and/or mechanical solutions may be employed by handling tool 526.
  • handling tool 526 may include, for example, a telescopic arm or deformable arm or spring made of, for example, shape memory material that can be in a folded or coiled form while device 515 is in transit and can be extended to interact with the insect upon a signal from processor 402.
  • handling tool 526 may include a chamber containing a chemical substance (e.g., as described above) that can be sprayed at or dropped on the insect from a relatively close range, thereby limiting the effect of the chemical substance to the insect itself and not effecting the surrounding space.
  • a chemical substance e.g., as described above
  • port 413 includes a reservoir of the chemical substance to enable the device 515 to dock at the port, recharge and stock the handling tool 526 with the chemical substance.
  • port 413 stores capsules (or other containers) of the chemical substance. A capsule can be loaded into the handling tool 526 while the device 515 is docking at port 413. A capsule may last several events of handling insects before being depleted, and may be replaced at port 413 when depleted.
  • device 515 may include a combination of different handling tools and may use a combination of methods (e.g., chemical and/or mechanical) for handling insects.
  • methods e.g., chemical and/or mechanical
  • Device 515 does not have human or other predator characteristics and is therefore typically not identified by insects (such as mosquitoes) as a human or predator and can thus approach the insect and get within close range of the insect without scaring it off.
  • device 515 is an aerial drone and the propulsion mechanism 525 includes a propeller mechanism suitable for aerial flight.
  • the propulsion mechanism 525 includes a propeller mechanism suitable for aerial flight.
  • Different types of independently mobile devices may have different types of propulsion mechanisms, or multiple types of propulsion mechanisms.
  • a terrestrial drone may have a propulsion mechanism that includes a motor, transmission, and wheels.
  • Device 515 typically includes a control circuit (not shown) in communication with a processor (e.g., processor 402) and is configured to receive input regarding location of an insect.
  • a control circuit e.g., processor 402
  • processor 402 e.g., processor 402
  • device 515 may further include one or more sensors such as an image sensor (e.g., camera 503) and/or a distance sensor (such as a rangefinder).
  • sensors such as an image sensor (e.g., camera 503) and/or a distance sensor (such as a rangefinder).
  • device 515 is controlled to handle a stationary insect or an insect after alighting (e.g., an insect on a surface in a space).
  • the device 515 or member 426 receives direction information (e.g., a vector) from processor 402, based on the detected location of the stationary insect and is propelled according to the received information.
  • a distance sensor in device 515 (or member 426) can detect the distance of the device 515 (or member 426) from the insect (and/or from the surface) and stop propelling at a predetermined distance from the insect.
  • device 515 may include a signal source (such as a light source or audio transmitter) to emit a signal that can be received and analyzed by processor 402 and may be used to estimate or calculate the distance of the device 515 or member 426 from the insect (and/or from the surface).
  • a signal source such as a light source or audio transmitter
  • device 515 may include a projector to project a visible mark to the vicinity of the insect.
  • Processor 402 can then control the device 515 (e.g., to control handling tool 526) or member 426 based on the calculated distance.
  • a dedicated image sensor attached to or within housing 401 can be used to capture an image of the insect (and possibly of the visible mark projected from a projector of device 515), which may be used to direct the device 515 or member 426 to the insect.
  • the visual mark can be detected from an image obtained by camera 403 or by the dedicated camera and device 515 or member 426 and can thus be directed to the location of the visual mark as imaged.
  • Using a device and/or extendable member controlled by a processor based on a location of an insect in an image enables accurate and environment friendly action to remove or eliminate pests such as flying insects.
  • embodiments of the invention can distinguish an insect from noise, such as, electronic noise on the image sensor and/or ambient noise, such as dust particles in the space, variations in ambient illumination, reflections, etc. Additionally, a specific insect type (e.g., mosquito) can be differentiated from another insect type (e.g., fly).
  • noise such as, electronic noise on the image sensor and/or ambient noise, such as dust particles in the space, variations in ambient illumination, reflections, etc.
  • a specific insect type e.g., mosquito
  • another insect type e.g., fly
  • a method for differentiating between a target insect and a non-target insect object from images of a space.
  • a target insect may be an insect, as opposed to a non-insect object (e.g., noise or other object) and/or a specific type of insect, as opposed to a different type of insect.
  • the method which may be carried out by a system such as system 100, includes using multiple images to determine if an object in an image is a target insect.
  • processor 102 may detect an object by comparing two (or more) images of the space and may determine that the object is a target insect based on a characteristic of the object in an image of the space. In some embodiments, an object is detected if it fulfills a predetermined criterion.
  • camera 103 may capture an image (also named“current image”), from which it is desirable to determine if an insect is present in the space.
  • Processor 102 may obtain a subtraction image by subtracting the current image of the space from a different, second, image of the space.
  • the subtraction image highlights changes in the space since objects that have not changed (e.g. have not moved or have not changed position) in between images, do not typically show up in the subtraction image.
  • Processor 102 may detect in the subtraction image an object having a predetermined criterion and determine that the object is a target insect.
  • a device may be controlled based on the determination that an object is a target insect.
  • two or more images of the space are compared, in order to detect an object which fulfills a predetermined criterion.
  • a current image may be compared to a second image that was previously captured, to detect an object that is present in the current image but not in the previous image.
  • the second image may include a representation of a plurality of images of the space.
  • the second image may be an average (or other suitable statistical representation) of multiple images of the space.
  • the second image may include a background image constructed using images of the space captured over time, by understanding constant and temporary elements in the images of the space, and constructing an image of the constant elements (e.g. walls and furniture, but not people and pets).
  • FIG. 6 An example of this embodiment is schematically illustrated in Fig. 6.
  • Two images of a space are obtained (step 602).
  • the images are compared by subtraction, e.g., a current image, is subtracted from another image of the space to obtain a subtraction image (step 604).
  • an object fulfilling a predetermined criterion is detected in the subtraction image.
  • a predetermined criterion may relate to one or more characteristics of the object.
  • a characteristic of the object may include size, shape, location in the subtraction image, color, transparency and other such attributes of the object in the subtraction image.
  • a predetermined criterion may be, for example, a size range (e.g., in pixels), a specific shape (e.g., as determined by a shape detection algorithm applied on the subtraction image), a specific location or range of locations of the object within the subtraction image, specific colors (e.g., as determined by applying a color detection algorithm on the subtraction image), etc.
  • Processor 102 determines if the object fulfilling the predetermined criterion is a target insect. For example, one or more characteristics of the object (such as, movement pattern, shape, color or transparency) may be determined and the object may be determined to be a target insect based on the determined characteristic. For example, mosquitoes are more transparent and are of lighter color than some other common insects, thus, in one example, in which the target insect is a mosquito, if the color of the pixels associated with the object are colors typical of mosquitoes the object would be determined to be a mosquito. In another embodiment, if an object is determined to have a certain level of transparency or to have a predetermined pattern of transparent areas, it may be determined to be a mosquito.
  • characteristics of the object such as, movement pattern, shape, color or transparency
  • the object may be determined to be a target insect based on the determined characteristic. For example, mosquitoes are more transparent and are of lighter color than some other common insects, thus, in one example, in which the target insect is a mosquito, if the color of the pixels
  • Transparency of an object may be determined, for example, based on a known color of background in the space. If an object is determined to have the color of the background (e.g., if the background color is not a color typical of the target insect), the object may be determined to be partially transparent. In another example, different insects have different shapes, thus a target insect may be determined based on its shape in the subtraction image.
  • an object may be detected from a plurality of images whereas detecting if the object fulfills a predetermined criterion and determining that the object is a target insect, are done from a single image.
  • a same characteristic of an object may be used to detect an object fulfilling a predetermined criterion, in a first image and to determine if the object is a target insect, in the same image or in a second image.
  • different characteristics are used to detect an object fulfilling a predetermined criterion in a first image and to determine if the object is a target insect in the same image or in a second image.
  • a subtraction image may include several objects but only two that are within a predetermined size range.
  • two objects are detected in the subtraction image.
  • One or more characteristic(s), other than size, may be determined for the two objects, e.g., the color and/or transparency and/or movement pattern of the two objects may be determined and the objects may be determined to be target insects or not, based on their color and/or transparency and/or movement pattern.
  • a high resolution image of the object may be obtained and the object can be determined to be a target insect based on the high resolution image.
  • an object may be detected in a first image, e.g., in a subtraction image, possibly, based on its size or other characteristic, and may then be determined to be a target insect (or not) from a second image which is of higher resolution than the first image.
  • characteristics such as color and/or movement may be spatially correlated. For example, if a number of pixels that are close to each other have properties indicative of a target insect, these pixels may be given more weight in determining the presence of a target insect, than a number of pixel having the same properties, but which are not closely grouped.
  • several correlated characteristics or pixel properties e.g., same movement patterns and/or changes in illumination, detected in several locations in an image, may point to movement of a larger object and/or reflections, and may be assigned a lower weight in determining the presence of a target insect, than single and uncorrelated characteristics.
  • Different weights may be assigned to characteristics (or pixels representing these characteristics) based on the behavior of the characteristic in a plurality of images. For example, a characteristic persisting over time is less likely to be noise and may therefore be assigned a higher weight.
  • Machine vision techniques such as object detection algorithms, segmentation, etc., may be used to detect an object in images of the space (e.g., a subtraction image) and to determine the pixels associated with the object.
  • a learning model may be applied on images of the space to determine that the object is a target insect.
  • a learning model may be applied, for example, on the subtraction image to detect an object having a predetermined criterion and/or on a current image to determine if the object is a target insect.
  • a learning model may be applied at other steps as well, such as integrating the various inputs (color, transparency, size, movement pattern, etc.) into a single decision of determining whether the object is a target insect.
  • processor 102 If the object is determined to be a target insect (step 608), processor 102 generates a signal to control a device (step 610). If the object is not determined to be a target insect, another current image is obtained and processed.
  • a device controlled based on the determination that an object is a target insect may include an auxiliary device, e.g., as described above.
  • a device such as a projector of a light source
  • may create a location indicator visible to a human eye e.g., visual mark 115.
  • a method may include determining a real-world location of the target insect from the images of the space and controlling a device to create a location indicator visible to a human eye and indicative of the real-world location of the target insect.
  • a device may be used to eliminate and/or otherwise handle target insects.
  • a method may include determining a real-world location of the target insect from the images of the space and controlling a device to eliminate (or otherwise handle) the target insect at the real-world location.
  • the device may include an auxiliary device for handling an insect, e.g., as described above.
  • the device may include a projector to project a form of energy at the real-world location of the target insect.
  • the device may include a remotely controlled independently mobile device and/or a telescopic arm and/or nozzle.
  • an object e.g., the object detected in a subtraction image
  • the object is tracked in multiple images of the space and to multiple locations in the space, and the object may be determined to be a target insect (or not) based on the tracking.
  • a movement pattern of an object is detected and the object is determined to be a target insect (or not) based on the movement pattern.
  • An object is detected in images of a space (step 702) and a movement pattern of the object is determined (step 704). If the movement pattern is similar to a predetermined pattern (step 706) then the object is determined to be a target insect (step 708). If the movement pattern is not similar to the predetermined movement pattern (step 706) then the object is not determined to be a target insect (step (710).
  • a predetermined movement pattern will be a pattern consistent with a pattern expected from the target insect.
  • a predetermined movement pattern can include an alighting pattern (e.g., flying and then settling down), which is typical of mosquitoes.
  • the predetermined movement pattern can include predominantly a non-repetitive movement, since a predominantly repetitive motion is characteristic of an unintended motion (such as movement of a fan, wind-blown objects and/or electronic noise).
  • a movement pattern can include a change in direction and a predetermined movement includes a change in direction at a specific angle or range of angles.
  • mosquitoes often change direction at an angle less sharp than flies.
  • a predetermined movement pattern may include a change of direction at an angle in a predetermined range.
  • mosquitoes move more slowly than flies, thus, a predetermined movement pattern can include a specific velocity (or range of velocities).
  • determining characteristics of objects may be more accurate when using multiple images and/or comparing images over time.
  • a moving object such as an insect
  • historical data may be used in determining if an object is a target insect. For example, determining if an object in a later captured image is a target insect, can be based on a weight assigned to pixels in an earlier captured image.
  • an object is detected at a location in a first image (e.g., first current image) of a space (step 802). If it is determined that the object is not a target insect (step 804), then a first weight is assigned to pixels at that location (step 806). If it is determined that the object is a target insect (step 804), then a second weight is assigned to pixels at that location (step 808).
  • a first image e.g., first current image
  • a first weight is assigned to pixels at that location (step 806).
  • a second weight is assigned to pixels at that location (step 808).
  • An object is detected at a location in a second image (e.g., a second current image) (step 810) and the weights from steps 806 and 808 are assigned to the pixels of the second image based on their location in the second image.
  • the object in the second image may then be determined to be a target insect (or not) based on the weighted pixels associated with the object in the second image (step 812).
  • images of a space may include windows, a TV screen, a fan, reflections and more, which may create“noisy” areas in the images.
  • Such noise may be detected, for example, by high variation in pixel values over time, by many false positives (e.g., falsely detected target insects), or by applying object detection algorithms to identify the objects likely to create noise (e.g., window, TV, etc.).
  • characteristics of objects (or pixels representing these characteristics) detected in relatively “noisy” areas of an image may be assigned less weight than characteristics (or pixels) of objects detected in other areas of the image.
  • characteristics (or pixels) of objects detected in an area of the image, in which a target insect was erroneously determined in past cases may be assigned less weight than characteristics (or pixels) detected in other areas of the image.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Pest Control & Pesticides (AREA)
  • Engineering & Computer Science (AREA)
  • Insects & Arthropods (AREA)
  • Wood Science & Technology (AREA)
  • Zoology (AREA)
  • Environmental Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Ecology (AREA)
  • Forests & Forestry (AREA)
  • General Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Health & Medical Sciences (AREA)
  • Catching Or Destruction (AREA)
  • Software Systems (AREA)
  • Burglar Alarm Systems (AREA)
  • Image Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)

Abstract

Systems and methods are provided for locating an insect in a space and for indicating to a user the location of the insect and/or for eliminating the insect. The system includes a camera to obtain an image of the space and a processor to detect an object by comparing at least two images of the space and determine that the object is an insect based on a characteristic of the object in an image of the space. In some embodiments an independently mobile device may be controlled to eliminate the insect at the location of the insect in the space.

Description

TITLE
SYSTEM AND METHOD FOR LOCATING AND ELIMINATING INSECTS
FIELD
[0001] The present invention is in the field of pest control, specifically, using computer vision to detect, locate and eliminate pests, such as flying insects.
BACKGROUND
[0002] In homes and other urban spaces, pests, such as flying insects, which share the environment with humans, spread disease, spoil foodstuff and generally cause a nuisance. Control of these pests is usually attempted through exclusion, repulsion, physical removal or chemical means.
[0003] A system using an image sensor with a magnifying lens is used to detect pests in a typically agricultural setting, where the image sensor is moved or items are moved in view of the image sensor, to enable surveillance of a large area.
[0004] Such a system, which requires a moving camera, is not suitable for in-door use, as people are not interested in a camera constantly moving in their living and/or working space.
[0005] Another system that uses an image sensor tracks flying insects in an area of interest defined by a camera and a retroreflective surface spaced apart from the camera. The need to employ a retroreflective surface in addition to a camera, renders this system obtrusive and cumbersome and thus, less likely to be widely installed in homes, offices and other urban spaces.
SUMMARY
[0006] Embodiments of the invention provide a system and method for detecting and locating pests, such as flying insects, typically in an in-door environment, to enable effortless and accurate action against pests, typically, in an enclosed environment.
[0007] Systems according to embodiments of the invention include a camera and processor to detect and locate pests from images obtained by the camera. The system may operate from a single housing, which includes the camera, and does not require additional elements separate from the single housing, to locate pests. Additionally, the camera of the system does not have to be attached to or embedded within a moveable platform in order to capture usable images. Thus, the system may be easily set up and unobtrusively located in a space such as a room in a house or office or public space such as a theater, a museum etc.
[0008] Embodiments of the invention can distinguish an insect from noise and/or from non insect objects.
[0009] In one embodiment the system can provide a mark visible to humans, to indicate a location of the insect in the room, for further action.
[0010] Embodiments of the invention provide a variety of types of solutions for acting against pests detected and located from images of the space.
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] The invention will now be described in relation to certain examples and embodiments with reference to the following illustrative drawing figures so that it may be more fully understood. In the drawings:
[0012] Fig. 1A is a schematic illustration of a system for locating an insect in a space, according to an embodiment of the invention;
[0013] Fig. 1B is a schematic illustration of a method for detecting and locating an insect in a space, according to an embodiment of the invention;
[0014] Figs. 2A and 2B are schematic illustrations of a system for locating an insect in a space, according to another embodiment of the invention;
[0015] Fig. 2C is a schematic illustration of a method for detecting and locating an insect in a space, according to another embodiment of the invention;
[0016] Fig. 3 is a schematic illustration of a system including a projector of a visual mark, according to an embodiment of the invention;
[0017] Figs. 4A and 4B are schematic illustrations of systems including an auxiliary device for handling an insect, according to embodiments of the invention;
[0018] Fig. 4C is a schematic illustration of a method for controlling an auxiliary device for handling an insect, according to an embodiment of the invention;
[0019] Fig. 5 is a schematic illustration of an auxiliary device for handling an insect, according to an embodiment of the invention; [0020] Fig. 6 is a schematic illustration of a method for detecting an insect in images of a space, according to an embodiment of the invention;
[0021] Fig. 7 is a schematic illustration of a method for determining if an object in an image is an insect, according to an embodiment of the invention; and
[0022] Fig. 8 is a schematic illustration of a method for determining if an object in an image is an insect based on prior images, according to an embodiment of the invention.
DETAILED DESCRIPTION
[0023] Embodiments of the invention provide systems and methods for detecting a location of one or more insect in an enclosed space, such as a room, and indicating the detected location of the insect in the space.
[0024] Examples described herein refer mainly to insect pests, especially to flying insects, such as mosquitoes, however, embodiments of the invention may be used to locate other pests as well.
[0025] In the following description, various aspects of the present invention will be described. For purposes of explanation, specific configurations and details are set forth in order to provide a thorough understanding of the present invention. However, it will also be apparent to one skilled in the art that the present invention may be practiced without the specific details presented herein. Furthermore, well known features may be omitted or simplified in order not to obscure the present invention.
[0026] Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as “analyzing”, "processing," "computing," "calculating," "determining," “detecting”, “identifying”, “estimating”, “understanding” or the like, refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulates and/or transforms data represented as physical, such as electronic, quantities within the computing system's registers and/or memories into other data similarly represented as physical quantities within the computing system's memories, registers or other such information storage, transmission or display devices. [0027] In one embodiment, which is schematically illustrated in Fig. 1A, a system 100 for detecting and locating an insect includes a camera 103 to obtain an image of a space, such as, room 104 or portion of the room 104. An insect 105, such as one or more mosquitos, may be in the room 104.
[0028] The camera 103, which includes an image sensor and suitable optics, is in communication with a processor 102. Processor 102 receives an image of the room or portion of the room 104, obtained by camera 103, and detects the location of insect 105 in the image of the room. Based on the location of the insect 105 in the image, processor 102 generates a signal to enable creation of a location indicator, which is visible to a human eye, to indicate the location of the insect 105 in the room 104.
[0029] The processor 102 may determine the location of the insect 105 in a space (e.g., room 104) based on an image of the space and may control a projector device to direct a light source to create an indication visible to a human eye, in vicinity of the location of the insect in the space.
[0030] In the example illustrated in Fig. 1 A, the location indicator is a visual mark 115 at the location of the insect 105 in the room 104. The visual mark 115 is created, in one embodiment, via projector 108 that projects a laser or other beam to the vicinity of the insect 105, in the room 104, forming, in vicinity of the location of the insect in the room, a visual mark 115.
[0031 ] Some or all of the components of system 100 are attached to or enclosed within a housing 101. Thus, for example, camera 103 and processor 102 may be both included within a single housing 101. In other embodiments some of the components of the system (e.g., processor 102) are remotely located.
[0032] Housing 101, which may be made of materials practical and safe for use, such as plastic and/or metal, may include one or more pivoting element such as hinges, rotatable joints or ball joints, allowing for various movements of the housing 101. For example, housing 101 can be stationed at one location in room 104 but can enable several fields of view (FOV) to camera 103, which is encased within the housing 101 , by rotating and/or tilting the housing 101. However, housing 101 typically provides stability for camera 103 such that the camera is not moved while obtaining images. [0033] In some embodiments, the camera 103 is positioned such that its focal plane is parallel to a surface in the room 104. For example, a surface in the room may include the floor or ceiling of the room or a wall or surface of a furniture in the room, etc.
[0034] In one embodiment processor 102 detects the location of the insect 105 in the image on a surface in the room (e.g., on a wall, ceiling, surface of a furniture in the room, etc.) and generates a signal to enable creating the visual mark 115 at the location of the insect 105 on the surface.
[0035] In some embodiments, the processor 102 detects a stationary (e.g., not flying) insect in an image of the room and the visual mark 115 is formed or directed to the location of the stationary insect.
[0036] In some embodiments, the processor 102 detects an alighting insect, e.g., the processor detects the insect flying and then settling down. The processor 102 then detects the location of the insect after alighting, e.g., after settling down, and the visual mark 115 is formed or directed to the location of the insect after alighting.
[0037] The camera 103 may include an image sensor, e.g., an appropriate chip such as a CCD or CMOS chip and may be a 2D or 3D camera. The camera 103 may include lenses and/or other optics to enable obtaining an image of the room (or part of the room) 104.
[0038] In some embodiments camera 103 includes an infrared (IR) sensitive sensor and/or may include lenses and/or filters to filter out other wavelengths to eliminate noise, to enable obtaining images of room 104 in special illumination conditions. For example, system 100 may include an IR illumination source 106. IR illumination source 106 may include an LED or other illumination source emitting in a range of about 750-950nm. In one example illumination source 106 illuminates at around 850nm. IR illumination source 106 can enable use of system 100 even in a dark room by providing illumination that is not visible and/or irritating to the human eye but which enables camera 103 to obtain meaningful images of a dark room.
[0039] Processor 102 may include, for example, one or more processors and may be a central processing unit (CPU), a digital signal processor (DSP), a microprocessor, a controller, a chip, a microchip, an integrated circuit (IC), or any other suitable multi-purpose or specific processor or controller. [0040] In some embodiments system 100 may include a warning device, e.g., a sound emitting device and/or a light source, such as a dedicated LED, and processor 102 may generate a warning signal, such as to cause a sound or light to be emitted, based on detection of the location of the insect.
[0041] In some embodiments, processor 102 is in communication with one or more memory unit(s) 112. Memory unit(s) 112 may include, for example, a random access memory (RAM), a dynamic RAM (DRAM), a flash memory, a volatile memory, a non-volatile memory, a cache memory, a buffer, a short term memory unit, a long term memory unit, or other suitable memory units or storage units.
[0042] Components of system 100 may be connected to each other wirelessly, e.g., via suitable network hubs, or via appropriate cabling or suitable ports such as USB.
[0043] According to some embodiments, at least some of the images obtained by camera 103 are stored in memory 112. Memory 112 may further store executable instructions that, when executed by the processor 102, facilitate methods as described herein.
[0044] One example of a method, some steps of which are carried out by processor 102, is schematically illustrated in Fig. 1B. The method, for detecting and locating an insect in an enclosed space, includes the steps of obtaining an image of the space (1001), for example, room 104, and detecting a location of an insect in the image (1003). The location of the insect in the image is translated to real-world coordinates (1005) and a location indicator is created to indicate the real-world coordinates (1007).
[0045] In some embodiments, once a location of an insect is detected, a signal is generated to notify a user. The signal may be sent (e.g., via Bluetooth, radio, etc.) to a user’s mobile device (such as the user’s mobile phone or to a dedicated device).
[0046] In one embodiment, the method includes detecting a stationary insect (e.g., an insect not flying and/or not changing locations in the space) in the image of the space and detecting the location of the stationary insect. A location indicator is created to indicate real-world coordinates of the stationary insect.
[0047] In another embodiment, the method includes detecting an alighting insect in images of the space and detecting the location of the insect after alighting. A location indicator is created to indicate real-world coordinates of the insect after alighting. [0048] In one embodiment the method includes projecting the location indicator (e.g., a beam of light visible to the human eye, such as, a visible light laser beam) to the location of the real-world coordinates in the space (1009) such that a visible mark is created at the location in space. For example, if an insect (e.g., a stationary insect and/or an insect after alighting) is detected at a location on a surface in the space, the beam of light is directed at the location on the surface such that a circle (or other shape) of light on the surface marks the location of the insect.
[0049] The location of the insect in the image can be translated to real-world coordinates (step 1005) by using projective geometry, for example, if the focal plane of the camera obtaining the image is parallel to a surface in the space on which the insect is located.
[0050] In another embodiment a system, which includes an imager (e.g., camera 103) and projector (e.g., projector 108) may be pre-calibrated. For example, the projector may be positioned in close proximity to the camera (for example see distance D described with reference to Fig. 3 below). During calibration a ray visible to the camera may be projected from the projector to several locations within the space and may be imaged by the camera at those locations. This way, by using interpolation, each location in the image (e.g., each pixel or group of pixels) can be correlated in real-time to an x,y coordinate in the space such that the projector can be directed to locations in the space based on locations detected in the image. Alternatively or in addition, using a ray visible to the camera can enable correcting the direction of the projector in real-time based on the visible indication.
[0051] In one embodiment, the projector includes one or more rotor to enable projection of a location indicator at different angles. In this case, each location in the image can be correlated to a, b coordinates of the rotor, based on pre-calibration.
[0052] In one example, rotors may include a step motor, such that the change in angle is known for each step. One or more physical stops may be used such that the angles of the rotor, at the limits of its movement, are known. For known camera’s optics, each pixel can be correlated to a known angle. Thus, the number of steps required to direct the rotor at each angle can be calculated. Since the projector is typically not located at the same location as the camera, the calculations may require adjustment to the distance between the projector and the camera. [0053] Other methods may be used to translate the location of the insect in the image to the real-world location.
[0054] In another embodiment, which is schematically illustrated in Fig. 2A, system 200 detects an insect, e.g., as described herein, and creates a location indicator, which is visible in an image of the room. In this embodiment, processor 202 locates an insect 205 in an image 223 of the room and generates a signal to create a location indicator 225 in the image 223 at the location of the insect. In one example, the image 223 of the room is displayed together with the location indicator 225, which may be an icon or other graphic indication superimposed on the image 223.
[0055] An example of an image 223 of a room is shown in Fig. 2B. Image 223, which includes part of a room, shows a surface, namely ceiling 226 of the room, on which an insect is located. A location indicator 225 is superimposed on the image 223 to indicate to a user viewing image 223, the location of the insect on the ceiling 226.
[0056] In one embodiment, images obtained by camera 203 can be stored locally (e.g., in memory unit 212) and/or remotely (e.g., the images may be transmitted over the internet or by using another suitable wireless communication, to remote storage, e.g., on the cloud). The images may then be retrieved and displayed on a device 209, such as a personal and/or mobile device (e.g., smartphone, tablet, etc.) or on a dedicated, typically mobile, device.
[0057] In one embodiment the image 223 of the room is an image of the room in real-time and the location indicator 225 is superimposed on the same image in which the location of insect 205 is detected.
[0058] In some embodiments, the image 223 of the room is manipulated such that certain details (such as personal, private and/or confidential information) are obscured or removed from the image. Thus, a real-time image (the same image in which insect 205 is detected) can be displayed without compromising privacy and/or confidentiality. The image 223 can be manipulated to protect privacy and/or confidentiality by processor 202 or by a different processor (e.g., a processor in device 209).
[0059] In another embodiment, a set of images of the room is obtained by camera 203. Camera 203 is not moved or repositioned while obtaining the set of images such that all the images capture the same field of view. A first image may be an image of the room 204 only, with no occupants, whereas a second image of the room 204 may be a real-time image of the room (possibly with occupants) in which an insect 205 is detected. In some embodiments, in order to protect the privacy of the occupants, only the first image is transmitted to device 209 to be displayed and the location of the insect 205 in the second image, is indicated and displayed on the first image, which is the image being displayed to the user.
[0060] In some embodiments, the first image (which typically does not include personal information) may be an image chosen by a user from a set of images of the room. In other embodiments, the first image may be a modified or manipulated image of the room in which personal information is obscured by modifying the personal information in the image.
[0061] In some embodiments, the first image may be a representative image, which enables a user to understand the layout of the space being imaged but is not necessarily a real image of the space. For example, a representative image may be created from a combination of several images of the space, typically obtained by camera 203. For example, the representative image may be an average of several images from a set of images of the space. In another example, a representative image may include a graphic representation of the space but not the actually imaged components of the space. In addition to being useful in protecting personal information, using an average image (or other representative image) as a first image, may be useful in case the camera (e.g., camera 203) is repositioned between images, such that the images are not all of exactly the same field of view.
[0062] In one embodiment, a method for detecting and locating an insect, carried out by processor 202, includes visually marking a location of an insect in the space on an image of the space. An exemplary method, which is schematically illustrated in Fig. 2C, includes obtaining a first image of a space (2001) and storing the first image (2003). Typically, the first image includes the space empty of occupants and/or in which personal information is obscured.
[0063] A second image of the space is obtained (2005). The second image is of about the same field of view as the first image but is obtained at a later time than the first image. The second image includes an insect in the space. The location of the insect in the second image is determined (2007) and a location indicator (e.g., a graphic mark) is created to mark that location in an image of the space (2009). [0064] In one embodiment, the location indicator marks the location on the same image in which the insect was detected. In other embodiments, the location indicator marks the location on a different image of the room. The different image of the room may be an image captured at an earlier time, e.g., the first image of the room.
[0065] In some embodiments the method includes accepting input from a user and determining which image to use as a first image (namely, which image to display together with the location indicator) based on the input from the user. Thus, a user can choose an image to send to storage and/or display, which does not include information which the user regards as personal or private.
[0066] In other or additional embodiments, the method includes a step of creating a representative image of the space (e.g., an average image) and using the representative image as the first image.
[0067] In some embodiments the first image is retrieved from storage and displayed to a user, e.g., on the user’s personal mobile device or on a dedicated device, with the location indicator superimposed on it, at the same location as in the second image (2011).
[0068] Thus, for example, a grid may be used on all the images of the space which are of the same field of view (or about the same field of view), such that a location of the insect in one image can be given x,y coordinates of the grid which are the same x,y coordinates in all the other images of the same field of view.
[0069] As discussed above, and as further exemplified in Fig. 3, a projector 308 may be controlled by processor 302 to project or direct a location indicator to the location of the insect in the real-world space, e.g., room 104.
[0070] In one embodiment, a projector 308 and a camera 303 are arranged in close proximity within housing 301. The projector 308 includes an indicator source, e.g., a light source, such as laser 316 and an indicator directing device 312, such as an optical system, including lenses and/or mirrors or other optical components to direct light from the light source in a desired direction or angle. In one embodiment, the indicator directing device 312 includes rotating optical elements such as a mirror-bearing gimbal arranged to pivot about a single axis. A set of two or three such gimbals, one mounted on the other with orthogonal pivot axes, may be used to allow the light of laser 316 to be directed in any desired pitch, roll and yaw. [0071] Based on the detected location of the insect 305 in an image obtained by camera 303, processor 302 controls indicator directing device 312 such that the indicator, e.g., laser 316, is directed to the real-world location of the insect. For example, control of the yaw, and pitch of the gimbals of indicator directing device 312 enables directing an indicator, such as laser 316, to a real-world location.
[0072] Typically, camera 303 is located at a minimal distance D from the projector 308 (or from components of the projector such as the laser and/or indicator directing device) to enable accurate aim of the indicator. In one example, camera 303 and laser 316 or indicator directing device 312 are located less than 20 cm of each other. In another example, camera 303 and laser 316 or indicator directing device 312 are located less than 10 cm of each other.
[0073] The laser 316 may include visible light such that the mark created by the laser at the detected location of the insect is visible and can be imaged by camera 303 and displayed to a user, for example on device 209. Thus, in one embodiment a user may receive an image of a room with a visual indication of the location of the insect created by laser 316, in the image of the room.
[0074] In one embodiment, the projector 308 is configured to eliminate or incapacitate the insect 305. For example, laser 316 may be a UV or IR or other light at high enough power such that when directed at an insect 305 on a surface in the room or at a stationary insect or at an insect after alighting, it may disable and/or kill insect 305.
[0075] In some embodiments, projector 308, which includes an indicator source, e.g., a light source, such as laser 316 and an indicator directing device 312 controlled by a processor, may be used in fields other than pest control. For example, projector 308 may be used to produce visual effects, such as animation. For example, projector 308 may be part of a toy. In some embodiments, the processor controlling the directing device receives input from an image sensor and/or based on image processing and can be used in virtual reality games or other applications.
[0076] In another embodiment, projector 308 may be used as a directing device, for example, to direct users to a specific point in an enclosed or other space. A few examples include:
[0077] - directing security forces to a location identified by security cameras; [0078] - directing a user to a desired location in large spaces such as archives, stores or warehouses;
[0079] - directing construction or maintenance staff to a specific site where a problem is detected (possibly, the problem is detected via image processing); and
[0080] - operating a laser cutting machine based on image processing.
[0081] Some embodiments of the invention provide devices for handling insects, such as eliminating or incapacitating the insects. Such a device may also include an apparatus such as an additional camera and/or illumination source, to assist in confirming the insect, e.g., confirming the existence and/or type of insect in an image. The devices, which are typically moveable, are controlled to approach a location of an insect in a space, such as an enclosed space, to handle the insect at close range, thereby limiting effects that may be hazardous to the surrounding space.
[0082] Some examples of devices for handling insects, which are described below, are devices controlled by systems for locating insects according to embodiments of the invention, however, in some embodiments, the devices for handling insects may be controlled by other systems.
[0083] The systems as described above may include, in some embodiments, an auxiliary device to be used, together with the systems described herein, to eliminate and/or otherwise handle insects detected in images, according to embodiments of the invention.
[0084] In exemplary embodiments, which are schematically illustrated in Figs. 4A and 4B, a system for detecting a location of an insect in a room includes a housing 401 which encases a camera 403 used to obtain an image of a space (such as a room in a house, office space and other public or private indoor spaces). Camera 403 is in communication with a processor 402 and memory 412, e.g., as described above. The system further includes an auxiliary device in communication with processor 402.
[0085] In Fig. 4A, the auxiliary device is an independently mobile device 415, which may be used to eliminate an insect or for other purposes, such as to remove, capture or analyze the insect, as further described in Fig. 5.
[0086] The system described in Fig. 4A may also include a port 413, typically on housing 401, such as a docking station or other terminal for powering and/or loading the independently mobile device 415. [0087] In one embodiment, the independently mobile device 415 is a flying device such as a drone.
[0088] Independently mobile device 415 may be remotely controlled by processor 402. For example, independently mobile device 415 may be in wireless communication (e.g., via Bluetooth, radio, etc.) with processor 402.
[0089] The system schematically illustrated in Fig. 4A includes a camera 403 to obtain images of a space and a mobile device 415 that is separately mobile from the camera 403. The processor 402 may detect an insect in at least one of the images of the space obtained by camera 403 and may control the device 415 to move to vicinity of the insect, based on analysis of the images of the space.
[0090] In one embodiment, processor 402 controls the mobile device 415 to move to the vicinity of the insect, based on analysis of an image of the space having the insect and the mobile device 415 within a single frame. Processor 402 may control the mobile device 415 to move in a direct path from the camera 403 in the direction of the insect, wherein the direction to the insect can be estimated from the location of the image of the insect within the frame. Once the insect and the mobile device 415 are within the same frame, processor 402 further controls movement of mobile device 415, such that it stays in the vicinity of the insect in the image, while guiding it away from the camera and towards the insect. For example, processor 402 may periodically determine the angular distance of the mobile device 415 from the insect in the frame, which may be estimated using the distance, in pixels, between the two objects in the frame. If the determined angular distance is above a predetermined value, the processor 402 may calculate the distance and direction needed to move the mobile device 415 in order to bring it within the predetermined angular distance from the insect, and may cause the mobile device 415 to move the calculated distance in the calculated direction.
[0091] This process may be repeated until the mobile device 415 is within a predetermined distance, e.g., an elimination distance, from the insect. For example, an elimination distance may be a distance from which the device can effectively handle the insect, for example, the distance from which an insecticide can be effectively sprayed on the insect. Once the predetermined distance (e.g. elimination distance) is reached, device 415 and/or member 426 (described below) may be controlled to eliminate the insect, e.g., by using chemical, mechanical or electrical methods.
[0092] Thus, processor 402 estimates a direction of the insect from the camera 403 and controls the device to move approximately in that direction.
[0093] In one embodiment, determining whether an elimination distance was reached, can be done by utilizing an additional camera on the mobile device 415 to obtain an image of the insect. The image of the insect may be analyzed (e.g. by comparing its size in the image to an expected size of this type of insect from the desired distance). In another embodiment, a processor (e.g., processor 402 or another processor, which may be attached to mobile device 415) may be in communication with a rangefinder or similar system (which may be attached to the mobile device 415 or at another location within the system) to determine, based on input from the rangefinder, whether an elimination distance was reached. In another embodiment, determining whether an elimination distance was reached can be done by the mobile device 415 emitting light in a known direction (e.g. using a laser pointer or other projector) to obtain a point of light and analyzing the location of the point of light in an image from camera 403 (e.g. a point on a wall or ceiling created by the laser pointer). The location of the mobile device 415 relative to camera 403 is known (as described herein). Therefore the angle from the mobile device 415 to the location of the point of light is known. The angle from camera 403 to the location of the point of light can be calculated by detecting the pixel (or group of pixels) of the point in the image. The distance to the point of light can be triangulated, from which the distance of the mobile device 415 to the insect can be estimated, since the insect is often on the same surface as the point of light.
[0094] In some embodiments, mobile device 415 may include a projector to project a beam of a form of energy to vicinity of the insect, to create the point of light and/or to handle the insect. Additionally, mobile device 415 may include an additional camera (e.g., camera 503 in Fig. 5). The direction and/or distance of the mobile device 415 from an insect may be calculated (e.g., as described above) using the projector and/or additional camera of the mobile device 415. [0095] Once within the predetermined distance, mobile device 415 may use a member, possibly extendable from the device to the vicinity of the insect, e.g., to handle the insect, as described below.
[0096] In Fig. 4B, the auxiliary device is attached to housing 401 at attachment point 411 and may be in communication with a power source and/or reservoir within housing 401, via attachment point 411. The auxiliary device may include a handling tool, such as a moveable and typically extendible member 426, such as a telescopic arm. Member 426 may be controlled by processor 402 to extend from the housing 401 and move to the location of the insect to handle the insect at the location, for example, to capture or kill the insect, as described below.
[0097] In some embodiments member 426 is a telescopic and/or deformable arm or spring made of, for example, shape memory material that is usually in a folded or coiled form and can be extended and moved to interact with the insect at the location of the insect, upon a signal from processor 402.
[0098] Handling the insect may include using mechanical and/or chemical methods. In some cases, both mechanical and chemical means or methods are used to handle the insect.
[0099] In some embodiments, member 426 serves as a conduit for instruments or agents used to handle the insect. For example, member 426 may include or may be in communication with a chamber containing a chemical substance (e.g., in the form of gas, liquid or powder) that can be sprayed at or dropped on the insect from a relatively close range, thereby limiting the effect of the chemical substance to the insect itself and not affecting the surrounding space. In one example, the chamber may contain a pesticide. In another example, the chamber may include a repellant such as citronella oil, which is a plant-based insect repellent.
[00100] In some embodiments, housing 401 includes a reservoir of the chemical substance. In other embodiments housing 401 stores capsules (or other containers) of the chemical substance, which can be loaded into the member 426.
[00101] In one embodiment, member 426 may include a nozzle attached to the distal end 427 of member 426. The member 426, carrying a nozzle, may be directed to the location of the insect and a pulse or spray of a chemical substance (e.g., as described above) may be directed at the insect at close range via the nozzle. [00102] In one embodiment, member 426 may include or may be in communication with a suction chamber to draw in and capture (and/or kill) the insect.
[00103] In another embodiment, member 426 may include an electrifying element by which to electrocute the insect. In another embodiment member 426 may include an adhesive element by which to capture (and/or kill) the insect.
[00104] Other electrical and/or mechanical and/or chemical solutions may be employed via member 426.
[00105] Member 426 does not have human or other predator characteristics and is therefore typically not identified by insects (such as mosquitoes) as humans or predators and can thus approach the insect and get within close range of the insect without scaring it off.
[00106] In some embodiments, an auxiliary device may include, for example, a projector (e.g., in addition to projector 108) to project a beam of any form of energy harmful or lethal to the insect to the location of the insect. In some embodiments a single projector (e.g., projector 108) may be used to indicate a location of an insect and to project a beam to handle (e.g., incapacitate) the insect. Thus, a projector may be controlled by a signal generated from processor 102 to project a beam of a form of energy such as light, heat, and the like, to the location of the insect, to handle the insect.
[00107] In some embodiments, neural networks, such as convolutional neural networks, or other computer vision software and algorithms are used to detect and identify details of the insect from an image or a plurality of images of the location. For example, shape and/or motion and/or color detection algorithms may be used to determine the shape and/or color and/or movement pattern and/or other details of the insect. Movement pattern may include, for example, direction of movement, size of movement, velocity of movement, etc. These details of the insect may be used to determine a type of insect being imaged and/or differentiate between different insects and/or between an insect and non-insect objects, such as particles of dust or other noise that may be imaged.
[00108] In some embodiments, processor 102 controls the auxiliary device based on the determination of the type of insect. For example, a projector may be controlled to handle the insect only if it is a specific type of insect. [00109] In other embodiments, an auxiliary device may include, for example, a tool to enhance the image of the room at the location of the insect. For example, the system (e.g., 100) may include a camera (e.g., in addition to camera 103) with optics to enable enhancing the location of the insect, for example, to confirm the existence and/or type of insect at the location, based on an enlarged image of the location.
[00110] In one embodiment, a long focus lens (e.g., telephoto lens) may be used to zoom-in on the location of the insect to enable seeing the shape or other details of the insect in better detail and focus.
[00111] In one embodiment, once camera 103 detects a location of a suspected insect, the additional camera may be directed and/or moved to the location of the suspected insect, for example, to confirm the existence and/or type of insect. In one embodiment a camera with a long-focus lens (or other enlarging optics) may be attached to or located on indicator directing device 312, e.g., on a gimbal, such that the enlarging optics can be moved in parallel to the indicator directing device, automatically directing the optics at the location of a suspected insect.
[00112] In one embodiment, differential analysis may be used to confirm a suspected insect and/or to detect an insect. For example, an area may be scanned at low resolution to detect a suspected insect, and the area of the suspected insect may then be analyzed at high resolution, e.g., to confirm the existence and/or type of insect. Using differential analysis of images enables to reduce processing, thereby providing a cost effective solution.
[00113] Thus, in one embodiment, camera 103 may obtain a wide FOV image of the room and an auxiliary device, such as an additional camera that enables zooming-in, obtains a detailed image of a portion of the room. Processor 102 can detect a location of a suspected insect in the wide FOV image of the room, direct the additional camera to the location of suspected insect (e.g., by controlling movement of the gimbals) and confirm the insect (e.g., confirm the existence and/or type of insect) in the detailed image of the portion of the room (the location of the suspected insect).
[00114] In one embodiment, a system for handling an insect, such as system 100, may include an auxiliary illumination source to allow higher resolution imaging of a location of a suspected insect and to assist in confirming the insect. Optionally, an illumination source, which may also be attached to the gimbal such that it is moved in parallel to the indicator directing device, may be used, e.g., to obtain a brighter image. The illumination source may have a relatively short wavelength (e.g. blue light) so as to reduce the diffraction limit and allow higher resolution imaging of the suspected insect. In some embodiments, the illumination source and the location indicator are the same element.
[00115] Once a suspected insect is confirmed, processor 102 can control projector 108 to indicate the location of the confirmed insect and possibly control another auxiliary device to eliminate or otherwise handle the confirmed insect.
[00116] Using an auxiliary device, such as an additional camera and/or additional illumination source, enables obtaining an enhanced image via optics and/or illumination and relying less on power consuming computer vision algorithms. Thus, a less powerful CPU may be used with camera 103, thereby providing a cost effective solution.
[00117] In some embodiments a single camera (e.g., camera 103) may be used to provide images from which to detect a location of an insect or suspected insect and to magnify or otherwise enhance the image at the detected location. For example, one optical element may be employed to image a large area (e.g., a room) and another optical element may be employed to image a small area within the large area (e.g., the detected location within the room). Alternatively or in addition, differential analysis may be used to locally enhance regions within an image of a large area, for example, to assist in identifying an insect. The tool to enhance the image of the room at the location of the insect, may be controlled by processor 102.
[00118] In one embodiment, which is schematically illustrated in Fig. 4C, a method, some steps of which may be carried out by processor 402, for eliminating, incapacitating or otherwise handling an insect, includes obtaining an image of a space (4001) and detecting a location of an insect in the image (4003). The location of the insect in the image is translated to real-world coordinates (4005). Processor 402 (or another processor) then controls an auxiliary device (such as independently mobile device 415 or member 426) based on the real- world coordinates. For example, the auxiliary device can be directed to the real-world coordinates (4007). [00119] In some embodiments, an auxiliary device is only employed to eliminate or otherwise handle an insect if it is determined that there are no other susceptible objects that can be harmed by the action of the auxiliary device. Susceptible objects may include, for example, living beings (e.g., humans, pets, etc) and/or other objects or materials, such as paper or fabric or objects including such materials that can be harmed by the action of the auxiliary device.
[00120] Thus, a method for eliminating an insect may include a step of determining if there is a living being (or object or material that may be harmed by the action of the auxiliary device) in the vicinity of the location of the insect and directing the auxiliary device at the real-world coordinates detected in step (4005) only if no living being (or object or material) is detected in vicinity of the insect. Existence of a living being in vicinity of location of the insect may be determined, for example, by, determining motion in the space. Motion above a predetermined size may indicate a person or other living being in the space. In one embodiment motion or a size of motion is determined by detecting changes over time in the images of the space.
[00121] In other embodiments, existence of a person or other living being (or specific object or material) in the space may be determined by using computer vision techniques, e.g., to detect from the image (e.g., an image obtained by camera 103 or an additional camera) a shape, color or other attribute of a person or object or material.
[00122] Thus, in some embodiments a system for eliminating an insect in a room includes a camera to obtain an image of the room and a processor to detect a location of the insect in the image of the room. For example, the processor detects, from the image of the room, an insect after alighting and/or an insect on a surface in a space. The processor may then translate the location of the insect (e.g., the insect after alighting) in the image to real- world coordinates and control an auxiliary device based on the real-world coordinates to eliminate or otherwise handle the insect.
[00123] Alternatively or in addition, the processor may determine if there is a person (or other living being) or specific susceptible object or material, in vicinity of the insect and may control the auxiliary device to eliminate or otherwise handle the insect based on the determination. [00124] Alternatively or in addition, the processor may confirm the existence and/or type of the insect at the location and may control the auxiliary device to eliminate or otherwise handle the insect based on the confirmation of the existence and/or type of the insect at the location. In one example, the processor may control the camera or an additional camera to obtain an enlarged or more detailed image of the insect to confirm the existence and/or type of the insect at the location.
[00125] The control of the auxiliary device, which may be via wireless communication, can be, for example, control of a propulsion mechanism of the auxiliary device and/or control of a handling tool of the auxiliary device.
[00126] An example of an auxiliary device, which is independently mobile, is schematically illustrated in Fig. 5.
[00127] In one embodiment device 515 is a flying device (e.g., drone) which includes a propulsion mechanism 525 to move the device without assistance and an insect handling tool 526, or, alternatively or in addition, including an attachment point configured to releasably receive and secure a handling tool to the device 515.
[00128] Handling tool 526 may apply mechanical and/or chemical and/or electrical methods by which to handle an insect. In some embodiments the handling tool 526 applies both mechanical and chemical means or methods by which to handle the insect.
[00129] In one embodiment handling tool 526 may include a suction chamber to draw in and capture (and/or kill) the insect. In another embodiment, handling tool 526 may include an electrifying element by which to electrocute the insect. In another embodiment handling tool 526 may include an adhesive element by which to capture (and/or kill) the insect. Other electrical and/or mechanical solutions may be employed by handling tool 526.
[00130] In one embodiment handling tool 526 may include, for example, a telescopic arm or deformable arm or spring made of, for example, shape memory material that can be in a folded or coiled form while device 515 is in transit and can be extended to interact with the insect upon a signal from processor 402.
[00131] In another embodiment handling tool 526 may include a chamber containing a chemical substance (e.g., as described above) that can be sprayed at or dropped on the insect from a relatively close range, thereby limiting the effect of the chemical substance to the insect itself and not effecting the surrounding space.
[00132] In some embodiments, port 413 includes a reservoir of the chemical substance to enable the device 515 to dock at the port, recharge and stock the handling tool 526 with the chemical substance. In other embodiments port 413 stores capsules (or other containers) of the chemical substance. A capsule can be loaded into the handling tool 526 while the device 515 is docking at port 413. A capsule may last several events of handling insects before being depleted, and may be replaced at port 413 when depleted.
[00133] In some embodiments, device 515 may include a combination of different handling tools and may use a combination of methods (e.g., chemical and/or mechanical) for handling insects.
[00134] Device 515 does not have human or other predator characteristics and is therefore typically not identified by insects (such as mosquitoes) as a human or predator and can thus approach the insect and get within close range of the insect without scaring it off.
[00135] In the example in Fig. 5, device 515 is an aerial drone and the propulsion mechanism 525 includes a propeller mechanism suitable for aerial flight. Different types of independently mobile devices may have different types of propulsion mechanisms, or multiple types of propulsion mechanisms. For example, a terrestrial drone may have a propulsion mechanism that includes a motor, transmission, and wheels.
[00136] Device 515 typically includes a control circuit (not shown) in communication with a processor (e.g., processor 402) and is configured to receive input regarding location of an insect.
[00137] In some embodiments, device 515 (and/or member 426) may further include one or more sensors such as an image sensor (e.g., camera 503) and/or a distance sensor (such as a rangefinder).
[00138] In one embodiment device 515 (and/or member 426) is controlled to handle a stationary insect or an insect after alighting (e.g., an insect on a surface in a space). The device 515 or member 426 receives direction information (e.g., a vector) from processor 402, based on the detected location of the stationary insect and is propelled according to the received information. A distance sensor in device 515 (or member 426) can detect the distance of the device 515 (or member 426) from the insect (and/or from the surface) and stop propelling at a predetermined distance from the insect.
[00139] In one embodiment device 515 (and/or member 426) may include a signal source (such as a light source or audio transmitter) to emit a signal that can be received and analyzed by processor 402 and may be used to estimate or calculate the distance of the device 515 or member 426 from the insect (and/or from the surface). For example, device 515 may include a projector to project a visible mark to the vicinity of the insect. Processor 402 can then control the device 515 (e.g., to control handling tool 526) or member 426 based on the calculated distance.
[00140] In some embodiments a dedicated image sensor attached to or within housing 401 can be used to capture an image of the insect (and possibly of the visible mark projected from a projector of device 515), which may be used to direct the device 515 or member 426 to the insect. The visual mark can be detected from an image obtained by camera 403 or by the dedicated camera and device 515 or member 426 and can thus be directed to the location of the visual mark as imaged.
[00141] Using a device and/or extendable member controlled by a processor based on a location of an insect in an image, according to embodiments of the invention, enables accurate and environment friendly action to remove or eliminate pests such as flying insects.
[00142] As described above, embodiments of the invention can distinguish an insect from noise, such as, electronic noise on the image sensor and/or ambient noise, such as dust particles in the space, variations in ambient illumination, reflections, etc. Additionally, a specific insect type (e.g., mosquito) can be differentiated from another insect type (e.g., fly).
[00143] In one embodiment, a method is provided for differentiating between a target insect and a non-target insect object from images of a space. For example, a target insect may be an insect, as opposed to a non-insect object (e.g., noise or other object) and/or a specific type of insect, as opposed to a different type of insect.
[00144] The method, which may be carried out by a system such as system 100, includes using multiple images to determine if an object in an image is a target insect.
[00145] In one embodiment, processor 102 may detect an object by comparing two (or more) images of the space and may determine that the object is a target insect based on a characteristic of the object in an image of the space. In some embodiments, an object is detected if it fulfills a predetermined criterion.
[00146] In one embodiment, camera 103 may capture an image (also named“current image”), from which it is desirable to determine if an insect is present in the space. Processor 102 may obtain a subtraction image by subtracting the current image of the space from a different, second, image of the space. The subtraction image highlights changes in the space since objects that have not changed (e.g. have not moved or have not changed position) in between images, do not typically show up in the subtraction image.
[00147] Processor 102 may detect in the subtraction image an object having a predetermined criterion and determine that the object is a target insect.
[00148] As described above, a device may be controlled based on the determination that an object is a target insect.
[00149] In an embodiment of the invention, two or more images of the space are compared, in order to detect an object which fulfills a predetermined criterion. For example, a current image may be compared to a second image that was previously captured, to detect an object that is present in the current image but not in the previous image. In some embodiments, the second image may include a representation of a plurality of images of the space. For example, the second image may be an average (or other suitable statistical representation) of multiple images of the space. In another example, the second image may include a background image constructed using images of the space captured over time, by understanding constant and temporary elements in the images of the space, and constructing an image of the constant elements (e.g. walls and furniture, but not people and pets).
[00150] An example of this embodiment is schematically illustrated in Fig. 6. Two images of a space are obtained (step 602). In one example, the images are compared by subtraction, e.g., a current image, is subtracted from another image of the space to obtain a subtraction image (step 604).
[00151] In step 606, an object fulfilling a predetermined criterion is detected in the subtraction image. A predetermined criterion may relate to one or more characteristics of the object. For example, a characteristic of the object may include size, shape, location in the subtraction image, color, transparency and other such attributes of the object in the subtraction image. Thus, a predetermined criterion may be, for example, a size range (e.g., in pixels), a specific shape (e.g., as determined by a shape detection algorithm applied on the subtraction image), a specific location or range of locations of the object within the subtraction image, specific colors (e.g., as determined by applying a color detection algorithm on the subtraction image), etc.
[00152] Processor 102 determines if the object fulfilling the predetermined criterion is a target insect. For example, one or more characteristics of the object (such as, movement pattern, shape, color or transparency) may be determined and the object may be determined to be a target insect based on the determined characteristic. For example, mosquitoes are more transparent and are of lighter color than some other common insects, thus, in one example, in which the target insect is a mosquito, if the color of the pixels associated with the object are colors typical of mosquitoes the object would be determined to be a mosquito. In another embodiment, if an object is determined to have a certain level of transparency or to have a predetermined pattern of transparent areas, it may be determined to be a mosquito. Transparency of an object may be determined, for example, based on a known color of background in the space. If an object is determined to have the color of the background (e.g., if the background color is not a color typical of the target insect), the object may be determined to be partially transparent. In another example, different insects have different shapes, thus a target insect may be determined based on its shape in the subtraction image.
[00153] In some embodiments, an object may be detected from a plurality of images whereas detecting if the object fulfills a predetermined criterion and determining that the object is a target insect, are done from a single image. In one embodiment, a same characteristic of an object may be used to detect an object fulfilling a predetermined criterion, in a first image and to determine if the object is a target insect, in the same image or in a second image. In other embodiments, different characteristics are used to detect an object fulfilling a predetermined criterion in a first image and to determine if the object is a target insect in the same image or in a second image.
[00154] For example, a subtraction image may include several objects but only two that are within a predetermined size range. Thus, two objects are detected in the subtraction image. One or more characteristic(s), other than size, may be determined for the two objects, e.g., the color and/or transparency and/or movement pattern of the two objects may be determined and the objects may be determined to be target insects or not, based on their color and/or transparency and/or movement pattern.
[00155] In some embodiments, a high resolution image of the object may be obtained and the object can be determined to be a target insect based on the high resolution image. For example, an object may be detected in a first image, e.g., in a subtraction image, possibly, based on its size or other characteristic, and may then be determined to be a target insect (or not) from a second image which is of higher resolution than the first image.
[00156] In some embodiments, characteristics, such as color and/or movement may be spatially correlated. For example, if a number of pixels that are close to each other have properties indicative of a target insect, these pixels may be given more weight in determining the presence of a target insect, than a number of pixel having the same properties, but which are not closely grouped. In another example, several correlated characteristics or pixel properties e.g., same movement patterns and/or changes in illumination, detected in several locations in an image, may point to movement of a larger object and/or reflections, and may be assigned a lower weight in determining the presence of a target insect, than single and uncorrelated characteristics.
[00157] Different weights may be assigned to characteristics (or pixels representing these characteristics) based on the behavior of the characteristic in a plurality of images. For example, a characteristic persisting over time is less likely to be noise and may therefore be assigned a higher weight.
[00158] Machine vision techniques, such as object detection algorithms, segmentation, etc., may be used to detect an object in images of the space (e.g., a subtraction image) and to determine the pixels associated with the object. In some embodiments, a learning model may be applied on images of the space to determine that the object is a target insect. A learning model may be applied, for example, on the subtraction image to detect an object having a predetermined criterion and/or on a current image to determine if the object is a target insect. A learning model may be applied at other steps as well, such as integrating the various inputs (color, transparency, size, movement pattern, etc.) into a single decision of determining whether the object is a target insect. [00159] If the object is determined to be a target insect (step 608), processor 102 generates a signal to control a device (step 610). If the object is not determined to be a target insect, another current image is obtained and processed.
[00160] A device controlled based on the determination that an object is a target insect may include an auxiliary device, e.g., as described above. In one example, a device (such as a projector of a light source) may create a location indicator visible to a human eye (e.g., visual mark 115). Thus, a method may include determining a real-world location of the target insect from the images of the space and controlling a device to create a location indicator visible to a human eye and indicative of the real-world location of the target insect.
[00161] In another embodiment, a device may be used to eliminate and/or otherwise handle target insects. Thus, a method may include determining a real-world location of the target insect from the images of the space and controlling a device to eliminate (or otherwise handle) the target insect at the real-world location. The device may include an auxiliary device for handling an insect, e.g., as described above. For example, the device may include a projector to project a form of energy at the real-world location of the target insect. Alternatively or in addition, the device may include a remotely controlled independently mobile device and/or a telescopic arm and/or nozzle.
[00162] In one embodiment, an object (e.g., the object detected in a subtraction image) is tracked in multiple images of the space and to multiple locations in the space, and the object may be determined to be a target insect (or not) based on the tracking.
[00163] In one embodiment, which is schematically illustrated in Fig. 7, a movement pattern of an object is detected and the object is determined to be a target insect (or not) based on the movement pattern.
[00164] An object is detected in images of a space (step 702) and a movement pattern of the object is determined (step 704). If the movement pattern is similar to a predetermined pattern (step 706) then the object is determined to be a target insect (step 708). If the movement pattern is not similar to the predetermined movement pattern (step 706) then the object is not determined to be a target insect (step (710).
[00165] Typically, a predetermined movement pattern will be a pattern consistent with a pattern expected from the target insect. For example, a predetermined movement pattern can include an alighting pattern (e.g., flying and then settling down), which is typical of mosquitoes. In another example, the predetermined movement pattern can include predominantly a non-repetitive movement, since a predominantly repetitive motion is characteristic of an unintended motion (such as movement of a fan, wind-blown objects and/or electronic noise). In yet another example, a movement pattern can include a change in direction and a predetermined movement includes a change in direction at a specific angle or range of angles. For example, mosquitoes often change direction at an angle less sharp than flies. Thus, a predetermined movement pattern may include a change of direction at an angle in a predetermined range. In another example, mosquitoes move more slowly than flies, thus, a predetermined movement pattern can include a specific velocity (or range of velocities).
[00166] Additionally, determining characteristics of objects, such as color and transparency, may be more accurate when using multiple images and/or comparing images over time. In some cases, over time, a moving object (such as an insect) may pass over different backgrounds, assisting in determining the color and/or transparency of the object (as described above). For example, a completely opaque object would not change its color or intensity when passing over different backgrounds, while a translucent one would.
[00167] In some embodiments, historical data may be used in determining if an object is a target insect. For example, determining if an object in a later captured image is a target insect, can be based on a weight assigned to pixels in an earlier captured image.
[00168] In one example, which is schematically illustrated in Fig. 8, an object is detected at a location in a first image (e.g., first current image) of a space (step 802). If it is determined that the object is not a target insect (step 804), then a first weight is assigned to pixels at that location (step 806). If it is determined that the object is a target insect (step 804), then a second weight is assigned to pixels at that location (step 808).
[00169] An object is detected at a location in a second image (e.g., a second current image) (step 810) and the weights from steps 806 and 808 are assigned to the pixels of the second image based on their location in the second image. The object in the second image may then be determined to be a target insect (or not) based on the weighted pixels associated with the object in the second image (step 812). [00170] For example, images of a space (such as a room) may include windows, a TV screen, a fan, reflections and more, which may create“noisy” areas in the images. Such noise may be detected, for example, by high variation in pixel values over time, by many false positives (e.g., falsely detected target insects), or by applying object detection algorithms to identify the objects likely to create noise (e.g., window, TV, etc.). In some embodiments, characteristics of objects (or pixels representing these characteristics) detected in relatively “noisy” areas of an image may be assigned less weight than characteristics (or pixels) of objects detected in other areas of the image. In another example, characteristics (or pixels) of objects detected in an area of the image, in which a target insect was erroneously determined in past cases, may be assigned less weight than characteristics (or pixels) detected in other areas of the image.

Claims

CLAIMS What is claimed is:
1. A method for detecting a target insect in a space, the method comprising:
detecting an object by comparing at least two images of the space;
determining that the object is a target insect based on a characteristic of the object in an image of the space; and
controlling a device based on a determination that the object is a target insect.
2. The method of claim 1 wherein the obj ect fulfills a predetermined criterion.
3. The method of claim 1 wherein one of the at least two images comprises a representation of a plurality of images of the space.
4. The method of claim 1 wherein comparing the at least two images of the space comprises obtaining a subtraction image by subtracting a current image of the space from a second image of the space; and comprising
detecting in the subtraction image an object fulfilling a predetermined criterion.
5. The method of claim 4 wherein the second image comprises an image of the space captured prior to the current image.
6. The method of claim 2 wherein the predetermined criterion relates to one or more characteristic of the object, the characteristic comprising one of: size, shape, location in an image, color and transparency.
7. The method of claim 1 comprising:
determining one or more characteristic of the object, the characteristic comprising one of: movement pattern, shape, color and transparency; and
determining that the object is a target insect based on the determined characteristic.
8. The method of claim 1 comprising
tracking the object in images of the space; and
determining that the object is a target insect based on the tracking.
9. The method of claim 8 comprising:
detecting a movement pattern of the object, based on the tracking of the object; and determining that the object is a target insect if the movement pattern is similar to a predetermined movement pattern.
10. The method of claim 9 wherein the predetermined movement pattern comprises one or more of: an alighting pattern, predominantly a non-repetitive movement and a change in direction at an angle in a predetermined range.
11. The method of claim 1 comprising:
obtaining a high- resolution image of the object; and
determining that the object is a target insect based on the high-resolution image.
12. The method of claim 1 comprising:
detecting spatially correlated characteristics of the object; and
determining if the object is a target insect based on the spatially correlated characteristics.
13. The method of claim 1 comprising:
assigning a weight to pixels at a location of the object in a first image of the space based on a determination that the object is a target insect; and
determining that an object in a second image of the space is a target insect by assigning the weight to pixels at the location of the object in the second image.
14. The method of claim 1 comprising:
determining a real-world location of the target insect from images of the space; and
controlling the device to create a location indicator visible to a human eye and indicative of the real-world location of the target insect.
15. The method of claim 14 wherein the device comprises a projector of a light source.
16. The method of claim 1 comprising:
determining a real-world location of the target insect from images of the space; and
controlling a device to eliminate the target insect at the real-world location.
17. The method of claim 16 wherein the device comprises a projector to project a form of energy at the real-world location of the target insect.
18. The method of claim 16 wherein the device comprises a remotely controlled independently mobile device.
19. The method of claim 16 wherein the device comprises a telescopic arm.
20. The method of claim 16 wherein the device comprises a nozzle.
21. The method of claim 16 comprising:
determining from the images of the space if there is a living being in vicinity of the target insect; and
controlling the device to eliminate the target insect at the real-world location of the target insect based on the determination if there is a living being in vicinity of the target insect.
22. The method of claim 1, wherein the device is an autonomously mobile device, the method comprising:
determining a real-world location of the target insect from images of the space; and
controlling the device to move to vicinity of the real-world location of the target insect.
23. The method of claim 1 comprising applying a learning model on images of the space to determine that the object is a target insect.
24. A system for detecting a target insect in a space, the system comprising:
a camera to obtain images of the space; and
a processor in communication with the camera, the processor to detect an object by comparing at least two of the images of the space; and determine that the object is a target insect based on a characteristic of the object in an image of the space.
25. A system for handling an insect in a space, the system comprising:
a camera to obtain images of the space;
a device separately mobile from the camera; and
a processor to detect the insect in at least one of the images of the space and to control the device to move to vicinity of the insect, based on analysis of the images of the space.
26. The system of claim 25 wherein the processor controls the device to move to vicinity of the insect, based on analysis of an image of the space having the insect and the device within a same frame.
27. The system of claim 26 wherein the processor estimates a direction of the insect from the camera and wherein the processor controls the device to move approximately in the direction.
28. The system of claim 27 wherein the processor estimates a distance of the device from the insect and wherein the processor controls the device to move to a predetermined distance from the insect.
29. The system of claim 28 comprising a rangefinder in communication with the processor to estimate the distance of the device from the insect.
30. The system of claim 28 wherein the processor estimates the distance of the device from the insect by comparing a size of the insect from an image of the space to an expected size of the insect.
31. The system of claim 28 wherein the processor estimates the distance of the device from the insect by analyzing a location of a point of light in the frame, the point of light being projected from the device.
32. The system of claim 28 wherein the processor controls the device to eliminate the insect when the device is at the predetermined distance from the insect.
33. The system of claim 32 wherein the device comprises a member extendable from the device and the processor controls the device to eliminate the insect via the member.
34. The system of claim 25 wherein the device comprises an additional camera to obtain an image of the insect.
35. The system of claim 25 wherein the device comprises a projector to project a beam of a form of energy to vicinity of the insect.
36. The system of claim 25 comprising a docking station for powering and/or loading the device.
37. The system of claim 25 wherein the device is configured to eliminate the insect electrically, mechanically or chemically.
PCT/IL2019/050839 2018-07-29 2019-07-24 System and method for locating and eliminating insects WO2020026230A1 (en)

Priority Applications (8)

Application Number Priority Date Filing Date Title
EP19843552.1A EP3830755A4 (en) 2018-07-29 2019-07-24 System and method for locating and eliminating insects
CA3105655A CA3105655A1 (en) 2018-07-29 2019-07-24 System and method for locating and eliminating insects
AU2019313665A AU2019313665A1 (en) 2018-07-29 2019-07-24 System and method for locating and eliminating insects
KR1020217005289A KR20210035252A (en) 2018-07-29 2019-07-24 Systems and methods for locating and removing insects
BR112021001634-1A BR112021001634A2 (en) 2018-07-29 2019-07-24 system and method for locating and eliminating insects
JP2021504834A JP2021531806A (en) 2018-07-29 2019-07-24 Systems and methods for locating and eliminating insects
CN201980049892.5A CN112513880A (en) 2018-07-29 2019-07-24 System and method for locating and eliminating insects
US17/259,205 US12063920B2 (en) 2018-07-29 2019-07-24 System and method for locating and eliminating insects

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
IL260844 2018-07-29
IL260844A IL260844B (en) 2018-07-29 2018-07-29 System and method for locating and eliminating insects
US201862743593P 2018-10-10 2018-10-10
US62/743,593 2018-10-10

Publications (1)

Publication Number Publication Date
WO2020026230A1 true WO2020026230A1 (en) 2020-02-06

Family

ID=68069430

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2019/050839 WO2020026230A1 (en) 2018-07-29 2019-07-24 System and method for locating and eliminating insects

Country Status (12)

Country Link
US (1) US12063920B2 (en)
EP (1) EP3830755A4 (en)
JP (1) JP2021531806A (en)
KR (1) KR20210035252A (en)
CN (1) CN112513880A (en)
AR (1) AR115817A1 (en)
AU (1) AU2019313665A1 (en)
BR (1) BR112021001634A2 (en)
CA (1) CA3105655A1 (en)
IL (1) IL260844B (en)
TW (1) TW202022698A (en)
WO (1) WO2020026230A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20220072429A (en) * 2020-11-25 2022-06-02 유한회사 평화스테인레스 Pest Control System Based on OLED and Big Date

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AT522373B1 (en) * 2019-03-18 2023-04-15 Univ Innsbruck DEVICE FOR DISTURBING THE OPTICAL NAVIGATION ABILITY OF ORGANISMS
US11176652B2 (en) * 2019-04-05 2021-11-16 Waymo Llc High bandwidth camera data transmission
US20220217962A1 (en) * 2019-05-24 2022-07-14 Anastasiia Romanivna ROMANOVA Mosquito monitoring and counting system
CN110674805B (en) * 2019-10-11 2022-04-15 杭州睿琪软件有限公司 Insect identification method and system
TWI763099B (en) * 2020-10-28 2022-05-01 李寬裕 Optical Pest Control Equipment
CN112674057A (en) * 2021-01-08 2021-04-20 中国人民解放军海军航空大学 Intelligent mosquito killing equipment and method
CN114431773B (en) * 2022-01-14 2023-05-16 珠海格力电器股份有限公司 Control method of sweeping robot
IL298319A (en) * 2022-11-16 2024-06-01 Bzigo Ltd Unmanned aerial vehicle for neutralizing insects
CN116391693B (en) * 2023-06-07 2023-09-19 北京市农林科学院智能装备技术研究中心 Method and system for killing longicorn
JP7445909B1 (en) 2023-08-21 2024-03-08 株式会社ヤマサ Pest control systems and pest control programs
US12022820B1 (en) * 2023-10-11 2024-07-02 Selina S Zhang Integrated insect control system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040076583A1 (en) * 2002-07-15 2004-04-22 Baylor College Of Medicine Method for indentification of biologically active agents
US20050025357A1 (en) * 2003-06-13 2005-02-03 Landwehr Val R. Method and system for detecting and classifying objects in images, such as insects and other arthropods
US20180046872A1 (en) * 2016-08-11 2018-02-15 DiamondFox Enterprises, LLC Handheld arthropod detection device
US20180204321A1 (en) * 2012-07-05 2018-07-19 Bernard Fryshman Object image recognition and instant active response with enhanced application and utility

Family Cites Families (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4015366A (en) 1975-04-11 1977-04-05 Advanced Decision Handling, Inc. Highly automated agricultural production system
JP3002719B2 (en) * 1996-07-19 2000-01-24 工業技術院長 Environmental cleanliness measurement system using a small biological monitor
US8400348B1 (en) * 1999-05-14 2013-03-19 Applied Information Movement and Management, Inc. Airborne biota monitoring and control system
US7057516B2 (en) 2001-06-01 2006-06-06 Dimitri Donskoy Device and method for detecting localization, monitoring, and identification of living organisms in structures
US7656300B2 (en) 2003-06-16 2010-02-02 Rønnau Development ApS Pest control system
JP2005021074A (en) * 2003-07-01 2005-01-27 Terada Seisakusho Co Ltd Method and system for image processing counting
US7286056B2 (en) 2005-03-22 2007-10-23 Lawrence Kates System and method for pest detection
JP5066575B2 (en) * 2006-10-23 2012-11-07 ダウ アグロサイエンシィズ エルエルシー Bedbug detection, monitoring and control technology
JP2008200002A (en) * 2007-02-22 2008-09-04 Matsushita Electric Works Ltd Nocturnal insect-trapping system
JP2010149727A (en) * 2008-12-25 2010-07-08 Aisin Aw Co Ltd System and program for preventing entrance of insect into cabin
US8705017B2 (en) 2009-01-15 2014-04-22 Tokitae Llc Photonic fence
BR112013009401A2 (en) * 2010-10-17 2016-07-26 Purdue Research Foundation automatic monitoring of insect populations
SG189915A1 (en) 2010-10-29 2013-06-28 Commw Scient Ind Res Org A real-time insect monitoring device
US9381646B1 (en) 2012-07-05 2016-07-05 Bernard Fryshman Insect and other small object image recognition and instant active response with enhanced application and utility
ES2763412T3 (en) * 2011-11-09 2020-05-28 Francois Gabriel Feugier Pest control system, pest control method and pest control program
US20150075060A1 (en) 2013-02-12 2015-03-19 Jody Arthur Balsam Apparatus and method for detection of insects
US20150085100A1 (en) * 2013-09-26 2015-03-26 Micholas Raschella System for detection of animals and pests
CN103914733B (en) 2014-03-31 2016-09-28 北京市农林科学院 A kind of pest trap counting assembly and number system
JP6274430B2 (en) * 2014-06-03 2018-02-07 みこらった株式会社 Pest capture and storage device and pest insecticide device
JP6479364B2 (en) * 2014-07-31 2019-03-06 近藤電子株式会社 Poultry health diagnosis device
US10568316B2 (en) 2014-08-15 2020-02-25 Monsanto Technology Llc Apparatus and methods for in-field data collection and sampling
DE202014007499U1 (en) 2014-09-19 2014-11-03 Florian Franzen Largely autonomous mini-drone (UAV helicopter drone) for killing mosquitoes and other small airborne insects in buildings and outdoor areas used by humans
US9693547B1 (en) 2014-10-20 2017-07-04 Jean François Moitier UAV-enforced insect no-fly zone
US10752378B2 (en) * 2014-12-18 2020-08-25 The Boeing Company Mobile apparatus for pest detection and engagement
JP2016136916A (en) * 2015-01-29 2016-08-04 シャープ株式会社 Injurious insect expelling device, and injurious insect expelling method
JP2016185076A (en) * 2015-03-27 2016-10-27 三菱自動車工業株式会社 Insect expelling device
US9828093B2 (en) * 2015-05-27 2017-11-28 First Principles, Inc. System for recharging remotely controlled aerial vehicle, charging station and rechargeable remotely controlled aerial vehicle, and method of use thereof
MX2018005714A (en) * 2015-11-08 2019-08-16 Agrowing Ltd A method for aerial imagery acquisition and analysis.
US20170231213A1 (en) * 2016-02-17 2017-08-17 International Business Machines Corporation Pest abatement utilizing an aerial drone
US9807996B1 (en) 2016-05-28 2017-11-07 Simon Siu-Chi Yu Bug eater
JP6410993B2 (en) * 2016-05-31 2018-10-24 株式会社オプティム Drone flight control system, method and program
US9856020B1 (en) 2016-07-27 2018-01-02 International Business Machines Corporation Drone-based mosquito amelioration based on risk analysis and pattern classifiers
CN107094734A (en) 2017-04-01 2017-08-29 史德成 Energy automatic identification and the laser mosquito killer killed off the insect pests
CN107041349A (en) 2017-04-06 2017-08-15 南京三宝弘正视觉科技有限公司 A kind of Pest killing apparatus and system
CN106940734A (en) 2017-04-24 2017-07-11 南京信息工程大学 A kind of Migrating Insects monitor recognition methods and device in the air
JP6512672B2 (en) * 2017-12-25 2019-05-15 みこらった株式会社 Pest control device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040076583A1 (en) * 2002-07-15 2004-04-22 Baylor College Of Medicine Method for indentification of biologically active agents
US20050025357A1 (en) * 2003-06-13 2005-02-03 Landwehr Val R. Method and system for detecting and classifying objects in images, such as insects and other arthropods
US20180204321A1 (en) * 2012-07-05 2018-07-19 Bernard Fryshman Object image recognition and instant active response with enhanced application and utility
US20180046872A1 (en) * 2016-08-11 2018-02-15 DiamondFox Enterprises, LLC Handheld arthropod detection device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3830755A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20220072429A (en) * 2020-11-25 2022-06-02 유한회사 평화스테인레스 Pest Control System Based on OLED and Big Date
KR102531690B1 (en) * 2020-11-25 2023-05-12 유한회사 평화스테인레스 Pest Control System Based on OLED and Big Date

Also Published As

Publication number Publication date
TW202022698A (en) 2020-06-16
CN112513880A (en) 2021-03-16
US20210251209A1 (en) 2021-08-19
KR20210035252A (en) 2021-03-31
CA3105655A1 (en) 2020-02-06
AU2019313665A1 (en) 2021-01-28
AR115817A1 (en) 2021-03-03
BR112021001634A2 (en) 2021-05-04
JP2021531806A (en) 2021-11-25
EP3830755A4 (en) 2022-05-18
EP3830755A1 (en) 2021-06-09
IL260844B (en) 2019-09-26
US12063920B2 (en) 2024-08-20

Similar Documents

Publication Publication Date Title
US12063920B2 (en) System and method for locating and eliminating insects
US20210199973A1 (en) Hybrid reality system including beacons
US9811764B2 (en) Object image recognition and instant active response with enhanced application and utility
US10147177B2 (en) Object image recognition and instant active response with enhanced application and utility
CN108141579B (en) 3D camera
US10861239B2 (en) Presentation of information associated with hidden objects
US10026165B1 (en) Object image recognition and instant active response
US8111289B2 (en) Method and apparatus for implementing multipurpose monitoring system
US20190096058A1 (en) Object image recognition and instant active response with enhanced application and utility
CN107836012A (en) Mapping method between projection image generation method and its device, image pixel and depth value
McNeil et al. Autonomous fire suppression system for use in high and low visibility environments by visual servoing
US10650284B2 (en) Induction system for product authentication
EP3455827B1 (en) Object image recognition and instant active response with enhanced application and utility
Fehlman et al. Mobile robot navigation with intelligent infrared image interpretation
JP2021140561A (en) Detection device, tracking device, detection program, and tracking program
US20230342952A1 (en) Method for coordinative measuring by terrestrial scanning with image-based interference detection of moving objects
WO2012091537A1 (en) System and method for navigation and visualization
WO2024105676A1 (en) Unmanned aerial vehicle for neutralizing insects
Zhang et al. Jellyfish: non-radio frequency counter drone technology
Fayed Computer-Based Stereoscopic Parts Recognition for Robotic Applications

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19843552

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 3105655

Country of ref document: CA

ENP Entry into the national phase

Ref document number: 2021504834

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2019313665

Country of ref document: AU

Date of ref document: 20190724

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112021001634

Country of ref document: BR

ENP Entry into the national phase

Ref document number: 20217005289

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2019843552

Country of ref document: EP

Effective date: 20210301

ENP Entry into the national phase

Ref document number: 112021001634

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20210128