WO2020026230A1 - System and method for locating and eliminating insects - Google Patents
System and method for locating and eliminating insects Download PDFInfo
- Publication number
- WO2020026230A1 WO2020026230A1 PCT/IL2019/050839 IL2019050839W WO2020026230A1 WO 2020026230 A1 WO2020026230 A1 WO 2020026230A1 IL 2019050839 W IL2019050839 W IL 2019050839W WO 2020026230 A1 WO2020026230 A1 WO 2020026230A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- insect
- image
- space
- location
- images
- Prior art date
Links
- 241000238631 Hexapoda Species 0.000 title claims abstract description 344
- 238000000034 method Methods 0.000 title claims abstract description 61
- 230000033001 locomotion Effects 0.000 claims description 44
- 238000004891 communication Methods 0.000 claims description 14
- 230000001276 controlling effect Effects 0.000 claims description 10
- 238000004458 analytical method Methods 0.000 claims description 7
- 230000002596 correlated effect Effects 0.000 claims description 7
- 238000003032 molecular docking Methods 0.000 claims description 3
- 230000003252 repetitive effect Effects 0.000 claims description 3
- 239000000126 substance Substances 0.000 description 18
- 238000005286 illumination Methods 0.000 description 16
- 241000255925 Diptera Species 0.000 description 14
- 241000607479 Yersinia pestis Species 0.000 description 14
- 230000015654 memory Effects 0.000 description 13
- 230000000007 visual effect Effects 0.000 description 12
- 239000000463 material Substances 0.000 description 8
- 230000007246 mechanism Effects 0.000 description 7
- 238000001514 detection method Methods 0.000 description 6
- 230000008030 elimination Effects 0.000 description 6
- 238000003379 elimination reaction Methods 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 5
- 241000282412 Homo Species 0.000 description 4
- 239000002775 capsule Substances 0.000 description 4
- 244000062645 predators Species 0.000 description 4
- 238000013459 approach Methods 0.000 description 3
- 230000000670 limiting effect Effects 0.000 description 3
- 239000000853 adhesive Substances 0.000 description 2
- 230000001070 adhesive effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 239000000428 dust Substances 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000010291 electrical method Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 239000002245 particle Substances 0.000 description 2
- 239000012781 shape memory material Substances 0.000 description 2
- 241000256113 Culicidae Species 0.000 description 1
- 241000196324 Embryophyta Species 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 239000010632 citronella oil Substances 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000007717 exclusion Effects 0.000 description 1
- 239000004744 fabric Substances 0.000 description 1
- 239000007789 gas Substances 0.000 description 1
- 231100001261 hazardous Toxicity 0.000 description 1
- 239000000077 insect repellent Substances 0.000 description 1
- 239000002917 insecticide Substances 0.000 description 1
- 238000003698 laser cutting Methods 0.000 description 1
- 231100000518 lethal Toxicity 0.000 description 1
- 230000001665 lethal effect Effects 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 230000007787 long-term memory Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000010297 mechanical methods and process Methods 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 239000000575 pesticide Substances 0.000 description 1
- 239000004033 plastic Substances 0.000 description 1
- 239000000843 powder Substances 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 230000006403 short-term memory Effects 0.000 description 1
- 239000007921 spray Substances 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M1/00—Stationary means for catching or killing insects
- A01M1/02—Stationary means for catching or killing insects with devices or substances, e.g. food, pheronones attracting the insects
- A01M1/026—Stationary means for catching or killing insects with devices or substances, e.g. food, pheronones attracting the insects combined with devices for monitoring insect presence, e.g. termites
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M1/00—Stationary means for catching or killing insects
- A01M1/06—Catching insects by using a suction effect
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M1/00—Stationary means for catching or killing insects
- A01M1/14—Catching by adhesive surfaces
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M1/00—Stationary means for catching or killing insects
- A01M1/20—Poisoning, narcotising, or burning insects
- A01M1/2022—Poisoning or narcotising insects by vaporising an insecticide
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M1/00—Stationary means for catching or killing insects
- A01M1/22—Killing insects by electric means
- A01M1/223—Killing insects by electric means by using electrocution
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M5/00—Catching insects in fields, gardens, or forests by movable appliances
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
Definitions
- the present invention is in the field of pest control, specifically, using computer vision to detect, locate and eliminate pests, such as flying insects.
- a system using an image sensor with a magnifying lens is used to detect pests in a typically agricultural setting, where the image sensor is moved or items are moved in view of the image sensor, to enable surveillance of a large area.
- Another system that uses an image sensor tracks flying insects in an area of interest defined by a camera and a retroreflective surface spaced apart from the camera.
- the need to employ a retroreflective surface in addition to a camera, renders this system obtrusive and cumbersome and thus, less likely to be widely installed in homes, offices and other urban spaces.
- Embodiments of the invention provide a system and method for detecting and locating pests, such as flying insects, typically in an in-door environment, to enable effortless and accurate action against pests, typically, in an enclosed environment.
- pests such as flying insects
- Systems according to embodiments of the invention include a camera and processor to detect and locate pests from images obtained by the camera.
- the system may operate from a single housing, which includes the camera, and does not require additional elements separate from the single housing, to locate pests. Additionally, the camera of the system does not have to be attached to or embedded within a moveable platform in order to capture usable images.
- the system may be easily set up and unobtrusively located in a space such as a room in a house or office or public space such as a theater, a museum etc.
- Embodiments of the invention can distinguish an insect from noise and/or from non insect objects.
- the system can provide a mark visible to humans, to indicate a location of the insect in the room, for further action.
- Embodiments of the invention provide a variety of types of solutions for acting against pests detected and located from images of the space.
- FIG. 1A is a schematic illustration of a system for locating an insect in a space, according to an embodiment of the invention
- FIG. 1B is a schematic illustration of a method for detecting and locating an insect in a space, according to an embodiment of the invention
- FIGs. 2A and 2B are schematic illustrations of a system for locating an insect in a space, according to another embodiment of the invention.
- FIG. 2C is a schematic illustration of a method for detecting and locating an insect in a space, according to another embodiment of the invention.
- FIG. 3 is a schematic illustration of a system including a projector of a visual mark, according to an embodiment of the invention.
- FIGs. 4A and 4B are schematic illustrations of systems including an auxiliary device for handling an insect, according to embodiments of the invention.
- FIG. 4C is a schematic illustration of a method for controlling an auxiliary device for handling an insect, according to an embodiment of the invention.
- FIG. 5 is a schematic illustration of an auxiliary device for handling an insect, according to an embodiment of the invention.
- Fig. 6 is a schematic illustration of a method for detecting an insect in images of a space, according to an embodiment of the invention;
- FIG. 7 is a schematic illustration of a method for determining if an object in an image is an insect, according to an embodiment of the invention.
- FIG. 8 is a schematic illustration of a method for determining if an object in an image is an insect based on prior images, according to an embodiment of the invention.
- Embodiments of the invention provide systems and methods for detecting a location of one or more insect in an enclosed space, such as a room, and indicating the detected location of the insect in the space.
- Examples described herein refer mainly to insect pests, especially to flying insects, such as mosquitoes, however, embodiments of the invention may be used to locate other pests as well.
- a system 100 for detecting and locating an insect includes a camera 103 to obtain an image of a space, such as, room 104 or portion of the room 104.
- An insect 105 such as one or more mosquitos, may be in the room 104.
- the camera 103 which includes an image sensor and suitable optics, is in communication with a processor 102.
- Processor 102 receives an image of the room or portion of the room 104, obtained by camera 103, and detects the location of insect 105 in the image of the room. Based on the location of the insect 105 in the image, processor 102 generates a signal to enable creation of a location indicator, which is visible to a human eye, to indicate the location of the insect 105 in the room 104.
- the processor 102 may determine the location of the insect 105 in a space (e.g., room 104) based on an image of the space and may control a projector device to direct a light source to create an indication visible to a human eye, in vicinity of the location of the insect in the space.
- a space e.g., room 104
- a projector device to direct a light source to create an indication visible to a human eye, in vicinity of the location of the insect in the space.
- the location indicator is a visual mark 115 at the location of the insect 105 in the room 104.
- the visual mark 115 is created, in one embodiment, via projector 108 that projects a laser or other beam to the vicinity of the insect 105, in the room 104, forming, in vicinity of the location of the insect in the room, a visual mark 115.
- system 100 Some or all of the components of system 100 are attached to or enclosed within a housing 101.
- camera 103 and processor 102 may be both included within a single housing 101.
- some of the components of the system e.g., processor 102 are remotely located.
- Housing 101 which may be made of materials practical and safe for use, such as plastic and/or metal, may include one or more pivoting element such as hinges, rotatable joints or ball joints, allowing for various movements of the housing 101.
- housing 101 can be stationed at one location in room 104 but can enable several fields of view (FOV) to camera 103, which is encased within the housing 101 , by rotating and/or tilting the housing 101.
- FOV fields of view
- housing 101 typically provides stability for camera 103 such that the camera is not moved while obtaining images.
- the camera 103 is positioned such that its focal plane is parallel to a surface in the room 104.
- a surface in the room may include the floor or ceiling of the room or a wall or surface of a furniture in the room, etc.
- processor 102 detects the location of the insect 105 in the image on a surface in the room (e.g., on a wall, ceiling, surface of a furniture in the room, etc.) and generates a signal to enable creating the visual mark 115 at the location of the insect 105 on the surface.
- a surface in the room e.g., on a wall, ceiling, surface of a furniture in the room, etc.
- the processor 102 detects a stationary (e.g., not flying) insect in an image of the room and the visual mark 115 is formed or directed to the location of the stationary insect.
- a stationary insect e.g., not flying
- the processor 102 detects an alighting insect, e.g., the processor detects the insect flying and then settling down. The processor 102 then detects the location of the insect after alighting, e.g., after settling down, and the visual mark 115 is formed or directed to the location of the insect after alighting.
- the camera 103 may include an image sensor, e.g., an appropriate chip such as a CCD or CMOS chip and may be a 2D or 3D camera.
- the camera 103 may include lenses and/or other optics to enable obtaining an image of the room (or part of the room) 104.
- camera 103 includes an infrared (IR) sensitive sensor and/or may include lenses and/or filters to filter out other wavelengths to eliminate noise, to enable obtaining images of room 104 in special illumination conditions.
- system 100 may include an IR illumination source 106.
- IR illumination source 106 may include an LED or other illumination source emitting in a range of about 750-950nm. In one example illumination source 106 illuminates at around 850nm.
- IR illumination source 106 can enable use of system 100 even in a dark room by providing illumination that is not visible and/or irritating to the human eye but which enables camera 103 to obtain meaningful images of a dark room.
- Processor 102 may include, for example, one or more processors and may be a central processing unit (CPU), a digital signal processor (DSP), a microprocessor, a controller, a chip, a microchip, an integrated circuit (IC), or any other suitable multi-purpose or specific processor or controller.
- system 100 may include a warning device, e.g., a sound emitting device and/or a light source, such as a dedicated LED, and processor 102 may generate a warning signal, such as to cause a sound or light to be emitted, based on detection of the location of the insect.
- a warning device e.g., a sound emitting device and/or a light source, such as a dedicated LED
- processor 102 may generate a warning signal, such as to cause a sound or light to be emitted, based on detection of the location of the insect.
- processor 102 is in communication with one or more memory unit(s) 112.
- Memory unit(s) 112 may include, for example, a random access memory (RAM), a dynamic RAM (DRAM), a flash memory, a volatile memory, a non-volatile memory, a cache memory, a buffer, a short term memory unit, a long term memory unit, or other suitable memory units or storage units.
- Components of system 100 may be connected to each other wirelessly, e.g., via suitable network hubs, or via appropriate cabling or suitable ports such as USB.
- Memory 112 may further store executable instructions that, when executed by the processor 102, facilitate methods as described herein.
- the method for detecting and locating an insect in an enclosed space, includes the steps of obtaining an image of the space (1001), for example, room 104, and detecting a location of an insect in the image (1003).
- the location of the insect in the image is translated to real-world coordinates (1005) and a location indicator is created to indicate the real-world coordinates (1007).
- a signal is generated to notify a user.
- the signal may be sent (e.g., via Bluetooth, radio, etc.) to a user’s mobile device (such as the user’s mobile phone or to a dedicated device).
- the method includes detecting a stationary insect (e.g., an insect not flying and/or not changing locations in the space) in the image of the space and detecting the location of the stationary insect.
- a location indicator is created to indicate real-world coordinates of the stationary insect.
- the method includes detecting an alighting insect in images of the space and detecting the location of the insect after alighting.
- a location indicator is created to indicate real-world coordinates of the insect after alighting.
- the method includes projecting the location indicator (e.g., a beam of light visible to the human eye, such as, a visible light laser beam) to the location of the real-world coordinates in the space (1009) such that a visible mark is created at the location in space.
- the location indicator e.g., a beam of light visible to the human eye, such as, a visible light laser beam
- the beam of light is directed at the location on the surface such that a circle (or other shape) of light on the surface marks the location of the insect.
- the location of the insect in the image can be translated to real-world coordinates (step 1005) by using projective geometry, for example, if the focal plane of the camera obtaining the image is parallel to a surface in the space on which the insect is located.
- a system which includes an imager (e.g., camera 103) and projector (e.g., projector 108) may be pre-calibrated.
- the projector may be positioned in close proximity to the camera (for example see distance D described with reference to Fig. 3 below).
- a ray visible to the camera may be projected from the projector to several locations within the space and may be imaged by the camera at those locations.
- each location in the image e.g., each pixel or group of pixels
- can be correlated in real-time to an x,y coordinate in the space such that the projector can be directed to locations in the space based on locations detected in the image.
- using a ray visible to the camera can enable correcting the direction of the projector in real-time based on the visible indication.
- the projector includes one or more rotor to enable projection of a location indicator at different angles.
- each location in the image can be correlated to a, b coordinates of the rotor, based on pre-calibration.
- rotors may include a step motor, such that the change in angle is known for each step.
- One or more physical stops may be used such that the angles of the rotor, at the limits of its movement, are known.
- each pixel can be correlated to a known angle.
- the number of steps required to direct the rotor at each angle can be calculated. Since the projector is typically not located at the same location as the camera, the calculations may require adjustment to the distance between the projector and the camera.
- Other methods may be used to translate the location of the insect in the image to the real-world location.
- system 200 detects an insect, e.g., as described herein, and creates a location indicator, which is visible in an image of the room.
- processor 202 locates an insect 205 in an image 223 of the room and generates a signal to create a location indicator 225 in the image 223 at the location of the insect.
- the image 223 of the room is displayed together with the location indicator 225, which may be an icon or other graphic indication superimposed on the image 223.
- FIG. 2B An example of an image 223 of a room is shown in Fig. 2B.
- a location indicator 225 is superimposed on the image 223 to indicate to a user viewing image 223, the location of the insect on the ceiling 226.
- images obtained by camera 203 can be stored locally (e.g., in memory unit 212) and/or remotely (e.g., the images may be transmitted over the internet or by using another suitable wireless communication, to remote storage, e.g., on the cloud).
- the images may then be retrieved and displayed on a device 209, such as a personal and/or mobile device (e.g., smartphone, tablet, etc.) or on a dedicated, typically mobile, device.
- the image 223 of the room is an image of the room in real-time and the location indicator 225 is superimposed on the same image in which the location of insect 205 is detected.
- the image 223 of the room is manipulated such that certain details (such as personal, private and/or confidential information) are obscured or removed from the image.
- a real-time image (the same image in which insect 205 is detected) can be displayed without compromising privacy and/or confidentiality.
- the image 223 can be manipulated to protect privacy and/or confidentiality by processor 202 or by a different processor (e.g., a processor in device 209).
- a set of images of the room is obtained by camera 203.
- Camera 203 is not moved or repositioned while obtaining the set of images such that all the images capture the same field of view.
- a first image may be an image of the room 204 only, with no occupants, whereas a second image of the room 204 may be a real-time image of the room (possibly with occupants) in which an insect 205 is detected.
- only the first image is transmitted to device 209 to be displayed and the location of the insect 205 in the second image, is indicated and displayed on the first image, which is the image being displayed to the user.
- the first image (which typically does not include personal information) may be an image chosen by a user from a set of images of the room.
- the first image may be a modified or manipulated image of the room in which personal information is obscured by modifying the personal information in the image.
- the first image may be a representative image, which enables a user to understand the layout of the space being imaged but is not necessarily a real image of the space.
- a representative image may be created from a combination of several images of the space, typically obtained by camera 203.
- the representative image may be an average of several images from a set of images of the space.
- a representative image may include a graphic representation of the space but not the actually imaged components of the space.
- using an average image (or other representative image) as a first image may be useful in case the camera (e.g., camera 203) is repositioned between images, such that the images are not all of exactly the same field of view.
- a method for detecting and locating an insect includes visually marking a location of an insect in the space on an image of the space.
- An exemplary method which is schematically illustrated in Fig. 2C, includes obtaining a first image of a space (2001) and storing the first image (2003).
- the first image includes the space empty of occupants and/or in which personal information is obscured.
- a second image of the space is obtained (2005).
- the second image is of about the same field of view as the first image but is obtained at a later time than the first image.
- the second image includes an insect in the space.
- the location of the insect in the second image is determined (2007) and a location indicator (e.g., a graphic mark) is created to mark that location in an image of the space (2009).
- the location indicator marks the location on the same image in which the insect was detected.
- the location indicator marks the location on a different image of the room.
- the different image of the room may be an image captured at an earlier time, e.g., the first image of the room.
- the method includes accepting input from a user and determining which image to use as a first image (namely, which image to display together with the location indicator) based on the input from the user.
- a user can choose an image to send to storage and/or display, which does not include information which the user regards as personal or private.
- the method includes a step of creating a representative image of the space (e.g., an average image) and using the representative image as the first image.
- a representative image of the space e.g., an average image
- the first image is retrieved from storage and displayed to a user, e.g., on the user’s personal mobile device or on a dedicated device, with the location indicator superimposed on it, at the same location as in the second image (2011).
- a grid may be used on all the images of the space which are of the same field of view (or about the same field of view), such that a location of the insect in one image can be given x,y coordinates of the grid which are the same x,y coordinates in all the other images of the same field of view.
- a projector 308 may be controlled by processor 302 to project or direct a location indicator to the location of the insect in the real-world space, e.g., room 104.
- a projector 308 and a camera 303 are arranged in close proximity within housing 301.
- the projector 308 includes an indicator source, e.g., a light source, such as laser 316 and an indicator directing device 312, such as an optical system, including lenses and/or mirrors or other optical components to direct light from the light source in a desired direction or angle.
- the indicator directing device 312 includes rotating optical elements such as a mirror-bearing gimbal arranged to pivot about a single axis. A set of two or three such gimbals, one mounted on the other with orthogonal pivot axes, may be used to allow the light of laser 316 to be directed in any desired pitch, roll and yaw.
- processor 302 controls indicator directing device 312 such that the indicator, e.g., laser 316, is directed to the real-world location of the insect. For example, control of the yaw, and pitch of the gimbals of indicator directing device 312 enables directing an indicator, such as laser 316, to a real-world location.
- the indicator e.g., laser 316
- camera 303 is located at a minimal distance D from the projector 308 (or from components of the projector such as the laser and/or indicator directing device) to enable accurate aim of the indicator.
- camera 303 and laser 316 or indicator directing device 312 are located less than 20 cm of each other. In another example, camera 303 and laser 316 or indicator directing device 312 are located less than 10 cm of each other.
- the laser 316 may include visible light such that the mark created by the laser at the detected location of the insect is visible and can be imaged by camera 303 and displayed to a user, for example on device 209.
- a user may receive an image of a room with a visual indication of the location of the insect created by laser 316, in the image of the room.
- the projector 308 is configured to eliminate or incapacitate the insect 305.
- laser 316 may be a UV or IR or other light at high enough power such that when directed at an insect 305 on a surface in the room or at a stationary insect or at an insect after alighting, it may disable and/or kill insect 305.
- projector 308 which includes an indicator source, e.g., a light source, such as laser 316 and an indicator directing device 312 controlled by a processor, may be used in fields other than pest control.
- projector 308 may be used to produce visual effects, such as animation.
- projector 308 may be part of a toy.
- the processor controlling the directing device receives input from an image sensor and/or based on image processing and can be used in virtual reality games or other applications.
- projector 308 may be used as a directing device, for example, to direct users to a specific point in an enclosed or other space.
- directing device for example, to direct users to a specific point in an enclosed or other space.
- a few examples include:
- Some embodiments of the invention provide devices for handling insects, such as eliminating or incapacitating the insects.
- a device may also include an apparatus such as an additional camera and/or illumination source, to assist in confirming the insect, e.g., confirming the existence and/or type of insect in an image.
- the devices which are typically moveable, are controlled to approach a location of an insect in a space, such as an enclosed space, to handle the insect at close range, thereby limiting effects that may be hazardous to the surrounding space.
- devices for handling insects are devices controlled by systems for locating insects according to embodiments of the invention, however, in some embodiments, the devices for handling insects may be controlled by other systems.
- the systems as described above may include, in some embodiments, an auxiliary device to be used, together with the systems described herein, to eliminate and/or otherwise handle insects detected in images, according to embodiments of the invention.
- a system for detecting a location of an insect in a room includes a housing 401 which encases a camera 403 used to obtain an image of a space (such as a room in a house, office space and other public or private indoor spaces).
- Camera 403 is in communication with a processor 402 and memory 412, e.g., as described above.
- the system further includes an auxiliary device in communication with processor 402.
- the auxiliary device is an independently mobile device 415, which may be used to eliminate an insect or for other purposes, such as to remove, capture or analyze the insect, as further described in Fig. 5.
- the system described in Fig. 4A may also include a port 413, typically on housing 401, such as a docking station or other terminal for powering and/or loading the independently mobile device 415.
- the independently mobile device 415 is a flying device such as a drone.
- Independently mobile device 415 may be remotely controlled by processor 402.
- independently mobile device 415 may be in wireless communication (e.g., via Bluetooth, radio, etc.) with processor 402.
- the system schematically illustrated in Fig. 4A includes a camera 403 to obtain images of a space and a mobile device 415 that is separately mobile from the camera 403.
- the processor 402 may detect an insect in at least one of the images of the space obtained by camera 403 and may control the device 415 to move to vicinity of the insect, based on analysis of the images of the space.
- processor 402 controls the mobile device 415 to move to the vicinity of the insect, based on analysis of an image of the space having the insect and the mobile device 415 within a single frame.
- Processor 402 may control the mobile device 415 to move in a direct path from the camera 403 in the direction of the insect, wherein the direction to the insect can be estimated from the location of the image of the insect within the frame.
- processor 402 further controls movement of mobile device 415, such that it stays in the vicinity of the insect in the image, while guiding it away from the camera and towards the insect.
- processor 402 may periodically determine the angular distance of the mobile device 415 from the insect in the frame, which may be estimated using the distance, in pixels, between the two objects in the frame. If the determined angular distance is above a predetermined value, the processor 402 may calculate the distance and direction needed to move the mobile device 415 in order to bring it within the predetermined angular distance from the insect, and may cause the mobile device 415 to move the calculated distance in the calculated direction.
- an elimination distance may be a distance from which the device can effectively handle the insect, for example, the distance from which an insecticide can be effectively sprayed on the insect.
- a predetermined distance e.g., an elimination distance
- device 415 and/or member 426 may be controlled to eliminate the insect, e.g., by using chemical, mechanical or electrical methods.
- processor 402 estimates a direction of the insect from the camera 403 and controls the device to move approximately in that direction.
- determining whether an elimination distance was reached can be done by utilizing an additional camera on the mobile device 415 to obtain an image of the insect.
- the image of the insect may be analyzed (e.g. by comparing its size in the image to an expected size of this type of insect from the desired distance).
- a processor e.g., processor 402 or another processor, which may be attached to mobile device 415) may be in communication with a rangefinder or similar system (which may be attached to the mobile device 415 or at another location within the system) to determine, based on input from the rangefinder, whether an elimination distance was reached.
- determining whether an elimination distance was reached can be done by the mobile device 415 emitting light in a known direction (e.g.
- the location of the mobile device 415 relative to camera 403 is known (as described herein). Therefore the angle from the mobile device 415 to the location of the point of light is known.
- the angle from camera 403 to the location of the point of light can be calculated by detecting the pixel (or group of pixels) of the point in the image.
- the distance to the point of light can be triangulated, from which the distance of the mobile device 415 to the insect can be estimated, since the insect is often on the same surface as the point of light.
- mobile device 415 may include a projector to project a beam of a form of energy to vicinity of the insect, to create the point of light and/or to handle the insect. Additionally, mobile device 415 may include an additional camera (e.g., camera 503 in Fig. 5). The direction and/or distance of the mobile device 415 from an insect may be calculated (e.g., as described above) using the projector and/or additional camera of the mobile device 415. [0095] Once within the predetermined distance, mobile device 415 may use a member, possibly extendable from the device to the vicinity of the insect, e.g., to handle the insect, as described below.
- the auxiliary device is attached to housing 401 at attachment point 411 and may be in communication with a power source and/or reservoir within housing 401, via attachment point 411.
- the auxiliary device may include a handling tool, such as a moveable and typically extendible member 426, such as a telescopic arm. Member 426 may be controlled by processor 402 to extend from the housing 401 and move to the location of the insect to handle the insect at the location, for example, to capture or kill the insect, as described below.
- member 426 is a telescopic and/or deformable arm or spring made of, for example, shape memory material that is usually in a folded or coiled form and can be extended and moved to interact with the insect at the location of the insect, upon a signal from processor 402.
- Handling the insect may include using mechanical and/or chemical methods. In some cases, both mechanical and chemical means or methods are used to handle the insect.
- member 426 serves as a conduit for instruments or agents used to handle the insect.
- member 426 may include or may be in communication with a chamber containing a chemical substance (e.g., in the form of gas, liquid or powder) that can be sprayed at or dropped on the insect from a relatively close range, thereby limiting the effect of the chemical substance to the insect itself and not affecting the surrounding space.
- the chamber may contain a pesticide.
- the chamber may include a repellant such as citronella oil, which is a plant-based insect repellent.
- housing 401 includes a reservoir of the chemical substance. In other embodiments housing 401 stores capsules (or other containers) of the chemical substance, which can be loaded into the member 426.
- member 426 may include a nozzle attached to the distal end 427 of member 426.
- the member 426, carrying a nozzle may be directed to the location of the insect and a pulse or spray of a chemical substance (e.g., as described above) may be directed at the insect at close range via the nozzle.
- member 426 may include or may be in communication with a suction chamber to draw in and capture (and/or kill) the insect.
- member 426 may include an electrifying element by which to electrocute the insect.
- member 426 may include an adhesive element by which to capture (and/or kill) the insect.
- Member 426 does not have human or other predator characteristics and is therefore typically not identified by insects (such as mosquitoes) as humans or predators and can thus approach the insect and get within close range of the insect without scaring it off.
- an auxiliary device may include, for example, a projector (e.g., in addition to projector 108) to project a beam of any form of energy harmful or lethal to the insect to the location of the insect.
- a projector e.g., in addition to projector 108 to project a beam of any form of energy harmful or lethal to the insect to the location of the insect.
- a single projector e.g., projector 108 may be used to indicate a location of an insect and to project a beam to handle (e.g., incapacitate) the insect.
- a projector may be controlled by a signal generated from processor 102 to project a beam of a form of energy such as light, heat, and the like, to the location of the insect, to handle the insect.
- neural networks such as convolutional neural networks, or other computer vision software and algorithms are used to detect and identify details of the insect from an image or a plurality of images of the location.
- shape and/or motion and/or color detection algorithms may be used to determine the shape and/or color and/or movement pattern and/or other details of the insect.
- Movement pattern may include, for example, direction of movement, size of movement, velocity of movement, etc.
- processor 102 controls the auxiliary device based on the determination of the type of insect.
- a projector may be controlled to handle the insect only if it is a specific type of insect.
- an auxiliary device may include, for example, a tool to enhance the image of the room at the location of the insect.
- the system e.g., 100
- may include a camera e.g., in addition to camera 103 with optics to enable enhancing the location of the insect, for example, to confirm the existence and/or type of insect at the location, based on an enlarged image of the location.
- a long focus lens e.g., telephoto lens
- a long focus lens may be used to zoom-in on the location of the insect to enable seeing the shape or other details of the insect in better detail and focus.
- the additional camera may be directed and/or moved to the location of the suspected insect, for example, to confirm the existence and/or type of insect.
- a camera with a long-focus lens may be attached to or located on indicator directing device 312, e.g., on a gimbal, such that the enlarging optics can be moved in parallel to the indicator directing device, automatically directing the optics at the location of a suspected insect.
- differential analysis may be used to confirm a suspected insect and/or to detect an insect. For example, an area may be scanned at low resolution to detect a suspected insect, and the area of the suspected insect may then be analyzed at high resolution, e.g., to confirm the existence and/or type of insect. Using differential analysis of images enables to reduce processing, thereby providing a cost effective solution.
- camera 103 may obtain a wide FOV image of the room and an auxiliary device, such as an additional camera that enables zooming-in, obtains a detailed image of a portion of the room.
- Processor 102 can detect a location of a suspected insect in the wide FOV image of the room, direct the additional camera to the location of suspected insect (e.g., by controlling movement of the gimbals) and confirm the insect (e.g., confirm the existence and/or type of insect) in the detailed image of the portion of the room (the location of the suspected insect).
- a system for handling an insect may include an auxiliary illumination source to allow higher resolution imaging of a location of a suspected insect and to assist in confirming the insect.
- an illumination source which may also be attached to the gimbal such that it is moved in parallel to the indicator directing device, may be used, e.g., to obtain a brighter image.
- the illumination source may have a relatively short wavelength (e.g. blue light) so as to reduce the diffraction limit and allow higher resolution imaging of the suspected insect.
- the illumination source and the location indicator are the same element.
- processor 102 can control projector 108 to indicate the location of the confirmed insect and possibly control another auxiliary device to eliminate or otherwise handle the confirmed insect.
- auxiliary device such as an additional camera and/or additional illumination source
- auxiliary device such as an additional camera and/or additional illumination source
- a less powerful CPU may be used with camera 103, thereby providing a cost effective solution.
- a single camera may be used to provide images from which to detect a location of an insect or suspected insect and to magnify or otherwise enhance the image at the detected location.
- one optical element may be employed to image a large area (e.g., a room) and another optical element may be employed to image a small area within the large area (e.g., the detected location within the room).
- differential analysis may be used to locally enhance regions within an image of a large area, for example, to assist in identifying an insect.
- the tool to enhance the image of the room at the location of the insect may be controlled by processor 102.
- a method for eliminating, incapacitating or otherwise handling an insect, includes obtaining an image of a space (4001) and detecting a location of an insect in the image (4003). The location of the insect in the image is translated to real-world coordinates (4005).
- Processor 402 or another processor then controls an auxiliary device (such as independently mobile device 415 or member 426) based on the real-world coordinates. For example, the auxiliary device can be directed to the real-world coordinates (4007).
- an auxiliary device is only employed to eliminate or otherwise handle an insect if it is determined that there are no other susceptible objects that can be harmed by the action of the auxiliary device.
- Susceptible objects may include, for example, living beings (e.g., humans, pets, etc) and/or other objects or materials, such as paper or fabric or objects including such materials that can be harmed by the action of the auxiliary device.
- a method for eliminating an insect may include a step of determining if there is a living being (or object or material that may be harmed by the action of the auxiliary device) in the vicinity of the location of the insect and directing the auxiliary device at the real-world coordinates detected in step (4005) only if no living being (or object or material) is detected in vicinity of the insect.
- Existence of a living being in vicinity of location of the insect may be determined, for example, by, determining motion in the space. Motion above a predetermined size may indicate a person or other living being in the space. In one embodiment motion or a size of motion is determined by detecting changes over time in the images of the space.
- existence of a person or other living being (or specific object or material) in the space may be determined by using computer vision techniques, e.g., to detect from the image (e.g., an image obtained by camera 103 or an additional camera) a shape, color or other attribute of a person or object or material.
- computer vision techniques e.g., to detect from the image (e.g., an image obtained by camera 103 or an additional camera) a shape, color or other attribute of a person or object or material.
- a system for eliminating an insect in a room includes a camera to obtain an image of the room and a processor to detect a location of the insect in the image of the room.
- the processor detects, from the image of the room, an insect after alighting and/or an insect on a surface in a space.
- the processor may then translate the location of the insect (e.g., the insect after alighting) in the image to real- world coordinates and control an auxiliary device based on the real-world coordinates to eliminate or otherwise handle the insect.
- the processor may determine if there is a person (or other living being) or specific susceptible object or material, in vicinity of the insect and may control the auxiliary device to eliminate or otherwise handle the insect based on the determination.
- the processor may confirm the existence and/or type of the insect at the location and may control the auxiliary device to eliminate or otherwise handle the insect based on the confirmation of the existence and/or type of the insect at the location.
- the processor may control the camera or an additional camera to obtain an enlarged or more detailed image of the insect to confirm the existence and/or type of the insect at the location.
- the control of the auxiliary device which may be via wireless communication, can be, for example, control of a propulsion mechanism of the auxiliary device and/or control of a handling tool of the auxiliary device.
- FIG. 5 An example of an auxiliary device, which is independently mobile, is schematically illustrated in Fig. 5.
- device 515 is a flying device (e.g., drone) which includes a propulsion mechanism 525 to move the device without assistance and an insect handling tool 526, or, alternatively or in addition, including an attachment point configured to releasably receive and secure a handling tool to the device 515.
- a flying device e.g., drone
- propulsion mechanism 525 to move the device without assistance
- an insect handling tool 526 or, alternatively or in addition, including an attachment point configured to releasably receive and secure a handling tool to the device 515.
- Handling tool 526 may apply mechanical and/or chemical and/or electrical methods by which to handle an insect. In some embodiments the handling tool 526 applies both mechanical and chemical means or methods by which to handle the insect.
- handling tool 526 may include a suction chamber to draw in and capture (and/or kill) the insect.
- handling tool 526 may include an electrifying element by which to electrocute the insect.
- handling tool 526 may include an adhesive element by which to capture (and/or kill) the insect.
- Other electrical and/or mechanical solutions may be employed by handling tool 526.
- handling tool 526 may include, for example, a telescopic arm or deformable arm or spring made of, for example, shape memory material that can be in a folded or coiled form while device 515 is in transit and can be extended to interact with the insect upon a signal from processor 402.
- handling tool 526 may include a chamber containing a chemical substance (e.g., as described above) that can be sprayed at or dropped on the insect from a relatively close range, thereby limiting the effect of the chemical substance to the insect itself and not effecting the surrounding space.
- a chemical substance e.g., as described above
- port 413 includes a reservoir of the chemical substance to enable the device 515 to dock at the port, recharge and stock the handling tool 526 with the chemical substance.
- port 413 stores capsules (or other containers) of the chemical substance. A capsule can be loaded into the handling tool 526 while the device 515 is docking at port 413. A capsule may last several events of handling insects before being depleted, and may be replaced at port 413 when depleted.
- device 515 may include a combination of different handling tools and may use a combination of methods (e.g., chemical and/or mechanical) for handling insects.
- methods e.g., chemical and/or mechanical
- Device 515 does not have human or other predator characteristics and is therefore typically not identified by insects (such as mosquitoes) as a human or predator and can thus approach the insect and get within close range of the insect without scaring it off.
- device 515 is an aerial drone and the propulsion mechanism 525 includes a propeller mechanism suitable for aerial flight.
- the propulsion mechanism 525 includes a propeller mechanism suitable for aerial flight.
- Different types of independently mobile devices may have different types of propulsion mechanisms, or multiple types of propulsion mechanisms.
- a terrestrial drone may have a propulsion mechanism that includes a motor, transmission, and wheels.
- Device 515 typically includes a control circuit (not shown) in communication with a processor (e.g., processor 402) and is configured to receive input regarding location of an insect.
- a control circuit e.g., processor 402
- processor 402 e.g., processor 402
- device 515 may further include one or more sensors such as an image sensor (e.g., camera 503) and/or a distance sensor (such as a rangefinder).
- sensors such as an image sensor (e.g., camera 503) and/or a distance sensor (such as a rangefinder).
- device 515 is controlled to handle a stationary insect or an insect after alighting (e.g., an insect on a surface in a space).
- the device 515 or member 426 receives direction information (e.g., a vector) from processor 402, based on the detected location of the stationary insect and is propelled according to the received information.
- a distance sensor in device 515 (or member 426) can detect the distance of the device 515 (or member 426) from the insect (and/or from the surface) and stop propelling at a predetermined distance from the insect.
- device 515 may include a signal source (such as a light source or audio transmitter) to emit a signal that can be received and analyzed by processor 402 and may be used to estimate or calculate the distance of the device 515 or member 426 from the insect (and/or from the surface).
- a signal source such as a light source or audio transmitter
- device 515 may include a projector to project a visible mark to the vicinity of the insect.
- Processor 402 can then control the device 515 (e.g., to control handling tool 526) or member 426 based on the calculated distance.
- a dedicated image sensor attached to or within housing 401 can be used to capture an image of the insect (and possibly of the visible mark projected from a projector of device 515), which may be used to direct the device 515 or member 426 to the insect.
- the visual mark can be detected from an image obtained by camera 403 or by the dedicated camera and device 515 or member 426 and can thus be directed to the location of the visual mark as imaged.
- Using a device and/or extendable member controlled by a processor based on a location of an insect in an image enables accurate and environment friendly action to remove or eliminate pests such as flying insects.
- embodiments of the invention can distinguish an insect from noise, such as, electronic noise on the image sensor and/or ambient noise, such as dust particles in the space, variations in ambient illumination, reflections, etc. Additionally, a specific insect type (e.g., mosquito) can be differentiated from another insect type (e.g., fly).
- noise such as, electronic noise on the image sensor and/or ambient noise, such as dust particles in the space, variations in ambient illumination, reflections, etc.
- a specific insect type e.g., mosquito
- another insect type e.g., fly
- a method for differentiating between a target insect and a non-target insect object from images of a space.
- a target insect may be an insect, as opposed to a non-insect object (e.g., noise or other object) and/or a specific type of insect, as opposed to a different type of insect.
- the method which may be carried out by a system such as system 100, includes using multiple images to determine if an object in an image is a target insect.
- processor 102 may detect an object by comparing two (or more) images of the space and may determine that the object is a target insect based on a characteristic of the object in an image of the space. In some embodiments, an object is detected if it fulfills a predetermined criterion.
- camera 103 may capture an image (also named“current image”), from which it is desirable to determine if an insect is present in the space.
- Processor 102 may obtain a subtraction image by subtracting the current image of the space from a different, second, image of the space.
- the subtraction image highlights changes in the space since objects that have not changed (e.g. have not moved or have not changed position) in between images, do not typically show up in the subtraction image.
- Processor 102 may detect in the subtraction image an object having a predetermined criterion and determine that the object is a target insect.
- a device may be controlled based on the determination that an object is a target insect.
- two or more images of the space are compared, in order to detect an object which fulfills a predetermined criterion.
- a current image may be compared to a second image that was previously captured, to detect an object that is present in the current image but not in the previous image.
- the second image may include a representation of a plurality of images of the space.
- the second image may be an average (or other suitable statistical representation) of multiple images of the space.
- the second image may include a background image constructed using images of the space captured over time, by understanding constant and temporary elements in the images of the space, and constructing an image of the constant elements (e.g. walls and furniture, but not people and pets).
- FIG. 6 An example of this embodiment is schematically illustrated in Fig. 6.
- Two images of a space are obtained (step 602).
- the images are compared by subtraction, e.g., a current image, is subtracted from another image of the space to obtain a subtraction image (step 604).
- an object fulfilling a predetermined criterion is detected in the subtraction image.
- a predetermined criterion may relate to one or more characteristics of the object.
- a characteristic of the object may include size, shape, location in the subtraction image, color, transparency and other such attributes of the object in the subtraction image.
- a predetermined criterion may be, for example, a size range (e.g., in pixels), a specific shape (e.g., as determined by a shape detection algorithm applied on the subtraction image), a specific location or range of locations of the object within the subtraction image, specific colors (e.g., as determined by applying a color detection algorithm on the subtraction image), etc.
- Processor 102 determines if the object fulfilling the predetermined criterion is a target insect. For example, one or more characteristics of the object (such as, movement pattern, shape, color or transparency) may be determined and the object may be determined to be a target insect based on the determined characteristic. For example, mosquitoes are more transparent and are of lighter color than some other common insects, thus, in one example, in which the target insect is a mosquito, if the color of the pixels associated with the object are colors typical of mosquitoes the object would be determined to be a mosquito. In another embodiment, if an object is determined to have a certain level of transparency or to have a predetermined pattern of transparent areas, it may be determined to be a mosquito.
- characteristics of the object such as, movement pattern, shape, color or transparency
- the object may be determined to be a target insect based on the determined characteristic. For example, mosquitoes are more transparent and are of lighter color than some other common insects, thus, in one example, in which the target insect is a mosquito, if the color of the pixels
- Transparency of an object may be determined, for example, based on a known color of background in the space. If an object is determined to have the color of the background (e.g., if the background color is not a color typical of the target insect), the object may be determined to be partially transparent. In another example, different insects have different shapes, thus a target insect may be determined based on its shape in the subtraction image.
- an object may be detected from a plurality of images whereas detecting if the object fulfills a predetermined criterion and determining that the object is a target insect, are done from a single image.
- a same characteristic of an object may be used to detect an object fulfilling a predetermined criterion, in a first image and to determine if the object is a target insect, in the same image or in a second image.
- different characteristics are used to detect an object fulfilling a predetermined criterion in a first image and to determine if the object is a target insect in the same image or in a second image.
- a subtraction image may include several objects but only two that are within a predetermined size range.
- two objects are detected in the subtraction image.
- One or more characteristic(s), other than size, may be determined for the two objects, e.g., the color and/or transparency and/or movement pattern of the two objects may be determined and the objects may be determined to be target insects or not, based on their color and/or transparency and/or movement pattern.
- a high resolution image of the object may be obtained and the object can be determined to be a target insect based on the high resolution image.
- an object may be detected in a first image, e.g., in a subtraction image, possibly, based on its size or other characteristic, and may then be determined to be a target insect (or not) from a second image which is of higher resolution than the first image.
- characteristics such as color and/or movement may be spatially correlated. For example, if a number of pixels that are close to each other have properties indicative of a target insect, these pixels may be given more weight in determining the presence of a target insect, than a number of pixel having the same properties, but which are not closely grouped.
- several correlated characteristics or pixel properties e.g., same movement patterns and/or changes in illumination, detected in several locations in an image, may point to movement of a larger object and/or reflections, and may be assigned a lower weight in determining the presence of a target insect, than single and uncorrelated characteristics.
- Different weights may be assigned to characteristics (or pixels representing these characteristics) based on the behavior of the characteristic in a plurality of images. For example, a characteristic persisting over time is less likely to be noise and may therefore be assigned a higher weight.
- Machine vision techniques such as object detection algorithms, segmentation, etc., may be used to detect an object in images of the space (e.g., a subtraction image) and to determine the pixels associated with the object.
- a learning model may be applied on images of the space to determine that the object is a target insect.
- a learning model may be applied, for example, on the subtraction image to detect an object having a predetermined criterion and/or on a current image to determine if the object is a target insect.
- a learning model may be applied at other steps as well, such as integrating the various inputs (color, transparency, size, movement pattern, etc.) into a single decision of determining whether the object is a target insect.
- processor 102 If the object is determined to be a target insect (step 608), processor 102 generates a signal to control a device (step 610). If the object is not determined to be a target insect, another current image is obtained and processed.
- a device controlled based on the determination that an object is a target insect may include an auxiliary device, e.g., as described above.
- a device such as a projector of a light source
- may create a location indicator visible to a human eye e.g., visual mark 115.
- a method may include determining a real-world location of the target insect from the images of the space and controlling a device to create a location indicator visible to a human eye and indicative of the real-world location of the target insect.
- a device may be used to eliminate and/or otherwise handle target insects.
- a method may include determining a real-world location of the target insect from the images of the space and controlling a device to eliminate (or otherwise handle) the target insect at the real-world location.
- the device may include an auxiliary device for handling an insect, e.g., as described above.
- the device may include a projector to project a form of energy at the real-world location of the target insect.
- the device may include a remotely controlled independently mobile device and/or a telescopic arm and/or nozzle.
- an object e.g., the object detected in a subtraction image
- the object is tracked in multiple images of the space and to multiple locations in the space, and the object may be determined to be a target insect (or not) based on the tracking.
- a movement pattern of an object is detected and the object is determined to be a target insect (or not) based on the movement pattern.
- An object is detected in images of a space (step 702) and a movement pattern of the object is determined (step 704). If the movement pattern is similar to a predetermined pattern (step 706) then the object is determined to be a target insect (step 708). If the movement pattern is not similar to the predetermined movement pattern (step 706) then the object is not determined to be a target insect (step (710).
- a predetermined movement pattern will be a pattern consistent with a pattern expected from the target insect.
- a predetermined movement pattern can include an alighting pattern (e.g., flying and then settling down), which is typical of mosquitoes.
- the predetermined movement pattern can include predominantly a non-repetitive movement, since a predominantly repetitive motion is characteristic of an unintended motion (such as movement of a fan, wind-blown objects and/or electronic noise).
- a movement pattern can include a change in direction and a predetermined movement includes a change in direction at a specific angle or range of angles.
- mosquitoes often change direction at an angle less sharp than flies.
- a predetermined movement pattern may include a change of direction at an angle in a predetermined range.
- mosquitoes move more slowly than flies, thus, a predetermined movement pattern can include a specific velocity (or range of velocities).
- determining characteristics of objects may be more accurate when using multiple images and/or comparing images over time.
- a moving object such as an insect
- historical data may be used in determining if an object is a target insect. For example, determining if an object in a later captured image is a target insect, can be based on a weight assigned to pixels in an earlier captured image.
- an object is detected at a location in a first image (e.g., first current image) of a space (step 802). If it is determined that the object is not a target insect (step 804), then a first weight is assigned to pixels at that location (step 806). If it is determined that the object is a target insect (step 804), then a second weight is assigned to pixels at that location (step 808).
- a first image e.g., first current image
- a first weight is assigned to pixels at that location (step 806).
- a second weight is assigned to pixels at that location (step 808).
- An object is detected at a location in a second image (e.g., a second current image) (step 810) and the weights from steps 806 and 808 are assigned to the pixels of the second image based on their location in the second image.
- the object in the second image may then be determined to be a target insect (or not) based on the weighted pixels associated with the object in the second image (step 812).
- images of a space may include windows, a TV screen, a fan, reflections and more, which may create“noisy” areas in the images.
- Such noise may be detected, for example, by high variation in pixel values over time, by many false positives (e.g., falsely detected target insects), or by applying object detection algorithms to identify the objects likely to create noise (e.g., window, TV, etc.).
- characteristics of objects (or pixels representing these characteristics) detected in relatively “noisy” areas of an image may be assigned less weight than characteristics (or pixels) of objects detected in other areas of the image.
- characteristics (or pixels) of objects detected in an area of the image, in which a target insect was erroneously determined in past cases may be assigned less weight than characteristics (or pixels) detected in other areas of the image.
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Pest Control & Pesticides (AREA)
- Engineering & Computer Science (AREA)
- Insects & Arthropods (AREA)
- Wood Science & Technology (AREA)
- Zoology (AREA)
- Environmental Sciences (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Biodiversity & Conservation Biology (AREA)
- Ecology (AREA)
- Forests & Forestry (AREA)
- General Health & Medical Sciences (AREA)
- Toxicology (AREA)
- Health & Medical Sciences (AREA)
- Catching Or Destruction (AREA)
- Software Systems (AREA)
- Burglar Alarm Systems (AREA)
- Image Analysis (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
Abstract
Description
Claims
Priority Applications (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP19843552.1A EP3830755A4 (en) | 2018-07-29 | 2019-07-24 | System and method for locating and eliminating insects |
CA3105655A CA3105655A1 (en) | 2018-07-29 | 2019-07-24 | System and method for locating and eliminating insects |
AU2019313665A AU2019313665A1 (en) | 2018-07-29 | 2019-07-24 | System and method for locating and eliminating insects |
KR1020217005289A KR20210035252A (en) | 2018-07-29 | 2019-07-24 | Systems and methods for locating and removing insects |
BR112021001634-1A BR112021001634A2 (en) | 2018-07-29 | 2019-07-24 | system and method for locating and eliminating insects |
JP2021504834A JP2021531806A (en) | 2018-07-29 | 2019-07-24 | Systems and methods for locating and eliminating insects |
CN201980049892.5A CN112513880A (en) | 2018-07-29 | 2019-07-24 | System and method for locating and eliminating insects |
US17/259,205 US12063920B2 (en) | 2018-07-29 | 2019-07-24 | System and method for locating and eliminating insects |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IL260844 | 2018-07-29 | ||
IL260844A IL260844B (en) | 2018-07-29 | 2018-07-29 | System and method for locating and eliminating insects |
US201862743593P | 2018-10-10 | 2018-10-10 | |
US62/743,593 | 2018-10-10 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020026230A1 true WO2020026230A1 (en) | 2020-02-06 |
Family
ID=68069430
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IL2019/050839 WO2020026230A1 (en) | 2018-07-29 | 2019-07-24 | System and method for locating and eliminating insects |
Country Status (12)
Country | Link |
---|---|
US (1) | US12063920B2 (en) |
EP (1) | EP3830755A4 (en) |
JP (1) | JP2021531806A (en) |
KR (1) | KR20210035252A (en) |
CN (1) | CN112513880A (en) |
AR (1) | AR115817A1 (en) |
AU (1) | AU2019313665A1 (en) |
BR (1) | BR112021001634A2 (en) |
CA (1) | CA3105655A1 (en) |
IL (1) | IL260844B (en) |
TW (1) | TW202022698A (en) |
WO (1) | WO2020026230A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20220072429A (en) * | 2020-11-25 | 2022-06-02 | 유한회사 평화스테인레스 | Pest Control System Based on OLED and Big Date |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AT522373B1 (en) * | 2019-03-18 | 2023-04-15 | Univ Innsbruck | DEVICE FOR DISTURBING THE OPTICAL NAVIGATION ABILITY OF ORGANISMS |
US11176652B2 (en) * | 2019-04-05 | 2021-11-16 | Waymo Llc | High bandwidth camera data transmission |
US20220217962A1 (en) * | 2019-05-24 | 2022-07-14 | Anastasiia Romanivna ROMANOVA | Mosquito monitoring and counting system |
CN110674805B (en) * | 2019-10-11 | 2022-04-15 | 杭州睿琪软件有限公司 | Insect identification method and system |
TWI763099B (en) * | 2020-10-28 | 2022-05-01 | 李寬裕 | Optical Pest Control Equipment |
CN112674057A (en) * | 2021-01-08 | 2021-04-20 | 中国人民解放军海军航空大学 | Intelligent mosquito killing equipment and method |
CN114431773B (en) * | 2022-01-14 | 2023-05-16 | 珠海格力电器股份有限公司 | Control method of sweeping robot |
IL298319A (en) * | 2022-11-16 | 2024-06-01 | Bzigo Ltd | Unmanned aerial vehicle for neutralizing insects |
CN116391693B (en) * | 2023-06-07 | 2023-09-19 | 北京市农林科学院智能装备技术研究中心 | Method and system for killing longicorn |
JP7445909B1 (en) | 2023-08-21 | 2024-03-08 | 株式会社ヤマサ | Pest control systems and pest control programs |
US12022820B1 (en) * | 2023-10-11 | 2024-07-02 | Selina S Zhang | Integrated insect control system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040076583A1 (en) * | 2002-07-15 | 2004-04-22 | Baylor College Of Medicine | Method for indentification of biologically active agents |
US20050025357A1 (en) * | 2003-06-13 | 2005-02-03 | Landwehr Val R. | Method and system for detecting and classifying objects in images, such as insects and other arthropods |
US20180046872A1 (en) * | 2016-08-11 | 2018-02-15 | DiamondFox Enterprises, LLC | Handheld arthropod detection device |
US20180204321A1 (en) * | 2012-07-05 | 2018-07-19 | Bernard Fryshman | Object image recognition and instant active response with enhanced application and utility |
Family Cites Families (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4015366A (en) | 1975-04-11 | 1977-04-05 | Advanced Decision Handling, Inc. | Highly automated agricultural production system |
JP3002719B2 (en) * | 1996-07-19 | 2000-01-24 | 工業技術院長 | Environmental cleanliness measurement system using a small biological monitor |
US8400348B1 (en) * | 1999-05-14 | 2013-03-19 | Applied Information Movement and Management, Inc. | Airborne biota monitoring and control system |
US7057516B2 (en) | 2001-06-01 | 2006-06-06 | Dimitri Donskoy | Device and method for detecting localization, monitoring, and identification of living organisms in structures |
US7656300B2 (en) | 2003-06-16 | 2010-02-02 | Rønnau Development ApS | Pest control system |
JP2005021074A (en) * | 2003-07-01 | 2005-01-27 | Terada Seisakusho Co Ltd | Method and system for image processing counting |
US7286056B2 (en) | 2005-03-22 | 2007-10-23 | Lawrence Kates | System and method for pest detection |
JP5066575B2 (en) * | 2006-10-23 | 2012-11-07 | ダウ アグロサイエンシィズ エルエルシー | Bedbug detection, monitoring and control technology |
JP2008200002A (en) * | 2007-02-22 | 2008-09-04 | Matsushita Electric Works Ltd | Nocturnal insect-trapping system |
JP2010149727A (en) * | 2008-12-25 | 2010-07-08 | Aisin Aw Co Ltd | System and program for preventing entrance of insect into cabin |
US8705017B2 (en) | 2009-01-15 | 2014-04-22 | Tokitae Llc | Photonic fence |
BR112013009401A2 (en) * | 2010-10-17 | 2016-07-26 | Purdue Research Foundation | automatic monitoring of insect populations |
SG189915A1 (en) | 2010-10-29 | 2013-06-28 | Commw Scient Ind Res Org | A real-time insect monitoring device |
US9381646B1 (en) | 2012-07-05 | 2016-07-05 | Bernard Fryshman | Insect and other small object image recognition and instant active response with enhanced application and utility |
ES2763412T3 (en) * | 2011-11-09 | 2020-05-28 | Francois Gabriel Feugier | Pest control system, pest control method and pest control program |
US20150075060A1 (en) | 2013-02-12 | 2015-03-19 | Jody Arthur Balsam | Apparatus and method for detection of insects |
US20150085100A1 (en) * | 2013-09-26 | 2015-03-26 | Micholas Raschella | System for detection of animals and pests |
CN103914733B (en) | 2014-03-31 | 2016-09-28 | 北京市农林科学院 | A kind of pest trap counting assembly and number system |
JP6274430B2 (en) * | 2014-06-03 | 2018-02-07 | みこらった株式会社 | Pest capture and storage device and pest insecticide device |
JP6479364B2 (en) * | 2014-07-31 | 2019-03-06 | 近藤電子株式会社 | Poultry health diagnosis device |
US10568316B2 (en) | 2014-08-15 | 2020-02-25 | Monsanto Technology Llc | Apparatus and methods for in-field data collection and sampling |
DE202014007499U1 (en) | 2014-09-19 | 2014-11-03 | Florian Franzen | Largely autonomous mini-drone (UAV helicopter drone) for killing mosquitoes and other small airborne insects in buildings and outdoor areas used by humans |
US9693547B1 (en) | 2014-10-20 | 2017-07-04 | Jean François Moitier | UAV-enforced insect no-fly zone |
US10752378B2 (en) * | 2014-12-18 | 2020-08-25 | The Boeing Company | Mobile apparatus for pest detection and engagement |
JP2016136916A (en) * | 2015-01-29 | 2016-08-04 | シャープ株式会社 | Injurious insect expelling device, and injurious insect expelling method |
JP2016185076A (en) * | 2015-03-27 | 2016-10-27 | 三菱自動車工業株式会社 | Insect expelling device |
US9828093B2 (en) * | 2015-05-27 | 2017-11-28 | First Principles, Inc. | System for recharging remotely controlled aerial vehicle, charging station and rechargeable remotely controlled aerial vehicle, and method of use thereof |
MX2018005714A (en) * | 2015-11-08 | 2019-08-16 | Agrowing Ltd | A method for aerial imagery acquisition and analysis. |
US20170231213A1 (en) * | 2016-02-17 | 2017-08-17 | International Business Machines Corporation | Pest abatement utilizing an aerial drone |
US9807996B1 (en) | 2016-05-28 | 2017-11-07 | Simon Siu-Chi Yu | Bug eater |
JP6410993B2 (en) * | 2016-05-31 | 2018-10-24 | 株式会社オプティム | Drone flight control system, method and program |
US9856020B1 (en) | 2016-07-27 | 2018-01-02 | International Business Machines Corporation | Drone-based mosquito amelioration based on risk analysis and pattern classifiers |
CN107094734A (en) | 2017-04-01 | 2017-08-29 | 史德成 | Energy automatic identification and the laser mosquito killer killed off the insect pests |
CN107041349A (en) | 2017-04-06 | 2017-08-15 | 南京三宝弘正视觉科技有限公司 | A kind of Pest killing apparatus and system |
CN106940734A (en) | 2017-04-24 | 2017-07-11 | 南京信息工程大学 | A kind of Migrating Insects monitor recognition methods and device in the air |
JP6512672B2 (en) * | 2017-12-25 | 2019-05-15 | みこらった株式会社 | Pest control device |
-
2018
- 2018-07-29 IL IL260844A patent/IL260844B/en active IP Right Grant
-
2019
- 2019-07-19 TW TW108125706A patent/TW202022698A/en unknown
- 2019-07-19 AR ARP190102042A patent/AR115817A1/en not_active Application Discontinuation
- 2019-07-24 WO PCT/IL2019/050839 patent/WO2020026230A1/en unknown
- 2019-07-24 AU AU2019313665A patent/AU2019313665A1/en not_active Abandoned
- 2019-07-24 BR BR112021001634-1A patent/BR112021001634A2/en not_active IP Right Cessation
- 2019-07-24 US US17/259,205 patent/US12063920B2/en active Active
- 2019-07-24 CA CA3105655A patent/CA3105655A1/en active Pending
- 2019-07-24 KR KR1020217005289A patent/KR20210035252A/en active Search and Examination
- 2019-07-24 EP EP19843552.1A patent/EP3830755A4/en active Pending
- 2019-07-24 JP JP2021504834A patent/JP2021531806A/en active Pending
- 2019-07-24 CN CN201980049892.5A patent/CN112513880A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040076583A1 (en) * | 2002-07-15 | 2004-04-22 | Baylor College Of Medicine | Method for indentification of biologically active agents |
US20050025357A1 (en) * | 2003-06-13 | 2005-02-03 | Landwehr Val R. | Method and system for detecting and classifying objects in images, such as insects and other arthropods |
US20180204321A1 (en) * | 2012-07-05 | 2018-07-19 | Bernard Fryshman | Object image recognition and instant active response with enhanced application and utility |
US20180046872A1 (en) * | 2016-08-11 | 2018-02-15 | DiamondFox Enterprises, LLC | Handheld arthropod detection device |
Non-Patent Citations (1)
Title |
---|
See also references of EP3830755A4 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20220072429A (en) * | 2020-11-25 | 2022-06-02 | 유한회사 평화스테인레스 | Pest Control System Based on OLED and Big Date |
KR102531690B1 (en) * | 2020-11-25 | 2023-05-12 | 유한회사 평화스테인레스 | Pest Control System Based on OLED and Big Date |
Also Published As
Publication number | Publication date |
---|---|
TW202022698A (en) | 2020-06-16 |
CN112513880A (en) | 2021-03-16 |
US20210251209A1 (en) | 2021-08-19 |
KR20210035252A (en) | 2021-03-31 |
CA3105655A1 (en) | 2020-02-06 |
AU2019313665A1 (en) | 2021-01-28 |
AR115817A1 (en) | 2021-03-03 |
BR112021001634A2 (en) | 2021-05-04 |
JP2021531806A (en) | 2021-11-25 |
EP3830755A4 (en) | 2022-05-18 |
EP3830755A1 (en) | 2021-06-09 |
IL260844B (en) | 2019-09-26 |
US12063920B2 (en) | 2024-08-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12063920B2 (en) | System and method for locating and eliminating insects | |
US20210199973A1 (en) | Hybrid reality system including beacons | |
US9811764B2 (en) | Object image recognition and instant active response with enhanced application and utility | |
US10147177B2 (en) | Object image recognition and instant active response with enhanced application and utility | |
CN108141579B (en) | 3D camera | |
US10861239B2 (en) | Presentation of information associated with hidden objects | |
US10026165B1 (en) | Object image recognition and instant active response | |
US8111289B2 (en) | Method and apparatus for implementing multipurpose monitoring system | |
US20190096058A1 (en) | Object image recognition and instant active response with enhanced application and utility | |
CN107836012A (en) | Mapping method between projection image generation method and its device, image pixel and depth value | |
McNeil et al. | Autonomous fire suppression system for use in high and low visibility environments by visual servoing | |
US10650284B2 (en) | Induction system for product authentication | |
EP3455827B1 (en) | Object image recognition and instant active response with enhanced application and utility | |
Fehlman et al. | Mobile robot navigation with intelligent infrared image interpretation | |
JP2021140561A (en) | Detection device, tracking device, detection program, and tracking program | |
US20230342952A1 (en) | Method for coordinative measuring by terrestrial scanning with image-based interference detection of moving objects | |
WO2012091537A1 (en) | System and method for navigation and visualization | |
WO2024105676A1 (en) | Unmanned aerial vehicle for neutralizing insects | |
Zhang et al. | Jellyfish: non-radio frequency counter drone technology | |
Fayed | Computer-Based Stereoscopic Parts Recognition for Robotic Applications |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19843552 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 3105655 Country of ref document: CA |
|
ENP | Entry into the national phase |
Ref document number: 2021504834 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2019313665 Country of ref document: AU Date of ref document: 20190724 Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112021001634 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 20217005289 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2019843552 Country of ref document: EP Effective date: 20210301 |
|
ENP | Entry into the national phase |
Ref document number: 112021001634 Country of ref document: BR Kind code of ref document: A2 Effective date: 20210128 |