WO2023197042A1 - Système et procédé de tri d'arthropodes selon leur sexe - Google Patents

Système et procédé de tri d'arthropodes selon leur sexe Download PDF

Info

Publication number
WO2023197042A1
WO2023197042A1 PCT/AU2023/050307 AU2023050307W WO2023197042A1 WO 2023197042 A1 WO2023197042 A1 WO 2023197042A1 AU 2023050307 W AU2023050307 W AU 2023050307W WO 2023197042 A1 WO2023197042 A1 WO 2023197042A1
Authority
WO
WIPO (PCT)
Prior art keywords
chamber
sorting
sex
arthropod
behavioural
Prior art date
Application number
PCT/AU2023/050307
Other languages
English (en)
Inventor
Brendan TREWIN
Xiaobei WANG
Stephen GENSEMER
Maciej GOLEBIEWSKI
Original Assignee
Commonwealth Scientific And Industrial Research Organisation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2022901010A external-priority patent/AU2022901010A0/en
Application filed by Commonwealth Scientific And Industrial Research Organisation filed Critical Commonwealth Scientific And Industrial Research Organisation
Publication of WO2023197042A1 publication Critical patent/WO2023197042A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M1/00Stationary means for catching or killing insects
    • A01M1/02Stationary means for catching or killing insects with devices or substances, e.g. food, pheronones attracting the insects
    • A01M1/026Stationary means for catching or killing insects with devices or substances, e.g. food, pheronones attracting the insects combined with devices for monitoring insect presence, e.g. termites
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K67/00Rearing or breeding animals, not otherwise provided for; New or modified breeds of animals
    • A01K67/033Rearing or breeding invertebrates; New breeds of invertebrates
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M1/00Stationary means for catching or killing insects
    • A01M1/22Killing insects by electric means
    • A01M1/226Killing insects by electric means by using waves, fields or rays, e.g. sound waves, microwaves, electric waves, magnetic fields, light rays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/62Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/70Labelling scene content, e.g. deriving syntactic or semantic representations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M1/00Stationary means for catching or killing insects
    • A01M1/06Catching insects by using a suction effect
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M1/00Stationary means for catching or killing insects
    • A01M1/08Attracting and catching insects by using combined illumination or colours and suction effects
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M1/00Stationary means for catching or killing insects
    • A01M1/22Killing insects by electric means
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M2200/00Kind of animal
    • A01M2200/01Insects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Definitions

  • the present disclosure relates to sex sorting of arthropods.
  • the present disclosure relates to a computer vision based tracking and targeting system for sex based sorting of arthropods.
  • Vector borne diseases are a leading cause of mortality and morbidity throughout tropical regions.
  • the Aedes aegypti and Aedes albopictus mosquito species are both vectors for the dengue, chikungunya, Zika and yellow fever viruses.
  • Many Anopheles species of mosquito are vectors for malaria.
  • Other arthropods species such as the Mediterranean fruit fly (( 'eralilis capitata), moths, and beetles represent threats to agriculture. There is thus great interest in controlling populations of many arthropod species.
  • IIT incompatible insect technique
  • SIT sterile insect technique
  • Wolbachia bacteria naturally occur in many arthropod species ( ⁇ 60%) and it has been discovered that some newly established infections prevent (or at least significantly suppresses) replication of virus in infected mosquitoes. Further, when males infected with a Wolbachia bacteria strain mate with wild females that don’t contain the same Wolbachia strain or contain a different Wolbachia strain to the males, the resultant eggs are infertile in a process called cytoplasmic incompatibility. In other approaches males may be rendered infertile though genetic engineering, chemical or radiation treatments. [0005] A problem with such IIT (or SIT) systems is that they require robust mass rearing of mosquitoes and highly accurate sorting of females from males.
  • a problem with existing sorting systems is that they are often manual, have low sorting accuracy (i.e., high contamination), are slow to sort sexes, impact male fitness and/or have high wastage rates (e.g., many dead pupae or adult male mosquitoes).
  • a computer vision based three-dimensional tracking and targeting method for sex based sorting of arthropods comprising: capturing a stream of video images either from each of at least two image sensors wherein the at least two image sensors are spaced apart and each have a respective field of view that share an overlapping region including at least a portion of a targeting chamber, or from a single image sensor and a mirror arrangement configured such that a field of view of the single image sensor is split into two portions where each portion has a respective field of view that share an overlapping region including at least a portion of a targeting chamber, wherein a plurality of arthropods move through the targeting chamber from an entry aperture to an exit aperture and an air flow system is configured to direct air flow through the targeting chamber; tracking a three dimensional position of one or more arthropods using the stream of video images; determining a sex of an arthropod by identifying the arthropod in at least one image in the stream of video images and using
  • the method further comprises generating a tracking stream of images and generating a classification stream of images from the stream of video images wherein each of the tracking streams are synchronised and each image in the tracking stream is down converted to a predefined lower resolution image size for tracking, and, and the step of tracking a three dimensional position is performed using the lower resolution tracking stream of images and the step of determining a sex of an arthropod is performed using the classification stream of images by identifying one or more arthropods in an image in the classification stream of images and using a trained machine learning classifier to estimate the sex of each of the identified one or more arthropods, and the steps of tracking a three dimensional position and determining a sex of an arthropod are performed in parallel.
  • the at least two image sensors comprises at least four image sensors divided into two sets of at least two image sensors, wherein the first set is used to generate a tracking stream of images and the second set is used to generate a classification stream of images, and the stream of images from each of the image sensors in the first set are synchronised, and images in the tracking stream of a lower resolution than the images in the classification stream, and the step of tracking a three dimensional position is performed using the lower resolution tracking stream of images and the step of determining a sex of an arthropod is performed using the classification stream of images by identifying one or more arthropods in an image in the classification stream of images and using a trained machine learning classifier to estimate the sex of each of the identified one or more arthropods, and the steps of tracking a three dimensional position and determining a sex of an arthropod are performed in parallel.
  • tracking the three dimensional position of one or more arthropods comprises: generating a first subtracted image captured at a first time by comparing a first image at the first time with at least the previous frame from a first tracking stream of images; identifying one or more arthropods in the first subtracted image and estimating a first position for each identified arthropod; generating a second subtracted image captured at the first time by comparing a second image at the first time with at least the previous frame from the second tracking stream of images; identifying one or more arthropods in the second subtracted image and estimating a second position for each identified arthropod; determining a three dimensional position for each identified arthropod by using a ray-tracing method that uses the position of the first sensor and the first position to generate a first ray, and the position of the second sensor and the second position to generate a second ray, and the three dimensional position is determined based on an identifying an intersection point of the first ray and the second ray within the targeting chamber;
  • determining a sex of an arthropod comprises: identifying one or more arthropods in each image in the classification stream of images and generating a candidate arthropod image by cropping the image to a predetermined classification image size around an identified arthropod; and providing each respective candidate arthropod image to the trained machine learning classifier and obtaining an estimate of the sex of the arthropod and storing the estimate of the sex of the arthropod with the identifier of the arthropod from the tracking database.
  • the directed energy beam is a laser and mirror galvanometer system wherein the orientation of the mirror galvanometer is controlled to direct the laser beam at the predicted target position of the identifier of the arthropod wherein the estimated sex is matched with a target sex from the tracking database, and the laser is fired at the target position at the future time.
  • the targeting chamber is illuminated by a lighting system which is provided on each side of the targeting chamber distal from an image sensor to backlight the targeting chamber with respect to the image sensor.
  • the method further comprises behaviourally sorting the arthropods using one or more sequential behavioural sorting chambers, each having an exit aperture and providing the behaviourally sorted arthropods to the entry aperture of the targeting chamber, wherein the one or more sorting chambers comprise: one or more lures wherein the air flow system is controlled to direct air into or out of the exit chamber of each behavioural sorting chamber based on a type of hire to lure arthropods of the non-target sex through the one or more sequential behavioural sorting chambers.
  • At least one of the one or more lures is an audio lure or an optical lure located adjacent to the exit aperture of a behavioural sorting chamber and is repeatedly switched on and off, wherein the respective lure is switched on for a lure time period and when the respective hire is switched on the air flow system is configmed to provide negative air flow to suck air through the exit aperture so as to suck lured arthropods out of the sorting chamber, and the air is then either directed into a next behavioural sorting chamber or into the entry aperture of the targeting chamber if the behavioural sorting chamber is a last behavioural sorting chamber in a sequence of the one or more behavioural sorting chambers.
  • the air flow system is configured to provide zero or positive air flow into the exit aperture to prevent hired arthropods exiting the behavioural sorting chamber.
  • each of the at least one lure located adjacent the exit aperture is an audio hire which broadcasts an audio signal at a frequency or frequency range determined from a wingbeat frequency range of the target sex.
  • the one or more lures comprises at least one chemical attractant lure for the non-target sex located in a collection chamber after the targeting chamber, and the air flow system is configured to direct air through the targeting chamber and into each of the one or more behavioural sorting chambers via the associated exit aperture.
  • each behavioural sorting chamber comprises at least one hire within the chamber and each chamber further comprises: a swarm marker structure comprising a first colour portion comprising the exit aperture and a second colour portion surrounding the first colour portion wherein the second colour is a contrasting colour to the first colour, and the at least one lure within the chamber is mounted to the swarm marker structure.
  • the swarm marker is a tubular structure with a black or dark colour that projects into the behavioural sorting chamber from a base wall of the sorting chamber which is connected to a containment structure with a white or light colour, wherein the exit aperture is located at a distal end of the tubular structure with respect to the base a wall of the behavioural sorting chamber.
  • the method further comprises sieving pupae through a sieving apparatus and into a collection container, wherein the sieve apparatus comprises a plurality of apertures with a shape and size configured to preferentially allow pupae with a first sex to pass through the sieving apparatus and to retain pupae of a target sex, wherein the collection container is further connected to a first behavioural sorting chamber of the at least one sorting chamber to receive a plurality of adult arthropods emerging from the sieved pupa.
  • a sex based arthropod sorting system comprising: a targeting chamber comprising an entry aperture and an exit aperture; an air flow system configured to directs air flow through the chamber from the entry aperture to the exit aperture; at least two image sensors wherein the at least two image sensors are spaced apart and each have a respective field of view that share an overlapping region including at least a portion of the targeting chamber, or a single image sensor and a mirror arrangement configured such that a field of view of the single image sensor is split into two portions where each portion has a respective field of view that share an overlapping region including at least a portion of the targeting chamber; a directed energy system comprising a directed energy source and a targeting apparatus for directing a directed energy beam at a target position within the targeting chamber; and at least one processor and at least one memory, wherein the at least one memory comprises instructions for configuring the at least one processor to perform the computer vision tracking and targeting method for sex based sorting of arthropods according to the first aspect
  • the directed energy source is a laser
  • the directed energy beam is a laser beam
  • the targeting apparatus is a mirror galvanometer system wherein the orientation of the mirror galvanometer is controlled by the at least one processor to direct the laser beam at the predicted target position, and the laser is fired at the future time.
  • the method further comprises a lighting system comprising a pair of translucent walls provided on each side of the targeting chamber distal from an image sensor to backlight the targeting chamber with respect to the image sensor and which are illuminated by a lighting panel behind each translucent wall.
  • the method further comprises one or more sequential behavioural sorting chambers, each having an exit aperture and the exit aperture of a last sorting behavioural chamber is connected to the entry aperture of the targeting chamber, wherein the one or more behavioural sorting chambers comprise: one or more lures wherein the air flow system is controlled to direct air into or out of the exit aperture of each behavioural sorting chamber based on a type of lure to lure arthropods of the non-target sex through the one or more sequential behavioural sorting chambers.
  • At least one of the one or more lures is an audio lure or an optical lure located adjacent the exit aperture of a behavioural sorting chamber and is repeatedly switched on and off, wherein the respective lure is switched on for a lure time period and when the respective lure is switched on the air flow system is configured to provide negative air flow to suck air through the exit aperture from the chamber so as to suck lured arthropods out of the sorting chamber, and the air is then either directed into a next behavioural sorting chamber or into the entry aperture of the targeting chamber if the behavioural sorting chamber is a last behavioural sorting chamber in a sequence of the one or more sorting chambers.
  • the air flow system when the respective lure is switched off, is configured to provide zero or positive air flow into the exit aperture to prevent hired arthropods exiting the behavioural sorting chamber.
  • each of the at least one lure located adjacent the exit aperture is an audio hire which broadcasts an audio signal at a frequency or frequency range determined from a wingbeat frequency range of the target sex.
  • the one or more lures comprises at least one chemical attractant lure for the non-target sex located in a collection chamber after the targeting chamber, and the air flow system is configured to direct air through the targeting chamber and into each of the one or more behavioural sorting chambers via the associated exit aperture.
  • each behavioural sorting chamber comprises at least one hire within the chamber and each behavioural sorting chamber further comprises: a swarm marker structure comprising a first colour portion comprising the exit aperture and a second colour portion surrounding the first colour portion wherein the second colour is a contrasting colour to the first colour, and the at least one lure within the behavioural sorting chamber is mounted to the swarm marker structure.
  • the swarm marker is a tubular structure with a black or dark colour that projects into the behavioural sorting chamber from a base wall of the behavioural sorting chamber which is connected to a base structure with a white or light colour, wherein the exit aperture is located at a distal end of the tubular structure with respect to the base a wall of the behavioural sorting chamber.
  • the system further comprises a sieving apparatus which is configured to sieve pupae into a collection container, wherein the sieve apparatus comprises a plurality of apertures with a shape and size configured to preferentially allow pupae with a first sex to pass through the sieving apparatus and to retain pupae of a target sex, wherein the collection container is further connected to a first sorting chamber of the at least one sorting chamber to receive a plurality of adult arthropods emerging from the sieved pupa.
  • the system further comprises a collection chamber connected to the exit aperture of the targeting chamber to collect arthropods of the non-target sex after traversing the targeting chamber.
  • Figure 1 A is a schematic diagram of a modular system for sex based sorting of arthropods according to an embodiment
  • Figure IB is a flow chart of a method for sex sorting of arthropods using a computer vision system for tracking and targeting of arthropods with a target sex according to an embodiment
  • Figure 2 is a schematic diagram of a computer vision tracking and targeting method for sex based sorting of arthropods according to an embodiment
  • Figure 3 A is an exploded view of a targeting chamber according to an embodiment
  • Figure 3B shows a pair of images captured by a pair of orthogonal cameras with approximately orthogonal views of the targeting chamber according to an embodiment
  • Figure 3C is a schematic view illustrating the relative geometry of each camera with respect to the target chamber and ray traces used for estimating positions of mosquitoes according to an embodiment
  • Figure 4 A is a schematic illustration of generation of a subtracted image according to an embodiment
  • Figure 4B is a schematic illustration of an Xception architecture and associated input, middle layer and output images according to an embodiment
  • Figure 5A is a visualization of prediction error rates relative to mosquito movement within the target chamber where darker grey colours represents an increase in error (inaccuracy) relative to the velocity of mosquito movement, measured as the difference in mm between two video frames at 30fps according to an embodiment
  • Figure 5B is a plot showing a linear relationship between distance moved between video frames (x axis) and the predicted location of the mosquito to be targeted by the laser (y axis) according to an embodiment;
  • Figure 5C is a plot showing the Mean and Standard Deviation in target prediction inaccuracy compared to mosquito movement between video frames according to an embodiment;
  • Figure 5D is a schematic diagram of delays within the system when targeting a single mosquito with the laser tracking and targeting according to an embodiment
  • Figure 6A is a plot showing the proportion of total male (black) and female (white) mosquitoes emerging since time of hatching according to an embodiment
  • Figure 6B is a representation of a sieve according to an embodiment
  • Figure 7 is a representation of a swarm marker according to an embodiment
  • Figure 8A is schematic diagram of a modular integrated sex sorting apparatus according to one embodiment
  • Figure 8B is schematic diagram of the internal frame and chamber of the modular system shown in Figure 8A;
  • Figure 9A is schematic diagram of a modular integrated sex sorting apparatus according to one embodiment
  • Figure 9B is schematic diagram of a modular integrated sex sorting apparatus according to one embodiment
  • Figure 9C is schematic diagram of a modular integrated sex sorting apparatus according to one embodiment
  • Figure 10 is a schematic diagram of a chamber incorporating a curved rear wall and curved lighting panel according to an embodiment
  • Figure 11 is schematic diagram of a single camera system incorporating a mirror according to one embodiment.
  • FIG. 1 A there is shown a sex based arthropod sorting system 1 and Figure IB is a flow chart of a method for sex sorting of arthropods using a computer vision system for tracking and targeting of arthropods with a target sex according to an embodiment.
  • the sex based arthropod sorting system 1 is a modular system comprising a behavioural sorting module 40, a computer vision based tracking and targeting module 70 including a computing apparatus 60 and directed energy module 80, and a final collection module 98.
  • the system also includes a rearing module 10, a sieving module 20, a collection (emergence) container 30 which is connected to the behavioural sorting module 40.
  • arthropods could be collected into a collection container using other collection apparatus such as insect traps and connected or transferred to the behavioural sorting module 40.
  • An integrated air flow system controls the direction of air flow through the system (and various components) to direct or assist one-way movement of arthropods through the system.
  • the different modules could each be provided separately, or various combinations of modules may be provided as components for a sex based sorting system, for example the rearing module 10, the sieving module 20, and collection (emergence) container 30 may be provided in an integrated arthropod supply apparatus, and/or the behavioural sorting module 40, computer vision based tracking and targeting module 70 and final collection module 98 provided in a sex sorting apparatus, or a complete system comprising all modules may be provided.
  • the arthropods are mosquitoes
  • males are the selected sex or the collected sex (i.e., the sex to be collected or retained) and females are the target sex (the sex to be sorted/excluded and targeted by the directed energy source).
  • target sex the sex to be sorted/excluded and targeted by the directed energy source.
  • a rearing module 10 is used to rear larvae to the pupal stage for subsequent sieving.
  • This rearing module controls the temperature and humidity and provides food sources for larvae.
  • the rearing module may be configured to implement a standard rearing protocol. In one embodiment the rearing module is maintained at a constant temperature of 26°C ( ⁇ 1°C) and a humidity of 60 ( ⁇ 2%).
  • the rearing module comprises water bays supplemented with a food source to ensure that larvae fed to satiation. After around the 7 days from placing the eggs in the rearing module the larvae begin to pupate. Thus, after a predefined time period the larvae/pupae are collected and mechanically sorted using a sieve module 20.
  • the predefined time period at which collection is begun, and/or the duration of collection may be selected to bias the population of pupae towards male pupae.
  • Several trials were conducted based on rearing and sieving larva over a period of up to 17 days. The sieved larva was counted, and the sex determined and Figure 6A is a plot showing the proportion of total male (black) and female (white) mosquitoes emerging since time of hatching.
  • the predefined time period is 9 days
  • the collection period is 3 days (that is collection is performed on days 9 to 12) after which the rearing module is emptied and reset to begin a new rearing cycle.
  • the sieving module performs mechanical sorting of the reared larvae/pupae.
  • manual sorting is performed.
  • an automated sieving system may be used in which rearing containers are automatically poured into a sieve which is mechanically agitated using servomotors or similar electronically controlled motor arrangements (e.g., belts, cams, gears, etc.).
  • the reared larvae/pupae are forced through a lift/drop and lengthwise shake of the sieve apparatus through the water column of a collection container 30.
  • arthropod species there are morphological differences between male and female pupae.
  • the sieve apparatus 20 comprises a plurality of apertures 22 with a shape and size configmed to exploit morphological differences between male and female pupae. This preferentially allows male pupae (the first sex or the selected sex) to pass through the sieving apparatus and to retain female pupae (the target sex to remove).
  • the sieve has a slotted base structure with perpendicular struts 24 and the slots 22 are angled to form a groove (e.g., a regular trapezoid cross sectional profile or a curved cross-sectional profde) to enhance mechanical action.
  • FIG. 6B is a representation of a sieve apparatus 20 in one embodiment.
  • the slots 22 are formed between elongated struts 26 each with a regular trapezoidal profile 26 such that the slots also have a regular trapezoidal profile with the lower length smaller than the upper length.
  • the apertures are cut using a router to provide a curved profile.
  • a sieve was constructed from Aluminium with a slot gap of 0.9mm.
  • a female contamination rate that is the proportion of females passing through the sieve, of between 0 and 4.5% was achieved (with an average of 2.7%).
  • Larvae that do not reach pupation may be removed manually or passively through a second sieve system with a finer slot gap (to allow larvae, but not pupae through).
  • Larvae exhibit phototaxis and thus a lighting system may be used to repel the larvae through the second sieve system.
  • the male wastage rate which is the proportion of males that did not move through the sieve system and were classified as female varied between 18.5 % and 45.5% (with an average of 27%).
  • a sieving system can obtain a mosquito population with around 95-99% males and 1- 5% females.
  • the sensitivity, specificity and accuracy ranges were 0.91-0.99, 0.91-0.98 and 0.82-0.83 respectively.
  • Sensitivity measures the proportion of positives that are correctly identified. In this case it would be the proportion of males correctly sorted through the sieves.
  • Specificity measures the proportion of negatives that are correctly identified. In this case it would be the proportion of females correctly sorted by the sieve.
  • Accuracy combines both the sensitivity and specificity into one metric.
  • a collection (or emergence) container 30 where they may rest to allow adult mosquitoes to emerge and allowed to enter a behavioural sorting module 40.
  • a permanent connection may be provided between the collection container and the first behavioural sorting chamber 42 to allow emerging mosquitoes to fly into the behavioural sorting chamber 42.
  • the pupae are left in the collection container 30 for a predefined time period, such as 1-2 days and then any adult mosquitoes are placed in the behavioural sorting chamber 42. This may be achieved by removing a barrier between the chambers and/or using an air flow system to push or suck mosquitoes into the first behavioural sorting chamber 42 from the collection container.
  • an opening or barrier between the collection container 30 and the behavioural sorting chamber 42 may be periodically opened, such as every six horns and an air flow system used to push or suck emergent mosquitoes into the first behavioural sorting chamber 42 from the collection container.
  • the behavioural sorting module 40 comprises one or more behavioural sorting chambers 42 each having an exit aperture 52.
  • a single behavioural sorting chamber is used.
  • multiple behavioural sorting chambers arranged as a sequence or series to progressively reduce the female contamination rate.
  • the behavioural sorting chamber(s) comprise one or more lures and an air flow system 91 configured (or controlled) to direct air into or out of the exit chamber of each sorting chamber based on a type of lure to lure arthropods of the non-target sex through the one or more sequential sorting chambers.
  • Various lures may be used including audio, visual, and chemical attractant lures (include chemical attractants lures for both sexes).
  • a swarm marker may also be included in the chamber and integrated into the exit aperture.
  • each behavioural sorting chamber 42 comprises a swarm marker structure 50, an audio hire 44, and an air flow system 91.
  • swarm markers In the wild male mosquitoes tend to swarm (clump or aggregate) over specific landmarks known as swarm markers. These typically comprise of a central dark colour object surrounded by a contrasting light coloured area. The male mosquitoes continuous fly around the dark object and female mosquitoes that fly into or near the swarm are mated with.
  • the swarm marker 50 comprises a first colour portion 54 comprising an exit aperture 52 and a second colour portion 56 surrounding the first colour portion wherein the second colour is a contrasting colour to the first colour.
  • An embodiment of a swarm marker structure is further illustrated in Figure 7.
  • the swarm marker is a tubular structure with a black (although it could be a dark colour such as red, dark grey or navy blue) that projects into the sorting chamber from a base wall of the sorting chamber 42.
  • the base structure 56 is an annular disk embedded in or resting on the base wall and constructed of a white colour, although other light colours such as off-white, cream, or light yellow could be used.
  • the exit aperture 52 is located at a distal end of the tubular structure 54 with respect to the base a wall of the sorting chamber 42.
  • the tube 54 connects the exit aperture 52 to an aperture 53 (dashed white circle in Figure 7) in the base wall of the sorting chamber 42.
  • the exit aperture 52 of the swarm marker 50 thus acts as an exit aperture for the behavioural sorting chamber 42.
  • the swarm marker 50 is located in the centre of the base of the sorting chamber 42.
  • an audio hire 44 may be located adjacent the exit aperture of the swarm marker structure.
  • the audio lure is a speaker which plays an audio signal at a frequency or frequency range determined from a wingbeat frequency range or wingbeat tone of the female mosquito (target sex) 45.
  • a collection tube is located below and connected to the base of swarm marker and a suction system is configured to draw air from the sorting chamber through the exit aperture (swarm marker) and collection tube 58.
  • a bladeless fan 92 which is part of the air flow system 91, is embedded or located in the walls of the collection tube 58 just below the base of the swarm marker 56.
  • the air flow system is configured to suck air through the exit aperture (e.g., a negative pressure air flow) so as to pull/suck the male mosquitoes (non-target arthropods) through the exit aperture of the swarm marker 50 and into the collection tube.
  • the air flow is changed such that the air flow system provides zero or positive air flow into the exit aperture to prevent the lured mosquitoes exiting the sorting chamber.
  • fan 92 in the walls of the collection tube 58 may be switched on when the audio hire 44 is switched on.
  • the air flow system 91 and audio lure may be configured to switch state at the same time, or a delay may be introduced such that the air flow system (e.g., turn fan 92 on) starts, or switches state, a short time after the audio lure (e.g., l-2seconds) is started.
  • the air flow system may change state (e.g., turn fan 92 off) at the same time the audio lure switches off, or a short time after the audio hire switches off.
  • the delay could shift both start and stop times by the same amount. In some embodiments alternative arrangements could be used.
  • the air flow system could be configured to create zero of positive air flow when the audio lure is on in order to allow a swarm to form around swarm lure and the audio lure, and when the audio lure is switched off, or just prior to the audio lure switching off, the air flow system is configured to create negative air flow to suck the swarmed mosquitos through the exit aperture.
  • a computer may be configmed to repeatedly switch on the audio hire 44 for a short time period (which we will define as a hire time period) to lure the male mosquitoes and to control the air flow system 92.
  • the air flow system 91 comprises one or more fans 92, blower 94, and a controller such as microprocessor 60' configured to switch the fans on or off and control the fan speed.
  • the air flow system may also comprise connective tubing such as collection tube 58 which connects the sorting chamber 42 to target chamber 72 and final collection chamber 98.
  • fans such as pump systems could be used to create controlled air flows.
  • the audio hire 44 and a first bladeless fan are synchronised so that the first fan is switched on for the hire time period to create a negative air flow in the chamber (to suck or draw lured mosquitoes to leave the chamber through the exit aperture) and then the first fan is switched off together with the audio hire for a second time period.
  • a second fan 92 adjacent the target chamber may also be used to create a negative pressure air flow from the connecting tube 58 into the target chamber 72. Note that when the first fan is creating negative airflow with respect to the exit aperture of the sorting chamber 42, this air flow is positive with respect to the second fan.
  • the operation of the first fan and second fan are synchronised and both spin in the same direction to draw arthropods through the connecting tube 58.
  • the second fan is switched on to draw arthropods through the connecting tube 58 towards the target chamber 72.
  • the second fan is continuously switched on, with the speed of the second fan varied based on the state of the first fan, for example speeding up when the first fan is off.
  • the lure time period and second time period are both 10 seconds. In other embodiments the lure time period and second time period are different with the hire time period being shorter than the second time period.
  • the lure time period may be a short time period, such as less than 30 seconds. This allows the mosquitoes to be pulled through the system in cohorts or batches.
  • the second fan may alternate operation with the first fan or may operate continuously with the first fan wherein the first fan is of sufficient power to override any negative airflow created by the second fan. In other embodiments the first and second fans could be combined, with the fan switching between creating positive air flow when the audio lure is off, and then reversing direction to create a negative air flow when the audio lure is on.
  • a single fan at the base of the exit aperture is switched off (no air flow) when the audio lure is off, and switched on to create a negative air flow through the exit aperture when the audio lure is on.
  • the computing apparatus 60 is a dedicated microprocessor board 60’ such as an chicken microprocessor board comprising processor 62 and memory 63, and is configmed to control the audio lure and the air flow system, including controlling the speed and timing of individual fans 92 to create desired air flows (e.g., direction and pressure).
  • the chicken microprocessor may be programmed or configured to provide synchronised control of the audio lure and fan.
  • Controlling the fan may include setting a fan speed such as by controlling a voltage level or using pulse width modulation (PWM) and controlling a delay between switching state of the audio lure and state of the fan 92.
  • the microcontroller may control ramp up/ramp down to a desired fan speed.
  • a female chemical attractant hire such as blood feeding pad or heat pad is located on the walls of the behavioural sorting chamber 42.
  • the blood feeding pad may be spiked with an insecticide to kill females.
  • Other forms of hires include visual lures and male attractant lures, and multiple lures or combinations of lures may be used such as an audio lure near the exit aperture and a second blood feeding pad (female chemical attractant) lure on the walls, or audio and visual lures.
  • Visual lures include light of a specific frequency or a frequency range (for example by using a light source and band pass filter) and/or lures of specific shapes including visual contrast patterns.
  • a male chemical attractant is placed in the final collection chamber 98 and the air flow system 92 is configured to gently blow the attractant through the system and thus create a positive air flow through the target chamber 72 and into the exit aperture 52 of each sorting chamber 42.
  • the behavioural sorting module 40 may use a multiple behavioural sorting chambers 42 connected in series by connection tubes 58. Using multiple behavioural sorting can thus further reduce female contamination rates. Using multiple chain behavioural sorting chambers can effectively reduce the female contamination rate. Further it only requires low power sufficient to run the microcontroller 60 and suction/fan system 92 making it suitable for use in areas and countries without access to reliable energy sources. For example, a 12V battery source can provide adequate power to run the behavioural sorting module 40.
  • a computer vision tracking and targeting module 70 This comprises a targeting chamber 72 comprising an entry aperture 73 and an exit aperture 74 and an air flow system 91 configured to direct air flow through the chamber from the entry aperture 73 to the exit aperture 74.
  • the air flow system 91 may include bladeless fans 92 located in connection tubes 58 prior to the entry aperture and after the exit aperture 74, as well as a blower 94 for removing immobilized mosquitoes from the system.
  • the air flow system may be controlled by the dedicated microprocessor board 60. Bladed fans and other suction or blower systems may be used to create directed controlled air flow through the system. Bladeless fans have an advantage over bladed fans as they reduce the likelihood of damaging mosquitoes flying through the system.
  • the computer vision tracking and targeting module 70 further comprises at least two image sensors 76 78 which are spaced apart and each has a respective field of view 77 79 that share an overlapping region including at least a portion of the targeting chamber 72, or a single image sensor 76 and a mirror arrangement 76' configmed such that a field of view 77 of the single image sensor is split into two portions where each portion has a respective field of view that share an overlapping region including at least a portion of the targeting chamber 72.
  • the at least two image sensors or single image sensor and mirror arrangement are configured to generate a stream of video images. Each output image from the single image sensor may be split to generate two images equivalent to the two images that would have been collected from the two image sensors 76 78.
  • the image sensors are cameras including one or more image sensors and associated optical assemblies including video cameras capable of capturing 4K video, and may be configured to capture colour or monochrome images.
  • a range of image sensors may be used including CCD, CMOS based image sensors with a range of optical assemblies, filters and sensor sizes (e.g., number of pixels).
  • the image sensor may be an integrated camera system such as a smart phone camera, or digital video camera and may comprise multiple image sensors each with an optical assembly, such that each image sensor has a different field of view, magnification range, and/or colour sensitivity.
  • a 4K blackfly camera may be used.
  • the two image sensors 76 78 are spaced apart and their fields of view a share an overlapping region including at least a portion of the targeting chamber 72 (i.e., an overlapping field of view).
  • the targeting chamber 72 is a rectangular box 7, with a first camera (including a first image sensor 76) located in front (and looking into) the targeting chamber 72 with a first field of view 77, and a second camera (including a second image sensor 78) located above (and looking down into) the targeting chamber 72 with a second field of view 79.
  • the two cameras are orientated substantially orthogonal (e.g., the angle between the two vectors is within 5° of 90°) so that the intersection of the field of views 77, 79 includes the entire targeting chamber 72.
  • additional cameras could be added to provide additional fields of view of the targeting chamber 72.
  • the computer vision tracking and targeting module 70 may comprise a single image sensor 76 (e.g. a single camera) and a mirror 76' configmed (or located) such that the field of view 77 of the image sensor 76 is split into two portions where each portion has a respective field of view that share an overlapping region including at least a portion the targeting chamber 72.
  • the second image sensor 78 is replaced with a mirror 76' to provide the single image sensor with a view through the top panel 72c of the chamber 72 in addition to view of the front panel 72d.
  • the field of view of the camera 77 thus simultaneously views the front panel 72d in a lower portion of the field of view and views the top panel 72c in an upper portion (upper and lower portions could alternatively be referred to as first and second portions).
  • the use of a mirror to replace the second camera 78 allows a reduction in the height of the housing 88 and may reduce the cost of the camera system, and/or allow the use of a single image sensor or camera with better performance characteristics such as having higher quality lenses, resolution, dynamic range, capture and download speeds, etc) compared to systems with two image sensors or cameras.
  • the computer vision tracking and targeting module 70 also includes a directed energy system comprising a directed energy source 80 and a targeting apparatus 82 for directing the directed energy beam 84 at a target position 86 within the targeting chamber 72 and a computing apparatus 60 comprising at least one processor 61 and at least one memory 62.
  • the at least one memory 62 comprises instructions for configuring the at least one processor 61 to perform a computer vision tracking and targeting method for sex based sorting of arthropods.
  • Figure 2 is a schematic diagram of an embodiment of a computer vision tracking and targeting module 60 for sex based sorting of arthropods implemented in the system shown in Figure 1 A.
  • the computer vision tracking and targeting module 60 is configured to capture a stream of video images either from each of at least two image sensors 76 78 or from the single image sensor 76 (e.g., a single camera) and a mirror 76'.
  • the at least two image sensors are spaced apart and a share an overlapping region including at least a portion of a targeting chamber 72, or a single image sensor and a mirror arrangement configured such that a field of view of the single image sensor is split into two portions where each portion has a respective field of view that share an overlapping region including at least a portion of the targeting chamber through which multiple arthropods (mosquitoes) move from the entry aperture 73 to the exit aperture 74 under assistance of an air flow system that directs air flow through the chamber from the entry aperture to the exit aperture.
  • arthropods mosquitoes
  • Each portion comprises a different set of pixels in the image, and thus the output image from the single image sensor may be split to generate two images equivalent to the two images that would have been collected from the two image sensors 76 78, or the system may be configured to store the respective pixel regions of the image sensor corresponding to the two fields of view such that each region can be processed separately, i.e. as separate images, as would be the case where two image sensors were used.
  • the tracking and targeting method further comprises tracking a three dimensional position of one or more arthropods using the stream of video images from each of the at least two image sensors 105, and determining a sex of an arthropod by identifying the arthropod in at least one image in the stream of video images and using a trained machine learning classifier to estimate the sex of the arthropod 106.
  • the method proceeds to predict a target position at a future time of an arthropod classified as a target sex (e.g. female) using the tracked three dimensional positions of the arthropod 107; and a directed energy beam 84 at the target position 86 at the future time to kill or incapacitate (e.g. maim/sterilise/immobilize) the target arthropod (female mosquito).
  • the directed energy source 80 is a laser
  • the directed energy beam 84 is a laser beam
  • the targeting apparatus 82 is a mirror galvanometer system.
  • a laser enclosure 88 is located around the directed energy system 80, cameras 76 78 and targeting chamber 72 to safely enclose and shield the system.
  • the orientation of the mirror galvanometer 82 is controlled by the computing apparatus 60 to direct the laser beam 84 at the predicted target position within the targeting chamber 86 at the future time.
  • other directed energy sources such as radio frequency sources including microwave and terahertz sources, and even X-ray sources may be used.
  • the laser is a 15 Watt laser with 1x3 mm spot beam.
  • high powered lasers such as a 35 W laser with a 6x6 mm spot can be used.
  • the laser energy can also be controlled using a PWM controller.
  • the sex sorted arthropods are then collected 109 in a collection chamber 98 which is connected to the exit aperture 74 of the targeting chamber 72 via a connecting tube to collect arthropods of the sorted (i.e., non-targeted) sex after traversing the targeting chamber 72.
  • the connecting tube may include a female (or target sex) collection chamber to collect immobilised (killed) mosquitoes hit by the directed energy beam.
  • Figure 2 is a schematic diagram of the computer vision tracking and targeting method for sex based sorting of arthropods implemented by the computing apparatus 60 according to an embodiment.
  • the computer vision tracking and targeting method comprises a three dimensional realtime tracking module 320 and an Artificial Intelligence/Machine Learning Classification module 210 which process the incoming stream of images in parallel.
  • the locations are provided to a targeting module 230.
  • the AI/ML classification module 210 processes a classification image stream and the 3D realtime tracking module processes a tracking image stream.
  • Each of the classification image stream and the tracking image stream is generated from at least two image sensors each in different locations, and thus each stream contains sub-streams from each image sensor, or in the case of a single image sensor and a mirror 76, each of the classification image stream and the tracking image stream is generated from each portion of the image corresponding to the two fields of view and thus each stream contains sub-streams from the two portions of the image sensor.
  • the output of an image sensor may be split to generate a tracking stream and classification stream, or additional image sensors may be used (e.g.
  • each image sensor in total
  • two image sensors e.g. cameras
  • the incoming image stream from each image sensor are split into the two image streams, one of which is passed to the classification image stream and the other to the tracking image stream. That is, a tracking stream of images is generated for each of the image sensors (e.g., cameras) and each of the tracking streams are synchronised. Further each image is down converted to a predefined lower resolution image size.
  • the incoming stream of images are 4K images (3840x2160 pixels) which are down converted 4 times to 1024 by 768 pixel images (e.g., XGA resolution) for tracking.
  • a classification image stream is generated for each of the image sensors (or cameras).
  • This second image stream (classification stream of images) comprises the input stream of full frame 4K images from the image sensors/cameras.
  • the tracking of the three dimensional position is performed using the tracking stream of images and determining a sex of an arthropod is performed in using the classification stream of (2D) images, and the steps of tracking a three dimensional position and determining a sex of an arthropod are performed in parallel.
  • image sensors are used and the output image stream of each image sensor is split to create the two image streams (classification and tracking) for that location.
  • additional image sensors are used to generate independent image streams to avoid splitting of output image streams.
  • four image sensors are used and are divided into two sets two image sensors. The first set is used only to generate the tracking image stream and the second set is used only for generating the classification image stream.
  • image sensors are paired such that an image sensor from the first set and an image sensor from the second set are co-located, with each pair at a different location.
  • the images used in the tracking stream are of lower resolution than the images used for the classification stream, and thus different types of image sensors can be used for the respective stream.
  • the image sensors used for generating the tracking stream are of lower native resolution, or operate at a lower resolution, than the image sensors used for generating the classification stream.
  • some of the image sensors generate single image streams (i.e., for either the classification image stream and the tracking image stream) and other image sensors generate image streams which are split into two image streams (i.e., one for the classification image stream and one for the tracking image stream).
  • each portion of the image, corresponding to the respective fields of view are split into the classification image stream and one for the tracking image stream (i.e., a single image sensor generates four sub-streams of images).
  • we generate a first subtracted image captured at a first time by comparing a first image at the first time with at least the previous frame from the tracking stream of images (from the same image sensor).
  • we subtract the previous image from the first image to eliminate stationary pixels (or mosquitoes).
  • several previous images could be combined to use as the image to be subtracted.
  • we generate a second subtracted image captured at the first time by comparing a second image at the first time with at least the previous frame from the second tracking stream of images.
  • FIG. 4A shows a tracking stream of images from a first camera 410 in which previous image 412 is subtracted from current image 413 to generate subtracted image 420.
  • An embodiment of a subtracted image 42 is illustrated.
  • Mosquitoes 422 which have moved since the previous frame stand out in the subtracted in the image.
  • An object detector algorithm may be run over the subtracted image to generate a bounding box enclosing an arthropod whose position we wish to estimate.
  • the object detector may be a machine learning based object detector trained on images of mosquitoes and may be implemented in computer vision library such as OpenCV or Tensorflow.
  • Tracking of the three dimensional position of one or more arthropods is based on a ray tracing approach.
  • each position may be an estimate of a centroid of the arthropod within the bounding box where the two rays intersect.
  • the position may also be determined using pattern matching, for example to identify the thorax or abdomen of the mosquito.
  • the three dimensional position is determined based on an identifying an intersection point of the first ray and the second ray within the targeting chamber.
  • Figure 3 A is an exploded view of the targeting chamber 72 illustrating a lighting system according to an embodiment.
  • the lighting system comprises a pair of translucent walls 302 304 which are backlit with lighting panels 312 314.
  • the translucent walls are provided on each side of the targeting chamber 72 distal from each camera to backlight the targeting chamber 72 with respect to the camera.
  • a first lighting panel 312 is located behind the target chamber 72 to backlight the view of the target chamber from the first camera 76 located in front of the target chamber 72
  • a second lighting panel 314 is located below (and supporting) the target chamber to backlight the view of the target chamber from the second camera 78 located above the target chamber 72.
  • Backlighting may be provided by LED strips mounted to a panel, or PCB mounted LED, a commercial off the shelf light box, or an LCD backlight panel.
  • each lighting panel comprises an array of 600 LED lights generating 4000-5000 lumen of light.
  • Figure 3B shows a pair of images 320 321 captured by a pair of orthogonal cameras with approximately orthogonal views of the targeting chamber according to an embodiment. Moving mosquitoes 322, 323 and 324 are identified by bounding boxes.
  • FIG. 3C which is a schematic view illustrating the relative geometry of each camera with respect to the targeting chamber and ray traces used for estimating positions of mosquitoes in the according an embodiment.
  • First camera 76 has field of view 77 and projects first ray 341.
  • Second camera 78 has second field of view 79 projects second ray 342.
  • the first ray 341 and second ray 342 intersect at point 343.
  • the relative geometry of the cameras, their pointing directions and the location and dimensions of the targeting chamber are stored in a memory 63 of a computing apparatus 60 associated with the computer vision tracking and targeting module 70. This allows the three dimensional position of the mosquito to be triangulated and estimated.
  • the cameras are located orthogonally with respect to each other.
  • the two cameras could be in non-orthogonal geometries including side by side placement provided the positions, pointing directions and fields of view of each camera is known. Additional cameras could also be provided to provide additional rays to be used in triangulating the position. [0086] In some embodiments there may be no exact intersection and an intersection point may be determined based on assuming an error radius or error threshold. In this embodiment if any two rays are within a threshold distance of each other they are deemed an intersection point. In another embodiment each ray may be assigned a volume.
  • the ray may the axis of a cylinder with a fixed radius or a cone in which the ray is an axis, and the walls are based on angular error radius to create a solid angle.
  • intersection volumes may be determined and intersection points may be a centroid or mid-point of the intersection volumes.
  • a three dimensional position has been estimated we next determine if the identified arthropod exists in a tracking database (that is, was it identified in a previous image). This may be performed by comparing the position with a previous position or an estimate of a future position of a previously identified arthropod. Image comparison methods may also be used, for example by comparing the image with a previous image. If the identified arthropod exists in a tracking database, we obtain the identifier otherwise we register a new identifier for the identified arthropod. We also store information used to identify the arthropod for future comparisons/searches. We then store the three dimensional position of the arthropod, and the associated order (e.g.an index), time and the arthropod identifier in the tracking database.
  • a tracking database that is, was it identified in a previous image. This may be performed by comparing the position with a previous position or an estimate of a future position of a previously identified arthropod. Image comparison methods may also be used, for example by comparing the image with a previous image
  • a sex of an arthropod This is performed by first identifying one or more arthropods in each image in the classification stream of images.
  • We generate a candidate arthropod image by cropping the original image to a standard (i.e., fixed or predetermined) classification image size around an identified arthropod, for example to a bounding box with a predetermined size which contains the identified arthropod. In one embodiment this is a 160x160 pixel image. However, in other embodiments other sizes may be used depending upon the processing capabilities available.
  • Multiple candidate arthropod images may be generated from the original image. Each candidate arthropod image is provided to a trained machine learning classifier to obtain an estimate of the sex of the arthropod.
  • the estimated sex matches a target sex (e.g., female) we then obtain the identifier of the arthropod from the tracking database. This can be performed based on the position of the arthropod in the original image from which it was cropped (in image coordinates) by searching the tracking database (generated by the tracking image stream). For example, a record in the tracking database may store the identifier, three dimensional position, and the original position (in image coordinates) of the mosquito in each image used to generate the three dimensional position. A search of the database could be performed for a record with original image position(s) that match the current original image position. Alternatively, a coarse 3D position could be estimated using ray tracing and matched to a position in the tracking database.
  • a target sex e.g., female
  • the targeting module 230 continues to track the position of the target mosquito. This may comprise analysing the previous set of positions using a Kalman filter to predict the future position at a future time and then firing the laser at the future time to incapacitate the mosquito.
  • the future time takes into account processing delays and time taken to orient the galvanometer mirror to target the mosquito with the laser pulse. This is further illustrated in Figure 5D.
  • frames are captured every 33ms and processing of images and estimation of the future position requires 13ms.
  • the future time is thus selected to be at most the sum of 13ms and 33ms, such as 46ms after the current frame.
  • this future time is reduced through the use of monochrome cameras which allow faster image processing resulting in an 8ms delay between frames.
  • each camera observes the full chamber.
  • the cameras are only required to view an overlapping portion which can be targeted with a laser system.
  • Multiple camera and laser systems could be provided, either for use with the same target chamber or as multiple modules provided in series.
  • Camera systems are selected to ensure they have a sufficient depth of field across dimensions of the targeting chamber.
  • a 4K ( ⁇ 8 Megapixel) camera operating at 30fps was used.
  • Camera systems that generate 4K uncompressed image stream are preferred over compressed streams as it ensures video processing is as fast as possible.
  • Testing also indicated that colour was less than 5% of information in images, and thus monochrome cameras may be used.
  • Monochrome image sensors/cameras often have the advantage of providing higher resolution and lower processing time than comparable colour image sensors/cameras.
  • the machine learning classifier is trained using an Xception architecture, although other architecture and algorithms may be used.
  • Figure 4B is a schematic illustration of an Xception architecture and associated input, middle layer and output images according to an embodiment.
  • the training system should be as similar as possible to the usage system. In the case of a change to the system (difference dimensions or lighting) the system can be rapidly retrained in a few days. In one embodiment 1024-dimensional vectors were used rather than 2048. However, it is considered that either value would work well. All weights in the model were randomly initialised.
  • Pixels in these images were easily converted to grayscale by combining their channel values using: Pgray ⁇ ' ⁇ -Pred T ⁇ ' ⁇ ’ ‘Pgreen T ⁇ ‘ ⁇ Pblue’ Equation 1 where p gray is the pixel value of the grayscale image, and p red , p green and Pbiue are the pixel values from the three colour channels of the original image.
  • p gray is the pixel value of the grayscale image
  • p red , p green and Pbiue are the pixel values from the three colour channels of the original image.
  • the architecture employed for training the models was based on the Xception model (Chollet, 2017).
  • the outputs of the Xception model were then fed into a layer that performed two-dimensional average pooling, then into a densely connected layer with 1024 nodes and a ReLU activation function, before finally being fed to a single output node with a sigmoid activation function.
  • Training was undertaken using binary cross-entropy loss and the “RMSProp” algorithm.
  • Each model was trained for 24 hours using a single Nvidia Tesla P100 GPU, 16GB RAM and a single CPU.
  • Tables 1 - 4 outline the proportions of males and females that would be eliminated by each of the models developed under different classification thresholds. Each row of these tables is computed by applying the classifier to 187,172 female images and 78,640 male images. For each image, each classifier outputs a value between zero and one, where zero corresponds to male and one to female. The threshold column corresponds to where we draw the line in this interval when assigning an image to the male or female categories.
  • Xception architecture provides a highly accurate classifier of mosquito images extracted from video. Models set at a low threshold, such as 0.01, lead to a very low female contamination probability and also a very low proportion of males destroyed. It is also evident that there is much greater classification accuracy from models trained using a top mounted camera compared to a side mounted camera. The use of colour images and a top mounted camera, resulted in a female contamination rate of 2.67 per 100,000 females, whilst for the side mounted camera with colour images, the contamination rate was 4.48 per 1,000 females.
  • the first measurement is the real position of the mosquito in three-dimensional space in the laser sorting chamber. As discussed above this was performed by detecting the mosquito centroids from the difference of sequential frames. The centroids of the same mosquito from each camera were then matched up using the ray-tracing method discussed above. With the correct matchup of the two centroids, the actual mosquito 3D location is calculated.
  • This 3D location of the mosquito is then matched to the database where locations of each mosquito were identified in the past.
  • the prediction of a mosquito’s future location is where the laser is targeted. It is converted to X and Y voltage for the galvanometric controller, which turns the two-axis mirror and redirects the laser beam to intercept the mosquito.
  • the targeting error presented here was calculated by measuring the difference in the next 3D position of the mosquito with that predicted by the laser targeting point.
  • Figure 5 A is a visualization of prediction error rates relative to mosquito movement within the target chamber where darker gray colours (darker red colours in the drawing in the original priority application) represents an increase in error (inaccuracy) relative to the velocity of mosquito movement, measured as the difference in mm between two video frames at 33fps according to an embodiment.
  • Each point in the flight path is assigned a colour corresponding to the error according to the legend on the side, with grey corresponding to a zero error and red corresponding to an error of 40mm (i.e., error range from 0-40mm).
  • a second test of the AI/ML tracking system was performed within the system. Males and females were hand sorted and released into the behavioural sorting box, where males attracted to a lure are then tracked and classified by the computer vision and neural network systems in real-time.
  • a full test run of an embodiment of the system was performed using adult mosquitoes.
  • 1,227 males and 14 females were placed into the collection (emergence) chamber 30 to be sorted by the system.
  • a total of 916 males (75%) and 4 females (30%) passed through the behavioural sorting system 50 within 180 minutes and into the laser targeting area 40. All four females that made it through the behavioural sorting system were identified and immobilized by the laser (e.g.,100% success rate).
  • Embodiments of the system enable efficient sorting of male from female mosquitoes (or arthropods) by using a series of behavioural and physical characteristic for identification and classification.
  • the system is modular and comprises a sieve module to remove >95% of females from the cohort and a behavioural sorting module that attracts males to a “swarm marker” and plays a female wingbeat tone through a speaker to act as an audio lure which is coordinated with an air flow system such as bladeless fan to suck/pull males into the next sorting cage.
  • the audio lure takes advantage of a difference in the wingbeat frequency of males and females.
  • the exact frequency (or frequency range) for mosquitoes depends upon the species as well as environmental factors such as temperature and humidity but is typically in the range of 100Hz to 1000Hz.
  • the specific frequency can thus be selected based on specific species and environmental conditions.
  • a female wing beat tone of 480Hz is used.
  • This module achieved accuracies of around 99%.
  • other lures including visual lures, chemical attractants and blood feeding pads may be used.
  • the individual module comprises many advantageous features and may be provided as standalone units or modules, or systems may be provided as various combinations of the modules described above, including the embodiment shown in Figure 1 which both rears arthropods and then sex sorts the emergent arthropods.
  • the system may comprise a sieving module and a behavioural sorting module comprising multiple behavioural sorting chambers arranged in sequence with an air flow system. In this embodiment each sorting chamber multiplicatively reduces the target sex.
  • the system may comprise a behavioural sorting module 40 and a computer vision based tracking and targeting module 70 which when combined, significantly improve performance of embodiments of the system with only one of the two modules.
  • the system 1 comprises an integrated sex sorting apparatus comprising the behavioural sorting module 40, computer vision based tracking and targeting module 70, including the computing apparatus 60 and directed energy module 80, and the final collection module 98 with integrated air flow system 91.
  • a sex sorting apparatus is an integrated apparatus as illustrated in Figures 8A and 8B.
  • the behavioural sorting module 40, and the final collection module 98 are both connected to a housing 88 containing the computer vision based tracking and targeting module 70 (including the computing apparatus 60 and directed energy module 80) via an integrated air flow system comprised of tubing and fans 92.
  • a container containing arthropods is connected to the behavioural sorting module to acts as a supply source of arthropods to be sorted.
  • the walls of the housing 88 are shown as laser safe translucent walls to show internal components but could be formed of opaque materials such as sheet metal.
  • the entry aperture 73 and exit aperture 74 of the targeting chamber 72 are formed as tubular interface structures, for example as injection moulded parts, which project through corresponding apertures in the side walls of the housing for connection to the tubing of the air flow system.
  • Tubular piping with an integrated fan 92 connects the exit aperture 52 of the behavioural sorting module 40 to the entry aperture 73 of the targeting chamber.
  • tubular piping connects the exit aperture 74 of the targeting chamber to the final collection module 98.
  • the collection chamber 98 is a removable translucent or transparent chamber which is removably mounted to a base section of the tubular piping in a transparent or translucent section, followed by a fan 92 section.
  • a gate arrangement may be placed in the tube near the entrance of the collection chamber. This may be manual or automatic and may be used to direct dead arthropods down into the collection chamber under the influence of the air flow system, or to close the entrance when the collection chamber is removed and emptied.
  • FIG 8B is schematic diagram of an internal frame of the housing shown in Figure 8A.
  • the internal frame comprises a base portion 88a and a rear portion 88b with the chamber 72 supported at the intersection of the base and rear portions.
  • the frame may be constructed of T-slot railing included extmded railings to provide flexibility in mounting locations, although other structural members such as tubes or rods may be used.
  • a first camera 76 is mounted to a front portion of the base portion and a second camera 78 is mounted to an upper portion of the rear portion.
  • the base portion 88a comprises two pairs of legs which support side rails to elevate the first camera 76 above the bottom surface of the housing, on which is mounted the laser 80 and targeting apparatus (e.g., galvanometer 82).
  • the computer apparatus 60 may be mounted to the rear of the rear frame portion 88b.
  • two calibration plates 72a and 72b are slidably mounted respectively to the rails of the rear and base portions.
  • the slidable mountings allow each calibration plate to move from a retracted storage location to a calibration location behind a respective wall of the chamber.
  • First calibration plate 72a slides down to a location behind the rear wall of the chamber 72 such that it is visible in the field of view of the first camera 76
  • the second calibration plate 72b slides along the base rails to a location under the base wall of the chamber 72 that that it is visible in the field of view of second camera 78.
  • the slidable mountings may allow manual movement of the calibration plates or may be automated using a belt drive or pneumatic actuators.
  • the calibration plates could be folded out to be located in front of the front and top panels of the targeting chamber.
  • the calibration plates comprise a known calibration pattern 72b' of rectangular grid lines of known (or predetermined) dimensions such as 10cm squares. Images captured with the calibration plates can be used with knowledge of the location of the plates and the calibration pattern to calibrate the three dimensional geometry of the target chamber 72 to assist in determining the three dimensional positions of arthropods detected in images.
  • FIGs 9A, 9B, and 9C are schematic diagram showing three embodiments of a modular integrated sex sorting apparatus.
  • the housing is made of sheet metal with a hinged access portion incorporating a laser safe viewing window in the front surface made of a translucent laser safe material.
  • a light is located on a top surface to indicate when the system is in use and/or when the laser is being fired.
  • An internal frame such as illustrated in Figure 8B, may be used to support the cameras, chamber, targeting system and computers.
  • the hinged portion provides access to base portion 88a of the frame which supports and houses the cameras, targeting system and chamber, and side access panels provide access to the rear of the housing which houses the computing apparatus 60 mounted to the rear portion 88b of the frame.
  • FIG. 9B is schematic diagram of a modular integrated sex sorting apparatus according to another embodiment.
  • a double hinged access door is provided in the front panel to allow access to the cameras, chamber, targeting system, and the computing apparatus 60 is mounted in the base portion which access provided via drawers.
  • Figure 9C is schematic diagram of a modular integrated sex sorting apparatus according to another embodiment. This system is a fully enclosed system which hides the tubing of the air flow system.
  • a rolling access panel is provided in the front and side portion of the housing 88.
  • the computer 80 is a side mounted computer located adjacent to the door. Handles may be provided for ease of movement.
  • connections of the air flow tubing to the entry aperture 73 and exit aperture 74 may be removable connections such as by using a clamping arrangement (i.e. screw adjustable band), a snap fit arrangement (i.e. one part has a annular ridge and the matching other part snap fits over the ridge), or a keyed arrangement (i.e. key on one component and matching retaining slot on the other part).
  • a gasket may be used to provide an air seal between the parts.
  • the connection may be a permanent connection which may be formed by gluing or welding the components together.
  • the tubing has a 90mm diameter, although other suitable diameters (e.g., 50, 70, 120 or 150mm) which allow arthropods to move through the system may be used.
  • Vibration damping connections may also be provided in the tubing on one or both sides of a fan module 92.
  • the cameras may be mounted on isolated mounts, for example using rubber or soft spacer elements.
  • a separate air compressor may be used to provide the directed air for the air flow system and though isolation reduce vibration in the system.
  • the chamber may be a rectangular chamber formed of flat panels connected at the edges, for example by welding or gluing, or it may be formed as an extruded box.
  • the targeting chamber panels are 103mm by 197mm, although it will be understood that other sizes could be used (larger or smaller).
  • Other geometries could also be used, such as an extruded tube or a chamber with flat front and top panels connected by a curved rear panel. This is illustrated in Figure 10 which flat top panel 72c and front panel 72d with a curved rear panel 72e. In this embodiment a curved LED panel 72f located behind the curved rear panel 72e provide light into chamber 72.
  • arthropods are collected in a container which is connected to the behavioural sorting module 40.
  • the arthropods may be collected by an arthropod collection apparatus.
  • the arthropod collection apparatus may be directly connected to the behavioural sorting module 40 to provide arthropods for sorting, optionally with a controllable barrier, such as door or gate apparatus (manual or automatic), to control when arthropods are allowed to enter the behavioural sorting module 40.
  • the arthropod collection apparatus may collect the arthropods into a container which is then temporarily connected to the behavioural sorting module 40, or the arthropod collection apparatus may collect the arthropods, and these are then transferred to a transport (or transfer) container which is transported to, and then connected to, the behavioural sorting module 40.
  • This allows collection, whether by capture or rearing, to be performed in a location different from the location of the sex sorting apparatus.
  • the arthropod collection apparatus may be an integrated apparatus comprising a rearing module 10, the sieving module 20, and a collection (emergence) container 30.
  • the arthropod collection apparatus may be an arthropod (or insect) trap which collects the arthropods into a collection container.
  • the arthropod collection apparatus may be used including a combination of traps and rearing systems, or multiple rearing systems.
  • the collection (emergence) container 30 may be disconnected from the sieving module and used to transport the arthropods to the behavioural sorting module 40 (where it is connected).
  • the arthropods are reared in batches, and the collection (emergence) container 30 is connected, or a door or gating apparatus is opened, when the arthropods are expected to emerge, or as discussed above, for some predefined time period around emergence (e.g., days 9-12).
  • the door or gate apparatus may be manual or automatically opened and closed for example based on the predefined time period around emergence.
  • the system can also perform real-time tracking of crawling or moving arthropods.
  • Both the sieving and 2D identification (classification) and 3D tracking systems take into account morphological differences between arthropods of different sexes.
  • the 3D tracking system is further configured to tracking behavioural differences of arthropods within the target chamber 72.
  • the dimensions and arrangement of slots in the sieves, including the use of grooved slots may preferentially allow males through the sieve.
  • the image classified may be trained to recognise the morphological differences between different sexes. Timing of sieving may also be selected to exploit sex related timing difference of when pupae emerge.
  • the behavioural sorting system also takes account of behavioural differences between sexes by providing a swarm marker and an audio lure which is synchronised with an air flow system.
  • the air flow system creates directional flow through the system and can be used in combination with other chemical lures.
  • the use of bladeless fans reduces the risk of harm to mosquitoes passing through the system (thus increasing the fitness of individuals collected in the final chamber 98)
  • the system may be implemented using one or more computing apparatus as described herein.
  • the computing apparatus may comprise one or more processors including multi-core CPUs and Graphical Processing Units (GPUs) operatively connected to one or more memories which store instructions to configure the processor to perform embodiments of the method.
  • the computing system may include, for example, one or more processors (CPUs, GPUs), memories, storage, and input/output devices (e.g., monitor, keyboard, disk drive, network interface, internet connection, etc.).
  • the computing apparatus may include circuitry or other specialized hardware for carrying out some or all aspects of the processes.
  • the computing apparatus may be an all-in-one computer, a desktop computer, a laptop, a tablet, a mobile computing apparatus, a server, a microcontroller, or a microprocessor board, and any associated peripheral devices.
  • the computer apparatus may be a distributed system including server based systems and cloud-based computing systems.
  • the computing apparatus may be a unitary computing or programmable device, or a distributed system or device comprising several components operatively (or functionally) connected via wired or wireless connections.
  • the computing system may be configured as a system that includes one or more devices, each of which is configured to carry out some aspects of the processes either in software, hardware, or some combination thereof.
  • a microcontroller or microprocessor board e.g., an chicken board
  • a separate image processing desktop incorporating GPUs may be used to perform the image processing, classification, tracking and targeting.
  • a user interface may be provided on another computing apparatus such as a laptop which interfaces with the microcontroller and image processing desktop to allow a user to interact, monitor and configure the system.
  • the user interface may be provided as a web portal or interface allowing a user to remotely interact, monitor and configure the system.
  • data processing may be performed remotely on a server based system including cloud based server systems, and the user interface is configured to communicate with such servers to exchange data and results.
  • FIG. 1 A An embodiment of a computing apparatus 60 is illustrated in Figure 1 A and comprises a central processing unit (CPU) 61, a memory 62, and may include a GPU 63, an output device 64 such as a display apparatus, and/or an input device 65 such as keyboard, mouse, etc.
  • the display apparatus may be a touch screen which also acts as an input device.
  • the CPU 61 comprises an Input/Output Interface, an Arithmetic and Logic Unit (ALU) and a Control Unit and Program Counter element which is in communication with input and output devices (e.g., input device 63 and display apparatus 65) through the Input/Output Interface.
  • ALU Arithmetic and Logic Unit
  • Control Unit and Program Counter element which is in communication with input and output devices (e.g., input device 63 and display apparatus 65) through the Input/Output Interface.
  • the Input/Output Interface may comprise a network interface and/or communications module for communicating with an equivalent communications module in another device using a predefined communications protocol (e.g., Bluetooth, Zigbee, IEEE 802.15, IEEE 802.11, TCP/IP, UDP, etc.).
  • a predefined communications protocol e.g., Bluetooth, Zigbee, IEEE 802.15, IEEE 802.11, TCP/IP, UDP, etc.
  • a graphical processing unit (GPU) 63 may also be included.
  • the display apparatus may comprise a flat screen display (e.g., LCD, LED, plasma, touch screen, etc.), a projector, CRT, etc.
  • the computing device may comprise a single CPU (core) or multiple CPU’s (multiple core), or multiple processors.
  • the computing device may use a parallel processor, a vector processor, or be a distributed computing device.
  • the memory is operatively coupled to the processor(s) and may comprise RAM and ROM components and may be provided within or external to the device.
  • the memory may be used to store the operating system and additional software modules or instructions.
  • the processor(s) may be configured to load and executed the software modules or instructions stored in the memory.
  • the computing system an embedded Al system similar to a NVIDA Jetson.
  • a computer program may be written, for example, in a general-purpose programming language (e.g., Python, Java, C++, C, C# etc.) or some specialized application-specific language, and may utilise or call software libraries or packages for example to implement data interfaces (e.g. JSON) or utilise machine learning (e.g. TensorFlow, CUD A).
  • a general-purpose programming language e.g., Python, Java, C++, C, C# etc.
  • software libraries or packages for example to implement data interfaces (e.g. JSON) or utilise machine learning (e.g. TensorFlow, CUD A).
  • JSON data interfaces
  • machine learning e.g. TensorFlow, CUD A
  • processing may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, or other electronic units designed to perform the functions described herein, or a combination thereof.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • processors controllers, micro-controllers, microprocessors, or other electronic units designed to perform the functions described herein, or a combination thereof.
  • middleware and computing platforms may be used.
  • the processor module comprises one or more Central Processing Units (CPUs) or Graphical processing units (GPU) configured to perform some of the steps of the methods.
  • a computing apparatus may comprise one or more CPUs and/or GPUs.
  • a CPU may comprise an Input/Output Interface, an Arithmetic and Logic Unit (ALU) and a Control Unit and Program Counter element which is in communication with input and output devices through the Input/Output Interface.
  • the Input/Output Interface may comprise a network interface and/or communications module for communicating with an equivalent communications module in another device using a predefined communications protocol (e.g., Bluetooth, Zigbee, IEEE 802.15, IEEE 802.11, TCP/IP, UDP, etc.).
  • the computing apparatus may comprise a single CPU (core) or multiple CPU’s (multiple core), or multiple processors.
  • the computing apparatus may be a cloud based computing apparatus using GPU clusters, a parallel processor, a vector processor, or be a distributed computing device.
  • Memory is operatively coupled to the processor(s) and may comprise RAM and ROM components and may be provided within or external to the device or processor module.
  • the memory may be used to store an operating system and additional software modules or instructions.
  • the processor ⁇ ) may be configured to load and executed the software modules or instructions stored in the memory.
  • Software modules also known as computer programs, computer codes, or instructions, may contain a number a number of source code or object code segments or instructions, and may reside in any computer readable medium such as a RAM memory, flash memory, ROM memory, EPROM memory, registers, hard disk, a removable disk, a CD-ROM, a DVD-ROM, a Blu-ray disc, or any other form of computer readable medium.
  • the computer-readable media may comprise non- transitory computer-readable media (e.g., tangible media).
  • computer- readable media may comprise transitory computer- readable media (e.g., a signal). Combinations of the above should also be included within the scope of computer-readable media.
  • the computer readable medium may be integral to the processor.
  • the processor and the computer readable medium may reside in an ASIC or related device.
  • the software codes may be stored in a memory unit and the processor may be configured to execute them.
  • the memory unit may be implemented within the processor or external to the processor, in which case it can be communicatively coupled to the processor via various means as is known in the art.
  • modules and/or other appropriate means for performing the methods and techniques described herein can be downloaded and/or otherwise obtained by computing device.
  • a device can be coupled to a server to facilitate the transfer of means for performing the methods described herein.
  • various methods described herein can be provided via storage means (e.g., RAM, ROM, a physical storage medium such as a compact disc (CD) or floppy disk, etc.), such that a computing device can obtain the various methods upon coupling or providing the storage means to the device.
  • storage means e.g., RAM, ROM, a physical storage medium such as a compact disc (CD) or floppy disk, etc.
  • the methods disclosed herein comprise one or more steps or actions for achieving the described method.
  • the method steps and/or actions may be interchanged with one another without departing from the scope of the claims.
  • the order and/or use of specific steps and/or actions may be modified without departing from the scope of the claims.
  • determining encompasses a wide variety of actions. For example, “determining” may include calculating, computing, processing, deriving, investigating, looking up (e.g., looking up in a table, a database or another data structure), ascertaining and the like. Also, “determining” may include receiving (e.g., receiving information), accessing (e.g., accessing data in a memory) and the like. [00138] The reference to any prior art in this specification is not, and should not be taken as, an acknowledgement or any form of suggestion that such prior art forms part of the common general knowledge.
  • a single embodiment may, for succinctness and/or to assist in understanding the scope of the disclosure, combine multiple features. It is to be understood that in such a case, these multiple features may be provided separately (in separate embodiments), or in any other suitable combination. Alternatively, where separate features are described in separate embodiments, these separate features may be combined into a single embodiment unless otherwise stated or implied. This also applies to the claims which can be recombined in any combination. That is a claim may be amended to include a feature defined in any other claim. Further a phrase referring to “at least one of’ a list of items refers to any combination of those items, including single members. As an example, “at least one of: a, b, or c” is intended to cover: a, b, c, a-b, a-c, b-c, and a-b-c.

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Pest Control & Pesticides (AREA)
  • Environmental Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Zoology (AREA)
  • General Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Wood Science & Technology (AREA)
  • Insects & Arthropods (AREA)
  • Animal Behavior & Ethology (AREA)
  • Animal Husbandry (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Human Computer Interaction (AREA)
  • Computational Linguistics (AREA)
  • Catching Or Destruction (AREA)

Abstract

Sont décrits un système et un procédé de tri d'arthropodes selon leur sexe, qui collecte de préférence des individus d'un sexe sélectionné dans un module de collecte final. Le système comprend un module de tri comportemental, un module de suivi et de ciblage basé sur la vision artificielle et une chambre de collecte finale dotée d'un système d'écoulement d'air associé qui commande la direction du flux d'air à travers le système afin de fournir un mouvement unidirectionnel. Le système peut également comprendre un module d'élevage, un module de criblage et un contenant de collecte d'émergence qui est connecté au module de tri comportemental. Le module de tri comportemental utilise des différences comportementales entre les sexes afin de collecter préférentiellement les individus sélectionnés. Un système d'écoulement d'air intégré commande la direction du flux d'air à travers le système afin de fournir un mouvement unidirectionnel comprenant une chambre d'un module de suivi et de ciblage basé sur la vision artificielle. Le sexe des individus est reconnu, et un faisceau d'énergie dirigé, tel qu'un laser, est utilisé pour tuer ou stériliser des individus du sexe non sélectionné (le sexe cible), tout en permettant à des individus du sexe sélectionné de passer à travers la chambre et d'entrer dans la chambre de collecte finale.
PCT/AU2023/050307 2022-04-14 2023-04-14 Système et procédé de tri d'arthropodes selon leur sexe WO2023197042A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2022901010A AU2022901010A0 (en) 2022-04-14 Sex based arthropod sorting system and method
AU2022901010 2022-04-14

Publications (1)

Publication Number Publication Date
WO2023197042A1 true WO2023197042A1 (fr) 2023-10-19

Family

ID=88328485

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2023/050307 WO2023197042A1 (fr) 2022-04-14 2023-04-14 Système et procédé de tri d'arthropodes selon leur sexe

Country Status (1)

Country Link
WO (1) WO2023197042A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102726358A (zh) * 2012-05-31 2012-10-17 湖北天罡投资有限公司 激光灭蚊器
US20180206473A1 (en) * 2017-01-23 2018-07-26 Verily Life Sciences Llc Insect singulator system
US20200154685A1 (en) * 2017-01-22 2020-05-21 Senecio Ltd. Method for sex sorting of mosquitoes and apparatus therefor
US20200281164A1 (en) * 2017-07-06 2020-09-10 Senecio Ltd. Method and apparatus for sex sorting of mosquitoes
US20220053743A1 (en) * 2018-12-17 2022-02-24 Senecio Ltd. System and method for mosquito pupae sorting, counting and packaging

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102726358A (zh) * 2012-05-31 2012-10-17 湖北天罡投资有限公司 激光灭蚊器
US20200154685A1 (en) * 2017-01-22 2020-05-21 Senecio Ltd. Method for sex sorting of mosquitoes and apparatus therefor
US20180206473A1 (en) * 2017-01-23 2018-07-26 Verily Life Sciences Llc Insect singulator system
US20200281164A1 (en) * 2017-07-06 2020-09-10 Senecio Ltd. Method and apparatus for sex sorting of mosquitoes
US20220053743A1 (en) * 2018-12-17 2022-02-24 Senecio Ltd. System and method for mosquito pupae sorting, counting and packaging

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
CHEN CHING-HSIN, CHIANG ANN-SHYN, TSAI HUNG-YIN: "Three-Dimensional Tracking of Multiple Small Insects by a Single Camera", JOURNAL OF INSECT SCIENCE, vol. 21, no. 6, 1 November 2021 (2021-11-01), XP093102558, DOI: 10.1093/jisesa/ieab079 *
KOHLHOFF KAI J., JAHN THOMAS R., LOMAS DAVID A., DOBSON CHRISTOPHER M., CROWTHER DAMIAN C., VENDRUSCOLO MICHELE: "The iFly tracking system for an automated locomotor and behavioural analysis of Drosophila melanogaster", INTEGRATIVE BIOLOGY, RSC PUBL., CAMBRIDGE, vol. 3, no. 7, 1 January 2011 (2011-01-01), Cambridge , pages 755, XP093102555, ISSN: 1757-9694, DOI: 10.1039/c0ib00149j *
MULLEN EMMA R., RUTSCHMAN PHILLIP, PEGRAM NATHAN, PATT JOSEPH M., ADAMCZYK JOHN J., JOHANSON: "Laser system for identification, tracking, and control of flying insects", OPTICS EXPRESS, vol. 24, no. 11, 30 May 2016 (2016-05-30), pages 11828, XP093102552, DOI: 10.1364/OE.24.011828 *
PODA SERGE B., NIGNAN CHARLES, GNANKINÉ OLIVIER, DABIRÉ ROCH K., DIABATÉ ABDOULAYE, ROUX OLIVIER: "Sex aggregation and species segregation cues in swarming mosquitoes: role of ground visual markers", PARASITES & VECTORS, vol. 12, no. 1, 1 December 2019 (2019-12-01), XP093102554, DOI: 10.1186/s13071-019-3845-5 *
RAKHMATULIN ILDARR: "Raspberry PI for Kill Mosquitoes by Laser", MEDIUM, 9 March 2021 (2021-03-09), XP093102553, Retrieved from the Internet <URL:https://medium.com/nerd-for-tech/raspberry-pi-for-kill-mosquitoes-by-laser-e99334a97d68> [retrieved on 20231116] *
SAVALL JOAN, HO ERIC TATT WEI, HUANG CHENG, MAXEY JESSICA R, SCHNITZER MARK J: "Dexterous robotic manipulation of alert adult Drosophila for high-content experimentation", NATURE METHODS, NATURE PUBLISHING GROUP US, NEW YORK, vol. 12, no. 7, 1 July 2015 (2015-07-01), New York, pages 657 - 660, XP093102560, ISSN: 1548-7091, DOI: 10.1038/nmeth.3410 *

Similar Documents

Publication Publication Date Title
US11785926B2 (en) Method for sex sorting of mosquitoes and apparatus therefor
DK181307B1 (en) System for external fish parasite monitoring in aquaculture
Roosjen et al. Deep learning for automated detection of Drosophila suzukii: potential for UAV‐based monitoring
DK181498B1 (en) System for external fish parasite monitoring in aquaculture
AU2018387736B2 (en) Method and system for external fish parasite monitoring in aquaculture
Liu et al. A review of recent sensing technologies to detect invertebrates on crops
US20150049919A1 (en) apparatus for diagnosis and control of honeybee varroatosis, image processing method and software for recognition of parasite
Combes et al. Linking biomechanics and ecology through predator–prey interactions: flight performance of dragonflies and their prey
Corcoran et al. Sonar jamming in the field: effectiveness and behavior of a unique prey defense
CN106793768A (zh) 光子栅栏
WO2021038561A1 (fr) Système et procédé de tri par sexe d&#39;insectes pré-adultes
Al-Saqer et al. Artificial neural networks based red palm weevil (Rynchophorus Ferrugineous, Olivier) recognition system
EP3132680A1 (fr) Appareil et procédé de séparation d&#39;un insecte et appareil de sélection d&#39;insectes à partir d&#39;un groupe d&#39;insectes
Giakoumoglou et al. White flies and black aphids detection in field vegetable crops using deep learning
de Castro Pereira et al. Detection and classification of whiteflies and development stages on soybean leaves images using an improved deep learning strategy
WO2023197042A1 (fr) Système et procédé de tri d&#39;arthropodes selon leur sexe
Giannetti et al. First use of unmanned aerial vehicles to monitor Halyomorpha halys and recognize it using artificial intelligence
US20210342597A1 (en) Apparatus and method for identifying organisms
DE102019131858A1 (de) System zur automatischen Erfassung und Bestimmung von sich bewegenden Objekten
Janisch et al. Deciphering choreographies of elaborate courtship displays of golden‐collared manakins using markerless motion capture
Hoffmann et al. Quantifying the movement of multiple insects using an optical insect counter
Yuana et al. Mobile sensing in Aedes aegypti larva detection with biological feature extraction
Hoffmann et al. Active smelling in the American cockroach
RU2777572C2 (ru) Система для мониторинга наружного паразита рыбы в аквакультуре
Madokoro et al. Development of long-term night-vision video analyzing system for physical pest control

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23787313

Country of ref document: EP

Kind code of ref document: A1