WO2019126971A1 - 一种基于机器视觉的虫害监测方法 - Google Patents

一种基于机器视觉的虫害监测方法 Download PDF

Info

Publication number
WO2019126971A1
WO2019126971A1 PCT/CN2017/118423 CN2017118423W WO2019126971A1 WO 2019126971 A1 WO2019126971 A1 WO 2019126971A1 CN 2017118423 W CN2017118423 W CN 2017118423W WO 2019126971 A1 WO2019126971 A1 WO 2019126971A1
Authority
WO
WIPO (PCT)
Prior art keywords
pest
pests
image
machine vision
area
Prior art date
Application number
PCT/CN2017/118423
Other languages
English (en)
French (fr)
Inventor
唐宇
骆少明
钟震宇
雷欢
侯超钧
庄家俊
黄伟锋
陈再励
林进添
朱立学
Original Assignee
仲恺农业工程学院
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 仲恺农业工程学院 filed Critical 仲恺农业工程学院
Priority to PCT/CN2017/118423 priority Critical patent/WO2019126971A1/zh
Priority to US16/637,480 priority patent/US10729117B2/en
Publication of WO2019126971A1 publication Critical patent/WO2019126971A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M1/00Stationary means for catching or killing insects
    • A01M1/02Stationary means for catching or killing insects with devices or substances, e.g. food, pheronones attracting the insects
    • A01M1/026Stationary means for catching or killing insects with devices or substances, e.g. food, pheronones attracting the insects combined with devices for monitoring insect presence, e.g. termites
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M1/00Stationary means for catching or killing insects
    • A01M1/02Stationary means for catching or killing insects with devices or substances, e.g. food, pheronones attracting the insects
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M1/00Stationary means for catching or killing insects
    • A01M1/02Stationary means for catching or killing insects with devices or substances, e.g. food, pheronones attracting the insects
    • A01M1/04Attracting insects by using illumination or colours
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M23/00Traps for animals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/217Validation; Performance evaluation; Active pattern learning techniques
    • G06F18/2193Validation; Performance evaluation; Active pattern learning techniques based on specific statistical tests
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/255Detecting or recognising potential candidate objects based on visual cues, e.g. shapes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects

Definitions

  • the invention relates to the field of pest detection, and in particular to a method for monitoring pests based on machine vision.
  • the occurrence of pests caused by citrus hibiscus is mainly detected manually, and the degree of occurrence of pests is also predicted by labor, which not only takes time and labor, but also because of adult activities and Migration will affect the accuracy of the monitoring data.
  • Other traditional pest monitoring methods, such as the use of traps for monitoring are still not ideal in terms of accuracy and timeliness, poor guidance for pest control in orchards, high cost of prevention and control, and poor results.
  • the present invention provides a pest detection method based on machine vision, which can monitor pests in real time and predict the degree of occurrence of pests.
  • a machine vision based pest monitoring method the steps of which include:
  • the identified regions of each pest in the image are respectively extracted into a plurality of pest suspect images, and the recognition accuracy rate of each pest suspect image is determined;
  • the level of pest prediction was calculated based on the number of pests and the correct recognition rate of each pest's suspected image.
  • the invention automatically collects pest images by the image collecting device for the insect trapping device, thereby eliminating the disadvantages of manual and time-consuming power consumption, and can also realize real-time monitoring of pests; combining the number of pests and the recognition accuracy rate of each pest suspect image Calculating the level of pest prediction, compared with the prior art, the pest prediction level is calculated by the number of pests alone, and the accuracy is higher. The predicted level of pest prediction is more meaningful, and the guidance for pest control is enhanced.
  • a statistical analysis model is established in advance, and the statistical analysis model is used to combine the number of pests and the recognition accuracy rate of each pest suspect image to calculate the pest prediction level.
  • the statistical analysis model is obtained according to the training, it can fit the corresponding relationship between the predicted level of pests and the number of pests and the recognition accuracy of the suspected images of each pest, so that the final predicted level of pests is more targeted and more Guide the significance of pest control.
  • n is the number of pests
  • allow max is the threshold number of pests
  • p i is the correct rate of identification of the i-th pest suspect image.
  • the pest prediction level H(n) is zero, that is, no pests occur, but when the number of pests is greater than or equal to the number of pests, the correct recognition rate of each pest suspect image is performed. Accumulating calculations, taking into account the suspected images of each pest and their possible considerations, will help to obtain a more scientific level of pest prediction H(n) and improve the guiding significance of pest control.
  • the insect trapping device comprises a box and a trap light installed in the box, the box is a polyhedron, the box has at least one opening; and the image collecting device is arranged to collect an image on a side of the box opening.
  • the box is used to gather the light of the trap light, so that the image collected by the image collecting device is clear, so that the pests in the image can be recognized later, the recognition accuracy is improved, the utility of the method is further improved, and the accuracy of the prediction is enhanced. It is convenient for people to prevent pests in time.
  • the opening of the box facing the image capture device is covered with a light transmissive film.
  • the light-transmissive film can make the light received by the image acquisition device more uniform and soft, improve the image quality, facilitate the recognition of the pests in the image, improve the recognition accuracy, further improve the practicability of the method, and enhance the accuracy of the prediction. It is convenient for people to prevent pests in time.
  • the step of identifying the pest in the captured image is specifically: identifying an area of the captured image that blocks the light of the light of the light, determining whether the geometric feature of each area matches the shape of the pest, and if so, identifying the corresponding area. It is a pest.
  • the shape of the pest matches at least according to the area and the circumference of each area. Because of the three important characteristics of the area of the combined area, the perimeter and the ratio between them, it is enough to greatly reduce the false positive rate, and also improve the recognition efficiency, which can improve the efficiency of the pest prediction level, and it is more effective for pest control. Timeliness.
  • the area and perimeter are calculated from the pixels in the area.
  • the area is obtained by accumulating all the pixels in the area, and the circumference is obtained by accumulating the pixels at the edge of the area, because the area is generally an irregular polygon, and the simple accumulation calculation by the pixel points can avoid the irregularity.
  • the pest discrimination model is established in advance, and the recognition accuracy rate of each pest suspect image is determined by the pest discrimination model. Because the pest discrimination model is obtained according to the training, it can fit the correspondence between the correct recognition rate of each suspected suspect image and the suspected image of each pest, so that the recognition accuracy rate of the finally obtained pest suspect image is more targeted, The final level of pest prediction is more indicative of the significance of pest control.
  • the specific steps of establishing the pest discrimination model are: making a collection of positive samples and a collection of negative samples of the pest image, the positive samples are pest images in various cases, the negative samples are images without pests; A set of sets and negative samples trains the neural network to generate a pest discriminant model.
  • the acquired image is also subjected to denoising preprocessing.
  • Denoising pre-processing of the image collected by the image acquisition device can remove the noise, and more accurately identify the pests in the image, and can also improve the accuracy of the pest prediction level;
  • the pest identification model is used to judge the correct recognition rate of each pest's suspected image, and the pest identification model is obtained through neural network training.
  • the degree of intelligence is higher, and there is no artificial judgment of subjectiveness, and the accuracy is high. Can improve the accuracy of pest prediction levels;
  • the pest prediction level is calculated according to the number of pests alone, and the present invention pre-establishes a statistical analysis model, and uses the statistical analysis model to calculate the pests by combining the number of pests and the recognition accuracy of each pest suspect image.
  • the level of prediction, the statistical analysis model, the number of pests, and the correct recognition rate of each pest's suspected image can be judged together, which can greatly reduce the judgment error, improve the effectiveness of the pest prediction level, and be more instructive for pest control;
  • Figure 1 is a block diagram of the method of the present patent.
  • Figure 2 is a schematic illustration of the area of the image of the patent that blocks the light of the trap light.
  • a machine vision-based pest monitoring method as shown in FIG. 1 includes the steps of: installing a trap device at a pest gathering place, and setting an image collecting device to collect an image for the trap device;
  • the identified regions of the pests in the image are respectively extracted into a plurality of suspected images of the pests, and the correct recognition rate of the suspected images of each pest is determined;
  • the threshold number of pests can be 3, because according to the phototaxis of pests, for example, citrus hibiscus is light-reducing. If pests occur, the number of identified pests may be more than three, and combined with environmental factors such as leaves, bees and other factors. Identify the number of pests obtained, so when the number of pests is less than the threshold number of pests 3, it can be determined that no pests occur and the growth of crops is not affected. More preferably, the results of multiple tests can be carried out according to the extent of different pests or according to the past. Experienced better threshold number of pests;
  • Pest-related parameters include the level of pest prediction, the number of pests, the area in which the individual pests are located in the image, and the images acquired by the corresponding image acquisition device.
  • the manual judgment is specifically as follows: the human is combined with the above-mentioned pest-related parameters to determine whether the actual number of pests in the image is consistent or more than the number of identified pests, and if so, corresponding treatment measures are taken according to the level of the early warning.
  • a specific image capture device can be a camera.
  • the invention automatically collects pest images by the image collecting device for the insect trapping device, thereby eliminating the disadvantages of manual and time-consuming power consumption, and can also realize real-time monitoring of pests; combining the number of pests and the recognition accuracy rate of each pest suspect image Calculating the level of pest prediction, compared with the prior art, the pest prediction level is calculated by the number of pests alone, and the accuracy is higher. The predicted level of pest prediction is more meaningful, and the guidance for pest control is enhanced.
  • the statistical analysis model is established in advance, and the statistical analysis model is used to combine the number of pests and the recognition accuracy rate of each pest suspect image to calculate the pest prediction level.
  • the calculation formula of pest prediction level H(n) based on the statistical analysis model is:
  • n is the number of pests
  • allow max is the threshold number of pests
  • p i is the recognition accuracy of the i-th pest suspect image
  • H(n) is in the range [0, 1].
  • the pest level can be set according to the value of H(n), such as 0.5 for the first level, 0.5-0.7 for the second level, and 0.7-0.9 for the third level, and the warning is given according to the above-mentioned level corresponding to the value of H(n). .
  • the statistical analysis model is obtained according to the training, it can fit the corresponding relationship between the predicted level of pests and the number of pests and the recognition accuracy of the suspected images of each pest, so that the final predicted level of pests is more targeted and more To guide the significance of pest control; or when the number of pests does not reach the threshold number of pests, the pest prediction level H(n) is zero, that is, no pests occur, but when the number of pests is greater than or equal to the number of pests, all pests are suspected.
  • the average value of the correct recognition rate of the image will take into account the suspected images of each pest and its possible considerations, which will help to obtain a more scientific level of pest prediction H(n) and improve the guiding significance of pest control.
  • the insect trapping device comprises a box and a trap light installed in the box, the box is a polyhedron, specifically a rectangular parallelepiped, the box has at least one opening, and the trap light uses a white light source with better backlighting effect, and is used in the box.
  • the image collecting device collects the pest image on the side of the opening of the box.
  • the box is used to gather the light of the trap light, so that the image collected by the image collecting device is clear, so that the pests in the image can be recognized later, the recognition accuracy is improved, the utility of the method is further improved, and the accuracy of the prediction is enhanced. It is convenient for people to prevent pests in time.
  • the side of the box facing the opening of the image capture device is covered with a light transmissive film, and the image capture device should have a certain distance from the box, so that the image capture device has a shooting range just covering the light transmissive film.
  • the light-transmissive film can make the light received by the image acquisition device more uniform and soft, improve the image quality, facilitate the recognition of the pests in the image, improve the recognition accuracy, further improve the practicability of the method, and enhance the accuracy of the prediction. It is convenient for people to prevent pests in time.
  • the light received by the image collecting device is more even and soft, and the background of the image collected by the image collecting device is also cleaner, and the noise can be effectively distinguished, so that the image collecting device can collect the image.
  • the image is denoised and preprocessed to make the captured image sharper.
  • the step of using the blob algorithm to identify a plurality of pests in the image collected by the image acquisition device is specifically: identifying an area of the image that blocks the light of the light of the light trap, and determining whether the geometric feature of each area matches the shape of the pest, and if so, The corresponding area is identified as a pest.
  • At least according to the area and circumference of each area it is judged whether or not the shape of the pest matches. Because of the three important characteristics of the area of the combined area, the perimeter and the ratio between them, it is enough to greatly reduce the false positive rate and improve the recognition efficiency. It can also improve the efficiency of the final pest prediction level, and it is more effective for pest control. Timeliness. More preferably, in addition to the area and circumference of each area, it is necessary to calculate the minimum circumscribed rectangle and the centroid position of each area.
  • the minimum circumscribed rectangle of the region in which the respective pests are located in the image is also calculated in the above steps, through the minimum The circumscribed rectangle locates the area where each pest is located in the image, and extracts multiple suspected images of the pest.
  • the area and perimeter are calculated from the pixels in the area.
  • the area is obtained by accumulating all the pixels in the area, and the circumference is obtained by accumulating the pixels at the edge of the area, because the area is generally an irregular polygon, and the simple accumulation calculation by the pixel points can avoid the irregularity.
  • Complex formula calculation for polygon area and perimeter is obtained by accumulating all the pixels in the area, and the circumference is obtained by accumulating the pixels at the edge of the area, because the area is generally an irregular polygon, and the simple accumulation calculation by the pixel points can avoid the irregularity. Complex formula calculation for polygon area and perimeter.
  • the i-th region be R i (x, y), and f(x, y) be the binarized pixel value at the pixel point (x, y) in the image acquired by the image acquisition device, then the i-th region
  • the area S(R i (x, y)) is:
  • the binarized pixel value f(x, y) is obtained by preprocessing.
  • f(x, y) at the pixel in the dark region of the image that is, the region f blocking the light of the trap light ( x, y)
  • set to 1 set f(x, y) at the pixel in the bright area of the image to 0, so the value of accumulating f(x, y) in R i (x, y) can be used as The area of the region R i (x, y).
  • the circumference of the i-th region is the number of pixel points (x, y) on the region boundary (labeled at 5 in FIG. 2);
  • centroid of the i-th region (labeled 0 in Figure 2) is (x 0 , y 0 ), and the specific calculation is:
  • the origin of the above coordinate value (x, y) is the top left vertex of the image, the X-axis direction is horizontal to the right, and the Y-axis direction is vertically downward, so Left, bottom, right, and top correspond to the label 1 in FIG. 2, respectively. 2, 3 and 4, corresponding to the left side of the minimum circumscribed rectangle, that is, the minimum value of the X coordinate, the lower side of the minimum circumscribed rectangle is taken as the maximum value of the Y coordinate, and the upper side of the minimum circumscribed rectangle is taken as the minimum value of the Y coordinate.
  • the pest discrimination model is established in advance, and the recognition rate of the suspected image of each pest is determined by the pest discrimination model. Because the pest discrimination model is obtained according to the training, it can fit the correspondence between the correct recognition rate of each suspected suspect image and the suspected image of each pest, so that the recognition accuracy rate of the finally obtained pest suspect image is more targeted, The final level of pest prediction is more indicative of the significance of pest control.
  • the specific steps of establishing a pest discrimination model are: preparing a positive sample set and a negative sample set of the pest image, the positive sample set includes pest images in various cases, the negative sample set includes a plurality of pest-free images;
  • the negative sample set trains the neural network to generate a pest identification model.
  • the neural network is specifically a VGGNet convolutional neural network.

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Pest Control & Pesticides (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Environmental Sciences (AREA)
  • Zoology (AREA)
  • Wood Science & Technology (AREA)
  • Insects & Arthropods (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Medical Informatics (AREA)
  • Data Mining & Analysis (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Probability & Statistics with Applications (AREA)
  • Catching Or Destruction (AREA)

Abstract

本发明涉及一种基于机器视觉的虫害监测方法,其步骤包括:在害虫聚集处安装诱虫装置,并设置图像采集装置面向诱虫装置采集图像;识别所采集图像中的害虫并得出害虫数量;若害虫数量大于或等于预先设定的害虫数量阈值,则将识别到的各个害虫在图像中所处的区域分别提取为多个害虫疑似图像,判断每个害虫疑似图像的识别正确率;根据害虫数量和每个害虫疑似图像的识别正确率计算出虫害预测水平。本发明通过图像采集装置面向诱虫装置自动采集害虫图像,免去人工目测耗时耗力的弊端,还能做到对害虫实时监控;结合害虫数量和每个害虫疑似图像的识别正确率来计算虫害预测水平准确性更高,得出的结果更有意义,增强了对害虫防治的指导性。

Description

一种基于机器视觉的虫害监测方法 技术领域
本发明涉及虫害监测领域,具体涉及一种基于机器视觉的虫害监测方法。
背景技术
近年来,虫害在我国一些地区形势严峻,已造成了不小的严重损失,而害虫作为虫害的主要传播媒介,防治害虫被认为是控制病虫害的关键。特别地,对于南方大面积种植的柑橘果园,柑橘木虱作为传播媒介造成的黄龙病已严重危害了柑橘果园的正常运作,极大地影响了柑橘果实产出与质量。目前各地区在实施柑橘黄龙病防控过程中,主要通过人工来目测柑橘木虱所导致的虫害的发生规律,还通过人工来预测虫害的发生程度,不但费工费时,同时因为成虫的活动和迁飞都影响着监测数据的准确性。其他传统虫害监测手段,如利用诱捕器进行监测等,在准确度和时效性等方面仍不够理想,对果园害虫防治指导性差,防治成本高,效果不佳。
发明内容
为了克服现有技术的缺陷,本发明提供一种基于机器视觉的虫害监测方法,可对害虫进行实时监控并预测虫害的发生程度。
针对上述技术问题,本专利所采用的方案为:一种基于机器视觉的虫害监测方法,其步骤包括:
在害虫聚集处安装诱虫装置,并设置图像采集装置面向诱虫装置采集图像;
识别所采集图像中的害虫并得出害虫数量;
若害虫数量大于或等于预先设定的害虫数量阈值,则将识别到的各个害虫在图像中所处的区域分别提取为多个害虫疑似图像,判断每个害虫疑似图像的识别正确率;
根据害虫数量和每个害虫疑似图像的识别正确率计算出虫害预测水平。
本发明通过图像采集装置面向诱虫装置自动采集害虫图像,免去人工目测耗时耗力的弊端,而且还能做到对害虫实时监控;结合害虫数量和每个害虫疑似图像的识别正确率来计算虫害预测水平,相比于现有技术中只是单独通过害虫数量来计算虫害预测水平,准确性更高,得出的虫害预测水平更有意义,增强了对害虫防治的指导性。
进一步地,预先建立统计分析模型,利用统计分析模型结合害虫数量和每个害虫疑似图像的识别正确率,计算得到虫害预测水平。
因为统计分析模型根据训练得到,其能够拟合出反映虫害预测水平分别与害虫数量、每个害虫疑似图像的识别正确率的对应关系,使得最终得到的虫害预测水平更有针对性,也更有指导害虫防治的意义。
进一步地,基于统计分析模型的虫害预测水平H(n)计算公式为:
Figure PCTCN2017118423-appb-000001
其中n为害虫数量,allow max为害虫数量阈值,p i为第i个害虫疑似图像的识别正确率。
当害虫数量没有达到害虫数量阈值时,则害虫预测水平H(n)为零,也即没有发生虫害,但当害虫数量大于等于害虫数量阈值时,则将每个害虫疑似图像的识别正确率进行累加计算,将各个害虫疑似图像以及其可能都考虑在内,利于得到更科学的害虫预测水平H(n),提高对虫害防治的指导意义。
进一步地,所述诱虫装置包括盒子以及安装在盒子中的诱虫灯,所述盒子为多面体,盒子至少有一面开口;设置图像采集装置面向盒子开口的一面采集图像。
利用盒子来聚集诱虫灯的光线,以使图像采集装置采集到的图像清晰,便于后面对图像中的害虫进行识别,提高识别准确率,进一步提高本方法的实用性,增强预测的准确性,便于人们及时对虫害进行防治。
进一步地,盒子面向图像采集装置的开口覆盖有透光薄膜。透光薄膜能够使图像采集装置接收到的光线更为均匀柔和,提高成像质量,便于后面对图像中的害虫进行识别,提高识别准确率,进一步提高本方法的实用性,增强预测的准确性,便于人们及时对虫害进行防治。
进一步地,识别所采集图像中的害虫的步骤具体为:识别所采集图像中遮挡诱虫灯灯光的区域,判断每个区域的几何特征与害虫外形是否匹配,若是,则将相对应的区域识别为害虫。结合诱虫灯的设置,只需判断图像中遮挡诱虫灯灯光的区域的几何特征是否为害虫即可,免于复杂的图像识别过程,提高了识别效率,确保本方法具有实时性,能够让人们更快地对虫害采取防治手段。
进一步地,至少根据每个区域的面积、周长来判断与害虫外形是否匹配。因为结合区域的面积、周长以及它们之间的比例这三个重要特征,足以极大地减少误判率,同时也能提高识别效率,能够提高得出虫害预测水平的效率,对于害虫防治更具时效性。
进一步地,根据区域中的像素点计算得出面积和周长。对区域中所有的像素点累加即可得到面积,对处在区域边缘的像素点进行累加即可得到周长,因为区域一般为不规则多边形,通过像素点进行简单累加计算可免于对不规则多边形面积和周长的复杂公式计算。
进一步地,预先建立害虫判别模型,通过害虫判别模型判断每个害虫疑似图像的识别正确率。因为害虫判别模型根据训练得到,其能够拟合出反映每个害虫疑似图像的识别正确率与每个害虫疑似图像的对应关系,使得最终得到的害虫疑似图像的识别正确率更有针对性,也使最终的虫害预测水平更有指导害虫防治的意义。
进一步地,建立害虫判别模型的具体步骤为:制作害虫图像的正样本的集合和负样本的集合,正样本为各种情况下的害虫图像,负样本为不含害虫的图像;通过正样本的集合和负样本的集合对神经网络进行训练,生成害虫判别模型。
进一步地,在识别所采集图像中的害虫并得出害虫数量之前,还需要对所采集图像进行去噪预处理。
与现有技术对比,本专利的有益效果为:
(1)通过图像采集装置面向诱虫装置进行拍摄,能够得到更准确的图像,提高虫害预测水平的准确性;
(2)对图像采集装置采集到的图像进行去噪预处理,可以将噪声去掉,便于更准确识别图像中的害虫,同样能够提高虫害预测水平的准确性;
(3)结合诱虫灯的设置,只需判断图像中遮挡诱虫灯灯光的区域的几何特征是否为害虫外形即可,去掉了多余区域的识别过程,免于复杂的图像识别步骤,提高了识别效率,确保本方法具有实时性,使人们能够更快地对虫害采取防治手段;透光薄膜使得图像采集装置接收到的诱虫灯的光线更柔和,使得拍出来的图像质量更好,能够得到更具有参考性的害虫图像;
(4)至少根据每个区域的面积、周长来判断与害虫外形是否匹配,因为结合区域的面积、周长以及它们之间的比例这三个重要特征,足以极大地减少误判率,同时还提高识别效率;
(5)通过害虫判别模型判断每个害虫疑似图像的识别正确率,且害虫判别模型通过神经网络训练得出,智能化程度更高,而且没有人工判断主观性大的弊端,准确性高,同样能够提高虫害预测水平的准确性;
(6)相对于现有技术中单独根据害虫数量来计算出虫害预测水平,本发明预先建立起统计分析模型,并利用统计分析模型结合害虫数量和每个害虫疑似图像的识别正确率来计算虫害预测水平,统计分析模型、害虫数量以及每个害虫疑似图像的识别正确率三者结合来判断,可以大大减少判断误差,提高虫害预测水平的有效性,对于害虫防治更具指导性;
(7)对区域中所有的像素点累加即可得到面积,对处在区域边缘的像素点进行累加即可得到周长,可免于对每个为不规则多边形的区域面积和周长的复杂公式计算,提高计算效率,便于快速得到虫害预测水平,利于人们快速应对虫害的发生。
附图说明
图1是本专利的方法框图。
图2是本专利图像中遮挡诱虫灯灯光的区域的示意图。
具体实施方式
以下结合附图对本专利做进一步的解释说明。附图仅用于示例性说明,不能理解为对本专利的限制;为了更好说明本实施例,附图某些部件会有省略、放大或缩小;对于本领域技术人员来说,附图中某些公知结构及其说明可能省略是可以理解的。
如图1所示的一种基于机器视觉的虫害监测方法,其步骤包括:在害虫聚集处安装诱虫装置,并设置图像采集装置面向诱虫装置采集图像;
对所采集到的图像性进行去噪预处理,采用blob算法识别图像采集装置采集到的图像中的多个害虫并得出害虫数量;
若害虫数量大于或等于预先设定的害虫数量阈值,则将识别到的各个害虫在图像中所处的区域分别提取为多个害虫疑似图像,判断每个害虫疑似图像的识别正确率;其中,害虫数量阈值可为3,因为根据害虫的趋光性,例如柑橘木虱就具有驱光性,如果发生虫害,识别得出的害虫数量极大可能不止3个,再结合环境中落叶、蜜蜂等因素 干扰识别得到的害虫数量,因此当害虫数量少于害虫数量阈值3时,可确定没有发生虫害,不影响农作物的生长,更优地,可根据在不同虫害程度的区域进行多次试验结果或根据以往经验得到较优的害虫数量阈值;
根据害虫数量和每个害虫疑似图像的识别正确率计算虫害预测水平;
根据虫害预测水平进行不同级别的预警,并将虫害相关参数发送至远程终端,由人工进行进一步确认判断。
虫害相关参数包括虫害预测水平、害虫数量、各个害虫在图像中所处的区域及相对应的图像采集装置采集到的图像。人工判断具体为:人结合上述虫害相关参数判断图像中的真实的害虫数量与识别得出的害虫数量相比是否一致或更多,若是则根据预警的级别采取相应的处理措施。
具体的图像采集装置可为相机。
本发明通过图像采集装置面向诱虫装置自动采集害虫图像,免去人工目测耗时耗力的弊端,而且还能做到对害虫实时监控;结合害虫数量和每个害虫疑似图像的识别正确率来计算虫害预测水平,相比于现有技术中只是单独通过害虫数量来计算虫害预测水平,准确性更高,得出的虫害预测水平更有意义,增强了对害虫防治的指导性。
预先建立统计分析模型,利用统计分析模型结合害虫数量和每个害虫疑似图像的识别正确率,计算得到虫害预测水平;基于统计分析模型的虫害预测水平H(n)计算公式为:
Figure PCTCN2017118423-appb-000002
其中n为害虫数量,allow max为害虫数量阈值,p i为第i个害虫疑似图像的识别正确率,H(n)的取值范围为[0,1]。可根据H(n)的值设定虫害等级,如0.5以下为一级、0.5-0.7为二级、0.7-0.9为三级,并根据H(n)的值所对应的上述级别来进行预警。
因为统计分析模型根据训练得到,其能够拟合出反映虫害预测水平分别与害虫数量、每个害虫疑似图像的识别正确率的对应关系,使得最终得到的虫害预测水平更有针对性,也更有指导害虫防治的意义;或者当害虫数量没有达到害虫数量阈值时,则害虫预测水平H(n)为零,也即没有发生虫害,但当害虫数量大于等于害虫数量阈值时,则计算所有害虫疑似图像的识别正确率的平均值,将各个害虫疑似图像以及其可能都考虑在内,利于得到更科学的害虫预测水平H(n),提高对虫害防治的指导意义。
所述诱虫装置包括盒子以及安装在盒子中的诱虫灯,所述盒子为多面体,具体可为长方体,盒子至少有一面开口,诱虫灯采用背光效果较好的白光光源,盒子中装有用于引诱害虫的挥发物,例如引诱柑橘木虱的β石竹烯&萜品油烯混合挥发物;图像采集装置面向盒子开口的一面采集害虫图像。利用盒子来聚集诱虫灯的光线,以使图像采集装置采集到的图像清晰,便于后面对图像中的害虫进行识别,提高识别准确率,进一步提高本方法的实用性,增强预测的准确性,便于 人们及时对虫害进行防治。
盒子面向图像采集装置开口的一面覆盖有透光薄膜,且图像采集装置应距离盒子有一定的距离,使图像采集装置的拍摄范围刚好覆盖透光薄膜。透光薄膜能够使图像采集装置接收到的光线更为均匀柔和,提高成像质量,便于后面对图像中的害虫进行识别,提高识别准确率,进一步提高本方法的实用性,增强预测的准确性,便于人们及时对虫害进行防治。
另外,由于覆盖有透光薄膜使得图像采集装置接收到的光线更为均匀柔和,图像采集装置采集到的图像的背景也更为干净,能够有效分辨出噪点,所以才能对图像采集装置采集到的图像进行去噪预处理,才能使拍摄到的图像清晰度更高。
采用blob算法识别图像采集装置采集到的图像中的多个害虫的步骤具体为:识别图像中遮挡诱虫灯灯光的区域,判断每个区域的几何特征与害虫外形是否匹配,若是,则将相对应的区域识别为害虫。结合诱虫灯的设置,只需判断图像中遮挡诱虫灯灯光的区域的几何特征是否为害虫即可,免于复杂的图像识别过程,提高了识别效率,确保本方法具有实时性,能够让人们更快地对虫害采取防治手段。
至少根据每个区域的面积、周长来判断与害虫外形是否匹配。因为结合区域的面积、周长以及它们之间的比例这三个重要特征,足以极大地减少误判率,提高识别效率,同样能够提高最终得出的虫害预测水平的效率,对于害虫防治更具时效性。更优地,除了每个区域的面积、周长需要计算,还需要计算每个区域的最小外接矩形以及形心 位置。
在将识别到的各个害虫在图像中所处的区域分别提取为多个害虫疑似图像时,实际上各个害虫在图像中所处的区域的最小外接矩形也在上述步骤中计算得出,通过最小外接矩形对各个害虫在图像中所处的区域进行定位,可提取出多个害虫疑似图像。
根据区域中的像素点计算得出面积和周长。对区域中所有的像素点累加即可得到面积,对处在区域边缘的像素点进行累加即可得到周长,因为区域一般为不规则多边形,通过像素点进行简单累加计算可免于对不规则多边形面积和周长的复杂公式计算。
设第i个区域为R i(x,y),f(x,y)为图像采集装置采集到的图像中像素点(x,y)处的二值化像素值,则第i个区域的面积S(R i(x,y))为:
Figure PCTCN2017118423-appb-000003
二值化像素值f(x,y)为预处理得到,具体实施过程中可将图像偏暗区域中像素点处的f(x,y),也即遮挡诱虫灯灯光的区域的f(x,y),置为1,将图像偏亮区域中像素点处的f(x,y)置为0,所以累加f(x,y)在R i(x,y)中的值可作为区域R i(x,y)的面积。
第i个区域的周长为区域边界(图2中标号为5处)上像素点(x,y)的数目;
第i个区域的形心(图2中标号为0处)为(x 0,y 0),具体计算为:
Figure PCTCN2017118423-appb-000004
其中矩阵M pq(R i(x,y))=∑ (x,y)∈Ri(x,y)f(x,y)x py q,例如上述公 式中M 10(R i(x,y))=∑ (x,y)∈Ri(x,y)f(x,y)x 1y 0,其他参数也以此类推;
第i个区域最小外接矩形的具体计算公式为:
Figure PCTCN2017118423-appb-000005
Figure PCTCN2017118423-appb-000006
Figure PCTCN2017118423-appb-000007
Figure PCTCN2017118423-appb-000008
以上坐标值(x,y)的原点为图像的左上角顶点,X轴方向为水平向右,Y轴方向为垂直向下,因此Left、bottom、right和top分别对应图2中的标号1、2、3和4,对应最小外接矩形左侧边即取X坐标最小值、最小外接矩形下侧边即取Y坐标最大值、最小外接矩形上侧边即取Y坐标最小值。
预先建立害虫判别模型,通过害虫判别模型判断每个害虫疑似图像的识别正确率。因为害虫判别模型根据训练得到,其能够拟合出反映每个害虫疑似图像的识别正确率与每个害虫疑似图像的对应关系,使得最终得到的害虫疑似图像的识别正确率更有针对性,也使最终的虫害预测水平更有指导害虫防治的意义。
建立害虫判别模型的具体步骤为:制作害虫图像的正样本集合和负样本集合,正样本集合包括各种情况下的害虫图像,负样本集合包括多个不含害虫的图像;通过正样本集合和负样本集合对神经网络进行训练,生成害虫判别模型。其中的神经网络具体为VGGNet卷积神经网络。
需要说明的是,以上仅为本发明的优选实施例而已,并不用于限 制本发明,尽管参照实施例对本发明进行了详细的说明,对于本领域的技术人员来说,其依然可以对前述实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换,但是凡在本发明的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。

Claims (10)

  1. 一种基于机器视觉的虫害监测方法,其特征在于,步骤包括:在害虫聚集处安装诱虫装置,并设置图像采集装置面向诱虫装置采集图像;
    识别所采集图像中的害虫并得出害虫数量;
    若害虫数量大于或等于预先设定的害虫数量阈值,则将识别到的各个害虫在图像中所处的区域分别提取为多个害虫疑似图像,判断每个害虫疑似图像的识别正确率;
    根据害虫数量和每个害虫疑似图像的识别正确率计算出虫害预测水平。
  2. 根据权利要求1所述的一种基于机器视觉的虫害监测方法,其特征在于,预先建立统计分析模型,利用统计分析模型结合害虫数量和每个害虫疑似图像的识别正确率,计算得到虫害预测水平。
  3. 根据权利要求2所述的一种基于机器视觉的虫害监测方法,其特征在于,基于统计分析模型的虫害预测水平H(n)计算公式为:
    Figure PCTCN2017118423-appb-100001
    其中n为害虫数量,allow max为害虫数量阈值,p i为第i个害虫疑似图像的识别正确率。
  4. 根据权利要求1所述的一种基于机器视觉的虫害监测方法,其特征在于,所述诱虫装置包括盒子以及安装在盒子中的诱虫灯,所 述盒子为多面体,盒子至少有一面开口;设置图像采集装置面向盒子开口的一面采集图像。
  5. 根据权利要求4所述的一种基于机器视觉的虫害监测方法,其特征在于,盒子面向图像采集装置的开口覆盖有透光薄膜。
  6. 根据权利要求1所述的一种基于机器视觉的虫害监测方法,其特征在于,识别所采集图像中的害虫的步骤具体为:识别所采集图像中遮挡诱虫灯灯光的区域,判断每个区域的几何特征与害虫外形是否匹配,若是,则将相对应的区域识别为害虫。
  7. 根据权利要求6所述的一种基于机器视觉的虫害监测方法,其特征在于,至少根据每个区域的面积、周长来判断与害虫外形是否匹配。
  8. 根据权利要求1所述的一种基于机器视觉的虫害监测方法,其特征在于,预先建立害虫判别模型,通过害虫判别模型判断每个害虫疑似图像的识别正确率。
  9. 根据权利要求8所述的一种基于机器视觉的虫害监测方法,其特征在于,建立害虫判别模型的具体步骤为:制作害虫图像的正样本的集合和负样本的集合,正样本为各种情况下的害虫图像,负样本为不含害虫的图像;通过正样本的集合和负样本的集合对神经网络进行训练,生成害虫判别模型。
  10. 根据权利要求1至9任一项所述的一种基于机器视觉的虫害监测方法,其特征在于,在识别所采集图像中的害虫并得出害虫数量之前,还需要对所采集图像进行去噪预处理。
PCT/CN2017/118423 2017-12-25 2017-12-25 一种基于机器视觉的虫害监测方法 WO2019126971A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2017/118423 WO2019126971A1 (zh) 2017-12-25 2017-12-25 一种基于机器视觉的虫害监测方法
US16/637,480 US10729117B2 (en) 2017-12-25 2017-12-25 Pest monitoring method based on machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/118423 WO2019126971A1 (zh) 2017-12-25 2017-12-25 一种基于机器视觉的虫害监测方法

Publications (1)

Publication Number Publication Date
WO2019126971A1 true WO2019126971A1 (zh) 2019-07-04

Family

ID=67064330

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/118423 WO2019126971A1 (zh) 2017-12-25 2017-12-25 一种基于机器视觉的虫害监测方法

Country Status (2)

Country Link
US (1) US10729117B2 (zh)
WO (1) WO2019126971A1 (zh)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111062295A (zh) * 2019-12-10 2020-04-24 上海秒针网络科技有限公司 区域定位方法和装置、存储介质
CN111476238A (zh) * 2020-04-29 2020-07-31 中国科学院合肥物质科学研究院 一种基于区域尺度感知技术的害虫图像检测方法
CN113688517A (zh) * 2021-08-20 2021-11-23 浙江大学 一种茶园诱虫板失效时间预测方法及系统
CN116310658A (zh) * 2023-05-17 2023-06-23 中储粮成都储藏研究院有限公司 一种基于球形摄像机建立储粮害虫图像数据集的方法
CN117036090A (zh) * 2023-09-08 2023-11-10 广州市坤盛信息科技有限公司 适配多种物联网设备实现精准林业管理的系统
CN117237820A (zh) * 2023-09-26 2023-12-15 中化现代农业有限公司 害虫危害程度的确定方法、装置、电子设备和存储介质
CN118120721A (zh) * 2024-04-17 2024-06-04 武汉新烽光电股份有限公司 一种野外土栖白蚁危害程度的环境识别方法

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
BR112021002477A2 (pt) * 2018-09-21 2021-07-27 Bayer Aktiengesellschaft imagem assistida por sensor
US10957036B2 (en) * 2019-05-17 2021-03-23 Ceres Imaging, Inc. Methods and systems for crop pest management utilizing geospatial images and microclimate data
US20220217962A1 (en) * 2019-05-24 2022-07-14 Anastasiia Romanivna ROMANOVA Mosquito monitoring and counting system
CN115471747B (zh) * 2022-08-30 2023-05-09 广东省农业科学院环境园艺研究所 一种山茶花病虫害和生理病害的ai快速判识方法及应用
US12022820B1 (en) * 2023-10-11 2024-07-02 Selina S Zhang Integrated insect control system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101976350A (zh) * 2010-10-20 2011-02-16 中国农业大学 基于视频分析的储粮害虫检测识别方法及其系统
US20130136312A1 (en) * 2011-11-24 2013-05-30 Shih-Mu TSENG Method and system for recognizing plant diseases and recording medium
CN103246872A (zh) * 2013-04-28 2013-08-14 北京农业智能装备技术研究中心 一种基于计算机视觉技术的广谱虫情自动测报方法
CN105850930A (zh) * 2016-04-23 2016-08-17 上海大学 基于机器视觉的病虫害预警系统和方法
CN107292891A (zh) * 2017-06-20 2017-10-24 华南农业大学 一种基于机器视觉的南方蔬菜重大害虫的检测计数方法

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7496228B2 (en) * 2003-06-13 2009-02-24 Landwehr Val R Method and system for detecting and classifying objects in images, such as insects and other arthropods
US7286056B2 (en) * 2005-03-22 2007-10-23 Lawrence Kates System and method for pest detection
CN104582478B (zh) * 2012-08-24 2017-03-08 国立大学法人香川大学 害虫聚集装置及害虫聚集方法
CN105636436B (zh) * 2014-09-24 2019-06-11 上海星让实业有限公司 一种智能成像系统以及安装有该智能成像系统的捕虫装置
US11241002B2 (en) * 2016-03-22 2022-02-08 Matthew Jay Remote insect monitoring systems and methods
ZA201704685B (en) * 2016-07-12 2019-06-26 Tata Consultancy Services Ltd Systems and methods for pest forecasting using historical pesticide usage information
GB2568002A (en) * 2016-09-08 2019-05-01 Walmart Apollo Llc Systems and methods for identifying pests in crop-containing areas via unmanned vehicles based on crop damage detection
US10796161B2 (en) * 2017-07-14 2020-10-06 Illumitex, Inc. System and method for identifying a number of insects in a horticultural area
US10375947B2 (en) * 2017-10-18 2019-08-13 Verily Life Sciences Llc Insect sensing systems and methods

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101976350A (zh) * 2010-10-20 2011-02-16 中国农业大学 基于视频分析的储粮害虫检测识别方法及其系统
US20130136312A1 (en) * 2011-11-24 2013-05-30 Shih-Mu TSENG Method and system for recognizing plant diseases and recording medium
CN103246872A (zh) * 2013-04-28 2013-08-14 北京农业智能装备技术研究中心 一种基于计算机视觉技术的广谱虫情自动测报方法
CN105850930A (zh) * 2016-04-23 2016-08-17 上海大学 基于机器视觉的病虫害预警系统和方法
CN107292891A (zh) * 2017-06-20 2017-10-24 华南农业大学 一种基于机器视觉的南方蔬菜重大害虫的检测计数方法

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111062295A (zh) * 2019-12-10 2020-04-24 上海秒针网络科技有限公司 区域定位方法和装置、存储介质
CN111062295B (zh) * 2019-12-10 2023-06-13 上海秒针网络科技有限公司 区域定位方法和装置、存储介质
CN111476238A (zh) * 2020-04-29 2020-07-31 中国科学院合肥物质科学研究院 一种基于区域尺度感知技术的害虫图像检测方法
CN111476238B (zh) * 2020-04-29 2023-04-07 中国科学院合肥物质科学研究院 一种基于区域尺度感知技术的害虫图像检测方法
CN113688517A (zh) * 2021-08-20 2021-11-23 浙江大学 一种茶园诱虫板失效时间预测方法及系统
CN113688517B (zh) * 2021-08-20 2023-11-14 浙江大学 一种茶园诱虫板失效时间预测方法及系统
CN116310658A (zh) * 2023-05-17 2023-06-23 中储粮成都储藏研究院有限公司 一种基于球形摄像机建立储粮害虫图像数据集的方法
CN116310658B (zh) * 2023-05-17 2023-08-01 中储粮成都储藏研究院有限公司 一种基于球形摄像机建立储粮害虫图像数据集的方法
CN117036090A (zh) * 2023-09-08 2023-11-10 广州市坤盛信息科技有限公司 适配多种物联网设备实现精准林业管理的系统
CN117036090B (zh) * 2023-09-08 2024-01-26 广州市坤盛信息科技有限公司 适配多种物联网设备实现精准林业管理的系统
CN117237820A (zh) * 2023-09-26 2023-12-15 中化现代农业有限公司 害虫危害程度的确定方法、装置、电子设备和存储介质
CN118120721A (zh) * 2024-04-17 2024-06-04 武汉新烽光电股份有限公司 一种野外土栖白蚁危害程度的环境识别方法

Also Published As

Publication number Publication date
US10729117B2 (en) 2020-08-04
US20200178511A1 (en) 2020-06-11

Similar Documents

Publication Publication Date Title
WO2019126971A1 (zh) 一种基于机器视觉的虫害监测方法
CN108040997B (zh) 一种基于机器视觉的虫害监测方法
CN111275679B (zh) 一种基于图像的太阳能电池缺陷检测系统及方法
CN106128022B (zh) 一种智慧金睛识别暴力动作报警方法
CN102419819B (zh) 人脸图像识别方法和系统
CN107782733A (zh) 金属表面缺陷的图像识别无损检测装置及方法
CN101577812A (zh) 一种岗位监测的方法和系统
CN109298785A (zh) 一种监测设备的人机联控系统及方法
CN111852792B (zh) 一种基于机器视觉的风机叶片缺陷自诊断定位方法
CN112396658A (zh) 一种基于视频的室内人员定位方法及定位系统
CN111401310B (zh) 基于人工智能的厨房卫生安全监督管理方法
CN104156729B (zh) 一种教室人数统计方法
CN108921004A (zh) 安全帽佩戴识别方法、电子设备、存储介质及系统
CN105096305A (zh) 绝缘子状态分析的方法及装置
CN105023272A (zh) 农作物叶子虫害检测方法和系统
CN108010242A (zh) 一种基于视频识别的安防报警方法、系统及存储介质
CN111325133A (zh) 一种基于人工智能识别的影像处理系统
CN110909703A (zh) 一种基于人工智能的明厨亮灶场景下厨师帽的检测方法
CN107133592A (zh) 电力变电站采用红外热成像及可见光成像技术融合的人体目标特征检测算法
CN112434545A (zh) 一种智能场所管理方法及系统
CN105894003A (zh) 一种基于机器视觉的大田果树病害监测预警系统
CN106650735B (zh) 一种led字符自动定位识别方法
CN107977531A (zh) 一种基于图像处理和领域数学模型进行接地电阻软测量的方法
CN116524224A (zh) 一种基于机器视觉的烤后烟叶类型检测方法及系统
CN115410114A (zh) 一种基于多特征的城轨防汛预警方法及系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17935838

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17935838

Country of ref document: EP

Kind code of ref document: A1