CN117274566B - Real-time weeding method based on deep learning and inter-plant weed distribution conditions - Google Patents

Real-time weeding method based on deep learning and inter-plant weed distribution conditions Download PDF

Info

Publication number
CN117274566B
CN117274566B CN202311239329.9A CN202311239329A CN117274566B CN 117274566 B CN117274566 B CN 117274566B CN 202311239329 A CN202311239329 A CN 202311239329A CN 117274566 B CN117274566 B CN 117274566B
Authority
CN
China
Prior art keywords
weeding
crop
detection frame
minimum detection
weeds
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311239329.9A
Other languages
Chinese (zh)
Other versions
CN117274566A (en
Inventor
薛惠丹
胡睿
苏文浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Technology
China Agricultural University
Original Assignee
Beijing University of Technology
China Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Technology, China Agricultural University filed Critical Beijing University of Technology
Priority to CN202311239329.9A priority Critical patent/CN117274566B/en
Publication of CN117274566A publication Critical patent/CN117274566A/en
Application granted granted Critical
Publication of CN117274566B publication Critical patent/CN117274566B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M21/00Apparatus for the destruction of unwanted vegetation, e.g. weeds
    • A01M21/02Apparatus for mechanical destruction
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M21/00Apparatus for the destruction of unwanted vegetation, e.g. weeds
    • A01M21/04Apparatus for destruction by steam, chemicals, burning, or electricity
    • A01M21/046Apparatus for destruction by steam, chemicals, burning, or electricity by electricity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/776Validation; Performance evaluation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Mathematical Physics (AREA)
  • Wood Science & Technology (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Data Mining & Analysis (AREA)
  • Biomedical Technology (AREA)
  • Environmental Sciences (AREA)
  • Insects & Arthropods (AREA)
  • Pest Control & Pesticides (AREA)
  • General Engineering & Computer Science (AREA)
  • Zoology (AREA)
  • Mechanical Engineering (AREA)
  • Guiding Agricultural Machines (AREA)

Abstract

The application provides a real-time weeding method based on deep learning and inter-plant weed distribution conditions, which comprises the following steps: inputting an image training data set into a convolutional neural network for training to obtain a trained model, identifying and positioning an image of a crop seedling and an image of weeds in a field, which are obtained by a camera in real time, so as to obtain a center pixel coordinate of a minimum detection frame of the seedling, a center pixel coordinate of a width and height of the minimum detection frame of the weeds, and solving four corner coordinates of the minimum detection frame of the seedling according to the result; constructing real-time field single crop row map information; determining a weeding strategy according to the field single crop row map information; and controlling the weeding mode of the mechanical-laser combined weeding device according to the weeding strategy and the field single crop row map information. The application can provide a proper weeding method according to the inter-plant weed condition, and reduces the seedling injury rate under the condition of ensuring weeding efficiency.

Description

Real-time weeding method based on deep learning and inter-plant weed distribution conditions
Technical Field
The invention relates to the technical field of intelligent agricultural weeding, in particular to a real-time weeding method based on deep learning and inter-plant weed distribution conditions.
Background
Along with the rapid development of computer technology, intelligent weeding devices replace traditional manual weeding and become a popular development trend. At present, the main intelligent weeding devices comprise mechanical weeding, chemical weeding, laser weeding and the like. Chemical weeding comprises two modes of indifferent spraying and accurate positioning weeding, the indifferent spraying has high weeding efficiency, but has great pollution to the environment; the chemical weeding mode with accurate positioning greatly reduces the using amount of pesticides, but is not suitable for occasions with high density of weeds. The laser weeding method is also a weeding method with accurate positioning, has little influence on the environment, but also has the problem of being unsuitable for occasions with high weed density. The mechanical weeding is suitable for occasions with high weed density, pollution is avoided, and the requirements of sustainable development are met, but the current mainstream mechanical weeding strategies are to only identify and avoid crops, and whether weeds exist in the rows or not periodically enter the rows for weeding, so that the probability of damaging the crops is increased, beneficial animals in the rows are damaged, and the balance of microbial ecological circles is broken.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention aims to provide a real-time weeding method based on deep learning and inter-plant weed distribution.
In order to achieve the above object, the present invention provides the following solutions:
A real-time weeding method based on deep learning and inter-plant weed distribution conditions is applied to a mechanical-laser combined weeding device, and comprises the following steps:
adjusting the height of a camera of the mechanical-laser combined weeding device and the focal length parameter of the camera so that the camera can acquire images of two adjacent crops in the same row;
Acquiring internal parameters and external parameters of a camera;
acquiring an image of crop seedlings and an image of weeds in a field by using a camera;
Determining an image training dataset and a verification dataset of seedlings according to the images of crop seedlings and the images of weeds in the field;
inputting the training data set into a convolutional neural network for training to obtain a training model, and checking the training model by using the verification data set to obtain a trained model;
Based on a trained model, identifying and positioning the image of the crop seedling and the image of the weed in the field, which are acquired by the camera in real time, so as to obtain the center pixel coordinates of the minimum detection frame of the seedling, the width and the height and the center pixel coordinates of the minimum detection frame of the weed;
solving four-corner coordinates of the minimum detection frame of the seedling according to the central pixel coordinates and the width and height of the minimum detection frame of the seedling;
constructing real-time field single crop row map information according to the center coordinates and four-corner coordinates of the minimum detection frame of the seedlings and the center pixel coordinates of the minimum detection frame of the weeds;
determining a weeding strategy according to the field single crop row map information;
And controlling the weeding mode of the mechanical-laser combined weeding device according to the weeding strategy and the field single crop row map information.
Preferably, the calculation formula for solving the four-corner coordinates of the minimum detection frame of the seedling according to the center pixel coordinates and the width and height of the minimum detection frame of the seedling is as follows:
wherein, Is the value of the abscissa of the upper left corner coordinate and the lower left corner coordinate of the minimum detection frame of the mth plant crop, and is/>Is the abscissa value of the upper right and lower right corner coordinates of the minimum detection frame of the mth plant crop,/>Is the ordinate value of the left upper corner coordinate and the left lower corner coordinate of the minimum detection frame of the m-th crop, and is/>The vertical coordinate values of the upper right corner coordinate and the lower right corner coordinate of the minimum detection frame of the mth plant crop are W m, the wide value of the minimum detection frame of the mth plant crop is h m, the high value of the minimum detection frame of the mth plant crop is u m, the horizontal coordinate value of the central pixel coordinate of the minimum detection frame of the mth plant crop is u m, and v m is the central pixel coordinate of the minimum detection frame of the mth plant crop.
Preferably, the constructing real-time field single crop row map information according to the center coordinates and four-corner coordinates of the minimum detection frame of the seedling and the center pixel coordinates of the minimum detection frame of the weed comprises:
Dividing a region to be worked into a safe region and a dangerous region according to the center coordinates and the four-corner coordinates of the minimum detection frame of the young seedling;
dividing the safety zone and the danger zone into a left safety zone, a right safety zone, a left danger zone and a right danger zone;
constructing field single crop row map information according to the central pixel coordinates of the minimum weed detection frame based on the pixel coordinates of the left safety zone, the right safety zone, the left danger zone and the right danger zone;
the judgment formula for judging the safety zone as the left safety zone is as follows:
the determination formula for determining the safe area as the right safe area is:
The decision formula for deciding the hazard zone as the right hazard zone is:
the decision formula for deciding the hazard zone as the left hazard zone is:
wherein/> Is the abscissa value of pixel points in the left dangerous area of the m-th crop, i.e./>Is the abscissa value of the pixel points in the right dangerous area of the m-th crop, i.e./>Is the ordinate value of pixel points in the left dangerous area of the m-th crop, i.e./>Is the ordinate value of pixel points in the right dangerous area of the m-th crop, i.e./>Is the abscissa value of pixel points in the left safe area of the m-th crop, and is/>Is the abscissa value of the pixel points in the right safe area of the m-th crop,Is the ordinate value of pixel points in the left safe area of the m-th crop, i.e./>Is the ordinate value of pixel points in the right safe area of the m-th crop, i.e./>Representing pixel point abscissa values in a safe zone between m and m+1 plants,/>Representing the value of the ordinate of the pixel points in the safe zone between the m and m+1 plants,/>Representing pixel point abscissa values in the hazard zone between m and m+1 plants,/>The ordinate values of the pixel points in the danger zone between the m and m+1 plants are represented.
Preferably, the determining a weeding strategy according to the field single crop row map information comprises:
judging whether the number of weeds in the area to be worked is equal to zero according to the field single crop row map information, obtaining a first judgment result, and if the first judgment result is yes, not taking any weeding measure; if the first judging result is negative, judging whether the number of weeds is larger than a threshold value, obtaining a second judging result, if the second judging result is negative, judging that the density of the weeds is large, performing indiscriminate weeding by adopting a mechanical weeding method, and if the second judging result is negative, calculating the included angles between the connecting lines of the weeds with adjacent numbers and the vertical direction, and obtaining a first included angle and a second included angle; the first included angle is an included angle between the connecting line of the center points of the b-th strain weeds and the b+1-th strain weeds and the vertical direction, and the second included angle is an included angle between the connecting line of the center points of the first strain weeds in the left safety area and the last strain weeds in the left dangerous area and the vertical direction;
calculating an included angle between the maximum transverse moving speed of the laser weeding module and the advancing speed of weeding equipment to obtain a third included angle;
Judging whether the third included angle is larger than the first included angle or the second included angle, if so, adopting a laser mode to weed, and if not, adopting a mechanical mode to weed.
Preferably, the method further comprises:
Acquiring time for identifying crops and weeds by a training model, formulating time delay by a weeding strategy, and obtaining system delay time according to the time delay;
based on the weeding strategy, controlling the entry time of a weeding cutter or the delay time of excitation emission according to the system delay time.
According to the specific embodiment provided by the invention, the invention discloses the following technical effects:
the application provides a real-time weeding method based on deep learning and inter-plant weed distribution conditions, which is applied to a mechanical-laser combined weeding device and comprises the following steps: adjusting the height of a camera of the mechanical-laser combined weeding device and the focal length parameter of the camera so that the camera can acquire images of two adjacent crops in the same row; acquiring internal parameters and external parameters of a camera; acquiring an image of crop seedlings and an image of weeds in a field by using a camera; determining an image training dataset and a verification dataset of seedlings according to the images of crop seedlings and the images of weeds in the field; inputting the training data set into a convolutional neural network for training to obtain a training model, and checking the training model by using the verification data set to obtain a trained model; based on a trained model, identifying and positioning the image of the crop seedling and the image of the weed in the field, which are acquired by the camera in real time, so as to obtain the center pixel coordinates of the minimum detection frame of the seedling, the width and the height and the center pixel coordinates of the minimum detection frame of the weed; solving four-corner coordinates of the minimum detection frame of the seedling according to the central pixel coordinates and the width and height of the minimum detection frame of the seedling; constructing real-time field single crop row map information according to the center coordinates and four-corner coordinates of the minimum detection frame of the seedlings and the center pixel coordinates of the minimum detection frame of the weeds; determining a weeding strategy according to the field single crop row map information; and controlling the weeding mode of the mechanical-laser combined weeding device according to the weeding strategy and the field single crop row map information. The application can provide a proper weeding method according to the inter-plant weed condition, and reduces the seedling injury rate under the condition of ensuring weeding efficiency.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions of the prior art, the drawings that are needed in the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a real-time weeding method based on deep learning and inter-plant weed distribution conditions provided by an embodiment of the invention;
FIG. 2 is a flow chart of the combined mechanical-laser weeding apparatus according to the embodiment of the present invention;
fig. 3 is a schematic structural diagram of a mechanical-laser combined weeding device according to an embodiment of the present invention;
fig. 4 is a schematic diagram of dividing areas based on four-corner coordinates and center point coordinates of crops according to an embodiment of the present invention;
Symbol description:
1. A camera; 2. weeding knives; 3. a laser emitter.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
In order that the above-recited objects, features and advantages of the present invention will become more readily apparent, a more particular description of the invention will be rendered by reference to the appended drawings and appended detailed description.
The embodiment also provides a real-time weeding method based on deep learning and inter-plant weed distribution conditions, which comprises the following steps:
step 100: adjusting the height of the camera and the focal length parameter of the camera so that the camera acquires images of two adjacent crops in the same row;
Step 200: acquiring internal parameters and external parameters of a camera;
Step 300: acquiring an image of crop seedlings and an image of weeds in a field by using a camera;
Step 400: determining an image training dataset and a verification dataset of seedlings according to the images of crop seedlings and the images of weeds in the field;
Step 500: inputting the training data set into a convolutional neural network for training to obtain a training model, and checking the training model by using the verification data set to obtain a trained model;
Step 600: based on a trained model, identifying and positioning the image of the crop seedling and the image of the weed in the field, which are acquired by the camera in real time, so as to obtain the center pixel coordinates of the minimum detection frame of the seedling, the width and the height and the center pixel coordinates of the minimum detection frame of the weed; ;
Step 700: solving four-corner coordinates of the minimum detection frame of the seedling according to the central pixel coordinates and the width and height of the minimum detection frame of the seedling;
Step 800: constructing real-time field single crop row map information according to the center coordinates and four-corner coordinates of the minimum detection frame of the seedlings and the center pixel coordinates of the minimum detection frame of the weeds;
step 900: determining a weeding strategy according to the field single crop row map information;
step 1000: and controlling the weeding mode of the mechanical-laser combined weeding device according to the weeding strategy and the field single crop row map information.
The present example also provides a real-time weeding method based on deep learning and inter-plant weed distribution conditions, as shown in fig. 1, applied to a mechanical-laser combined weeding device (as shown in fig. 3), wherein the mechanical-laser combined weeding device comprises: the weeding device comprises a camera 1, an image processing unit, a controller, a mechanical weeding device, a weeding cutter 2 and a laser emitter 3, wherein the image processing unit is respectively in communication connection with the camera 1 and the controller, the controller is in communication connection with the mechanical weeding device, and the weeding cutter 2 and the laser emitter 3 are arranged on the mechanical weeding device; specifically, the real-time weeding method in the example comprises the following steps:
Flow 100: parameters such as the height of the camera and the focal length of the camera are adjusted, so that the camera can only shoot two adjacent crops in the same row at most;
flow 200: acquiring internal parameters and external parameters of a camera by using a Zhang Zhengyou camera calibration method;
Flow 300: acquiring an image of crop seedlings and an image of weeds in a field;
Flow 400: determining a training data set and a verification data set according to the crop seedling image and the weed image in the field, sending the training data set into a convolutional neural network for training to obtain a training model, and checking the quality of the training model by using the verification data set, wherein the training model is used for detecting and positioning field crops and weeds in real time, and obtaining the center pixel coordinates (u m,vm) and the width and height (w m,hm) of the minimum detection frame of the mth crop and the center pixel coordinates (u n,vn) of the minimum detection frame of the nth weed; the pixel coordinate system takes the upper left corner of the picture as an origin, the horizontal right as a u axis and the vertical downward as a v axis;
the process 500 comprises the following steps: solving four-corner coordinates of the minimum detection frame according to the center coordinates and the width and height of the minimum detection frame;
flow 600: constructing real-time field single crop row map information according to the center coordinates and four-corner coordinates of the minimum detection frame of the crop and the center coordinates of weeds;
Flow 700: determining a weeding strategy according to the field single crop row map information;
Flow 800: and controlling the mechanical-laser combined weeding equipment according to the weeding strategy and the real-time position information of weeds and crops.
Optionally, this embodiment includes eight processes, as shown in fig. 1, specifically as follows:
Sub-scheme 1: placing the mechanical-laser combined weeding equipment in a field, and adjusting each system of the weeding equipment to an optimal state;
sub-process 2: the camera starts shooting crops and weeds in real time;
Sub-flow 3: determining a training data set and a verification data set according to the field crop seedling image and the field weed image, sending the training data set into a convolutional neural network for training to obtain a training model, and checking the training model by using the verification data set, wherein the training model is used for detecting and positioning field crops and weeds in real time according to real-time data shot by a camera, and obtaining the center pixel coordinates (u m,vm) and the width and height (w m,hm) of the minimum detection frame of the mth crop and the center pixel coordinates (u n,vn) of the minimum detection frame of the nth weed;
sub-flow 4: calculating four-corner coordinates of a target detection frame of the crop according to coordinate information of a crop center point and width and height of the detection frame, wherein the coordinate information is returned to the upper computer through target detection;
Sub-process 5: constructing real-time field single-crop row map information according to the center coordinates, the four-corner coordinates and the weed center coordinates of the minimum detection frame of the crops, wherein the coordinates of the map information are actual coordinates in a world coordinate system;
sub-flow 6: determining a weeding strategy according to the field single crop row map information;
sub-flow 7: controlling a mechanical-laser combined weeding device according to the weeding strategy judging result and the real-time position information of weeds and crops;
Sub-flow 8: after the process is finished, returning to a judging box for identifying crops to work, and repeating until the weeding equipment completes all the field weeding tasks.
Specifically, the calculation formula for solving the four-corner coordinates of the minimum detection frame of the seedling according to the center pixel coordinates and the width and height of the minimum detection frame of the seedling is as follows:
wherein, Is the value of the abscissa of the upper left corner coordinate and the lower left corner coordinate of the minimum detection frame of the mth plant crop, and is/>Is the abscissa value of the upper right and lower right corner coordinates of the minimum detection frame of the mth plant crop,/>Is the ordinate value of the left upper corner coordinate and the left lower corner coordinate of the minimum detection frame of the m-th crop, and is/>The vertical coordinate values of the upper right corner coordinate and the lower right corner coordinate of the minimum detection frame of the mth plant crop are W m, the wide value of the minimum detection frame of the mth plant crop is h m, the high value of the minimum detection frame of the mth plant crop is u m, the horizontal coordinate value of the central pixel coordinate of the minimum detection frame of the mth plant crop is v m, the central pixel coordinate of the minimum detection frame of the mth plant crop is v/isRespectively representing the left upper corner coordinate, the right upper corner coordinate, the left lower corner coordinate and the right lower corner coordinate of the minimum detection frame of the mth plant crop.
Specifically, the construction of real-time field single crop row map information according to the center coordinates and four-corner coordinates of the minimum detection frame of the seedling and the center pixel coordinates of the minimum detection frame of the weed comprises the following steps:
Dividing a region to be worked into a safe region and a dangerous region according to the center coordinates and the four-corner coordinates of the minimum detection frame of the young seedling;
dividing the safety zone and the danger zone into a left safety zone, a right safety zone, a left danger zone and a right danger zone;
as shown in fig. 1 to 2, the center coordinates and the four-corner coordinates of the minimum detection frame of the crop are obtained by solving the width and height, and the method comprises the following steps:
defining the minimum crop detection frame as a dangerous area according to the minimum crop detection frame and the four-corner coordinates: the judgment formula of the coordinate value in the dangerous area is as follows:
Wherein, (u, v) represents the coordinate value of any image pixel point in the image, Representing coordinate values of pixel points in an m-th crop dangerous area, wherein m is a positive integer;
And (3) defining a safety zone according to four-corner coordinates of the minimum detection frames of two adjacent crops: the judgment formula of the coordinate value in the safety zone is as follows:
wherein, Coordinate values representing pixel points located in a safe zone between m and m+1 plants,/>Representing the value of the coordinate of the minimum detection frame of the (m+1) th crop in the image in the y-axis direction, and/>Less than/>
The dangerous area (safety area) is further divided into a left area and a right area according to the center coordinates and the four-corner coordinates of the minimum crop detection frame, and the two areas are respectively named as a left dangerous area (left safety area) and a right dangerous area (right safety area): the judgment formulas of the left danger zone (left safety zone) and the right danger zone (right safety zone) are as follows:
wherein, Respectively representing coordinates of pixel points in left and right dangerous areas of mth crop plants,/>Respectively representing coordinates of pixel points in a left safe area and a right safe area of the mth plant of crops,/>Is the abscissa value of pixel points in the left dangerous area of the m-th crop, i.e./>Is the abscissa value of the pixel points in the right dangerous area of the m-th crop, i.e./>Is the ordinate value of pixel points in the left dangerous area of the m-th crop, i.e./>Is the ordinate value of pixel points in the right dangerous area of the m-th crop, i.e./>Is the abscissa value of pixel points in the left safe area of the m-th crop, and is/>Is the abscissa value of the pixel points in the right safe area of the m-th crop, and is/>Is the ordinate value of pixel points in the left safe area of the m-th crop, i.e./>Is the ordinate value of pixel points in the right safe area of the m-th crop, i.e./>Representing pixel point abscissa values in a safe zone between m and m+1 plants,/>The ordinate values of the pixel points in the safe area between the m and m+1 plants are represented,Representing pixel point abscissa values in the hazard zone between m and m+1 plants,/>The ordinate values of the pixel points in the danger zone between the m and m+1 plants are represented.
The dangerous area is a weeding area, namely a weeding knife is used for weeding vulnerable crops in the area, and otherwise, the dangerous area is a safe area;
screening and partitioning all weeds according to the left and right safety areas and the left and right dangerous areas; the weed screening partition formula is as follows:
wherein, Respectively representing coordinates of weed center points in a left dangerous area, a right dangerous area, a left safe area and a right safe area after screening the subareas, wherein a and b are positive integers;
As shown in fig. 4, the pixel coordinate system is converted into a world coordinate system to obtain field single crop row map information; the conversion method comprises the following steps:
Converting a pixel coordinate system (o-uv) where weeds (crops) are located into an image coordinate system (o-xy) taking an optical center as an origin; the conversion formula is as follows:
Wherein, the x-axis direction of the image coordinate system is horizontal to the right, the y-axis direction is vertical to the down, (u 0,v0) represents the coordinate of the origin of the image coordinate system in the pixel coordinate system, (u, v) represents the pixel coordinate, namely the coordinate of the central point of the weed and the four-corner coordinate of the minimum detection frame of the crop in the invention; d x,dy represents the lengths of the image pixels in the directions of two coordinate axes, respectively; (x, y) represents coordinates of (u, v) in an image coordinate system;
Then converting the image coordinate system of the weeds (crops) into a camera coordinate system (O C-XCYCZC) with the optical center as an origin; the conversion formula is as follows:
wherein, X C of the camera coordinate system is horizontal to the right, Y C is vertical to the top, (X C1,YC1,ZC1) represents the coordinates of weeds (crops) in the camera coordinate system; f is the focal length of the camera, and Z C represents the depth information of the camera;
In the weeding method, only the horizontal distance relation between the related coordinates of weeds (crops) and the world coordinate system is needed to be known, and the depth relation is not needed to be known, so that the transformation in the Z-axis direction can be omitted; meanwhile, the camera coordinate system and the world coordinate system are kept relatively static through the four-degree-of-freedom camera stabilizing system, so that no angle change exists, and the two coordinate systems have a translation relationship;
Finally, converting the camera coordinate system into a world coordinate system (O-XY) taking the midpoint of the mechanical-laser combined weeding equipment as an origin; the conversion formula is as follows:
Wherein, (X 1,Y1) represents coordinates of weeds (crops) in a world coordinate system, and T x、Ty represents translational distances between axes of a camera coordinate system and axes corresponding to the world coordinate system, respectively;
In a specific example, T x=0、Ty =75cm
What is required is: representing coordinates of central points of weeds in the left and right dangerous areas and the left and right safety areas after the screening of the subareas under a world coordinate system;
and constructing real-time field single-crop row map information according to the center coordinates, the four-corner coordinates and the weed center coordinates of the minimum detection frame of the crops, and updating in real time along with the field operation of weeding equipment.
Preferably, the determining a weeding strategy according to the field single crop row map information comprises:
According to the real-time field single crop row map information, a frame of image with two crops is intercepted;
the weed conditions in the left and right safety areas are respectively judged according to the center coordinates of the weeds in the screening areas in the frame of image (the left safety area is used for explanation):
Firstly, judging whether the number of weeds is equal to zero;
If the weeding number is equal to zero, no weeding measure is taken in the row;
if not, judging whether the number of weeds is greater than a threshold value N;
If the number of weeds is greater than the threshold value N, judging that the density of the weeds in the left safety zone is too high, and not suitable for a laser weeding method, and adopting a mechanical weeding method to carry out indiscriminate weeding;
if the number of weeds is smaller than the threshold value N, calculating the included angle between the connecting lines of the weeds with adjacent numbers and the vertical direction: the included angle calculation formula is as follows:
wherein, Represents the included angle between the connecting line of the center points of the b-th strain weeds and the b+1-th strain weeds and the vertical direction, and is/The included angle between the connecting line of the first strain of weeds in the left safety area and the central point of the last strain of weeds in the left danger area and the vertical direction is represented, and the weed sequencing is performed according to the size of the coordinates of the central points of the weeds in the Y direction;
In a specific example, N is 10;
As shown in fig. 4, specifically, the determining a weeding strategy according to the field single crop row map information includes:
judging whether the number of weeds in the area to be worked is equal to zero according to the field single crop row map information, obtaining a first judgment result, and if the first judgment result is yes, not taking any weeding measure; if the first judging result is negative, judging whether the number of weeds is larger than a threshold value, obtaining a second judging result, if the second judging result is negative, judging that the density of the weeds is large, performing indiscriminate weeding by adopting a mechanical weeding method, and if the second judging result is negative, calculating the included angles between the connecting lines of the weeds with adjacent numbers and the vertical direction, and obtaining a first included angle and a second included angle; the first included angle is an included angle between the connecting line of the center points of the b-th strain weeds and the b+1-th strain weeds and the vertical direction, and the second included angle is an included angle between the connecting line of the center points of the first strain weeds in the left safety area and the last strain weeds in the left dangerous area and the vertical direction;
calculating an included angle between the maximum transverse moving speed of the laser weeding module and the advancing speed of weeding equipment to obtain a third included angle;
Judging whether the third included angle is larger than the first included angle or the second included angle, if so, adopting a laser mode to weed, and if not, adopting a mechanical mode to weed.
Calculating an included angle between the maximum transverse moving speed of the laser weeding module and the advancing speed of weeding equipment: the calculation formula of the included angle is as follows:
Wherein, alpha represents an included angle between the lateral movement of the laser weeding module at the highest speed and the vertical direction, v car represents the advancing speed of the weeding vehicle, and v laser represents the maximum lateral movement speed of the laser weeding module;
In a specific example, v laser is determined with the selection of the motor, and v car is determined according to the speed of each operation;
If it is And/>When the weed distribution condition is smaller than alpha, the weed distribution condition is suitable for laser weeding, so that a laser weeding method is adopted in the area;
If it is Or/>If the weed distribution condition is larger than alpha, the weed distribution condition is not suitable for laser weeding, so that the area adopts a mechanical weeding mode;
The weeds in the dangerous areas all adopt a laser weeding method so as to reduce the damage to crops in the weeding process as much as possible.
Preferably, the control of the mechanical-laser combined weeding apparatus according to the weeding strategy and the real-time position information of weeds (crops) comprises:
Acquiring time for identifying crops and weeds by a training model, formulating time delay by a weeding strategy, and obtaining system delay time delta t according to the time delay;
according to the result of weeding strategy formulation, controlling the corresponding device to weed:
If a mechanical weeding mode (taking a left safety zone as an example) is adopted, calculating the time for a weeding cutter to start entering and exiting a row by using a delay time prediction method based on crop position information, wherein the calculation formula of the time prediction method based on the position information is as follows:
wherein, Respectively represent/>After the coordinate transformation, the value in the world coordinate system is represented by S, wherein the distance between the tool tip of the weeding tool and the world coordinate system in the Y direction is represented by t in1, the delay time of the weeding tool for entering the row from row to row is represented by t out, the delay time of the weeding tool for entering the row from row to row is represented by t in2, and the delay time of the weeding tool for entering the second safety zone is represented by the delay time when two continuous left safety zones are judged to be the mechanical weeding method;
in a specific example s= -60MM;
When only one safety area needs to adopt a mechanical weeding mode, the upper computer sends an instruction to the motor to control the motor to rotate through t in1 from the beginning of calculating the delay time, so as to control the weeding knives to enter the rows, and sends an instruction to the motor to control the motor to reversely rotate through t out from the beginning of calculating the delay time, so as to control the weeding knives to enter the rows; when the two continuous safety areas are judged to be in a mechanical weeding mode, the upper computer sends instructions to the motor to control the motor to rotate through t in1 from the beginning of calculating the delay time, so as to control the weeding knives to enter the row, and the upper computer sends instructions to the motor to control the motor to reversely rotate through t out from the beginning of calculating the delay time, so as to control the weeding knives to enter the row, and the upper computer sends instructions to the motor to control the motor to rotate through t in2 from the beginning of calculating the delay time, so as to control the weeding knives to enter the row; repeating the above t out and t in2 until the last time the weeding cutter enters the row, only giving a t out instruction;
If a laser weeding method (taking a left safety zone as an example) is adopted, calculating the delay time of the laser emitted by the laser emitter based on the coordinate information of the central point of the weeds, wherein the calculation formula is as follows:
Wherein S 1 represents the distance between the laser transmitter and the world coordinate system in the Y-axis direction, and t open represents the delay time (from the time of calculating t open) for the laser transmitter to clear the b-th weed;
the distance of the lateral movement of the laser transmitter is as follows:
wherein, A transverse displacement distance representing the current coordinate of the left laser transmitter and the coordinate of the b-th weed, and X laser represents the horizontal coordinate of the current left laser transmitter;
In a specific example S 1 =112.5mm;
Preferably, the laser emitter is also required to be responsible for removing weeds in the dangerous area, and an inverse exhaustion method based on a slope is adopted to ensure that the most weeds in the dangerous area are removed and simultaneously connected with laser weeding in a subsequent safe area (taking a left dangerous area as an example);
The judgment mode of the slope-based reverse exhaustion method is as follows:
Calculating the slope of the strain a weed and other weeds in the left dangerous area, and excluding the slope larger than alpha, wherein the calculation formula is as follows:
wherein, The abscissa (in small to large scale Y coordinate of weeds) of the last strain of weed representing the left hazard zone,/>The abscissa of weeds other than the a-th strain weed;
Then, the slope of the residual weeds is calculated by taking the weeds meeting the conditions as starting points, and the weeds with the slope larger than alpha are removed, wherein the calculation formula is as follows:
The above operation is cycled, leaving a path for the greatest amount of weed, and the laser emitter then clears the weed according to this path, thus ensuring both that the greatest amount of weed is cleared and that the crop is minimally damaged.
Specifically, the method further comprises the following steps:
Acquiring time for identifying crops and weeds by a training model, formulating time delay by a weeding strategy, and obtaining system delay time according to the time delay;
based on the weeding strategy, controlling the entry time of a weeding cutter or the delay time of excitation emission according to the system delay time.
The beneficial effects of the invention are as follows:
the invention provides a weeding method according to the distribution condition of inter-plant weed based on the detection result of deep learning, which can provide a more reasonable weeding method according to the distribution condition of inter-plant weed, and adopts mechanical weeding when the weed density is high and adopts laser weeding when the weed density is low.
According to the weeding method, the invention designs a mechanical-laser combined weeding device, which combines the advantages of mechanical weeding and laser weeding, and can remove weeds to the greatest extent and minimize crop injury at the same time.
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other.
The principles and embodiments of the present invention have been described herein with reference to specific examples, the description of which is intended only to assist in understanding the methods of the present invention and the core ideas thereof; also, it is within the scope of the present invention to be modified by those of ordinary skill in the art in light of the present teachings. In view of the foregoing, this description should not be construed as limiting the invention.

Claims (4)

1. A real-time weeding method based on deep learning and inter-plant weed distribution conditions is applied to a mechanical-laser combined weeding device, and is characterized by comprising the following steps:
adjusting the height of a camera of the mechanical-laser combined weeding device and the focal length parameter of the camera so that the camera can acquire images of two adjacent crops in the same row;
Acquiring internal parameters and external parameters of a camera;
acquiring an image of crop seedlings and an image of weeds in a field by using a camera;
Determining an image training dataset and a verification dataset of seedlings according to the images of crop seedlings and the images of weeds in the field;
inputting the training data set into a convolutional neural network for training to obtain a training model, and checking the training model by using the verification data set to obtain a trained model;
Based on a trained model, identifying and positioning the image of the crop seedling and the image of the weed in the field, which are acquired by the camera in real time, so as to obtain the center pixel coordinates of the minimum detection frame of the seedling, the width and the height and the center pixel coordinates of the minimum detection frame of the weed;
solving four-corner coordinates of the minimum detection frame of the seedling according to the central pixel coordinates and the width and height of the minimum detection frame of the seedling;
constructing real-time field single crop row map information according to the center coordinates and four-corner coordinates of the minimum detection frame of the seedlings and the center pixel coordinates of the minimum detection frame of the weeds;
determining a weeding strategy according to the field single crop row map information;
controlling a weeding mode of the mechanical-laser combined weeding device according to the weeding strategy and the field single crop row map information;
The determining the weeding strategy according to the field single crop row map information comprises the following steps:
judging whether the number of weeds in the area to be worked is equal to zero according to the field single crop row map information, obtaining a first judgment result, and if the first judgment result is yes, not taking any weeding measure; if the first judging result is negative, judging whether the number of weeds is larger than a threshold value, obtaining a second judging result, if the second judging result is negative, judging that the density of the weeds is large, performing indiscriminate weeding by adopting a mechanical weeding method, and if the second judging result is negative, calculating the included angles between the connecting lines of the weeds with adjacent numbers and the vertical direction, and obtaining a first included angle and a second included angle; the first included angle is an included angle between the connecting line of the center points of the b-th strain weeds and the b+1-th strain weeds and the vertical direction, and the second included angle is an included angle between the connecting line of the center points of the first strain weeds in the left safety area and the last strain weeds in the left dangerous area and the vertical direction;
calculating an included angle between the maximum transverse moving speed of the laser weeding module and the advancing speed of weeding equipment to obtain a third included angle;
Judging whether the third included angle is larger than the first included angle or the second included angle, if so, adopting a laser mode to weed, and if not, adopting a mechanical mode to weed.
2. The real-time weeding method based on deep learning and inter-plant weed distribution according to claim 1, wherein the calculation formula for solving the four corner coordinates of the minimum detection frame of the seedling according to the center pixel coordinates and width and height of the minimum detection frame of the seedling is:
wherein, Is the value of the abscissa of the upper left corner coordinate and the lower left corner coordinate of the minimum detection frame of the mth plant crop, and is/>Is the abscissa value of the upper right and lower right corner coordinates of the minimum detection frame of the mth plant crop,/>Is the ordinate value of the left upper corner coordinate and the left lower corner coordinate of the minimum detection frame of the m-th crop, and is/>The vertical coordinate values of the upper right corner coordinate and the lower right corner coordinate of the minimum detection frame of the mth plant crop are W m, the wide value of the minimum detection frame of the mth plant crop is h m, the high value of the minimum detection frame of the mth plant crop is u m, the horizontal coordinate value of the central pixel coordinate of the minimum detection frame of the mth plant crop is u m, and v m is the central pixel coordinate of the minimum detection frame of the mth plant crop.
3. The real-time weeding method based on deep learning and inter-plant weed distribution according to claim 1, wherein said constructing real-time field single crop row map information according to the center coordinates and four corner coordinates of the minimum detection frame of the seedling and the center pixel coordinates of the minimum detection frame of the weed comprises:
Dividing a region to be worked into a safe region and a dangerous region according to the center coordinates and the four-corner coordinates of the minimum detection frame of the young seedling;
dividing the safety zone and the danger zone into a left safety zone, a right safety zone, a left danger zone and a right danger zone;
constructing field single crop row map information according to the central pixel coordinates of the minimum weed detection frame based on the pixel coordinates of the left safety zone, the right safety zone, the left danger zone and the right danger zone;
the judgment formula for judging the safety zone as the left safety zone is as follows:
a left safe area;
the determination formula for determining the safe area as the right safe area is:
a right safe area;
The decision formula for deciding the hazard zone as the right hazard zone is:
A right hazard zone;
the decision formula for deciding the hazard zone as the left hazard zone is:
a left hazard zone; wherein/> Is the abscissa value of pixel points in the left dangerous area of the m-th crop, i.e./>Is the abscissa value of the pixel points in the right dangerous area of the m-th crop, i.e./>Is the ordinate value of pixel points in the left dangerous area of the m-th crop, i.e./>Is the ordinate value of pixel points in the right dangerous area of the m-th crop, i.e./>Is the abscissa value of pixel points in the left safe area of the m-th crop, and is/>Is the abscissa value of the pixel points in the right safe area of the m-th crop, and is/>Is the ordinate value of pixel points in the left safe area of the m-th crop, i.e./>Is the ordinate value of pixel points in the right safe area of the m-th crop, i.e./>Representing pixel point abscissa values in a safe zone between m and m+1 plants,/>Representing the value of the ordinate of the pixel points in the safe zone between the m and m+1 plants,/>Representing pixel point abscissa values in the hazard zone between m and m+1 plants,/>The ordinate values of the pixel points in the danger zone between the m and m+1 plants are represented.
4. The real-time weeding method based on deep learning and inter-plant weed distribution according to claim 1, further comprising:
Acquiring time for identifying crops and weeds by a training model, formulating time delay by a weeding strategy, and obtaining system delay time according to the time delay;
based on the weeding strategy, controlling the entry time of a weeding cutter or the delay time of excitation emission according to the system delay time.
CN202311239329.9A 2023-09-25 2023-09-25 Real-time weeding method based on deep learning and inter-plant weed distribution conditions Active CN117274566B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311239329.9A CN117274566B (en) 2023-09-25 2023-09-25 Real-time weeding method based on deep learning and inter-plant weed distribution conditions

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311239329.9A CN117274566B (en) 2023-09-25 2023-09-25 Real-time weeding method based on deep learning and inter-plant weed distribution conditions

Publications (2)

Publication Number Publication Date
CN117274566A CN117274566A (en) 2023-12-22
CN117274566B true CN117274566B (en) 2024-04-26

Family

ID=89215504

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311239329.9A Active CN117274566B (en) 2023-09-25 2023-09-25 Real-time weeding method based on deep learning and inter-plant weed distribution conditions

Country Status (1)

Country Link
CN (1) CN117274566B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118235753A (en) * 2024-05-21 2024-06-25 内蒙古八爪智能科技有限公司 Weeding method and device of laser weeding robot, medium and laser weeding robot

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013050007A1 (en) * 2011-10-03 2013-04-11 Ustav Pristrojove Techniky Akademie Ved Cr, V.V.I. Method and apparatus for measuring shape deviations of mechanical parts
WO2020011318A1 (en) * 2018-07-12 2020-01-16 Agro Intelligence Aps A system for use when performing a weeding operation in an agricultural field
CN111929699A (en) * 2020-07-21 2020-11-13 北京建筑大学 Laser radar inertial navigation odometer considering dynamic obstacles and mapping method and system
WO2021043904A1 (en) * 2019-09-05 2021-03-11 Basf Se System and method for identification of plant species
CN114067207A (en) * 2021-11-16 2022-02-18 福州大学 Vegetable seedling field weed detection method based on deep learning and image processing
WO2022047830A1 (en) * 2020-09-04 2022-03-10 浙江大学 Method for detecting field navigation line after ridge closing of crops
CN114419407A (en) * 2021-12-14 2022-04-29 中国农业大学 Inline weed automatic identification method and device for seedling stage of transplanted crops
CN114937078A (en) * 2022-05-19 2022-08-23 苏州大学 Automatic weeding method, device and storage medium
CN115251024A (en) * 2022-08-29 2022-11-01 北京大学现代农业研究院 Weeding mode determining method and device, electronic equipment and weeding system
CN115560754A (en) * 2022-08-25 2023-01-03 邯郸科技职业学院 Visual navigation method based on weed removal
CN116110081A (en) * 2023-04-12 2023-05-12 齐鲁工业大学(山东省科学院) Detection method and system for wearing safety helmet based on deep learning
CN116540708A (en) * 2023-05-11 2023-08-04 江苏大学 Autonomous navigation method and system for paddy field mechanical weeding equipment under repeated operation scene

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11393049B2 (en) * 2020-09-24 2022-07-19 Centure Applications LTD Machine learning models for selecting treatments for treating an agricultural field
WO2022147156A1 (en) * 2020-12-29 2022-07-07 Blue River Technology Inc. Generating a ground plane for obstruction detection

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013050007A1 (en) * 2011-10-03 2013-04-11 Ustav Pristrojove Techniky Akademie Ved Cr, V.V.I. Method and apparatus for measuring shape deviations of mechanical parts
WO2020011318A1 (en) * 2018-07-12 2020-01-16 Agro Intelligence Aps A system for use when performing a weeding operation in an agricultural field
WO2021043904A1 (en) * 2019-09-05 2021-03-11 Basf Se System and method for identification of plant species
CN111929699A (en) * 2020-07-21 2020-11-13 北京建筑大学 Laser radar inertial navigation odometer considering dynamic obstacles and mapping method and system
WO2022047830A1 (en) * 2020-09-04 2022-03-10 浙江大学 Method for detecting field navigation line after ridge closing of crops
CN114067207A (en) * 2021-11-16 2022-02-18 福州大学 Vegetable seedling field weed detection method based on deep learning and image processing
CN114419407A (en) * 2021-12-14 2022-04-29 中国农业大学 Inline weed automatic identification method and device for seedling stage of transplanted crops
CN114937078A (en) * 2022-05-19 2022-08-23 苏州大学 Automatic weeding method, device and storage medium
CN115560754A (en) * 2022-08-25 2023-01-03 邯郸科技职业学院 Visual navigation method based on weed removal
CN115251024A (en) * 2022-08-29 2022-11-01 北京大学现代农业研究院 Weeding mode determining method and device, electronic equipment and weeding system
CN116110081A (en) * 2023-04-12 2023-05-12 齐鲁工业大学(山东省科学院) Detection method and system for wearing safety helmet based on deep learning
CN116540708A (en) * 2023-05-11 2023-08-04 江苏大学 Autonomous navigation method and system for paddy field mechanical weeding equipment under repeated operation scene

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
基于机器视觉的株间机械除草装置的作物识别与定位方法;胡炼;罗锡文;曾山;张智刚;陈雄飞;林潮兴;;农业工程学报;20130515(10);20-26 *
机械式苗间除草装置与技术研究;马克明;《河北农机》;20210524;5-6 *
玉米中耕除草复合导航系统设计与试验;张漫;项明;魏爽;季宇寒;仇瑞承;孟庆宽;;农业机械学报;20151230(S1);13-19 *

Also Published As

Publication number Publication date
CN117274566A (en) 2023-12-22

Similar Documents

Publication Publication Date Title
CN117274566B (en) Real-time weeding method based on deep learning and inter-plant weed distribution conditions
CN113276106B (en) Climbing robot space positioning method and space positioning system
CN109483573A (en) Machine learning device, robot system and machine learning method
CN113597874B (en) Weeding robot and weeding path planning method, device and medium thereof
CN108748149B (en) Non-calibration mechanical arm grabbing method based on deep learning in complex environment
CN106097322A (en) A kind of vision system calibration method based on neutral net
CN110433467B (en) Operation method and device of table tennis ball picking robot based on binocular vision and ant colony algorithm
CN105116902A (en) Mobile robot obstacle avoidance navigation method and system
CN112734739B (en) Visual building crack identification method based on attention mechanism and ResNet fusion
CN113129373B (en) Indoor mobile robot vision positioning method based on convolutional neural network
CN113593035A (en) Motion control decision generation method and device, electronic equipment and storage medium
CN117021059B (en) Picking robot, fruit positioning method and device thereof, electronic equipment and medium
CN113703444A (en) Intelligent robot inspection obstacle avoidance method and system
CN113065562A (en) Crop ridge row extraction and leading route selection method based on semantic segmentation network
CN115139315A (en) Grabbing motion planning method for picking mechanical arm
CN110866548A (en) Infrared intelligent matching identification and distance measurement positioning method and system for insulator of power transmission line
You et al. Semiautonomous precision pruning of upright fruiting offshoot orchard systems: An integrated approach
CN116740036A (en) Method and system for detecting cutting point position of steel pipe end arc striking and extinguishing plate
CN116380899A (en) Methane concentration distribution construction method based on unmanned aerial vehicle
CN114946439B (en) Intelligent and accurate topping device for field cotton
CN115097833A (en) Automatic obstacle avoidance method and system for pesticide application robot and storage medium
CN114830911A (en) Intelligent weeding method and device and storage medium
US20230089195A1 (en) Control device, control system, control method, and recording medium with control program recorded thereon
CN113524216A (en) Fruit and vegetable picking robot based on multi-frame fusion and control method thereof
CN117823741B (en) Pipe network non-excavation repairing method and system combined with intelligent robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant