CN114937078A - Automatic weeding method, device and storage medium - Google Patents

Automatic weeding method, device and storage medium Download PDF

Info

Publication number
CN114937078A
CN114937078A CN202210544629.7A CN202210544629A CN114937078A CN 114937078 A CN114937078 A CN 114937078A CN 202210544629 A CN202210544629 A CN 202210544629A CN 114937078 A CN114937078 A CN 114937078A
Authority
CN
China
Prior art keywords
seedling
crops
image
distance
position information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210544629.7A
Other languages
Chinese (zh)
Inventor
王蓬勃
王天健
戴广林
周辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou University
Original Assignee
Suzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou University filed Critical Suzhou University
Priority to CN202210544629.7A priority Critical patent/CN114937078A/en
Publication of CN114937078A publication Critical patent/CN114937078A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M21/00Apparatus for the destruction of unwanted vegetation, e.g. weeds
    • A01M21/02Apparatus for mechanical destruction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0014Image feed-back for automatic industrial control, e.g. robot with camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Insects & Arthropods (AREA)
  • Pest Control & Pesticides (AREA)
  • Wood Science & Technology (AREA)
  • Zoology (AREA)
  • Environmental Sciences (AREA)
  • Guiding Agricultural Machines (AREA)
  • Soil Working Implements (AREA)

Abstract

The application discloses an automatic weeding method, an automatic weeding device and a storage medium, which belong to the technical field of automatic operation, and the method comprises the following steps: acquiring a crop image through a camera; identifying position information of crops in the crop image through a position detection model; acquiring the target height of the camera from the ground; converting the position information into target position information in a camera coordinate system according to the target height; determining the knife seedling distance and the seedling distance according to the target position information; and carrying out weeding operation according to the determined knife seedling distance and seedling distance. The problem of probably hindering crops when having solved among the prior art automatic weeding, reached and can real-timely acquire sword seedling distance and seedling interval and then the accurate effect of avoiding crops to hinder and reach crops is avoided.

Description

Automatic weeding method, device and storage medium
Technical Field
The invention relates to an automatic weeding method, an automatic weeding device and a storage medium, and belongs to the technical field of automatic operation.
Background
In traditional agricultural planting in China, weeds directly influence agricultural production, and the quality of crops is reduced. At present, the efficiency of an artificial weeding mode is low, and the problems of environmental pollution, food safety and the like are caused although the operation efficiency is improved by a large-area chemical weeding mode, so that the automatic intelligent mechanical weeding device becomes a research direction of green sustainable agriculture in the future.
At present, scholars at home and abroad carry out a great deal of research on intelligent intertillage weeding machines, wherein the key technology comprises: seedling grass identification and positioning, complete machine design and accurate servo control, and weeding manipulator design for synchronously cutting weeds among rows and plants. In order to complete the inter-plant weeding operation, how to accurately position crops in real time and effectively treat inter-plant weeds on the premise of not damaging the crops is the primary technical difficulty for realizing the automation and the intellectualization of the intertillage weeding machine.
Disclosure of Invention
The invention aims to provide an autonomous mobile chassis, a multi-span greenhouse chassis rail changing method and a storage medium, which are used for solving the problems in the prior art.
In order to achieve the purpose, the invention provides the following technical scheme:
according to a first aspect, embodiments of the present invention provide an automatic weeding method, including:
acquiring an image of the crop through a camera;
identifying position information of crops in the crop image through a position detection model;
acquiring the target height of the camera from the ground;
converting the position information into target position information in a camera coordinate system according to the target height;
determining the cutting seedling distance and the seedling spacing according to the target position information;
and carrying out weeding operation according to the determined cutting seedling distance and seedling distance.
Optionally, the acquiring a target height of the camera from the ground includes:
aligning the crop image with the acquired depth image;
extracting a region of interest in the crop image;
generating a binary image of the region of interest;
and acquiring the target height according to the binarized image and the aligned depth image.
Optionally, the generating a binarized image of the region of interest includes:
and separating green pixels in the region of interest by adopting an ultragreen algorithm, and obtaining the separated binary image.
Optionally, the obtaining the target height according to the binarized image and the aligned depth image includes:
performing AND operation on the binarized image and the aligned depth image;
and calculating the average value of the depth values in the processed binary image, and determining the average value as the target height.
Optionally, the determining the knife seedling distance and the seedling distance according to the target position information includes:
dividing the crops into a left row and a right row according to the coordinate value of the abscissa of each pixel in the target position information;
calculating the seedling cutting distance according to the distance between the left row of crops and the right row of crops;
and calculating the seedling spacing according to the spacing between two adjacent rows of crops in the left and right rows of crops.
Optionally, the calculating the seedling cutting distance according to the distance between the left and right rows of crops includes:
acquiring a lower boundary of a field of view;
for each column, acquiring a difference value between a vertical coordinate of a crop adjacent to the lower boundary of the field of view and the lower boundary of the field of view;
and determining the absolute value of the obtained difference value as the seedling cutting distance of each row.
Optionally, the calculating the seedling distance according to the distance between two adjacent rows of crops in the left and right columns of crops includes:
for each column, respectively calculating the difference value of the vertical coordinates of two adjacent rows of crops;
and determining the absolute value of the calculated difference value as the seedling spacing of each row.
Optionally, the image acquisition device is arranged at a preset height from the ground.
In a second aspect, there is provided an automatic weeding apparatus comprising a memory having at least one program instruction stored therein and a processor for implementing the method according to the first aspect by loading and executing the at least one program instruction.
In a third aspect, there is provided a computer storage medium having stored therein at least one program instruction which is loaded and executed by a processor to implement the method of the first aspect.
Acquiring a crop image through a camera; identifying position information of crops in the crop image through a position detection model; acquiring the target height of the camera from the ground; converting the position information into target position information under a camera coordinate system according to the target height; determining the cutting seedling distance and the seedling spacing according to the target position information; and carrying out weeding operation according to the determined knife seedling distance and seedling distance. The problem of probably injure crops when having solved among the prior art automatic weeding, reached and can real-timely obtain sword seedling distance and seedling interval and then the accurate effect of avoiding crops to injure crops of avoiding.
The foregoing description is only an overview of the technical solutions of the present invention, and in order to make the technical solutions of the present invention more clearly understood and to make the technical solutions of the present invention practical in accordance with the contents of the specification, the following detailed description is given of preferred embodiments of the present invention with reference to the accompanying drawings.
Drawings
FIG. 1 is a flow chart of a method for automatically weeding according to an embodiment of the present invention;
fig. 2 is a schematic view of a weeding robot provided in an embodiment of the present invention in a possible field operation;
fig. 3 is a schematic diagram of possible location information of each crop detected by a location detection model according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating a relationship between a pixel coordinate system and an image coordinate system according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a camera imaging principle provided by an embodiment of the invention;
fig. 6 is a schematic diagram of one possible calculated blade-seedling distance and seedling-seedling distance according to an embodiment of the present invention.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it should be understood that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
Furthermore, the technical features involved in the different embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
Referring to fig. 1, a flow chart of an automatic weeding method provided in an embodiment of the present application is shown, and as shown in fig. 1, the method includes:
101, acquiring a crop image through a camera;
the camera described in this embodiment may be an RGBD camera. The RGBD camera can be arranged at a preset height from the ground, and the preset height can be set to different values according to the height of crops. For example, if the crop is cabbage seedling, the preset height may be 70 cm.
Certainly, in practical implementation, in order to ensure the image quality of the acquired crop image, the angle of the RGBD camera may be parallel to the ground.
In one possible implementation manner of this embodiment, the RGBD camera may be disposed in the weeding robot, as shown in fig. 2, the weeding robot is composed of a moving platform, the RGBD camera and a bracket, the RGBD camera is disposed at a distance of 70cm from the ground, and the RGBD camera can acquire the crop image of the position in real time as the weeding robot moves.
Optionally, since weeds may be present in the crop planting field, the obtained crop image may further include weeds, which is not limited in this embodiment.
Step 102, identifying the position information of the crops in the crop image through a position detection model;
the position detection model is a model obtained by pre-training, and during actual implementation, the training step of the position detection model comprises the following steps:
firstly, acquiring a training set;
in this step, crop image sets with different illumination and different growth states can be acquired through the RGBD camera. Wherein, each crop image can comprise crops and weeds growing around the crops.
To ensure training accuracy, the crop image set may include a large number of crop images, for example, 8000 crop images. In addition, in actual implementation, in order to improve the generalization ability, after each crop image is acquired by the RGBD camera, the acquired crop image may be subjected to image enhancement, which is not limited in this embodiment. The image enhancement mode may include: random reduction, stitching, cropping, etc. of the image.
In practical implementation, the acquired crop image set can be divided into a training set, a verification set and a test set.
Secondly, training an initial position detection model according to the training set, and obtaining a trained position detection model.
In practical implementation, a Yolov5S detection model may be constructed on a Pytorch deep learning framework, and then a training set may be input into the Yolov5S detection model for training to obtain a trained position detection model.
Optionally, the trained position detection model may be verified by a verification set, and the test set may be input to the trained position detection model for testing, and the training result may be verified, so as to obtain a final position detection model. Please refer to fig. 3, which shows a possible schematic diagram of the position information of each crop detected by the trained position detection model.
In the step, the position of the crops in the crop image and the growth state of the crops can be detected by using the trained position detection model.
103, acquiring the target height of the camera from the ground;
the method comprises the following steps:
first, aligning the crop image with the acquired depth image;
alternatively, the color image acquired by the RGBD camera may be aligned with the depth image. The color image is an RGB three-channel image, and the depth image is a D channel image.
Secondly, extracting a region of interest in the crop image;
in practice, a region of interest (ROI) of the center pixel in the crop image, for example, a region of ROI of 200 × 200 pixels of the center may be extracted.
Thirdly, generating a binary image of the region of interest;
and separating green pixels in the region of interest by adopting an ultragreen algorithm, and obtaining the separated binary image. In practical implementation, an ultragreen algorithm (2g-r-b) and an Otsu threshold segmentation algorithm can be adopted to separate green pixels and obtain an ROI binary image.
Fourthly, acquiring the target height according to the binarized image and the aligned depth image.
(1) Performing AND operation on the binarized image and the aligned depth image;
(2) and calculating an average value of the depth values in the processed binary image, and determining the average value as the target height.
Alternatively, after performing the and operation, the abnormal value may be removed, and then an average value of the depth values is calculated, and the average value is determined as the target height.
Step 104, converting the position information into target position information in a camera coordinate system according to the target height;
the method comprises the following steps:
firstly, acquiring camera internal reference information of a camera;
the camera internal reference information includes the pixel sizes dx, dy in the camera and the center points PPx, PPy of the pixel coordinate system.
Secondly, converting the acquired position information into target position information in a camera coordinate system according to the camera internal reference information and the target height.
In actual implementation, the acquired position information can be converted into target position information in a camera coordinate system according to the camera internal reference information, the target height and the camera imaging principle.
Optionally, please refer to fig. 4, which shows the relationship between the pixel coordinate system and the image coordinate system. In addition, please refer to fig. 5, which shows a schematic diagram of a possible camera imaging principle of the camera.
Specifically, the method comprises the following steps: let Xp, Yp be pixel coordinate points, Xc, Yc, Zc be camera coordinate points, and Z be the acquired target height, which is obtained from geometric relations
Figure BDA0003651597690000081
Figure BDA0003651597690000082
If Xu and Yu are pixel coordinate points, the conversion relationship is as follows:
X p =(X u -pp x )dx
Y p =(Y u -pp y )dy
the conversion relationship between the pixel coordinate system and the camera coordinate system is obtained by substituting the following equations:
Figure BDA0003651597690000083
Figure BDA0003651597690000084
Zc=z
step 105, determining the cutting seedling distance and the seedling distance according to the target position information;
firstly, dividing crops into a left row and a right row according to coordinate values of abscissa of each pixel in the target position information;
after the target position information is acquired, whether the abscissa of the pixel coordinate is larger than a preset threshold value is judged, if so, the Right column Right [ ], and if not, the Left column Left [ ]isjudged. The above description is given by way of example only with reference to the division using the target position information in the camera coordinate system, and in actual implementation, the determination may be performed using the position information detected by the position detection model. At this time, as shown in fig. 4, the coordinates of the two rows are sorted from large to small with the vertical axis v along the pixel coordinate system as a positive direction. The method for acquiring the knife seedling distance and the seedling distance converts two rows of sorted coordinates from pixel coordinates to camera coordinates by using the coordinate conversion relation in the step 104.
Secondly, calculating the seedling cutting distance according to the distance between the left row of crops and the right row of crops;
(1) acquiring a lower boundary of a visual field;
in actual implementation, the lower view-field boundary Yc can be obtained by substituting Yu into 800 into the expression in step 104.
(2) For each column, acquiring a difference value between a vertical coordinate of the crop close to the lower boundary of the field of view and the lower boundary of the field of view;
the ordinate of Right [0] under the camera coordinate system close to the visual field boundary is marked as Yc _ Right [0] and ordinate Yc _ Left [0] of Left [0], and the absolute value is obtained by subtracting the visual field lower boundary Yc and adding with Dk2, the distance between the Left and Right two rows of seedlings and the cutter is the distance between the seedlings, and is marked as DKc _ Right [0] and DKc _ Left [0] respectively.
(3) And determining the absolute value of the obtained difference value as the seedling cutting distance of each row.
Referring to fig. 6, the distance between the knife seedling and the cutting knife is close to the distance between the seedling below the near field, which is:
dk1+ Dk 2. Wherein DK1 is the absolute value of the difference calculated in the above step, and DK2 is a known value.
Thirdly, calculating the seedling spacing according to the spacing between two adjacent rows of crops in the left and right rows of crops.
(1) Respectively calculating the difference value of the vertical coordinates of two adjacent rows of crops for each column;
(2) and determining the absolute value of the calculated difference value as the seedling spacing of each row.
In the same way, the Right 1 ordinate and Left 1 ordinate of the camera coordinate values of the Left and Right rows which are sorted are respectively differed with the corresponding Right 0 ordinate and Left 0 ordinate to obtain an absolute value which is the distance between the front and the back seedlings of the Left and Right rows. Referring to fig. 6, the distance between seedlings is DP.
And 106, performing weeding operation according to the determined knife seedling distance and the determined seedling distance.
And weeding can be carried out after the distances between the knife seedlings and the seedlings are determined.
In summary, the crop image is acquired by the camera; identifying position information of crops in the crop image through a position detection model; acquiring the target height of the camera from the ground; converting the position information into target position information in a camera coordinate system according to the target height; determining the knife seedling distance and the seedling distance according to the target position information; and carrying out weeding operation according to the determined knife seedling distance and seedling distance. The problem of probably hindering crops when having solved among the prior art automatic weeding, reached and can real-timely acquire sword seedling distance and seedling interval and then the accurate effect of avoiding crops to hinder and reach crops is avoided.
The present application also provides an automatic weeding apparatus comprising a memory having at least one program instruction stored therein and a processor for implementing the method as described above by loading and executing the at least one program instruction.
The present application also provides a computer storage medium having stored therein at least one program instruction, which is loaded and executed by a processor to implement the method as described above.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An automated weeding method, comprising:
acquiring a crop image through a camera;
identifying position information of crops in the crop image through a position detection model;
acquiring the target height of the camera from the ground;
converting the position information into target position information in a camera coordinate system according to the target height;
determining the cutting seedling distance and the seedling spacing according to the target position information;
and carrying out weeding operation according to the determined knife seedling distance and seedling distance.
2. The method of claim 1, wherein the obtaining the target height of the camera from the ground comprises:
aligning the crop image with the acquired depth image;
extracting a region of interest in the crop image;
generating a binary image of the region of interest;
and acquiring the target height according to the binarized image and the aligned depth image.
3. The method according to claim 2, wherein said generating a binarized image of the region of interest comprises:
and separating green pixels in the region of interest by adopting an ultragreen algorithm, and obtaining the separated binary image.
4. The method according to claim 2, wherein the obtaining the target height from the binarized image and the aligned depth image comprises:
performing AND operation on the binarized image and the aligned depth image;
and calculating the average value of the depth values in the processed binary image, and determining the average value as the target height.
5. The method of claim 1, wherein determining the cutting and seedling spacing based on the target location information comprises:
dividing the crops into a left row and a right row according to the coordinate value of the abscissa of each pixel in the target position information;
calculating the seedling cutting distance according to the distance between the left row of crops and the right row of crops;
and calculating the seedling spacing according to the spacing between two adjacent rows of crops in the left and right rows of crops.
6. The method of claim 5, wherein calculating the cutting distance based on the spacing between the left and right rows of agricultural crops comprises:
acquiring a lower boundary of a field of view;
for each column, obtaining a difference value between a vertical coordinate of the crop adjacent to the lower boundary of the field of view and the lower boundary of the field of view;
and determining the absolute value of the obtained difference value as the seedling cutting distance of each row.
7. The method of claim 5, wherein the calculating the seedling spacing from the spacing between two adjacent rows of the two left and right columns of crop plants comprises:
for each column, respectively calculating the difference value of the vertical coordinates of two adjacent rows of crops;
and determining the absolute value of the calculated difference value as the seedling spacing of each row.
8. The method according to any one of claims 1 to 7, wherein the image acquisition device is positioned at a predetermined height from the ground.
9. An automated weeding apparatus, comprising a memory having stored therein at least one program instruction, and a processor for implementing the method according to any one of claims 1 to 8 by loading and executing the at least one program instruction.
10. A computer storage medium having stored therein at least one program instruction which is loaded and executed by a processor to implement the method of any one of claims 1 to 8.
CN202210544629.7A 2022-05-19 2022-05-19 Automatic weeding method, device and storage medium Pending CN114937078A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210544629.7A CN114937078A (en) 2022-05-19 2022-05-19 Automatic weeding method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210544629.7A CN114937078A (en) 2022-05-19 2022-05-19 Automatic weeding method, device and storage medium

Publications (1)

Publication Number Publication Date
CN114937078A true CN114937078A (en) 2022-08-23

Family

ID=82864426

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210544629.7A Pending CN114937078A (en) 2022-05-19 2022-05-19 Automatic weeding method, device and storage medium

Country Status (1)

Country Link
CN (1) CN114937078A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117274566A (en) * 2023-09-25 2023-12-22 北京工业大学 Real-time weeding method based on deep learning and inter-plant weed distribution conditions

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117274566A (en) * 2023-09-25 2023-12-22 北京工业大学 Real-time weeding method based on deep learning and inter-plant weed distribution conditions
CN117274566B (en) * 2023-09-25 2024-04-26 北京工业大学 Real-time weeding method based on deep learning and inter-plant weed distribution conditions

Similar Documents

Publication Publication Date Title
Ge et al. Fruit localization and environment perception for strawberry harvesting robots
CN109886155B (en) Single-plant rice detection and positioning method, system, equipment and medium based on deep learning
CN108133471B (en) Robot navigation path extraction method and device based on artificial bee colony algorithm
CN114818909B (en) Weed detection method and device based on crop growth characteristics
CN112990103B (en) String mining secondary positioning method based on machine vision
Lin et al. Automatic detection of plant rows for a transplanter in paddy field using faster r-cnn
CN114239756B (en) Insect pest detection method and system
Yoshida et al. A tomato recognition method for harvesting with robots using point clouds
CN114937078A (en) Automatic weeding method, device and storage medium
CN115661650A (en) Farm management system based on data monitoring of Internet of things
CN112395984A (en) Method for detecting seedling guide line of unmanned agricultural machine
WO2022222822A1 (en) Method and device for identifying and positioning abelmoschus manihot on basis of cameras placed in non-parallel manner
CN103186773A (en) Early-stage ribbing ridge line recognition algorithm based on one-dimensional Hough transform and expert system
Wang et al. The seedling line extraction of automatic weeding machinery in paddy field
CN114821268A (en) Weed and crop identification method based on machine learning
CN112883915B (en) Automatic wheat head identification method and system based on transfer learning
CN111369497B (en) Walking type tree fruit continuous counting method and device
CN114387343A (en) Method for detecting picking position of mushroom stem of clustered oyster mushroom
CN103593840A (en) Method for detecting phenotype of Arabidopsis
CN108734054B (en) Non-shielding citrus fruit image identification method
CN117036926A (en) Weed identification method integrating deep learning and image processing
CN114830911B (en) Intelligent weeding method, intelligent weeding device and storage medium
He et al. Extracting the navigation path of an agricultural plant protection robot based on machine vision
Li et al. Vision-based Navigation Line Extraction by Combining Crop Row Detection and RANSAC Algorithm
Yihang et al. Automatic recognition of rape seeding emergence stage based on computer vision technology

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination