CN111080103A - Method for automatically evaluating crop yield - Google Patents
Method for automatically evaluating crop yield Download PDFInfo
- Publication number
- CN111080103A CN111080103A CN201911236153.5A CN201911236153A CN111080103A CN 111080103 A CN111080103 A CN 111080103A CN 201911236153 A CN201911236153 A CN 201911236153A CN 111080103 A CN111080103 A CN 111080103A
- Authority
- CN
- China
- Prior art keywords
- fruit
- image
- camera
- pixels
- pixel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 21
- 235000013399 edible fruits Nutrition 0.000 claims abstract description 47
- 238000011156 evaluation Methods 0.000 claims abstract description 13
- 238000012545 processing Methods 0.000 claims abstract description 11
- 238000013507 mapping Methods 0.000 claims abstract description 8
- 230000009467 reduction Effects 0.000 claims description 17
- 238000007405 data analysis Methods 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000012271 agricultural production Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 235000013305 food Nutrition 0.000 description 1
- 238000003306 harvesting Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000000474 nursing effect Effects 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 238000009331 sowing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0631—Resource planning, allocation, distributing or scheduling for enterprises or organisations
- G06Q10/06315—Needs-based resource requirements planning or analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/02—Agriculture; Fishing; Forestry; Mining
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/60—Rotation of whole images or parts thereof
- G06T3/608—Rotation of whole images or parts thereof by skew deformation, e.g. two-pass or three-pass rotation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/62—Analysis of geometric attributes of area, perimeter, diameter or volume
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Resources & Organizations (AREA)
- Strategic Management (AREA)
- Economics (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Entrepreneurship & Innovation (AREA)
- Quality & Reliability (AREA)
- Animal Husbandry (AREA)
- Game Theory and Decision Science (AREA)
- Operations Research (AREA)
- Educational Administration (AREA)
- Development Economics (AREA)
- Geometry (AREA)
- Life Sciences & Earth Sciences (AREA)
- Agronomy & Crop Science (AREA)
- Multimedia (AREA)
- Marine Sciences & Fisheries (AREA)
- Mining & Mineral Resources (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to the technical field of data analysis and processing, and discloses a method for automatically evaluating crop yield, which comprises the following steps: setting a camera and calibrating the position of the camera; acquiring an original image of a planting area through a camera; acquiring a plant image and a fruit image from an original image, and processing the fruit image into a binary image; carrying out boundary processing on the binary image, tracking the effective contour of the binary image, and acquiring the effective contour as a tracking result; analyzing pixel points of the tracking result, marking initial pixels, boundary pixels and undetermined pixels from the effective outline, and mapping the plant density, the fruit volume and the fruit quality with the pixels of the tracking result; calculating the fruit volume, the fruit quality and the plant density from the initial pixel, the boundary pixel and the undetermined pixel of the tracking result according to the mapping relation; and calculating the overall yield according to the fruit volume, the fruit quality and the plant density. The method can greatly improve the efficiency and accuracy of evaluation and can provide scientific reference for planting.
Description
Technical Field
The invention relates to the technical field of data analysis and processing, in particular to a method for automatically evaluating crop yield.
Background
The growth of crops can not be separated from scientific technological production technology and novel industrially manufactured mechanical equipment capable of assisting agricultural production. With the continuous improvement of the requirements of consumers on environment and food greening and health, the establishment of a traceable agricultural product production mechanism becomes a new trend of agricultural development, and how to record the growth track of agricultural products through data collection makes the products greener and healthier. At present, crop sowing, nursing, harvesting and the like are mainly carried out in the agricultural production by a manual working mode, a large amount of manpower is consumed in the whole crop planting production process, great physical consumption is brought to workers, and the manual working mode also becomes an important factor influencing the physical health of current agricultural workers.
In the planting process of crops, the planting scale, the gap arrangement and the growth condition of the crops are accurately judged, so that the yield of the crops in the future stage is evaluated, the optimal planting mode of the crops can be established in an auxiliary mode, the planting strategy of the crops is adjusted in time, and the comprehensive yield of the crops is improved conveniently.
In fact, the yield evaluation of crops still stays in the process of being carried out by the experience of people, but the judgment of the experience is often too subjective and not accurate enough, and the application area is narrow, so that the requirement of the yield evaluation of crops cannot be well met, a more reasonable technical scheme needs to be provided, and the technical problems in the prior art are solved.
Disclosure of Invention
The invention provides a method for automatically evaluating crop yield, and aims to realize an efficient method for evaluating the crop yield through automatic detection and judgment, perform objective comprehensive evaluation on crops in a planting area in time and provide a yield evaluation value under the current condition.
In order to achieve the above effects, the technical scheme adopted by the invention is as follows:
a method for automated crop yield assessment, comprising:
setting a camera and calibrating the position of the camera;
acquiring an original image of a planting area through a camera;
acquiring a plant image and a fruit image from an original image, and processing the fruit image into a binary image;
carrying out boundary processing on the binary image, tracking the effective contour of the binary image, and acquiring the effective contour as a tracking result;
analyzing pixel points of the tracking result, marking initial pixels, boundary pixels and undetermined pixels from the effective outline, and mapping the plant density, the fruit volume and the fruit quality with the pixels of the tracking result;
calculating the fruit volume, the fruit quality and the plant density from the initial pixel, the boundary pixel and the undetermined pixel of the tracking result according to the mapping relation;
and calculating the overall yield according to the fruit volume, the fruit quality and the plant density.
According to the evaluation method disclosed by the invention, the camera is arranged to obtain the image in the planting area, and after the image is processed, the volume, the quality and the plant density of the fruit are reversely calculated by using the pixels of the image, so that the yield of the whole planting area can be calculated. So can prejudge in earlier stage, obtain current output value in this plantation district, be convenient for adjust planting mode according to the output plan, make output reach people's expectation.
Further, the cameras disclosed in the above technical solutions are optimized, the cameras are used for collecting images in the planting area, specifically, the number of the cameras is one, the original images of the planting area are obtained in a single-camera multi-view mode, and the original images are processed through a view angle and a space object reduction rule, so that reduction and comparison are realized.
Further, the technical scheme discloses that the original image is processed through the visual angle and the space object reduction rule, so that reduction contrast is realized. Specifically, the reduction comparison comprises: the effective contour is evenly divided into areas through three X-direction spools and three Y-direction spools, the numerical values of pixel points occupied by the X-direction spools and the Y-direction spools are obtained, and the numerical values are compared with preset values of the pixel points in a database to obtain the quality of the fruit. The acquisition parameters can be automatically recognized in a standardized way through the setting.
Still further, the optimization of the three X-direction axes and the Y-direction axis disclosed in the above technical solution can be determined by the following means: determining the intersection point of the horizontal central line and the vertical central line of the effective contour in the binary image, respectively determining the direction line which passes through the intersection point and is superposed with the horizontal central line and the vertical central line as an X-direction spool and a Y-direction spool, respectively dividing the parts of the two spools positioned in the effective contour into four equal parts, wherein the direction line passing through the equal division point on the Y-direction spool is the X-direction spool, and the direction line passing through the equal division point on the X-direction spool is the Y-direction spool.
Further, before the above reduction comparison, the direction of the binary image needs to be adjusted to implement space reduction, specifically: the space restoration comprises the step of vertically correcting the binary image and rotating the central axis of the binary image to the vertical direction.
And furthermore, after the camera acquires the image and restores the original image with multiple visual angles, a real environment view in the current planting area can be obtained, so that the distance between the camera and the fruits and plants in the planting area and the pitch angle of the corresponding pixel points can be determined according to the position of the camera, and the fruit volume, the fruit quality and the plant density are reversely calculated.
Compared with the prior art, the invention has the beneficial effects that:
by using the technology, the labor intensity of workers can be greatly reduced, a manual confirmation mode is omitted, the yield value is not required to be judged through experience, the evaluation is carried out through the mode provided by the invention, the evaluation efficiency and accuracy can be greatly improved, and scientific reference can be provided for planting.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only show some embodiments of the present invention, and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
FIG. 1 is a schematic process diagram of the present invention;
FIG. 2 is a schematic diagram of the relative position relationship of pixels in the present invention;
FIG. 3 is an original image of the fruit obtained;
FIG. 4 is an effective contour image obtained by post-processing tracing of an original image;
FIG. 5 is a binary image obtained from a valid contour image conversion;
fig. 6 is a schematic diagram of area division of the binary image by the X-direction axis and the Y-direction axis.
Detailed Description
The invention is further described with reference to the following figures and specific embodiments. It should be noted that the description of the embodiments is provided to help understanding of the present invention, but the present invention is not limited thereto. Specific structural and functional details disclosed herein are merely illustrative of example embodiments of the invention. This invention may, however, be embodied in many alternate forms and should not be construed as limited to the embodiments set forth herein.
It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of example embodiments of the present invention.
It should be understood that the term "and/or" herein is merely one type of association relationship that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, B exists alone, and A and B exist at the same time, and the term "/and" is used herein to describe another association object relationship, which means that two relationships may exist, for example, A/and B, may mean: a alone, and both a and B alone, and further, the character "/" in this document generally means that the former and latter associated objects are in an "or" relationship.
It will be understood that when an element is referred to as being "connected," "connected," or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being "directly adjacent" or "directly coupled" to another element, there are no intervening elements present. Other words used to describe the relationship between elements should be interpreted in a similar manner (e.g., "between … …" versus "directly between … …", "adjacent" versus "directly adjacent", etc.).
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments of the invention. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes," and/or "including," when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, numbers, steps, operations, elements, components, and/or groups thereof.
It should also be noted that, in some alternative implementations, the functions/acts noted may occur out of the order noted in the figures. For example, two figures shown in succession may, in fact, be executed substantially concurrently, or the figures may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
In the following description, specific details are provided to facilitate a thorough understanding of example embodiments. However, it will be understood by those of ordinary skill in the art that the example embodiments may be practiced without these specific details. For example, systems may be shown in block diagrams in order not to obscure the examples in unnecessary detail. In other instances, well-known processes, structures and techniques may be shown without unnecessary detail in order to avoid obscuring example embodiments.
Examples
As shown in fig. 1 to 6, the present embodiment discloses a method for automatically evaluating crop yield, which includes:
s01: setting a camera and calibrating the position of the camera;
s02: acquiring an original image of a planting area through a camera;
s03: acquiring a plant image and a fruit image from an original image, and processing the fruit image into a binary image;
s04: carrying out boundary processing on the binary image, tracking the effective contour of the binary image, and acquiring the effective contour as a tracking result;
s05: analyzing pixel points of the tracking result, marking initial pixels, boundary pixels and undetermined pixels from the effective outline, and mapping the plant density, the fruit volume and the fruit quality with the pixels of the tracking result;
s06: calculating the fruit volume, the fruit quality and the plant density from the initial pixel, the boundary pixel and the undetermined pixel of the tracking result according to the mapping relation;
s07: and calculating the overall yield according to the fruit volume, the fruit quality and the plant density.
According to the evaluation method disclosed by the invention, the camera is arranged to obtain the image in the planting area, and after the image is processed, the volume, the quality and the plant density of the fruit are reversely calculated by using the pixels of the image, so that the yield of the whole planting area can be calculated. So can prejudge in earlier stage, obtain current output value in this plantation district, be convenient for adjust planting mode according to the output plan, make output reach people's expectation.
When the binary image is tracked, the standard image is obtained and measured and calibrated, and the calibration quantity and type are tighter, and the accuracy is higher; of course, a learning function or a compensation method can be set in the system to perform the measurement, and after the camera retrieves the original standard image, the two-value processing (color-to-black-to-white recognition of the effective boundary contour, and comparison with the standard image in the database) is performed to obtain the corresponding result.
The cameras disclosed in the technical scheme are optimized and used for collecting images in the planting area, specifically, the number of the cameras is one, the original images of the planting area are obtained in a single-camera multi-view mode, and the original images are processed according to the view angle and space object reduction rule, so that reduction and comparison are achieved.
The technical scheme discloses that the original image is processed through the visual angle and the space object reduction rule, so that reduction contrast is realized. Specifically, the reduction comparison comprises: the effective contour is evenly divided into areas through three X-direction spools and three Y-direction spools, the numerical values of pixel points occupied by the X-direction spools and the Y-direction spools are obtained, and the numerical values are compared with preset values of the pixel points in a database to obtain the quality of the fruit. The acquisition parameters can be automatically recognized in a standardized way through the setting.
The optimization of the three X-direction axes and the Y-direction axis disclosed in the technical scheme can be determined in the following way: determining the intersection point of the horizontal central line and the vertical central line of the effective contour in the binary image, respectively determining the direction line which passes through the intersection point and is superposed with the horizontal central line and the vertical central line as an X-direction spool and a Y-direction spool, respectively dividing the parts of the two spools positioned in the effective contour into four equal parts, wherein the direction line passing through the equal division point on the Y-direction spool is the X-direction spool, and the direction line passing through the equal division point on the X-direction spool is the Y-direction spool.
Before the above reduction comparison, the direction of the binary image needs to be adjusted, so as to implement spatial reduction, specifically: the space restoration comprises the step of vertically correcting the binary image and rotating the central axis of the binary image to the vertical direction.
After the camera acquires the image and restores the original image with multiple visual angles, the view of the real environment in the current planting area can be obtained, so that the distance between the camera and the fruits and plants in the planting area and the pitch angle of the corresponding pixel points can be determined according to the position of the camera, and the fruit volume, the fruit quality and the plant density can be reversely calculated.
The present invention is not limited to the above-described alternative embodiments, and the technical features can be arbitrarily combined to obtain a new technical solution without contradiction, and a person skilled in the art can obtain other various embodiments by arbitrarily combining the above-described embodiments with each other, and any person can obtain other various embodiments by teaching the present invention. The above detailed description should not be taken as limiting the scope of the invention, which is defined in the claims, and which the description is intended to be interpreted accordingly.
Claims (6)
1. A method for automated crop yield assessment, comprising:
setting a camera and calibrating the position of the camera;
acquiring an original image of a planting area through a camera;
acquiring a plant image and a fruit image from an original image, and processing the fruit image into a binary image;
carrying out boundary processing on the binary image, tracking the effective contour of the binary image, and acquiring the effective contour as a tracking result;
analyzing pixel points of the tracking result, marking initial pixels, boundary pixels and undetermined pixels from the effective outline, and mapping the plant density, the fruit volume and the fruit quality with the pixels of the tracking result;
calculating the fruit volume, the fruit quality and the plant density from the initial pixel, the boundary pixel and the undetermined pixel of the tracking result according to the mapping relation;
and calculating the overall yield according to the fruit volume, the fruit quality and the plant density.
2. The method for automatic evaluation of crop yield according to claim 1, characterized in that: the number of the cameras is one, the original image of the planting area is obtained in a single-camera multi-view mode, and the original image is processed according to the view angle and space object reduction rule, so that reduction and comparison are achieved.
3. The method for automatic evaluation of crop yield according to claim 2, characterized in that: the reduction comparison comprises: the effective contour is evenly divided into areas through three X-direction spools and three Y-direction spools, the numerical values of pixel points occupied by the X-direction spools and the Y-direction spools are obtained, and the numerical values are compared with preset values of the pixel points in a database to obtain the quality of the fruit.
4. The method for automatically evaluating crop yield according to claim 3, wherein: determining the intersection point of the horizontal central line and the vertical central line of the effective contour in the binary image, respectively determining the direction line which passes through the intersection point and is superposed with the horizontal central line and the vertical central line as an X-direction spool and a Y-direction spool, respectively dividing the parts of the two spools positioned in the effective contour into four equal parts, wherein the direction line passing through the equal division point on the Y-direction spool is the X-direction spool, and the direction line passing through the equal division point on the X-direction spool is the Y-direction spool.
5. The method for automatic evaluation of crop yield according to claim 2, characterized in that: the space restoration comprises the step of vertically correcting the binary image and rotating the central axis of the binary image to the vertical direction.
6. The method for automatic evaluation of crop yield according to claim 1, characterized in that: and determining the distance between the camera and the fruit and plant in the planting area and the pitch angle of the corresponding pixel point according to the position of the camera, thereby reversely calculating the fruit volume, the fruit quality and the plant density.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911236153.5A CN111080103A (en) | 2019-12-05 | 2019-12-05 | Method for automatically evaluating crop yield |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911236153.5A CN111080103A (en) | 2019-12-05 | 2019-12-05 | Method for automatically evaluating crop yield |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111080103A true CN111080103A (en) | 2020-04-28 |
Family
ID=70313205
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911236153.5A Pending CN111080103A (en) | 2019-12-05 | 2019-12-05 | Method for automatically evaluating crop yield |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111080103A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113807129A (en) * | 2020-06-12 | 2021-12-17 | 广州极飞科技股份有限公司 | Crop area identification method and device, computer equipment and storage medium |
CN114910147A (en) * | 2021-12-14 | 2022-08-16 | 成都农业科技职业学院 | Internet of things-based maturity and yield estimation method and device |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160307329A1 (en) * | 2015-04-16 | 2016-10-20 | Regents Of The University Of Minnesota | Robotic surveying of fruit plants |
CN107358627A (en) * | 2017-07-12 | 2017-11-17 | 西北农林科技大学 | Fruit size detection method based on Kinect cameras |
CN109087377A (en) * | 2018-08-03 | 2018-12-25 | 北京字节跳动网络技术有限公司 | Method and apparatus for handling image |
US20180373932A1 (en) * | 2016-12-30 | 2018-12-27 | International Business Machines Corporation | Method and system for crop recognition and boundary delineation |
CN109886094A (en) * | 2019-01-08 | 2019-06-14 | 中国农业大学 | A kind of crop growth of cereal crop seedlings seedling gesture capturing analysis method and device |
US20190258859A1 (en) * | 2016-09-07 | 2019-08-22 | Precision Hawk Usa, Inc. | Systems and methods for mapping emerged plants |
-
2019
- 2019-12-05 CN CN201911236153.5A patent/CN111080103A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160307329A1 (en) * | 2015-04-16 | 2016-10-20 | Regents Of The University Of Minnesota | Robotic surveying of fruit plants |
US20190258859A1 (en) * | 2016-09-07 | 2019-08-22 | Precision Hawk Usa, Inc. | Systems and methods for mapping emerged plants |
US20180373932A1 (en) * | 2016-12-30 | 2018-12-27 | International Business Machines Corporation | Method and system for crop recognition and boundary delineation |
CN107358627A (en) * | 2017-07-12 | 2017-11-17 | 西北农林科技大学 | Fruit size detection method based on Kinect cameras |
CN109087377A (en) * | 2018-08-03 | 2018-12-25 | 北京字节跳动网络技术有限公司 | Method and apparatus for handling image |
CN109886094A (en) * | 2019-01-08 | 2019-06-14 | 中国农业大学 | A kind of crop growth of cereal crop seedlings seedling gesture capturing analysis method and device |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113807129A (en) * | 2020-06-12 | 2021-12-17 | 广州极飞科技股份有限公司 | Crop area identification method and device, computer equipment and storage medium |
CN114910147A (en) * | 2021-12-14 | 2022-08-16 | 成都农业科技职业学院 | Internet of things-based maturity and yield estimation method and device |
CN114910147B (en) * | 2021-12-14 | 2023-10-24 | 成都农业科技职业学院 | Maturity and yield estimation method and device based on Internet of things |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11734925B2 (en) | Large-scale crop phenology extraction method based on shape model fitting method | |
Hui et al. | Image-based dynamic quantification and high-accuracy 3D evaluation of canopy structure of plant populations | |
Kurtser et al. | In-field grape cluster size assessment for vine yield estimation using a mobile robot and a consumer level RGB-D camera | |
CN107748886B (en) | Track type modern standardized orchard information sensing system based on depth camera | |
WO2010139628A1 (en) | Device and method for recording a plant | |
CN111080103A (en) | Method for automatically evaluating crop yield | |
US10602665B2 (en) | Two armed robotic system for adjusting the height of an agricultural tool | |
DE102015221085A1 (en) | Method and information system for recognizing at least one plant planted in a field | |
CN103808263A (en) | High-throughput detection method for grain form parameters | |
KR101974638B1 (en) | Apparatus for processing plant images and method thereof | |
CN109470179A (en) | A kind of extensive water ploughs vegetables growing way detection system and method | |
CN115272187A (en) | Vehicle-mounted dynamic field frame-to-frame relevance based field rice and wheat lodging global evaluation method | |
CN111985724B (en) | Crop yield estimation method, device, equipment and storage medium | |
CN104766135A (en) | Method, device and system for predicting crop yield | |
CN116485412B (en) | Agricultural product tracing method and system based on blockchain technology | |
Constantino et al. | Plant height measurement and tiller segmentation of rice crops using image processing | |
CN108323389A (en) | The detection method and device of the rice transplanting rice shoot spacing in the rows and cave rice shoot number of rice transplanter | |
CN114066842A (en) | Method, system and device for counting number of ears and storage medium | |
DE102014107143A1 (en) | System and method for efficient surface measurement using a laser displacement sensor | |
CN110738133B (en) | Method and device for identifying image contour boundaries of different agricultural facilities | |
CN109598215A (en) | Orchard modeling analysis system and method based on positioning shooting of unmanned aerial vehicle | |
CN113932712B (en) | Melon and fruit vegetable size measurement method based on depth camera and key points | |
CN118411607A (en) | Flower yield estimation method, device and medium based on computer vision | |
CN103919556A (en) | Cow body shape trait index data collecting method based on three-dimensional measuring | |
Kurtser et al. | PointNet and geometric reasoning for detection of grape vines from single frame RGB-D data in outdoor conditions |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200428 |
|
RJ01 | Rejection of invention patent application after publication |