CN111964760B - Fruit and vegetable weight calculation method and device based on intelligent planter - Google Patents

Fruit and vegetable weight calculation method and device based on intelligent planter Download PDF

Info

Publication number
CN111964760B
CN111964760B CN202010691092.8A CN202010691092A CN111964760B CN 111964760 B CN111964760 B CN 111964760B CN 202010691092 A CN202010691092 A CN 202010691092A CN 111964760 B CN111964760 B CN 111964760B
Authority
CN
China
Prior art keywords
fruit
weight
detection sensor
weight detection
crop
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010691092.8A
Other languages
Chinese (zh)
Other versions
CN111964760A (en
Inventor
翁园林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Ainong Yunlian Technology Co ltd
Original Assignee
Wuhan Ainong Yunlian Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Ainong Yunlian Technology Co ltd filed Critical Wuhan Ainong Yunlian Technology Co ltd
Priority to CN202010691092.8A priority Critical patent/CN111964760B/en
Publication of CN111964760A publication Critical patent/CN111964760A/en
Application granted granted Critical
Publication of CN111964760B publication Critical patent/CN111964760B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01GWEIGHING
    • G01G19/00Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01GWEIGHING
    • G01G23/00Auxiliary devices for weighing apparatus

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a fruit and vegetable weight calculation method and device based on an intelligent planter, wherein the fruit and vegetable weight calculation method comprises the following steps: acquiring a multi-dimensional graph of crops in the intelligent planter, and generating a crop model according to the multi-dimensional graph of the crops, wherein the crop model comprises distribution information of fruits; acquiring unit density information of each fruit detected by a weight detection sensor; establishing mapping parameters of unit density information and fruit positions according to the initial position and/or the motion track of the weight detection sensor; and determining the weight of each fruit in the crop model according to the mapping parameters and the crop model. The method can determine the actual position of each fruit on the crop model (namely the crop which actually grows), and can calculate the weight of each fruit, so that on one hand, people can be guided to pick the fruits with the weight reaching the standard, the yield is improved, and on the other hand, factors influencing the weight of the fruits can be determined through big data analysis, so that people can be guided to plant the fruits correctly.

Description

Fruit and vegetable weight calculation method and device based on intelligent planter
Technical Field
The invention belongs to the technical field of crop planting and detection, and particularly relates to a fruit and vegetable weight calculation method and device based on an intelligent planter.
Background
With the rapid development of social economy and the improvement of living standard of people, seasonal vegetables and out-of-season vegetables are gradually popular with people, and in order to meet the requirements of people on healthy and safe vegetables, some planting black technologies are gradually introduced into the lives of people.
However, in the growing process of crops, the weight of fruits is difficult to judge only by size, fruits which are seemingly large but have small weight are often picked, and the fruits may need to be grown for a long time to grow to an optimal state, and the yield is influenced by early picking.
In view of the above, overcoming the drawbacks of the prior art is an urgent problem in the art.
Disclosure of Invention
Aiming at the defects or the improvement requirements of the prior art, the invention provides a fruit and vegetable weight calculation method and a device based on an intelligent planter, aiming at determining the actual position of each fruit on a crop model (namely the crop which actually grows) and calculating the weight of each fruit by adopting the method disclosed by the invention, so that on one hand, people can be guided to pick the fruit with the weight reaching the standard and improve the yield, and on the other hand, factors influencing the weight of the fruit can be determined through big data analysis so as to guide people to plant the fruit correctly.
In order to achieve the above object, according to one aspect of the present invention, there is provided a fruit and vegetable weight calculation method based on an intelligent planter, the fruit and vegetable weight calculation method including:
acquiring a multi-dimensional graph of a crop in an intelligent planter, and generating a crop model according to the multi-dimensional graph of the crop, wherein the crop model comprises distribution information of fruits;
acquiring unit density information of each fruit detected by a weight detection sensor;
establishing mapping parameters of unit density information and fruit positions according to the initial position and/or the motion track of the weight detection sensor;
and determining the weight of each fruit in the crop model according to the mapping parameters and the crop model.
Preferably, the establishing of the mapping parameter of the unit density information and the fruit position according to the initial position and/or the movement track of the weight detection sensor comprises:
acquiring an initial coordinate of each weight detection sensor in a preset coordinate system, and a rotation angle of each weight detection sensor in the process of detecting the unit density of the fruits;
determining the coordinates of fruits distributed on the crops under a preset coordinate system based on the initial coordinates of the weight detection sensor and the rotation angle of the weight detection sensor;
and establishing mapping parameters of unit density information and fruit positions according to the coordinates of each fruit in a preset coordinate system and the unit density detection result of the corresponding weight detection sensor.
Preferably, the determining coordinates of the fruits distributed on the crop under the preset coordinate system based on the initial coordinates of the weight detecting sensors and the rotation angles of the weight detecting sensors comprises:
after unit density information sent by any weight detection sensor is received, the rotation angle of the corresponding weight detection sensor is obtained;
and determining the coordinates of the fruits distributed on the crops under a preset coordinate system by combining the initial coordinates and the corresponding rotation angles of the at least two weight detection sensors.
Preferably, the establishing a mapping parameter of the unit density information and the fruit position according to the initial position and/or the movement track of the weight detection sensor further comprises:
judging whether the coordinates of the fruits distributed on the crops under a preset coordinate system are the same or not;
if the fruit to be verified exists, determining the fruit to be verified according to the same coordinate, selecting a first weight detection sensor and a second weight detection sensor, and respectively adjusting the detection angles of the first weight detection sensor and the second weight detection sensor so that the detection signals of the first weight detection sensor and the second weight detection sensor are both emitted to the same fruit to be verified;
and correcting the coordinates of the fruit to be verified under a preset coordinate system according to the detection conditions of the first weight detection sensor and the second weight detection sensor.
Preferably, the correcting the coordinates of the fruit to be verified in the preset coordinate system according to the detection conditions of the first weight detecting sensor and the second weight detecting sensor comprises:
determining a first distance between the first weight detection sensor and the fruit to be verified and a second distance between the second weight detection sensor and the fruit to be verified according to the signal strength received by the first weight detection sensor and the second weight detection sensor respectively;
and correcting the coordinates of the fruit to be verified under a preset coordinate system according to the first distance, the second distance, the coordinates of the first weight detection sensor and the coordinates of the second weight detection sensor.
Preferably, the obtaining a multi-dimensional map of a crop in an intelligent planter, and the generating a crop model from the multi-dimensional map of the crop comprises:
receiving a plurality of pictures of crops in the intelligent planter reported by a user;
fitting the pictures of the crops in the intelligent planting machines to obtain a multi-dimensional image of the crops in the intelligent planting machines;
and generating a crop model according to the multi-dimensional graph of the crop.
Preferably, in the pictures of the crops in the intelligent planter, which are reported by the receiving user, the request for reporting the pictures by the user is generated by the user actively triggering the intelligent terminal, or is generated by the intelligent terminal automatically initiating after the intelligent terminal identifies that the pictures contain the intelligent planter objects.
Preferably, the determining the weight of each fruit in the crop model according to the mapping parameters and the crop model comprises:
determining the unit density of each fruit in the crop model according to the mapping parameters and the crop model;
calculating the volume of each fruit according to the crop model and the selected reference object in the crop model;
the weight of each fruit was calculated from the unit density of each fruit and the volume of each fruit.
Preferably, the server is connected with a plurality of intelligent terminals, and the fruit and vegetable weight calculation method further comprises the following steps:
and aiming at each intelligent terminal, after the weight of the fruit corresponding to each intelligent terminal is calculated, ranking is carried out according to the weight condition, and the planting condition of each user is displayed in a ranking mode.
According to another aspect of the invention, there is provided a fruit and vegetable weight calculation apparatus comprising at least one processor; and a memory communicatively coupled to the at least one processor; the storage stores instructions which can be executed by the at least one processor, and the instructions are programmed to execute the fruit and vegetable weight calculation method.
Generally, compared with the prior art, the technical scheme of the invention has the following beneficial effects: the weight calculation method of the fruits and vegetables comprises the following steps: acquiring a multi-dimensional graph of crops in the intelligent planter, and generating a crop model according to the multi-dimensional graph of the crops, wherein the crop model comprises distribution information of fruits; acquiring unit density information of each fruit detected by a weight detection sensor; establishing mapping parameters of unit density information and fruit positions according to the initial position and/or the motion track of the weight detection sensor; and determining the weight of each fruit in the crop model according to the mapping parameters and the crop model. The method can determine the actual position of each fruit on the crop model (namely the crop which actually grows), and can calculate the weight of each fruit, so that on one hand, people can be guided to pick the fruits with the weight reaching the standard, the yield is improved, and on the other hand, factors influencing the weight of the fruits can be determined through big data analysis, so that people can be guided to plant the fruits correctly.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required to be used in the embodiments of the present invention will be briefly described below. It is obvious that the drawings described below are only some embodiments of the invention, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
FIG. 1 is a schematic structural diagram of an intelligent planter according to the present invention;
fig. 2a is a schematic top view of a planter body of an intelligent planter according to the present invention (only a partial structure is shown);
FIG. 2b is a schematic diagram of an implementation of the rotation of the rotating base relative to the planter body according to an embodiment of the present invention;
FIG. 2c is a schematic cross-sectional view of FIG. 2b according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a light supplement band according to an embodiment of the present invention;
FIG. 4 is a schematic structural diagram of a support frame provided in the present invention;
FIG. 5 is a schematic view of another intelligent planter according to an embodiment of the present invention;
FIG. 6 is a schematic structural diagram of another support frame provided in the embodiment of the present invention;
FIG. 7a is another embodiment of the present invention, which provides another rotation of the rotating base relative to the planter body;
FIG. 7b is a schematic cross-sectional view of FIG. 7a according to an embodiment of the present invention;
FIG. 8 is a schematic flowchart illustrating a method for calculating a weight of fruits and vegetables based on an intelligent planting machine according to an embodiment of the present invention;
FIG. 9 is a schematic view of the detailed flow chart of step 102 in FIG. 8;
FIG. 10 is a diagram illustrating a method for setting a default coordinate system according to an embodiment of the present invention;
FIG. 11a is a schematic diagram illustrating the rotation of the supporting frame by an angle a according to the embodiment of the present invention;
FIG. 11b is a schematic diagram illustrating coordinate transformation for mapping the real object to the predetermined coordinate system in FIG. 11a according to an embodiment of the present invention;
FIG. 12 is a schematic view of the detection angle range of the weight detection sensor provided in the embodiment of the present invention;
FIG. 13 is a schematic flow chart of steps further included between steps 102b and 102c in FIG. 9;
FIG. 14 is one implementation of an embodiment of the present invention to accurately determine the location of a fruit to be verified;
FIG. 15 is a second implementation of the embodiment of the present invention for accurately determining the position of a fruit to be verified;
fig. 16 is a schematic structural diagram of a fruit and vegetable weight calculating device provided by an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
Example 1:
this embodiment provides a wisdom planter, this wisdom planter can be used to plant crops such as all kinds of vegetables or fruit, and wherein, this wisdom planter not only is applicable to the family of miniscope and plants, also is applicable to wide-range farming, and the crop growth rate that adopts this wisdom planter to plant is fast, and planting density is big, and output is high moreover.
As shown in fig. 1 and fig. 2a, the intelligent planter of the present embodiment comprises a planter body 1, a support frame 2, a plurality of weight detecting sensors 20, and a rotating base 3; the weight detection sensors 20 are distributed on the support frame 2, and the weight detection sensors 20 are used for detecting and reporting the unit density information of the fruits on the crops in the intelligent planting machine.
Rotating base 3 with support frame 2 is connected, rotating base 3 is used for driving support frame 2 centers on crop in the wisdom planter is rotatory, in order to adjust weight detection sensor 20's position and/or motion trail, thereby all-round detection fruit on the crop in the wisdom planter. As shown in fig. 2c, a containing groove 32 is formed on the rotating base 3, and the containing groove 32 is used for containing the bottom of the supporting frame 2; wherein, the receiving groove 32 and the support frame 2 form a detachable connection, for example, a magnetic adsorption or a screw connection.
In the optional scheme, be provided with circular orbit 11 on the planter body 1, rotating base 3 can carry out 360 degrees rotations along circular orbit 11 to drive support frame 2 centers on the crop rotation in the wisdom planter to the all-round detection fruit on the crop in the wisdom planter.
In a practical application scenario, as shown in fig. 2b and fig. 2c, a driving wheel 12 is arranged on the planter body 1, a driven wheel 31 is arranged on the rotating base 3, a first gear 121 is arranged on the driving wheel 12, a second gear 311 is arranged on the driven wheel 31, and the driving wheel 12 and the driven wheel 31 are in gear engagement connection through the first gear 121 and the second gear 311; wherein, the driving wheel 12 and the driven wheel 31 cooperate with each other to drive the rotating base 3 to rotate along the track 11. In an optional scheme, wisdom planter still includes the motor, and the motor is connected with driven wheel 31, and driving wheel 12 keeps motionless for planter body 1, and motor drive follows driving wheel 31 and rotates along driving wheel 12 to drive support frame 2 and rotate along track 11, as shown in fig. 2a, rotating base 3 can follow initial position (right side position), after moving respectively to downside position, left side position and upside position along track 11, returns initial position (right side position) once more. In another optional scheme, the wisdom planter still includes the motor, and the motor is connected with action wheel 12, and motor drive action wheel 12 rotates, and action wheel 12 drives and rotates from driving wheel 31 to drive support frame 2 and rotate along track 11.
Furthermore, a traction piece 13 is further arranged on the planter body 1, an internal gear 131 is arranged on the traction piece 13, and the traction piece 13 and the driving wheel 12 are coaxially arranged; the internal gear 131 on the traction element 13 and the first gear 121 on the driving wheel 12 form a track 11 for the rotation of the rotating base 3; the second gear 311 on the driven wheel 31 is also in gear engagement with the internal gear 131 on the traction element 13. The rotating base 3 is driven to rotate along the track 11 by the mutual cooperation among the driving wheel 12, the driven wheel 31 and the traction piece 13.
Wherein, the support frame 2 comprises at least two support arms which can extend along two different directions, the weight detecting sensors 20 are distributed on the support arms which extend along the different directions, so that the positions of the detected fruits can be determined according to the positions of the different weight detecting sensors 20 and the detection result.
In an alternative, as shown in fig. 4, the support frame 2 includes a first support arm 21 and a second support arm 22, and the first support arm 21 is movably connected to the second support arm 22, it is understood that an angle between the first support arm 21 and the second support arm 22 is adjustable, for example, an angle between the first support arm 21 and the second support arm 22 is 90 degrees, and the weight detection sensors 20 are respectively distributed on the first support arm 21 and the second support arm 22. During the actual analysis, a coordinate system can be established based on the first support arm 21 and the second support arm 22 as reference, so as to determine the specific position of the fruit.
In an alternative solution, the first support arm 21 and the second support arm 22 are retractable to adjust the length of the first support arm 21 and the second support arm 22 according to the crop production conditions in the smart planter. In practical application scenarios, different crops are different in size, for example, some crops are narrow and long (which may be understood as being laterally shorter and longitudinally higher), some crops are fat and short (which may be understood as being laterally longer and longitudinally shorter), and the lengths of the first supporting arm 21 and the second supporting arm 22 may be adjusted according to the actual growth conditions of the crops, so as to ensure that the weight detecting sensors 20 on the supporting arms can fully cover the fruits on the crops.
In a practical application scenario, when the fruit density is too high, there is a possibility that adjacent fruits are commonly detected by the weight detection sensors 20 at the same position, so that overlapping detection occurs, and the real position of the fruit is misjudged. To solve this problem, in a preferred embodiment, the detection angle of the weight detection sensor 20 is adjustable to re-detect the fruit to be verified by adjusting the detection angle of some of the weight detection sensors 20. Here, the detection angle refers to a light emission angle of the weight detection sensor 20.
In practical application scenarios, the intelligent planting machine of the present embodiment further includes an adjustable support 4, and some climbing crops, such as cucumbers, tomatoes, etc., can grow along the adjustable support 4, wherein the height of the adjustable support 4 can be adjusted to adapt to the growth of the crops.
In another alternative, as shown in fig. 5 to 7, the supporting frame 2 includes a first supporting arm 21, a second supporting arm 22 and a third supporting arm 23, the second supporting arm 22 is perpendicular to the first supporting arm 21 and the third supporting arm 23, the number of the rotating base 3 and the driven wheel 12 is two, and the rest of the structure is the same as that of the above embodiment, and is not described again here. By adopting the structure, the support frame 2 can rotate 180 degrees along the track 11, and the omnibearing detection can be realized.
In preferred embodiment, wisdom planter still includes mends light area 23, and wherein, mends light area 23 can set up on support frame 2, and LED lamp light filling area can be for 5 red 1 blue LED lamp areas, realizes high-power high light efficiency vegetation light filling lamp, satisfies the required spectrum of the different growth stages of plant, prevents that the crop only longleaf and pole are not grown the fruit, improves output greatly, carries out the light filling voluntarily when the illumination is not enough. As shown in fig. 3, the number of the light supplementing strips 23 may be multiple, and specifically includes a first light supplementing strip 231, a second light supplementing strip 232, and a third light supplementing strip 233, wherein the illumination intensity of the first light supplementing strip 231 is greater than the illumination intensity of the second light supplementing strip 232, and the intensity of the second light supplementing strip 232 is greater than the intensity of the third light supplementing strip 233. Through the actual growth condition of crop, selectively open corresponding light filling area 23, realize the irradiant function of intelligence. Or the illumination intensity of the first light supplementing strip 231, the second light supplementing strip 232 and the third light supplementing strip 233 can be adjusted, and the illumination intensity of the first light supplementing strip 231, the second light supplementing strip 232 and the third light supplementing strip 233 can be correspondingly adjusted according to the actual growth condition of the passing object, so that the crops can be guaranteed to grow well.
In order to realize automatic planting, in the preferred scheme, the wisdom planter still includes accurate drip irrigation system, illumination sensor, air temperature and humidity sensor and soil moisture sensor, through information such as illumination, temperature, humidity of each sensor real-time supervision vegetation, integrates the information flow, for the planter provides accurate information source to it is the crop light filling to control LED lamp light filling area, and for the crop supplementary nutrient solution or water through accurate drip irrigation system.
The accurate drip irrigation system comprises a silent water pump, the flow rate can be adjusted in a stepless mode from 0L/H to 100L/H, and diluted nutrient solution (high-concentration concentrated nutrient solution is diluted by 1: 500) is input into the planting pot through a conduit connected with the water pump.
In preferred scheme, wisdom planter still includes level sensor, through level sensor real-time supervision nutrient solution water level, prevents that the liquid level is too high or low excessively, and the water shortage is reported to the police.
Further, the wisdom planter still includes main control chip, and wherein, this main control chip can be for carrying on the high performance main control chip of ARM Cortex M3 kernel. This main control chip is connected with light sensor, air temperature and humidity sensor, soil moisture sensor and level sensor respectively to receive the detection information of each sensor respectively, acquire the growth state of crop, this main control chip still respectively with LED lamp fill-up area, accurate drip irrigation system, so that according to the growth state of crop, control LED lamp fill-up area is the crop light filling, and for crop supplementary nutrient solution or water through accurate drip irrigation system.
In addition, the main control chip is also connected to the weight detection sensors 20 so as to obtain the unit density detection result of each weight detection sensor 20. Or, when the position of the fruit needs to be accurately determined, the main control chip is further used for controlling and adjusting the detection angle of the weight detection sensor 20.
By combining the fruit and vegetable weight calculation method in embodiment 2, the main control chip in this embodiment can establish a connection with the intelligent terminal and/or the server, so as to report the detected unit density information and other information.
Adopt the wisdom planter of this embodiment, can be through the weight of every fruit of weight detection sensor intelligent analysis, can guide people to pick the fruit that weight is up to standard on the one hand, increase of production, on the other hand can also confirm the factor that influences fruit weight through big data analysis to guide people to plant correctly.
Example 2:
the embodiment provides a fruit and vegetable weight calculation method based on an intelligent planter, which is suitable for the intelligent planter in any embodiment, the method can determine the actual position of each fruit on a crop model (namely, a crop which actually grows), and can calculate the weight of each fruit, so that on one hand, people can be guided to pick fruits with the weight reaching the standard, the yield is improved, and on the other hand, factors influencing the weight of the fruits can be determined through big data analysis, so that people can be guided to plant the fruits correctly. Meanwhile, ranking can be performed according to the weight, and interestingness of planting is enhanced.
Referring to fig. 8, an implementation process of the fruit and vegetable weight calculation method is specifically described, and the fruit and vegetable weight calculation method of the embodiment includes the following steps:
step 100: the method comprises the steps of obtaining a multi-dimensional graph of crops in the intelligent planter, and generating a crop model according to the multi-dimensional graph of the crops, wherein the crop model comprises distribution information of fruits.
The fruits of the crops in the intelligent planter grow on the culture medium, and the crops can bear more fruits, for example, the crops can be vegetables or fruits such as tomatoes, cucumbers, green peppers or cherries.
The multi-dimensional graph of the crop can cover the omni-directional characteristics of the crop, and the multi-dimensional graph can be obtained according to a plurality of photos at different angles or can be obtained by analyzing a video containing omni-directional information of the crop.
The crop model can be a three-dimensional model, is obtained by modeling according to the real shape and size of the crop and the distribution of the fruits, and simulates and shows the actual growth condition of the crop. In an actual application scene, under a preset coordinate system, three-dimensional coordinates of fruits distributed on crops can be determined according to a crop model so as to establish a mapping relation between unit density information and fruit positions in a follow-up manner.
Step 101: information on the unit density of each fruit detected by the weight detection sensor is acquired.
The weight detection sensor can be a sensor manufactured based on a near infrared spectrum detection technology, and in the actual detection process, the near infrared spectrum detection technology is adopted to detect the unit density information of the fruits. The near-infrared light is an electromagnetic wave between visible light and mid-infrared light, and generally refers to an electromagnetic wave having a wavelength in a range of 780nm to 2526 nm.
When the fruits interact with the near infrared light, the related information such as the fruit components and the internal quality parameters are loaded on the near infrared light, and then the internal quality parameters of the fruits are analyzed and extracted from the near infrared light. Specifically, when near infrared light is irradiated onto fruits, the internal components of different fruits have different degrees of light absorption and reflection for different wavelengths, and the spectral characteristics thereof are changed according to the internal components of the fruits (for example, molecules such as O-H, N-H and C-H) and the mass fractions of the components, so as to analyze the main components in the fruits and the mass fractions of the components according to the near infrared spectrum, thereby determining the unit density information of the fruits.
Step 102: and establishing a mapping parameter of unit density information and fruit positions according to the initial position and/or the motion track of the weight detection sensor.
In this embodiment, the position of the detected fruit is determined according to the initial position and/or the movement locus of the weight detection sensor, and then the mapping parameter between the unit density information and the fruit position is established according to the position of the detected fruit and the unit density information thereof. Wherein the fruit position is estimated based on the position of the weight detecting sensor in this step, there may be a deviation from the actual position of the fruit, but the deviation is within an acceptable range.
Wherein, a plurality of weight detection sensor that distribute on the wisdom planter, and this weight detection sensor distributes at least in two different directions, can confirm the position of fruit according to the weight detection sensor that is not on same direction. In the initial state, the detection angle of the weight detecting sensor is fixed to face one direction, and the position of the fruit is determined by combining the theory that two straight lines in the same plane intersect at one point (as shown in fig. 10) and the density detection similarity condition of the two weight detecting sensors.
For the unit density information of all fruits on the all-round detection crop, the support frame that weight detection sensor place can be rotatory along predetermineeing the direction to drive weight detection sensor switching position, 360 degrees are around crop rotary motion, guarantee to detect whole fruits on the crop.
Wherein, weight detection sensor's initial position can be the position that weight detection sensor does not correspond when rotating, and weight detection sensor can restore initial position again after 360 degrees rotations, like the wisdom planter that fig. 1 shows, in fig. 2a, when the support frame is in the right side position, weight detection sensor is located initial position.
Step 103: and determining the weight of each fruit in the crop model according to the mapping parameters and the crop model.
The fruit positions in the mapping parameters are positions predicted by adopting a mathematical algorithm, the positions of the fruits in the crop model are the same as the actual growth condition of the crops, and coordinate matching is carried out on the fruit positions in the mapping parameters and the distribution condition of the fruits in the crop model so as to enable the unit density information of each fruit to correspond to the real position of each fruit.
Here, it should be noted that, when performing coordinate matching, it is necessary to ensure that a reference coordinate system corresponding to the crop model is the same as a reference coordinate system corresponding to the mapping parameter, or that the two are different, and coordinate conversion is necessary, so as to ensure that the fruit coordinates are established according to the same reference coordinate system.
In practical application scenarios, there may be multiple ways to achieve the goal of showing the position of a certain weight of fruit up to standard to people for picking. For example, the fruit may be marked with a highlight or other color in the crop model, and since the crop model is the same as the actual growth condition of the crop, one can directly identify the position of the fruit with reference to the crop model, or trigger an LED lamp on the intelligent planter to emit the light of the LED lamp to the fruit.
In step 103, the actual volume of each fruit can be determined by combining the picture uploaded by the user and the specified reference, and the weight of the fruit can be determined by multiplying the unit density and the volume. Wherein the designated reference object may be a support frame, such as the first support arm and the second support arm described in fig. 4. One of the implementation manners of step 103 is: determining the unit density of each fruit in the crop model according to the mapping parameters and the crop model; calculating the volume of each fruit according to the crop model and the selected reference object in the crop model; the weight of each fruit was calculated from the unit density of each fruit and the volume of each fruit.
In an actual application scene, in order to protect the privacy of a user, a shooting device is not installed on the intelligent planting machine, a multi-dimensional graph of a crop is generally shot by the user independently and uploaded to a server, and the server produces a crop model according to the multi-dimensional graph of the crop.
In step 100, a multi-dimensional map of a crop in an intelligent planter is obtained, and a crop model is generated according to the multi-dimensional map of the crop, wherein at least two implementation manners exist as follows:
the first method is as follows: the intelligent terminal of the user side is loaded with an application program (APP), and the application program is used for monitoring the growth state of crops in the intelligent planting machine. The APP can automatically identify whether the picture contains the intelligent planter object or not according to the picture shot by the user, and if the picture contains the intelligent planter object, the intelligent terminal initiates a picture request to be reported to the server. The server side receives the pictures of the crops in the intelligent planting machines reported by the intelligent terminal according to the reported picture request, and fits the pictures of the crops in the intelligent planting machines to obtain a multi-dimensional graph of the crops in the intelligent planting machines; and generating a crop model according to the multi-dimensional graph of the crop. Or the APP can automatically identify whether the picture contains the intelligent planter object or not according to the picture shot by the user, if so, the intelligent terminal prompts the user to initiate a request for reporting the picture to the server, and after the intelligent terminal receives a picture reporting operation triggered by the user, the intelligent terminal establishes connection with the server and initiates a picture reporting request to the server.
The second method comprises the following steps: in a first mode, the APP loaded on the intelligent terminal can automatically identify the picture taken by the user, and the privacy of the user is invaded to a certain extent. In a preferred mode, after a user shoots a picture of a crop, the intelligent terminal is actively triggered to report the picture to the server, the server performs image recognition, and after the picture is determined to contain an intelligent planter object, the pictures of the crop in multiple intelligent planters are fitted to obtain a multi-dimensional graph of the crop in the intelligent planter; and generating a crop model according to the multi-dimensional graph of the crop.
The fruit density of different crops (the fruit density refers to the interval or distance between adjacent fruits) is different, even the same crop has the situation that the fruit density is greatly different, when the distribution of the weight detection sensors is not matched with the fruit density, the problem of missing detection or overlapping detection is easy to occur, for example, when the number of the weight detection sensors is small and the fruit density is high, a part of fruits can not be detected (missing detection) possibly; when the number of the weight detection sensors is large and the fruit density is small, there is a possibility that adjacent fruits are collectively detected by the weight detection sensors at the same position (overlap detection), resulting in erroneous judgment of the true position of the fruit.
In order to solve the foregoing problems, in a preferred embodiment, before step 101, the density of the fruits or the number and spacing of the fruits in different directions may be determined according to the distribution information of the fruits in the crop model, and then the distribution of the weight detection sensors on the intelligent planter is adjusted according to the actual situation, so that the weight detection sensors and the distribution of the fruits on the crops can achieve the best adaptation degree, the fruits on the crops can be completely covered, and the condition of missing detection can be avoided as much as possible.
In an alternative, with reference to fig. 9 to 11, in step 102, establishing a mapping parameter of the unit density information and the fruit position according to the initial position and/or the movement track of the weight detection sensor may be implemented as follows:
step 102 a: and acquiring initial coordinates of each weight detection sensor in a preset coordinate system, and the rotating angle of each weight detection sensor in the process of detecting the unit density of the fruits.
The preset coordinate system may be set with reference to fig. 10 (for example, the direction of the first support arm is a Z axis, and the direction of the second support arm is an X axis), where in fig. 10, the left side is an actual diagram, and the right side is a diagram that the actual diagram on the left side is mapped to the preset coordinate system. In an alternative embodiment, the support frame rotates along a straight line (a central line extending along the Z-axis direction) where a central point of the second support arm is located during the rotation process, the central point of the second support arm is used as an origin, the direction of the first support arm is the Z-axis, and the direction of the second support arm is the X-axis, so as to establish a preset coordinate system. In other embodiments, a preset coordinate system may be established in other manners, so as to ensure that the preset coordinate system corresponds to the reference object.
The rotation angle of the weight detecting sensor can be determined by the rotation angle of the support frame driven by the rotating base, wherein, as shown in fig. 11a, a comparison schematic diagram after the rotation angle a of the support frame is shown, and as shown in fig. 11b, the coordinate conditions of the same weight detecting sensor before and after the rotation of the support frame are shown, wherein, the angle a is the rotation angle of the weight detecting sensor driven by the rotating base. As shown in fig. 11b, the initial coordinate of a certain weight detecting sensor on the first support arm is (x1, 0, z1), and after the rotation angle a, the coordinate of the weight detecting sensor is (x1cosa, x1sina, z 1); the initial coordinate of a certain weight detecting sensor on the second support arm is (x0, 0, 0), and after the rotation angle a, the coordinate of the weight detecting sensor is (x0cosa, x0sina, 0). Therefore, the weight detection sensors distributed in different directions, in combination with the rotation angle, determine the three-dimensional coordinates of the fruit on the crop.
In an actual application scene, the weight detection sensor rotates along the Z-axis direction, in the 360-degree rotation process, the weight detection sensor can realize omnibearing three-dimensional detection, and under a preset coordinate system, the coordinate of the weight detection sensor can also change along with the rotation angle of the weight detection sensor, so that the position of a fruit can be calibrated through the weight detection sensor.
Step 102 b: and determining the coordinates of the fruits distributed on the crops under a preset coordinate system based on the initial coordinates of the weight detection sensors and the rotation angles of the weight detection sensors.
In the initial state, the detection angle of the weight detection sensor is fixed to face one direction, and the position of the fruit is determined by combining the theory that two straight lines in the same plane intersect at one point and the density detection similarity condition of the two weight detection sensors.
Specifically, after receiving unit density information sent by any one weight detection sensor, acquiring a rotation angle of the corresponding weight detection sensor; the coordinates of the fruits distributed on the crop under the preset coordinate system are determined by combining the initial coordinates and the corresponding rotation angles of the at least two weight detection sensors, as shown in fig. 10, the rotation angle is zero, and the positions of the detected fruits can be determined by combining the initial coordinates of the two weight detection sensors distributed in different directions.
Step 102 c: and establishing mapping parameters of unit density information and fruit positions according to the coordinates of each fruit in a preset coordinate system and the unit density detection result of the corresponding weight detection sensor.
In a practical application scenario, when the fruit density is too high, adjacent fruits may be detected by the weight detection sensors at the same position together, so that overlapping detection occurs, and the real position of the fruit is misjudged. To solve this problem, in a preferred embodiment, the fruit to be verified can be re-detected by adjusting the detection angle of some weight detection sensors. Wherein, the detection angle refers to the light ray emitting angle of the weight detection sensor, as shown in fig. 12, the adjustable detection angle adjustment range of the weight detection sensor located in the horizontal direction (the direction of the X axis) is 180 degrees to 360 degrees, wherein, in the initial state, the detection angle of the weight detection sensor is 270 degrees (as shown in fig. 10), that is, the light ray of the weight detection sensor is emitted vertically downwards; the adjustable detection angle adjustment range of the weight detecting sensor in the vertical direction (the direction in which the Z axis is located) is 90 to 270 degrees, wherein in the initial state, the detection angle of the weight detecting sensor is 180 degrees, that is, the light of the weight detecting sensor is horizontally emitted to the left (as shown in fig. 10).
Between the step 102b and the step 102c, the fruit to be verified is determined according to the coordinate condition of the fruit, and then the detection angle of the corresponding weight detection sensor is adjusted, so that the position of the fruit is accurately determined. Please refer to fig. 13 for the detailed steps.
Step 102b 1: and judging whether the coordinates of the fruits distributed on the crops under the preset coordinate system are the same or not.
In the present embodiment, when the position of the fruit is calibrated using the weight detection sensor, when there are at least two identical coordinates, it can be determined that there is a case of duplicate detection.
Step 102b 2: if the fruit to be verified exists, determining the fruit to be verified according to the same coordinate, selecting a first weight detection sensor and a second weight detection sensor, and respectively adjusting the detection angles of the first weight detection sensor and the second weight detection sensor so that the detection signals of the first weight detection sensor and the second weight detection sensor are both emitted to the same fruit to be verified.
Taking the coordinate system shown in fig. 11b as an example, assuming that the same coordinate point is (X, Y, Z), the coordinate includes information of three directions (X, Y and Z), and verification can be performed from one direction first. Firstly, (X, Y, Z) needs to be converted into (X0, 0, 0), (X1, 0, Z1) and a corresponding rotation angle, wherein, when the support frame is rotated, the X1 is an initial X coordinate corresponding to the weight detection sensor on the first support arm, the weight detection sensor with an initial coordinate of (X0, 0, 0) is selected as the first weight detection sensor, the sensor closest to the first weight detection sensor is selected as the second weight detection sensor, then the rotating base rotates the weight detection sensor to a corresponding angle according to the rotation angle, and finally the detection angles of the first weight detection sensor and the second weight detection sensor are respectively adjusted, so that the detection signals of the first weight detection sensor and the second weight detection sensor are both emitted to the same fruit to be verified.
And/or selecting the weight detection sensor with the initial coordinate of (x1, 0, z1) as the first weight detection sensor and selecting the sensor closest to the first weight detection sensor as the second weight detection sensor.
When the position of the fruit to be verified can be determined only by selecting the weight detecting sensor located in one direction (X direction or Z direction), verification in the other direction is not required. However, when the position of the fruit to be verified cannot be determined only by selecting the weight detecting sensor located in one direction (X direction or Z direction), verification needs to be performed in conjunction with the weight detecting sensor located in the other direction.
Step 102b 3: and correcting the coordinates of the fruit to be verified under a preset coordinate system according to the detection conditions of the first weight detection sensor and the second weight detection sensor.
The distance between the weight detection sensor and the fruit to be verified influences the detection condition, so that the distance between the first weight detection sensor and the fruit to be verified and the distance between the second weight detection sensor and the fruit to be verified can be respectively determined according to the signal intensity or the time difference of received detection signals, and then the target position of the fruit to be verified is determined according to the triangular characteristic.
In this embodiment, a first distance between the first weight detecting sensor and the fruit to be verified and a second distance between the second weight detecting sensor and the fruit to be verified are determined according to the respective received signal strengths of the first weight detecting sensor and the second weight detecting sensor.
And correcting the coordinates of the fruit to be verified under a preset coordinate system according to the first distance, the second distance, the coordinates of the first weight detection sensor and the coordinates of the second weight detection sensor. In this step, the coordinates of the first weight detection sensor and the coordinates of the second weight detection sensor may be initial coordinates, or coordinates obtained after rotation and transformation by a rotation angle, specifically determined according to the position of the fruit to be verified.
In an alternative embodiment, the first distance between the first weight detecting sensor and the fruit to be verified and the second distance between the second weight detecting sensor and the fruit to be verified may also be determined according to the time required for the first weight detecting sensor and the second weight detecting sensor to receive the reflected signal respectively.
According to the condition that the positions of two vertexes and the lengths of three sides are known, the shape of the triangle can be determined, and therefore the position of a third vertex is determined, wherein the position of the third vertex is the position where the fruit to be verified is determined. With reference to fig. 13 or fig. 14, the distance between the first weight detecting sensor and the second weight detecting sensor can be known from the coordinates of the first weight detecting sensor and the second weight detecting sensor, and the first distance and the second distance are determined according to the foregoing method, so that the position where the fruit to be verified is determined can be known according to the triangle theory.
In the embodiment, the accurate position of the fruit to be verified can be accurately determined by adjusting the detection angle of the weight detection sensor and combining the characteristics of the triangle, so that the overlapped detected fruit is separated, and the position detection accuracy is improved.
In a practical application scenario, in order to increase the interestingness and interactivity of planting, the server may rank the fruits planted by each user, for example, according to the weight of a single fruit, or according to the average weight of the fruits. In a preferred embodiment, the server is connected with a plurality of intelligent terminals, and the fruit and vegetable weight calculation method further comprises the following steps: and aiming at each intelligent terminal, after the weight of the fruit corresponding to each intelligent terminal is calculated, ranking is carried out according to the weight condition, and the planting condition of each user is displayed in a ranking mode. Wherein, the weight condition can be the heaviest fruit planted by each user, or the average weight of the fruit planted by each user. The server can also display the planting mode of the user with the first ranking so as to be convenient for other users to refer to and improve the planting quality of the server.
Example 3:
referring to fig. 16, fig. 16 is a schematic structural diagram of a device for calculating weight of fruits and vegetables according to an embodiment of the present invention. The fruit and vegetable weight calculating device of the present embodiment includes one or more processors 51 and a memory 52. In fig. 16, one processor 51 is taken as an example.
The processor 51 and the memory 52 may be connected by a bus or other means, and fig. 16 illustrates the connection by a bus.
The memory 52, as a non-volatile computer-readable storage medium for the fruit and vegetable weight calculation method, can be used to store non-volatile software programs, non-volatile computer-executable programs, and modules, such as the fruit and vegetable weight calculation method based on the smart planter in embodiment 1 and corresponding program instructions. The processor 51 implements the functions of the intelligent planter-based fruit and vegetable weight calculation method of embodiment 1 by executing the non-volatile software programs, instructions, and modules stored in the memory 52 to execute various functional applications and data processing of the intelligent planter-based fruit and vegetable weight calculation method.
The memory 52 may include, among other things, high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid-state storage device. In some embodiments, the memory 52 may optionally include memory located remotely from the processor 51, and these remote memories may be connected to the processor 51 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
Please refer to fig. 1 to 15 and the related text description for a method for calculating the weight of fruits and vegetables based on the intelligent planting machine, which will not be described again.
It should be noted that, for the information interaction, execution process and other contents between the modules and units in the apparatus and system, the specific contents may refer to the description in the embodiment of the method of the present invention because the same concept is used as the embodiment of the processing method of the present invention, and are not described herein again.
Those of ordinary skill in the art will appreciate that all or part of the steps of the various methods of the embodiments may be implemented by associated hardware as instructed by a program, which may be stored on a computer-readable storage medium, which may include: a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and the like.
It will be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the invention, and that any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (9)

1. A fruit and vegetable weight calculation method based on an intelligent planter is characterized by comprising the following steps:
acquiring a multi-dimensional graph of a crop in an intelligent planter, and generating a crop model according to the multi-dimensional graph of the crop, wherein the crop model comprises distribution information of fruits;
acquiring unit density information of each fruit detected by a weight detection sensor;
establishing mapping parameters of unit density information and fruit positions according to the initial position and/or the motion track of the weight detection sensor;
determining the weight of each fruit in the crop model according to the mapping parameters and the crop model;
the establishing of the mapping parameters of the unit density information and the fruit position according to the initial position and/or the motion track of the weight detection sensor comprises the following steps:
acquiring an initial coordinate of each weight detection sensor in a preset coordinate system, and a rotation angle of each weight detection sensor in the process of detecting the unit density of the fruits;
determining the coordinates of fruits distributed on the crops under a preset coordinate system based on the initial coordinates of the weight detection sensor and the rotation angle of the weight detection sensor;
and establishing mapping parameters of unit density information and fruit positions according to the coordinates of each fruit in a preset coordinate system and the unit density detection result of the corresponding weight detection sensor.
2. The fruit and vegetable weight calculation method according to claim 1, wherein the determining coordinates of the fruits distributed on the crop under the preset coordinate system based on the initial coordinates of the weight detection sensors and the rotation angles of the weight detection sensors comprises:
after unit density information sent by any weight detection sensor is received, the rotation angle of the corresponding weight detection sensor is obtained;
and determining the coordinates of the fruits distributed on the crops under a preset coordinate system by combining the initial coordinates and the corresponding rotation angles of the at least two weight detection sensors.
3. The fruit and vegetable weight calculation method according to claim 1, wherein the establishing of the mapping parameters of the unit density information and the fruit position according to the initial position and/or the movement locus of the weight detection sensor further comprises:
judging whether the coordinates of the fruits distributed on the crops under a preset coordinate system are the same or not;
if the fruit to be verified exists, determining the fruit to be verified according to the same coordinate, selecting a first weight detection sensor and a second weight detection sensor, and respectively adjusting the detection angles of the first weight detection sensor and the second weight detection sensor so that the detection signals of the first weight detection sensor and the second weight detection sensor are both emitted to the same fruit to be verified;
and correcting the coordinates of the fruit to be verified under a preset coordinate system according to the detection conditions of the first weight detection sensor and the second weight detection sensor.
4. The fruit and vegetable weight calculation method according to claim 3, wherein the step of correcting the coordinates of the fruit to be verified in a preset coordinate system according to the detection conditions of the first weight detection sensor and the second weight detection sensor comprises the steps of:
determining a first distance between the first weight detection sensor and the fruit to be verified and a second distance between the second weight detection sensor and the fruit to be verified according to the signal strength received by the first weight detection sensor and the second weight detection sensor respectively;
and correcting the coordinates of the fruit to be verified under a preset coordinate system according to the first distance, the second distance, the coordinates of the first weight detection sensor and the coordinates of the second weight detection sensor.
5. The fruit and vegetable weight calculation method according to claim 1, wherein the obtaining of the multi-dimensional map of the crops in the intelligent planter and the generating of the crop model according to the multi-dimensional map of the crops comprises:
receiving a plurality of pictures of crops in the intelligent planter reported by a user;
fitting the pictures of the crops in the intelligent planting machines to obtain a multi-dimensional image of the crops in the intelligent planting machines;
and generating a crop model according to the multi-dimensional graph of the crop.
6. The fruit and vegetable weight calculation method according to claim 5, wherein the request for reporting the picture by the user is generated by the user actively triggering the intelligent terminal in the pictures of the crops in the intelligent planting machines reported by the user, or is generated by automatic initiation after the intelligent terminal recognizes that the pictures contain the intelligent planting machine object.
7. The fruit and vegetable weight calculation method according to claim 1, wherein the determining the weight of each fruit in the crop model according to the mapping parameters and the crop model comprises:
determining the unit density of each fruit in the crop model according to the mapping parameters and the crop model;
calculating the volume of each fruit according to the crop model and the selected reference object in the crop model;
the weight of each fruit was calculated from the unit density of each fruit and the volume of each fruit.
8. The fruit and vegetable weight calculation method according to any one of claims 1 to 7, wherein a server is connected with a plurality of intelligent terminals, and the fruit and vegetable weight calculation method further comprises the following steps:
and aiming at each intelligent terminal, after the weight of the fruit corresponding to each intelligent terminal is calculated, ranking is carried out according to the weight condition, and the planting condition of each user is displayed in a ranking mode.
9. The fruit and vegetable weight calculating device is characterized by comprising at least one processor; and a memory communicatively coupled to the at least one processor; the storage is stored with instructions executable by the at least one processor, and the instructions are programmed to perform the fruit and vegetable weight calculation method according to any one of claims 1 to 8.
CN202010691092.8A 2020-07-17 2020-07-17 Fruit and vegetable weight calculation method and device based on intelligent planter Active CN111964760B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010691092.8A CN111964760B (en) 2020-07-17 2020-07-17 Fruit and vegetable weight calculation method and device based on intelligent planter

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010691092.8A CN111964760B (en) 2020-07-17 2020-07-17 Fruit and vegetable weight calculation method and device based on intelligent planter

Publications (2)

Publication Number Publication Date
CN111964760A CN111964760A (en) 2020-11-20
CN111964760B true CN111964760B (en) 2022-03-18

Family

ID=73361708

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010691092.8A Active CN111964760B (en) 2020-07-17 2020-07-17 Fruit and vegetable weight calculation method and device based on intelligent planter

Country Status (1)

Country Link
CN (1) CN111964760B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112665698A (en) * 2020-12-15 2021-04-16 重庆电子工程职业学院 Intelligent electronic scale
CN114674407B (en) * 2022-03-28 2024-01-23 稷青科技(上海)有限公司 Automatic weighing device and method for hydroponic leaf vegetables

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7854108B2 (en) * 2003-12-12 2010-12-21 Vision Robotics Corporation Agricultural robot system and method
US20140288850A1 (en) * 2011-10-30 2014-09-25 Paskal Technologies Agriculture Cooperative LTD. Self-learning of plant growth strategy in a greenhouse
US20160019688A1 (en) * 2014-07-18 2016-01-21 University Of Georgia Research Foundation, Inc. Method and system of estimating produce characteristics
CN204854914U (en) * 2015-06-03 2015-12-09 上海飞翼农业科技有限公司 Fruit weight telemetry unit
CN111160450A (en) * 2019-12-27 2020-05-15 中山德著智能科技有限公司 Fruit and vegetable weighing method based on neural network, storage medium and device

Also Published As

Publication number Publication date
CN111964760A (en) 2020-11-20

Similar Documents

Publication Publication Date Title
US11771077B2 (en) Identifying and avoiding obstructions using depth information in a single image
Narvaez et al. A survey of ranging and imaging techniques for precision agriculture phenotyping
Nguyen et al. Detection of red and bicoloured apples on tree with an RGB-D camera
CN111964760B (en) Fruit and vegetable weight calculation method and device based on intelligent planter
US10192185B2 (en) Farmland management system and farmland management method
CA2914575C (en) A system and method for providing illumination to plants
Rosell-Polo et al. Advances in structured light sensors applications in precision agriculture and livestock farming
CN111488017A (en) Wisdom agricultural management control system based on thing networking
US10172289B2 (en) System and method for the automatic adjustment of the height of an agricultural implement using 3D reconstruction
CN103439265A (en) Real-time monitoring method for growth characters of tea trees in intensive cultivation
US20220101554A1 (en) Extracting Feature Values from Point Clouds to Generate Plant Treatments
CN110570324A (en) Intelligent planting method and intelligent planting box
KR20230136125A (en) System for monitoring enclosed growing environments
Tejada et al. Proof-of-concept robot platform for exploring automated harvesting of sugar snap peas
CN111972159B (en) Intelligent planter and intelligent illumination method thereof
KR102479284B1 (en) Vegetation index acquisition unit and apparatus for monitoring plant comprising the same
Moreno et al. Proximal sensing for geometric characterization of vines: A review of the latest advances
Pekkeriet et al. Contribution of innovative technologies to new developments in horticulture
CN111972123B (en) Intelligent fruit and vegetable picking recommendation method and device based on intelligent planter
US20220100996A1 (en) Ground Plane Compensation in Identifying and Treating Plants
CN112781647A (en) Miniature intelligent flight detector of fruit quality in growth
Berk et al. Digital evaluation of the green leaf wall area of the vine in the" Yellow Muscat" variety.
CN114449711B (en) Intelligent control method and device for multifunctional plant lamp and multifunctional plant lamp
Berk et al. Digitalna procjena lisne površine krošnje stijenke vinove loze (Vitis vinifera cv. Sauvignon) korištenjem LIDAR mjerne tehnologije
Vrochidou et al. Leveraging Computer Vision for Precision Viticulture

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant