CN115723138A - Control method and device of agricultural robot, electronic equipment and storage medium - Google Patents

Control method and device of agricultural robot, electronic equipment and storage medium Download PDF

Info

Publication number
CN115723138A
CN115723138A CN202211509459.5A CN202211509459A CN115723138A CN 115723138 A CN115723138 A CN 115723138A CN 202211509459 A CN202211509459 A CN 202211509459A CN 115723138 A CN115723138 A CN 115723138A
Authority
CN
China
Prior art keywords
image
agricultural robot
moving
target
preset area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211509459.5A
Other languages
Chinese (zh)
Inventor
苏海峰
宋佳音
蔡扬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Seven Seas Shenzhen Technology Co ltd
Original Assignee
Seven Seas Shenzhen Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Seven Seas Shenzhen Technology Co ltd filed Critical Seven Seas Shenzhen Technology Co ltd
Priority to CN202211509459.5A priority Critical patent/CN115723138A/en
Publication of CN115723138A publication Critical patent/CN115723138A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Guiding Agricultural Machines (AREA)

Abstract

The application discloses control method, device, electronic equipment and storage medium of agricultural robot, belongs to wisdom agricultural technology field, and the method includes: when a set period is reached, acquiring an image acquired by each target camera with the image acquisition direction matched with the advancing direction of the agricultural robot in the camera assembly, generating a binary mask image based on target pixels belonging to specified plants in the image, determining the growth track of the specified plants in the preset area based on the target pixels in each preset area in the mask image, determining a moving line corresponding to the target camera according to the growth track of the specified plants in each preset area, and further controlling the agricultural robot to advance on the tractor-ploughing path based on the moving line corresponding to each target camera. Like this, but automatic control agricultural robot advances on the tractor-ploughing way, need not artifical the participation, and artificial intelligence degree is higher, operating efficiency is also higher.

Description

Control method and device of agricultural robot, electronic equipment and storage medium
Technical Field
The application relates to the technical field of intelligent agriculture, in particular to a control method and device of an agricultural robot, electronic equipment and a storage medium.
Background
With the rapid development of artificial intelligence technology, the automatic and intelligent operation mode of farmland also becomes a great development trend of agricultural operation.
In the correlation technique, when the agricultural robot fertilizies, weeds etc. in the farmland, acquire the image on the agricultural robot advancing direction in real time by the control end, show for operating personnel, by operating personnel according to the quick-witted cultivation way in the image recognition farmland, then manual control agricultural robot walks on quick-witted cultivation way to avoid overwhelming the crops in the farmland. The mode of this kind of manual control agricultural robot walking has great requirement to operator's operating technique, not only can influence agricultural robot's application prospect but also can reduce the operating efficiency.
Disclosure of Invention
The embodiment of the application provides a control method and device of an agricultural robot, electronic equipment and a storage medium, and is used for providing a scheme for automatically controlling the agricultural robot to move on a tractor-ploughing channel.
In a first aspect, an embodiment of the present application provides a control method for an agricultural robot, including:
when a set period is reached, acquiring an image acquired by each target camera in the camera assembly, wherein the image acquisition direction of each target camera is matched with the traveling direction of the agricultural robot;
generating a binary mask image based on target pixels belonging to a specified plant in the image;
determining a growth track of the designated plant in each preset area based on a target pixel in each preset area in the mask image;
determining a moving line corresponding to the target camera according to the growth track of the designated plant in each preset area;
and controlling the agricultural robot to move on the tractor-ploughing path based on the moving line corresponding to each target camera.
In some embodiments, the number of the preset areas is two, and each preset area is determined according to the planting information of the designated plant and the image acquisition range of the target camera.
In some embodiments, determining the growth trajectory of the designated plant in each preset region based on the target pixel in the preset region in the mask image comprises:
carrying out contour detection on each preset region in the mask image;
fitting the detected outline of the outermost layer to obtain the outlines of a plurality of designated plants;
determining a center point of each of the designated plants based on the contour of the designated plant;
and fitting the central points of the designated plants to obtain the growth tracks of the designated plants in the preset area.
In some embodiments, determining a moving route corresponding to the target camera according to the growth trajectory of the designated plant in each preset area includes:
determining two intersection points of the growth track of the designated plant in each preset area and the preset area;
fusing the intersection points matched with the positions on different growth tracks to obtain a reference point;
converting the two-dimensional coordinate of each reference point into a three-dimensional coordinate based on the conversion relation from the two-dimensional coordinate system of the image to the three-dimensional coordinate system of the agricultural robot;
and determining a line segment formed by connecting the three-dimensional coordinates of the reference points as a moving line corresponding to the target camera.
In some embodiments, the agricultural robot includes a robot body and at least two motion devices, the number of the target cameras is one, and the agricultural robot is controlled to travel on the tractor plough way based on the moving line corresponding to each target camera, and the method includes:
taking a moving line corresponding to the target camera as a central moving line of the robot body;
converting the central moving line based on the line conversion relation between the center of the robot body and each moving device to obtain the moving line of the moving device;
and controlling the agricultural robot to move on the tractor plowing path according to the moving line of each moving device.
In some embodiments, the agricultural robot includes a robot body and at least two motion devices, the number of the target cameras is two, and the agricultural robot is controlled to travel on the tractor-ploughing path based on the moving line corresponding to each target camera, and the method includes:
taking the moving line corresponding to each target camera as the moving lines of two moving devices matched with the target camera;
and controlling the agricultural robot to move on the tractor-ploughing path according to the moving line of each moving device.
In some embodiments, the target pixel in the image belonging to the designated plant is determined according to the following steps:
converting the image from an RGB image to an HSV image;
if the HSV component of any pixel in the HSV image meets a preset threshold condition, determining that the pixel is the target pixel belonging to the specified plant.
In a second aspect, an embodiment of the present application provides a control device for an agricultural robot, including:
the acquisition module is used for acquiring images acquired by each target camera, of which the image acquisition direction is matched with the advancing direction of the agricultural robot, in the camera assembly when a set period is reached;
the generation module is used for generating a binary mask image based on target pixels belonging to a specified plant in the image;
the track determining module is used for determining the growth track of the specified plant in each preset area based on the target pixel in the preset area in the mask image;
the line determining module is used for determining a moving line corresponding to the target camera according to the growth track of the designated plant in each preset area;
and the control module is used for controlling the agricultural robot to move on the tractor-ploughing path based on the moving line corresponding to each target camera.
In some embodiments, the number of the preset areas is two, and each preset area is determined according to the planting information of the designated plant and the image acquisition range of the target camera.
In some embodiments, the trajectory determination module is specifically configured to:
carrying out contour detection on each preset region in the mask image;
fitting the detected outline of the outermost layer to obtain the outlines of a plurality of designated plants;
determining a center point of each of the designated plants based on the contour of the designated plant;
and fitting the central points of the designated plants to obtain the growth tracks of the designated plants in the preset area.
In some embodiments, the route determination module is specifically configured to:
determining two intersection points of the growth track of the designated plant in each preset area and the preset area;
fusing the intersection points matched with the positions on different growth tracks to obtain a reference point;
converting the two-dimensional coordinate of each reference point into a three-dimensional coordinate based on the conversion relation from the two-dimensional coordinate system of the image to the three-dimensional coordinate system of the agricultural robot;
and determining a line segment formed by connecting the three-dimensional coordinates of the reference points as a moving line corresponding to the target camera.
In some embodiments, the agricultural robot comprises a robot body and at least two motion devices, the number of target cameras is one, and the control module is specifically configured to:
taking a moving line corresponding to the target camera as a central moving line of the robot body;
converting the central moving line based on the line conversion relation between the center of the robot body and each moving device to obtain a moving line of the moving device;
and controlling the agricultural robot to move on the tractor plowing path according to the moving line of each moving device.
In some embodiments, the agricultural robot comprises a robot body and at least two motion devices, the number of target cameras is two, and the control module is specifically configured to:
taking the moving line corresponding to each target camera as the moving lines of two moving devices matched with the target camera;
and controlling the agricultural robot to move on the tractor-ploughing path according to the moving line of each moving device.
In some embodiments, the target pixel in the image belonging to the designated plant is determined according to the following steps:
converting the image from an RGB image to an HSV image;
if the HSV component of any pixel in the HSV image meets a preset threshold condition, determining that the pixel is the target pixel belonging to the specified plant.
In a third aspect, an embodiment of the present application provides an electronic device, including: at least one processor, and a memory communicatively coupled to the at least one processor, wherein:
the memory stores a computer program executable by at least one processor, the computer program being executed by said at least one processor to enable said at least one processor to perform the above-mentioned control method of an agricultural robot.
In a fourth aspect, embodiments of the present application provide a storage medium, where when a computer program in the storage medium is executed by a processor of an electronic device, the electronic device is capable of executing the control method of the agricultural robot.
In the embodiment of the application, when a set period is reached, an image collected by each target camera with the image collection direction matched with the advancing direction of the agricultural robot in the camera assembly is obtained, a binary mask image is generated based on target pixels belonging to specified plants in the image, the growth track of the specified plants in the preset area is determined based on the target pixels in each preset area in the mask image, the moving line corresponding to the target camera is determined according to the growth track of the specified plants in each preset area, and then the agricultural robot is controlled to advance on the tractor-ploughing path based on the moving line corresponding to each target camera. Like this, but automatic control agricultural robot advances on the tractor-ploughing way, need not artifical the participation, and artificial intelligence degree is higher, operating efficiency is also higher.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a flowchart of a control method for an agricultural robot according to an embodiment of the present application;
fig. 2a is a schematic view of an agricultural robot provided in an embodiment of the present application;
fig. 2b is a schematic view of another agricultural robot provided in the embodiment of the present application;
fig. 3 is a schematic operation diagram of an agricultural robot provided in an embodiment of the present application;
fig. 4a is a schematic view of another agricultural robot provided in the embodiments of the present application;
fig. 4b is a schematic view of another agricultural robot provided in the embodiments of the present application;
fig. 5 is an operation schematic diagram of another agricultural robot provided in the embodiment of the present application;
fig. 6 is a schematic structural diagram of a control device of an agricultural robot according to an embodiment of the application;
fig. 7 is a hardware structure diagram of an electronic device for implementing a control method of an agricultural robot according to an embodiment of the present application.
Detailed Description
In order to provide a scheme for automatically controlling an agricultural robot to travel on a tractor plough way, the embodiment of the application provides a control method and device of the agricultural robot, electronic equipment and a storage medium.
The preferred embodiments of the present application will be described in conjunction with the drawings of the specification, it should be understood that the preferred embodiments described herein are only for illustrating and explaining the present application, and are not intended to limit the present application, and the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
First, it should be noted that the control method for the agricultural robot provided in the embodiment of the present application may be executed separately at the control end, may be executed separately at the agricultural robot, and may also be executed by executing part of steps at the control end and executing part of steps at the agricultural robot end, which is not limited in this embodiment of the present application.
Fig. 1 is a flowchart of a control method for an agricultural robot according to an embodiment of the present application, and includes the following steps.
In step 101, each time a set period is reached, an image captured by each target camera in the camera assembly whose image capturing orientation matches the direction of travel of the agricultural robot is acquired.
Wherein the agricultural robot may comprise a camera assembly, a robot body and at least two movement means, and the movement means may be in the form of a crawler belt or in the form of wheels.
In some embodiments, the camera assembly may include two cameras disposed directly in front of and directly behind the robot body, respectively. When the agricultural robot proceeds forward, the target camera in which the image capturing orientation matches the traveling direction of the agricultural robot in the camera assembly is a camera disposed right in front of the robot body. When the agricultural robot goes backwards, the target camera in which the image acquisition direction is matched with the advancing direction of the agricultural robot in the camera assembly is a camera arranged right behind the robot body.
In some embodiments, the camera assembly may include four cameras, two cameras distributed on both sides of the front of the robot body, and two cameras distributed on both sides of the rear of the robot body. When the agricultural robot moves forwards, the target cameras in the camera assembly, of which the image acquisition directions are matched with the moving direction of the agricultural robot, are two cameras on the front side of the robot body; when the agricultural robot moves backwards, the target cameras in the camera assembly, of which the image acquisition directions are matched with the moving direction of the agricultural robot, are two cameras on the rear side of the robot body.
In step 102, a binarized mask image is generated based on target pixels in the image that belong to a specified plant.
Wherein the designated plant may be a crop plant.
In order to facilitate the segmentation of the designated plants from the image based on the colors, the image may be converted from an RGB image into an HSV image, and then, for each pixel in the HSV image, if an HSV component of the pixel satisfies a preset threshold condition, it is determined that the pixel is a target pixel, otherwise, it is determined that the pixel is a non-target pixel. Then, a mask image after the binarization of the HSV image can be generated in a manner that the pixel value corresponding to the target pixel is 255 and the pixel value corresponding to the non-target pixel is 0.
In step 103, based on the target pixel in each preset area in the mask image, the growth track of the designated plant in the preset area is determined.
Generally, the number of the preset areas is even, such as 2, 4, etc., each preset area corresponds to a column of designated plants, and one preset area is generally rectangular. Moreover, the preset area can be set by technicians according to experimental data or experience, and can also be predetermined according to planting information of specified plants and an image acquisition range of the camera, wherein the planting information comprises planting row spacing, planting time and the like.
In specific implementation, when the growth trajectory of the designated plant in any preset area is determined, the outline of the preset area in the mask image can be detected to obtain the general outline of the designated plant in the preset area, then the detected outline of the outermost layer is fitted to obtain the outlines of a plurality of designated plants, the center point of the designated plant is determined based on the outline of each designated plant, for example, the center point of the minimum circumscribed circle of the outline of the designated plant is determined as the center point, and finally, the center point of each designated plant is fitted to obtain the growth trajectory of the designated plant in the preset area.
In step 104, a moving route corresponding to the target camera is determined according to the growth track of the designated plant in each preset area.
For example, the moving route corresponding to the target camera is determined according to the following steps:
the first step is to determine two intersection points of the growth track of the designated plant in each preset area and the preset area.
Generally, each preset area is rectangular and the growth track is a straight line, so that two intersections of the growth track of the designated plant in each preset area and the preset area exist.
And secondly, fusing the intersection points matched with the positions on different growth tracks to obtain a reference point.
Based on the first step, one growth track corresponds to two intersection points, if the intersection points matched with the positions on different growth tracks are both positioned above the growth track and are both positioned below the growth track, the intersection points matched with the positions on different growth tracks are fused to obtain a reference point. Assuming that there are two preset regions in the image, a total of two reference points are obtained.
And thirdly, converting the two-dimensional coordinate of each reference point into a three-dimensional coordinate based on the conversion relation from the two-dimensional coordinate system of the image to the three-dimensional coordinate system of the agricultural robot.
And fourthly, determining a line segment formed by connecting the three-dimensional coordinates of the reference points as a moving line corresponding to the target camera.
In step 105, the agricultural robot is controlled to travel on the tractor-ploughing path based on the movement path corresponding to each target camera.
When the target camera is a camera disposed right in front of or right behind the robot body, the moving line obtained in step 104 is the central moving line of the robot body, and at this time, the central moving line can be converted based on the line conversion relationship between the center of the robot body and each moving device to obtain the moving line of the moving device, and then the agricultural robot is controlled to travel on the tractor-ploughing path according to the moving lines of the moving devices, so that the agricultural robot can automatically travel on the tractor-ploughing path without crushing or scratching the designated plants.
When the target cameras are two cameras arranged on the front side of the robot body or two cameras arranged on the rear side of the robot body, the moving line corresponding to each target camera can be used as the moving line of the moving device matched with the target camera, for example, the moving line corresponding to the left target camera is used as the moving line of the left moving device, the moving line corresponding to the right target camera is used as the moving line of the right moving device, and then the agricultural robot is controlled to move on the tractor-ploughing path according to the moving lines of the moving devices, so that the agricultural robot can automatically walk on the tractor-ploughing path without crushing or scratching specified plants.
In addition, the agricultural robot can be controlled to operate on the designated plants such as pesticide spraying, fertilizer application and the like while the agricultural robot is controlled to travel on the machine plowing path.
Like this, but automatic control agricultural robot advances on the tractor-ploughing way, carries out the operation and can not crush or scratch appointed plant to appointed plant voluntarily, also needn't artifical the participation, and artificial intelligence degree is higher, the operating efficiency is also higher.
In addition, in the embodiment of the present application, since each camera in the camera assembly is in an uninterrupted shooting state, and path planning of the agricultural robot is also performed in real time, identification of the tractor-ploughing path is also performed in real time.
The embodiments of the present application will be described with reference to specific examples.
Fig. 2a is a schematic view of an agricultural robot provided in an embodiment of the present application, which includes a camera assembly, a robot body, and four wheels (in this case, the motion device is represented by wheels), wherein the camera assembly includes two cameras (only one camera is shown in fig. 2 a), and the two cameras are respectively installed right in front of and right behind the robot body. When the agricultural robot moves forwards, a target camera in the camera assembly, of which the image acquisition direction is matched with the moving direction of the agricultural robot, is a camera arranged right in front of the robot body; when the agricultural robot moves backwards, the target camera, of which the image acquisition direction is matched with the moving direction of the agricultural robot, in the camera assembly is a camera installed right behind the robot body.
Fig. 2b is a schematic diagram of another agricultural robot provided in the embodiment of the present application, which includes a camera assembly, a robot body, and two crawlers (in this case, the motion device is represented as a crawler), wherein the camera assembly includes two cameras (only one camera is shown in fig. 2 b), and the two cameras are respectively installed right in front of and right behind the robot body. When the agricultural robot moves forwards, a target camera in the camera assembly, of which the image acquisition direction is matched with the moving direction of the agricultural robot, is a camera arranged right in front of the robot body; when the agricultural robot moves backwards, the target camera in the camera assembly, of which the image acquisition orientation is matched with the moving direction of the agricultural robot, is a camera installed right behind the robot body.
The scheme of the embodiment of the application is described below by taking the case that the agricultural robot moves forward and the target camera is a camera installed right in front of the robot body.
Fig. 3 is a schematic operation diagram of the agricultural robot shown in fig. 2a and 2b provided in an embodiment of the present application, where a direction indicated by an arrow is a traveling direction of the agricultural robot, a left rectangular frame to a right rectangular frame is an image capture range of the target camera, each rectangular frame represents a preset area, each preset area corresponds to a column of designated plants, and the two preset areas can be predetermined according to planting conditions of the designated plants and a visual field range of the target camera, where the planting conditions are planting row spacing, planting column spacing, and the like.
The scheme of the embodiment of the application comprises the following steps:
the method comprises the steps that firstly, a control end obtains an image collected by a target camera, the image is converted into an HSV image from an RGB image, for each pixel in the HSV image, if the HSV component of the pixel exceeds a preset HSV component threshold value, the pixel is determined to be a target pixel belonging to a specified plant, and otherwise, the pixel is determined to be a non-target pixel.
And secondly, the control end generates a mask image of the HSV image according to the binarization mode of the target pixel and the non-target pixel, wherein the pixels with the pixel values of 255 in the mask image belong to the pixels of the designated crops, and the pixels with the pixel values of 0 in the mask image belong to the pixels of the designated crops.
And thirdly, the control end detects the outline by adopting a findCoutours function aiming at each preset area in the mask image, outputs the approximate outline of the specified plant row in the preset area, fits a polygonal outline by adopting an approxPloyDP function to the outermost layer outline in the outlines, and then calculates by adopting a minEnclingcircle function to obtain the minimum circumcircle of each polygonal outline, wherein the circle center of each minimum circumcircle is the central point of all the specified plants in the preset area.
Fourthly, the control end adopts a fitLine function to fit a straight line Ll to the center point set Cl of the left preset area (namely the growth track of the designated plant in the left preset area), fit a straight line Lr to the center point set Cr of the right preset area (namely the growth track of the designated plant in the right preset area), then calculate intersection points PL1 and PL2 of the straight line Ll and the left preset area, calculate intersection points PR1 and PR2 of the straight line Lr and the right preset area, calculate the middle points P of PL1 and PR1, and calculate the middle points Q of PL2 and PR 2. And (3) carrying out coordinate transformation through internal and external parameters of the target camera, converting two-dimensional pixels P and Q under a camera coordinate system into three-dimensional points Pw and Qw under a world coordinate system, and sending the coordinates of Pw and Qw to a motion control system of the agricultural robot.
Wherein, the coordinate conversion formula is as follows:
Figure BDA0003968727550000111
zc is the distance between the target camera and an imaging target and can be determined according to a depth map acquired by the target camera, K is an internal reference of the target camera and can be obtained by calibration through a universal calibration method, R is a rotation matrix of the target camera relative to a world coordinate system, T is a translation matrix of the target camera relative to the world coordinate system, and x is the translation matrix of the target camera relative to the world coordinate system w 、y w 、z w Is a world coordinate system; u, v are the pixel coordinates of the target camera.
That is, the three-dimensional coordinate of Pw can be obtained by substituting the two-dimensional pixel coordinate of P into the above expression as u and v, and the three-dimensional coordinate of Qw can be obtained by substituting the two-dimensional pixel coordinate of Q into the above expression as u and v.
And fifthly, the motion control system of the agricultural robot takes the connecting line of the Pw and the Qw as a central moving line of the robot body, the central moving line is converted based on the line conversion relation between the center of the robot body and each motion device to obtain the moving line of each motion device, and then the agricultural robot is controlled to move on the tractor plowing channel according to the moving lines of each motion device, so that specified plants can be prevented from being crushed or scratched, and the specified plants can be subjected to operations such as pesticide spraying, fertilizer application and the like.
Fig. 4a is a schematic view of another agricultural robot provided by the embodiment of the present application, which includes a camera assembly, a robot body and four wheels (in this case, the motion device is represented by wheels), wherein the camera assembly includes four cameras (fig. 4a shows only 2 cameras), each camera being mounted on one wheel. When the agricultural robot moves forwards, the target cameras in the camera assembly, of which the image acquisition directions are matched with the moving direction of the agricultural robot, are cameras installed on two wheels in front of the agricultural robot; when the agricultural robot travels backwards, the target cameras in the camera assembly whose image capturing orientation matches the traveling direction of the agricultural robot are cameras mounted on two wheels on the rear of the agricultural robot.
Fig. 4b is a schematic diagram of another agricultural robot provided by the embodiment of the present application, which includes a camera assembly, a robot body, and two crawlers (in this case, the motion device is represented as a crawler), wherein the camera assembly includes four cameras (fig. 4b only shows 2 cameras), two cameras are mounted on each crawler, and the two cameras are distributed on two sides of the robot body. When the agricultural robot moves forwards, the target cameras in the camera assembly, of which the image acquisition directions are matched with the moving direction of the agricultural robot, are two cameras installed in front of the robot body; when the agricultural robot travels backwards, the target cameras in the camera assembly whose image capturing orientation matches the traveling direction of the agricultural robot are two cameras installed behind the robot body.
The scheme of the embodiment of the application is described by taking the case that the agricultural robot moves forwards and the target camera is mounted on two motion devices in front of the agricultural robot.
Fig. 5 is an operation schematic diagram of the agricultural robot shown in fig. 4a and 4b according to an embodiment of the present application, in fig. 5, two left rectangular frames are image capturing ranges of a left target camera, two right rectangular frames are image capturing ranges of a right target camera, each rectangular frame represents a preset area, each preset area corresponds to a column of designated plants, and the two preset areas corresponding to each target camera can be predetermined according to planting conditions of the designated plants and a visual field range of the target camera, where the planting conditions include planting row spacing, planting column spacing, and the like.
The scheme of the embodiment of the application comprises the following steps:
for the image obtained by each target camera, the processing procedure of the control end is the same as the first step to the fourth step, which is not described herein again.
And fifthly, the motion control system of the agricultural robot takes the moving line corresponding to each target camera as the moving lines of two motion devices matched with the target cameras, for example, the moving line corresponding to the left target camera is taken as the moving line of the left motion device (one crawler belt or two wheels), the moving line corresponding to the right target camera is taken as the moving line of the right motion device (one crawler belt or two wheels), then, according to the moving lines of the motion devices, the agricultural robot is controlled to move on the tractor-ploughing path, so that the designated plants can be prevented from being crushed or scratched, and operations such as pesticide spraying, fertilizer application and the like can be carried out on the designated plants.
The scheme of this application embodiment, can be independently carry out tractor-ploughing way discernment, promote agricultural robot's operating efficiency, reduce the operation technical requirement to the operator to make agricultural robot more automatic and intelligent.
Based on the same technical concept, the embodiment of the application also provides a control device of the agricultural robot, and the principle of the control device of the agricultural robot for solving the problems is similar to the control method of the agricultural robot, so the implementation of the control device of the agricultural robot can refer to the implementation of the control method of the agricultural robot, and repeated parts are not repeated.
Fig. 6 is a schematic structural diagram of a control device of an agricultural robot according to an embodiment of the present application, and includes an obtaining module 601, a generating module 602, a trajectory determining module 603, a route determining module 604, and a control module 605.
The acquisition module 601 is used for acquiring images acquired by each target camera, of which the image acquisition direction is matched with the traveling direction of the agricultural robot, in the camera assembly when a set period is reached;
a generating module 602, configured to generate a binarized mask image based on target pixels in the image that belong to a specified plant;
a track determining module 603, configured to determine a growth track of the designated plant in each preset region based on a target pixel in the preset region in the mask image;
a route determining module 604, configured to determine a moving route corresponding to the target camera according to the growth trajectory of the designated plant in each preset area;
and a control module 605, configured to control the agricultural robot to travel on the tractor-ploughing path based on the moving route corresponding to each target camera.
In some embodiments, the number of the preset areas is two, and each preset area is determined according to the planting information of the designated plant and the image acquisition range of the target camera.
In some embodiments, the trajectory determination module 603 is specifically configured to:
carrying out contour detection on each preset region in the mask image;
fitting the detected outline of the outermost layer to obtain the outlines of a plurality of designated plants;
determining a center point of each of the designated plants based on the contour of the designated plant;
and fitting the central points of the designated plants to obtain the growth tracks of the designated plants in the preset area.
In some embodiments, the route determining module 604 is specifically configured to:
determining two intersection points of the growth track of the designated plant in each preset area and the preset area;
fusing the intersection points matched with the positions on different growth tracks to obtain a reference point;
converting the two-dimensional coordinate of each reference point into a three-dimensional coordinate based on the conversion relation from the two-dimensional coordinate system of the image to the three-dimensional coordinate system of the agricultural robot;
and determining a line segment formed by connecting the three-dimensional coordinates of the reference points as a moving line corresponding to the target camera.
In some embodiments, the agricultural robot comprises a robot body and at least two motion devices, the number of target cameras is one, and the control module 605 is specifically configured to:
taking a moving line corresponding to the target camera as a central moving line of the robot body;
converting the central moving line based on the line conversion relation between the center of the robot body and each moving device to obtain a moving line of the moving device;
and controlling the agricultural robot to move on the tractor plowing path according to the moving line of each moving device.
In some embodiments, the agricultural robot comprises a robot body and at least two motion devices, the number of target cameras is two, the control module 605 is specifically configured to:
taking the moving line corresponding to each target camera as the moving lines of two moving devices matched with the target camera;
and controlling the agricultural robot to move on the tractor-ploughing path according to the moving line of each moving device.
In some embodiments, the target pixel in the image belonging to the designated plant is determined according to the following steps:
converting the image from an RGB image to an HSV image;
if the HSV component of any pixel in the HSV image meets a preset threshold condition, determining that the pixel is the target pixel belonging to the specified plant.
The division of the modules in the embodiments of the present application is schematic, and only one logic function division is provided, and in actual implementation, there may be another division manner, and in addition, each function module in each embodiment of the present application may be integrated in one processor, may also exist alone physically, or may also be integrated in one module by two or more modules. The coupling of the various modules to each other may be through interfaces that are typically electrical communication interfaces, but mechanical or other forms of interfaces are not excluded. Thus, modules described as separate components may or may not be physically separate, may be located in one place, or may be distributed in different locations on the same or different devices. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
Having described the control method and apparatus of an agricultural robot according to an exemplary embodiment of the present application, next, an electronic device according to another exemplary embodiment of the present application will be described.
An electronic device 130 implemented according to this embodiment of the present application is described below with reference to fig. 7. The electronic device 130 shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 7, the electronic device 130 is represented in the form of a general electronic device. The components of the electronic device 130 may include, but are not limited to: the at least one processor 131, the at least one memory 132, and a bus 133 that couples various system components including the memory 132 and the processor 131.
Bus 133 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, a processor, or a local bus using any of a variety of bus architectures.
The memory 132 may include readable media in the form of volatile memory, such as Random Access Memory (RAM) 1321 and/or cache memory 1322, and may further include Read Only Memory (ROM) 1323.
Memory 132 may also include a program/utility 1325 having a set (at least one) of program modules 1324, such program modules 1324 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which or some combination thereof may comprise an implementation of a network environment.
The electronic device 130 may also communicate with one or more external devices 134 (e.g., keyboard, pointing device, etc.), with one or more devices that enable a user to interact with the electronic device 130, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 130 to communicate with one or more other electronic devices. Such communication may occur via input/output (I/O) interfaces 135. Also, the electronic device 130 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 136. As shown, network adapter 136 communicates with other modules for electronic device 130 over bus 133. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with electronic device 130, including but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
In an exemplary embodiment, there is also provided a storage medium in which a computer program is stored, the computer program being executable by a processor of an electronic device, the electronic device being capable of executing the control method of an agricultural robot described above. Alternatively, the storage medium may be a non-transitory computer readable storage medium, which may be, for example, a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, the electronic device of the present application may at least comprise at least one processor, and a memory communicatively connected to the at least one processor, wherein the memory stores a computer program executable by the at least one processor, and the computer program, when executed by the at least one processor, may cause the at least one processor to perform the steps of the control method of any one of the agricultural robots provided by the embodiments of the present application.
In an exemplary embodiment, a computer program product is also provided, which, when executed by an electronic device, enables the electronic device to implement any of the exemplary methods provided herein.
Also, a computer program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable Disk, a hard Disk, a RAM, a ROM, an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a Compact Disk Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The program product for control of an agricultural robot in an embodiment of the present application may be in the form of a CD-ROM and include program code and may be run on a computing device. However, the program product of the present application is not so limited, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, radio Frequency (RF), etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In situations involving remote computing devices, the remote computing devices may be connected to the user computing device over any kind of Network, such as a Local Area Network (LAN) or Wide Area Network (WAN), or may be connected to external computing devices (e.g., connected over the internet using an internet service provider).
It should be noted that although in the above detailed description several units or sub-units of the apparatus are mentioned, such a division is merely exemplary and not mandatory. Indeed, the features and functions of two or more of the units described above may be embodied in one unit, according to embodiments of the application. Conversely, the features and functions of one unit described above may be further divided into embodiments by a plurality of units.
Further, while the operations of the methods of the present application are depicted in the drawings in a particular order, this does not require or imply that these operations must be performed in this particular order, or that all of the illustrated operations must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and so forth) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application also encompasses such modifications and variations.

Claims (16)

1. A control method of an agricultural robot, characterized by comprising:
when a set period is reached, acquiring an image acquired by each target camera, wherein the image acquisition direction in the camera assembly is matched with the traveling direction of the agricultural robot;
generating a binary mask image based on target pixels belonging to a specified plant in the image;
determining a growth track of the designated plant in each preset area based on a target pixel in each preset area in the mask image;
determining a moving line corresponding to the target camera according to the growth track of the designated plant in each preset area;
and controlling the agricultural robot to move on the tractor-ploughing path based on the moving line corresponding to each target camera.
2. The method of claim 1, wherein the number of the preset areas is two, and each preset area is determined according to the planting information of the designated plant and the image capturing range of the target camera.
3. The method of claim 1, wherein determining the growth trajectory of the designated plant in each preset area based on the target pixel in the preset area in the mask image comprises:
carrying out contour detection on each preset region in the mask image;
fitting the detected outermost layer outline to obtain the outlines of a plurality of designated plants;
determining a center point of each of the designated plants based on the contour of the designated plant;
and fitting the central points of the designated plants to obtain the growth tracks of the designated plants in the preset area.
4. The method of claim 1, wherein determining a moving route corresponding to the target camera according to the growth trajectory of the designated plant in each preset area comprises:
determining two intersection points of the growth track of the designated plant in each preset area and the preset area;
fusing the intersection points matched with the positions on different growth tracks to obtain a reference point;
converting the two-dimensional coordinate of each reference point into a three-dimensional coordinate based on the conversion relation from the two-dimensional coordinate system of the image to the three-dimensional coordinate system of the agricultural robot;
and determining a line segment formed by connecting the three-dimensional coordinates of the reference points as a moving line corresponding to the target camera.
5. The method of any one of claims 1-4, wherein the agricultural robot comprises a robot body and at least two motion devices, the number of target cameras being one, the agricultural robot being controlled to travel on the tractor-ploughs based on the movement line corresponding to each target camera, comprising:
taking a moving line corresponding to the target camera as a central moving line of the robot body;
converting the central moving line based on the line conversion relation between the center of the robot body and each moving device to obtain a moving line of the moving device;
and controlling the agricultural robot to move on the tractor-ploughing path according to the moving line of each moving device.
6. The method of any one of claims 1-4, wherein the agricultural robot comprises a robot body and at least two motion devices, the number of target cameras is two, and the agricultural robot is controlled to travel on the tractor-ploughs based on the movement line corresponding to each target camera, comprising:
taking the moving line corresponding to each target camera as the moving lines of two moving devices matched with the target camera;
and controlling the agricultural robot to move on the tractor-ploughing path according to the moving line of each moving device.
7. The method of claim 1, wherein the target pixel in the image belonging to the designated plant is determined according to the following steps:
converting the image from an RGB image to an HSV image;
if the HSV component of any pixel in the HSV image meets a preset threshold condition, determining that the pixel is the target pixel belonging to the specified plant.
8. A control device of an agricultural robot, comprising:
the acquisition module is used for acquiring images acquired by each target camera, of which the image acquisition direction is matched with the advancing direction of the agricultural robot, in the camera assembly when a set period is reached;
the generation module is used for generating a binary mask image based on target pixels belonging to a specified plant in the image;
the track determining module is used for determining the growth track of the specified plant in each preset area based on the target pixel in the preset area in the mask image;
the line determining module is used for determining a moving line corresponding to the target camera according to the growth track of the specified plant in each preset area;
and the control module is used for controlling the agricultural robot to move on the tractor plowing path based on the moving line corresponding to each target camera.
9. The apparatus of claim 8, wherein the number of the preset areas is two, and each preset area is determined according to the planting information of the designated plant and the image capturing range of the target camera.
10. The apparatus of claim 8, wherein the trajectory determination module is specifically configured to:
carrying out contour detection on each preset region in the mask image;
fitting the detected outline of the outermost layer to obtain the outlines of a plurality of designated plants;
determining a center point of each of the designated plants based on the contour of the designated plant;
and fitting the central points of the designated plants to obtain the growth tracks of the designated plants in the preset area.
11. The apparatus of claim 8, wherein the route determination module is specifically configured to:
determining two intersection points of the growth track of the designated plant in each preset area and the preset area;
fusing the intersection points matched with the positions on different growth tracks to obtain a reference point;
converting the two-dimensional coordinate of each reference point into a three-dimensional coordinate based on the conversion relation from the two-dimensional coordinate system of the image to the three-dimensional coordinate system of the agricultural robot;
and determining a line segment formed by connecting the three-dimensional coordinates of the reference points as a moving line corresponding to the target camera.
12. Device according to any of claims 8-11, characterized in that the agricultural robot comprises a robot body and at least two movement devices, the number of object cameras being one, the control module being specifically adapted to:
taking a moving line corresponding to the target camera as a central moving line of the robot body;
converting the central moving line based on the line conversion relation between the center of the robot body and each moving device to obtain the moving line of the moving device;
and controlling the agricultural robot to move on the tractor plowing path according to the moving line of each moving device.
13. Device according to any of claims 8-11, characterized in that the agricultural robot comprises a robot body and at least two movement devices, the number of object cameras being two, the control module being specifically adapted to:
taking the moving line corresponding to each target camera as the moving lines of two moving devices matched with the target camera;
and controlling the agricultural robot to move on the tractor plowing path according to the moving line of each moving device.
14. The apparatus of claim 8, wherein the target pixel in the image belonging to the designated plant is determined according to the following steps:
converting the image from an RGB image to an HSV image;
if the HSV component of any pixel in the HSV image meets a preset threshold condition, determining that the pixel is the target pixel belonging to the specified plant.
15. An electronic device, comprising: at least one processor, and a memory communicatively coupled to the at least one processor, wherein:
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-7.
16. A storage medium, characterized in that, when the computer program in the storage medium is executed by a processor of an electronic device, the electronic device is capable of performing the method according to any one of claims 1-7.
CN202211509459.5A 2022-11-29 2022-11-29 Control method and device of agricultural robot, electronic equipment and storage medium Pending CN115723138A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211509459.5A CN115723138A (en) 2022-11-29 2022-11-29 Control method and device of agricultural robot, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211509459.5A CN115723138A (en) 2022-11-29 2022-11-29 Control method and device of agricultural robot, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115723138A true CN115723138A (en) 2023-03-03

Family

ID=85298958

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211509459.5A Pending CN115723138A (en) 2022-11-29 2022-11-29 Control method and device of agricultural robot, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115723138A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116267044A (en) * 2023-04-04 2023-06-23 南京农业大学 Paddy field weeding robot motion control system
CN116892944A (en) * 2023-09-11 2023-10-17 黑龙江惠达科技股份有限公司 Agricultural machinery navigation line generation method and device, and navigation method and device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116267044A (en) * 2023-04-04 2023-06-23 南京农业大学 Paddy field weeding robot motion control system
CN116892944A (en) * 2023-09-11 2023-10-17 黑龙江惠达科技股份有限公司 Agricultural machinery navigation line generation method and device, and navigation method and device
CN116892944B (en) * 2023-09-11 2023-12-08 黑龙江惠达科技股份有限公司 Agricultural machinery navigation line generation method and device, and navigation method and device

Similar Documents

Publication Publication Date Title
Bai et al. Vision-based navigation and guidance for agricultural autonomous vehicles and robots: A review
CN115723138A (en) Control method and device of agricultural robot, electronic equipment and storage medium
Gai et al. Using a depth camera for crop row detection and mapping for under-canopy navigation of agricultural robotic vehicle
Shalal et al. A review of autonomous navigation systems in agricultural environments
Mousazadeh A technical review on navigation systems of agricultural autonomous off-road vehicles
Beloev et al. Artificial intelligence-driven autonomous robot for precision agriculture
Ulloa et al. Robotic fertilization in strip cropping using a CNN vegetables detection-characterization method
Matsuzaki et al. 3D semantic mapping in greenhouses for agricultural mobile robots with robust object recognition using robots' trajectory
Ma et al. Rice row tracking control of crawler tractor based on the satellite and visual integrated navigation
CN115900726A (en) Navigation path generation method based on crop geographic coordinate positioning
Matsuzaki et al. Image-based scene recognition for robot navigation considering traversable plants and its manual annotation-free training
Wang et al. The seedling line extraction of automatic weeding machinery in paddy field
Huang et al. An End‐to‐End Learning‐Based Row‐Following System for an Agricultural Robot in Structured Apple Orchards
Wang et al. The identification of straight-curved rice seedling rows for automatic row avoidance and weeding system
Chatzisavvas et al. Autonomous Unmanned Ground Vehicle in Precision Agriculture–The VELOS project
Wang et al. Intelligent robotic lawn mower design
CN116576859A (en) Path navigation method, operation control method and related device
US12001221B2 (en) Methods for managing coordinated autonomous teams of under-canopy robotic systems for an agricultural field and devices
Martini et al. Enhancing navigation benchmarking and perception data generation for row-based crops in simulation
Peebles et al. Robotic Harvesting of Asparagus using Machine Learning and Time-of-Flight Imaging–Overview of Development and Field Trials
CN110125945B (en) Plant row following method of harvesting robot
Matsuzaki et al. Semantic-aware plant traversability estimation in plant-rich environments for agricultural mobile robots
Korthals et al. Towards inverse sensor mapping in agriculture
CN110807425A (en) Intelligent weeding system and weeding method
Shamshiri et al. An overview of visual servoing for robotic manipulators in digital agriculture

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination