WO2022233307A1 - 一种基于作物茎秆定位的锄草机器人及锄草方法 - Google Patents

一种基于作物茎秆定位的锄草机器人及锄草方法 Download PDF

Info

Publication number
WO2022233307A1
WO2022233307A1 PCT/CN2022/091056 CN2022091056W WO2022233307A1 WO 2022233307 A1 WO2022233307 A1 WO 2022233307A1 CN 2022091056 W CN2022091056 W CN 2022091056W WO 2022233307 A1 WO2022233307 A1 WO 2022233307A1
Authority
WO
WIPO (PCT)
Prior art keywords
weeding
image
crop
target
pixel
Prior art date
Application number
PCT/CN2022/091056
Other languages
English (en)
French (fr)
Inventor
吴艳娟
王健
王云亮
卢秀玲
Original Assignee
天津理工大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 天津理工大学 filed Critical 天津理工大学
Publication of WO2022233307A1 publication Critical patent/WO2022233307A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B39/00Other machines specially adapted for working soil on which crops are growing
    • A01B39/12Other machines specially adapted for working soil on which crops are growing for special purposes, e.g. for special culture
    • A01B39/18Other machines specially adapted for working soil on which crops are growing for special purposes, e.g. for special culture for weeding
    • A01B39/19Rod weeders, i.e. weeder with rotary rods propelled beneath the soil surface
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B39/00Other machines specially adapted for working soil on which crops are growing
    • A01B39/20Tools; Details
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B39/00Other machines specially adapted for working soil on which crops are growing
    • A01B39/20Tools; Details
    • A01B39/22Tools; Mounting tools
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B39/00Other machines specially adapted for working soil on which crops are growing
    • A01B39/20Tools; Details
    • A01B39/24Undercarriages
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B39/00Other machines specially adapted for working soil on which crops are growing
    • A01B39/28Other machines specially adapted for working soil on which crops are growing with special additional arrangements
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B79/00Methods for working soil
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means

Definitions

  • the invention relates to the field of intelligent technology, in particular to a weeding robot and a weeding method based on crop stalk positioning.
  • weeding machine commonly used at present can significantly improve the efficiency of weeding operation, the weeding machine can only remove weeds between rows, but can do nothing to weeds between plants.
  • Smart weeding is a high-tech that can autonomously identify field crops and weeds, and can remove weeds around crops in a targeted manner.
  • the intelligent inter-row weeding robot is an efficient, safe and environmentally friendly weeding machine, which can completely solve the problem of weeding between rows and rows of crops, thus completely replacing chemical weeding and manual weeding.
  • the purpose of the present invention is to provide a weeding robot and a weeding method based on the positioning of crop stalks, so as to improve the intelligence of the weeding robot in the prior art and improve the weeding efficiency of the intelligent robot, thereby replacing manual operations.
  • the present invention provides the following scheme:
  • the present invention provides a weeding robot based on crop stalk positioning, comprising:
  • Controller GPS positioning module, inter-row vision sensor, crop vision sensor, motor drive module, crawler motor, inter-row weeding actuator and inter-row weeding actuator;
  • the controller is respectively connected with the visual sensor between the ridges and the crop visual sensor;
  • the controller is connected with the GPS positioning module
  • the controller is respectively connected with the crawler motor, the weeding actuator between the ridges and the weeding actuator between the rows through the motor drive module.
  • the present invention provides a weeding method using a weeding robot based on crop stalk positioning, which specifically includes the following steps:
  • S2 carry out target recognition and root positioning based on the images obtained by the visual sensor between the ridges and the crop visual sensor;
  • the steps of performing target recognition and root positioning based on the images obtained by the inter-row vision sensor and the crop vision sensor include:
  • S21 image space processing: convert the acquired image to the HSV cone color space and set a threshold of H to segment the image in the green part to acquire a second image;
  • the steps after the image space processing is completed include:
  • the image with the largest connection area in the fourth image is acquired as the fifth image.
  • the method further includes:
  • S23 performs skeleton extraction: define the value of white pixel to be 0, and the value of black pixel to be 1;
  • the selected pixel P1 is black and has 8 adjacent pixels, which are arranged in order as P2, P3, P4, P5, P6, P7, P8, P9, P2;
  • the pixel is black with 8 adjacent pixels
  • the method further includes:
  • the S4: the step of obtaining the distance between the target root and the machine body based on the target ranging result includes:
  • the required movement distance of the weeding robot in the front-back, left-right, and up-down positions is obtained.
  • R represents the redness value of the pixel point
  • G represents the greenness value of the pixel point
  • B represents the blueness value of the pixel point
  • R, G, B value range is (0 ⁇ 255 natural number)
  • R' , G', B' are the decimal values between (0 ⁇ 1) obtained by dividing R, G, B by the maximum chromaticity value of 255
  • H represents the vertical center axis OV of the cone around the HSV space from the horizontal center
  • S represents the distance from the starting point O along the direction of the lateral central axis OS
  • V represents the distance from the starting point O along the direction of the vertical central axis OH.
  • the present invention discloses the following technical effects:
  • the invention provides a weeding robot and a weeding method based on crop stalk positioning, comprising: a controller, a GPS positioning module, a visual sensor between ridges, a crop visual sensor, a motor drive module, a crawler motor, and a weeding execution between ridges
  • the controller is connected with the visual sensor between the ridges and the crop vision sensor, respectively; the controller is connected with the GPS positioning module, and the controller is connected with the motor drive module respectively.
  • the crawler motor, the weeding actuator between the ridges and the weeding actuator between the rows are connected.
  • the method provided by the invention can improve the intelligence of the weeding robot in the prior art, and improve the weeding efficiency of the intelligent robot, thereby replacing manual work.
  • FIG. 1 is a schematic diagram of a complete weeder based on crop stalk positioning according to an embodiment of the present invention
  • FIG. 2 is a schematic structural diagram of a weeding robot based on crop stalk positioning according to an embodiment of the present invention
  • FIG. 3 is a schematic diagram of a weeding method based on crop stalk positioning according to an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of a RGB-to-HSV conversion of a weeding method based on crop stalk positioning according to an embodiment of the present invention
  • FIG. 5 is a schematic diagram of a weeding method erosion algorithm based on crop stalk positioning according to an embodiment of the present invention
  • FIG. 6 is a schematic diagram of a skeleton extraction algorithm of a weeding method based on crop stalk positioning according to an embodiment of the present invention
  • FIG. 7 is a left view of a crawler wheel of a weeding robot based on crop stalk positioning according to an embodiment of the present invention.
  • 1 trolley bottom plate
  • 2 crawlers
  • 3 vision sensor between ridges and crop vision sensor
  • 4 first telescopic rod
  • 5 second telescopic rod
  • 6 battery
  • 7 GPS positioning module
  • 8 transmission module
  • 9 - Controller 10 - connecting shafts with three degrees of freedom of X, Y and Z, 11 - weeding actuator between plants, 12 - action vision sensor, 13 - third telescopic rod, 14 - actuator between ridges, 15 - Garbage Pickup Device, 16 - Garbage Collection Box.
  • the embodiment of the present invention provides a crop stem-based
  • the positioned weeding robot and the weeding method can improve the intelligence of the weeding robot in the prior art, and improve the weeding efficiency of the intelligent robot, thereby replacing manual operations.
  • 1 is a schematic diagram of the whole machine of the weeding machine of the present invention
  • 1 is a trolley floor
  • 2 is a crawler
  • 3 is a visual sensor between ridges and a crop visual sensor
  • 4 is a first telescopic rod, which is specifically used for visual sensors between ridges and crops
  • 5 is the second telescopic rod, which is specifically used for the left and right movement of the visual sensor between the ridges and the crop vision sensor
  • 6 is the battery
  • 7 is the GPS positioning module
  • 8 is the transmission module
  • 9 is the controller.
  • 10 is the connecting shaft with three degrees of freedom of X, Y and Z
  • 11 is the weeding actuator between plants
  • 12 is the action vision sensor
  • 13 is the third telescopic rod
  • 14 is the actuator between the ridges
  • 15 is the garbage pickup device
  • 16 For the garbage collection box (with a weight sensor, and can automatically turn over and dump the garbage).
  • the present invention provides a weeding robot based on crop stalk positioning, comprising:
  • Controller GPS positioning module, inter-row vision sensor, crop vision sensor, motor drive module, crawler motor, inter-row weeding actuator and inter-row weeding actuator.
  • the controller is respectively connected with the visual sensor between the ridges and the crop visual sensor;
  • the controller is connected with the GPS positioning module.
  • the controller is respectively connected with the crawler motor, the weeding actuator between the ridges and the weeding actuator between the rows through the motor drive module.
  • the visual sensor between the ridges and the crop visual sensor are used to collect image information directly above the crops on both sides of the trolley, and then transmit the collected images to the host computer controller through the wired transmission module for crop and weed position analysis. .
  • the GPS positioning module is mainly to locate the weeding work field, and to perform the positioning between the ridges, the length of the ridge and even the position of the seedlings when the weeding work starts and ends for the first time, and transmit these data to the upper position.
  • the machine can store it, so that the upper computer can carry out the weeder positioning and travel route planning and analysis for the field, and store and memorize the analysis results.
  • the next weeding can be performed by selecting the memory route to speed up the weeding speed and improve the weeding efficiency.
  • the weeding actuator between the ridges is composed of two rows of high-speed rotating blades inclined into the soil at the front end of the trolley. In order to weed and loosen the soil without leaving dead ends, the blades in the rear row are arranged in a staggered manner with the position of the front row.
  • the grass trolley shovels the weeds between the ridges from the roots and loosens the soil. It is equipped with a lifting rod. When it is working, it is lowered to hoe and loosen the soil. When it is not working, it is raised to more than 15 cm above the ground.
  • the weeding actuator between the seedlings on the left and right ridges is also a serrated knife-shaped blade that tilts into the soil and rotates at a high speed.
  • the rotating blade is retracted by a motor with three degrees of freedom, X, Y, and Z. lever control.
  • X, Y, and Z. lever control When working, firstly use the vision sensor to collect the images of crops and surrounding weeds in real time directly above the crops, convert the collected RGB color images to the color space, convert them to the HSV cone color space, and perform background segmentation in HSV. , keep green seedlings and weed information.
  • the erosion algorithm is used to separate the large-area connected weeds, and finally the area with the largest connection area in the image is retained as the crop seedlings. Then the skeleton of the seedling is extracted, and the intersection point is obtained as the position of the stem, and the center point of the stem is located by coordinate transformation. Finally, by setting the relevant general-purpose input and output GPIO ports to control the robot body and the drive stepper motor of the actuator to remove weeds.
  • Embodiment 2 is a diagrammatic representation of Embodiment 1:
  • the invention provides a weeding method using a weeding robot based on crop stalk positioning, which specifically includes the following steps:
  • S1 Use the inter-row vision sensor and the crop vision sensor to acquire images.
  • S2 Perform target recognition and root positioning based on the images obtained by the inter-row vision sensor and the crop vision sensor.
  • the steps of performing target recognition and root positioning based on the images obtained by the inter-row vision sensor and the crop vision sensor include:
  • S21 image space processing: convert the acquired image to the HSV cone color space and set a threshold of H to segment the image in the green part to acquire a second image;
  • HSV cone color space conversion is performed in the following manner
  • R represents the redness value of the pixel point
  • G represents the greenness value of the pixel point
  • B represents the blueness value of the pixel point
  • R, G, B value range is (0 ⁇ 255 natural number)
  • R' , G', B' are the decimal values between (0 ⁇ 1) obtained by dividing R, G, B by the maximum chromaticity value of 255
  • H represents the vertical center axis OV of the cone around the HSV space from the horizontal center
  • the axis OS is the angle of rotation from the start
  • S represents the distance from the starting point O along the direction of the horizontal central axis OS
  • V represents the distance from the starting point O along the direction of the vertical central axis OH.
  • the method includes:
  • the image with the largest connection area in the fourth image is obtained as the fifth image.
  • the method further includes:
  • S23 performs skeleton extraction: define the value of white pixel to be 0, and the value of black pixel to be 1.
  • the selected pixel P1 is black and has 8 adjacent pixels, which are arranged in sequence as P2, P3, P4, P5, P6, P7, P8, P9, P2;
  • the pixel is black with 8 adjacent pixels
  • the method further includes:
  • the S4: the step of obtaining the distance between the target root and the machine body based on the target ranging result includes:
  • the required movement distance of the weeding robot in the front-back, left-right, and up-down positions is obtained.
  • crawler wheels for walking enhances the stability of the robot's walking.
  • Inter-plant weeding adopts a connecting rod with three degrees of freedom of X, Y, and Z to connect the actuator.
  • the X and Y directions are controlled by the guide rail, and the Z direction is controlled by the telescopic rod.
  • the weeds around the crops are thoroughly removed, which solves the difficulty of removing weeds between the plants.
  • the invention greatly increases the work efficiency of weeding, improves the work quality and reduces the consumption of manpower.
  • the robot has a memory function, and records the field area, ridge direction, ridge length, distance between ridges, and the position of the seedlings for the first weeding operation, and then performs the second and third operations on the field later. Secondary hoeing can call records to speed up the job.

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mechanical Engineering (AREA)
  • Soil Sciences (AREA)
  • Environmental Sciences (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Soil Working Implements (AREA)

Abstract

一种基于作物茎秆定位的锄草机器人,涉及自动锄草的技术领域,包括:控制器(9)、GPS定位模块(7)、垄间视觉传感器、作物视觉传感器、电机驱动模块、履带电机、垄间锄草执行器(14)以及株间锄草执行器(11);控制器(9)分别与垄间视觉传感器以及作物视觉传感器(3)相连;控制器(9)与GPS定位模块(7)相连,控制器(9)通过电机驱动模块分别与履带电机、垄间锄草执行器(14)以及株间锄草执行器(11)相连。还涉及一种锄草方法。通过该方法可以提高现有技术中锄草机器人智能型、并提高智能机器人的锄草效率,进而代替人工作业。

Description

一种基于作物茎秆定位的锄草机器人及锄草方法
本申请要求于2021年05月07日提交中国专利局、申请号为202110496132.8、发明名称为“一种基于作物茎秆定位的锄草机器人及锄草方法”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及智能技术领域,特别是涉及一种基于作物茎秆定位的锄草机器人及锄草方法。
背景技术
在农作物的苗期生长过程中,株间杂草危害较大,目前最主要的株间锄草方式是化学除草或人工锄草。以化学除草为主的除草方式又给农业带来诸多新问题,如农田杂草种群演替加快、耐药性增强、滥用除草剂造成的耕地环境污染等,作物由于化学药物残留造成食品安全性问题等;而人工锄草受农村劳动力的外流和劳动力价格的逐年上升的影响,这种作业方式对于大规模种植不具有可行性。以机械方式去除田间杂草,不仅可取得较好的除草效果,特别是可以避免使用化学除草剂,消除作物的化学农药污染隐患,保证食品安全。目前普遍使用的锄草机虽然可以显著提高锄草作业效率,但锄草机只能清除行间杂草,对株间杂草却无能为力。智能锄草是一种能自主识别田间作物和杂草,并能针对性地清除作物周边杂草的高新技术。智能株间锄草机器人是能实现高效、安全、环保的锄草作业机器,可以彻底解决作物行间、株间的锄草问题,从而完全取代化学除草和人工锄草。
发明内容
本发明的目的是提供一种基于作物茎秆定位的锄草机器人及锄草方法,以提高现有技术中锄草机器人智能型、并提高智能机器人的锄草效率,进而代替人工作业。
为实现上述目的,本发明提供了如下方案:
第一方面,本发明提供了一种基于作物茎秆定位的锄草机器人,包括:
控制器、GPS定位模块、垄间视觉传感器、作物视觉传感器、电机驱动模块、履带电机、垄间锄草执行器以及株间锄草执行器;
所述控制器分别与所述垄间视觉传感器以及作物视觉传感器相连;
所述控制器与所述GPS定位模块相连;
所述控制器通过所述电机驱动模块分别与所述履带电机、所述垄间锄草执行器以及所述株间锄草执行器相连。
第二方面,本发明提供了一种利用基于作物茎秆定位的锄草机器人的锄草方法,具体包括如下步骤:
S1:利用垄间视觉传感器以及作物视觉传感器获取图像;
S2:基于所述垄间视觉传感器以及所述作物视觉传感器获取的图像进行目标识别以及根部定位;
S3:依据目标根部定位进行目标测距;
S4:基于目标测距结果获取目标根部与机器本体的距离。
优选的,所述基于所述垄间视觉传感器以及所述作物视觉传感器获取的图像进行目标识别以及根部定位的步骤包括:
S21:图像空间处理:将获取的图像进行HSV锥形体颜色空间转化并设置H的阈值以对绿色部分的图像进行分割,获取第二图像;
对第二图像转换至RGB图像,并进行灰度二值化处理以获取第三图像A。
优选的,所述S21:图像空间处理完成之后的步骤包括:
S22:进行腐蚀操作:定义结构矩阵B,并使B在A上滑动;
判定结构矩阵B与A是否相同:
若相同,则删除第三图像A上结构矩阵B所对应的点直至结构矩阵B遍历第三图像A上所有的点以获取第四图像;
获取所述第四图像中连接区域最大的图像为第五图像。
优选的,所述S22:进行腐蚀操作的步骤之后,所述方法还包括:
S23进行骨架提取:定义白色像素取值为0,黑色像素取值为1;
1):遍历第五图像中所有像素点,获取所有满足如下条件的像素点并将所有满足条件的点转化为白色:
所选定的像素P1为黑色,且有8个相邻像素,按照顺序排列为P2,P3,P4,P5,P6,P7,P8,P9,P2;
2<=B(P1)<=6;
A(P1)=1;
P2×P4×P6=0;
P4×P6×P8=0;
2):再次测试作物图像的所有像素,搜寻同时满足以下所有条件的像素并将所有满足条件的点转化为白色:
像素为黑色,有8个相邻像素;
2<=B(P1)<=6;
A(P1)=1;
P2×P4×P8=0;
P2×P6×P8=0;
重复1)以及2)直至所述第五图像的所有像素点不发生改变。
优选的,所述S23进行骨架提取的步骤之后,所述方法还包括:
S24:通过查找骨架的交点,获取根部像素坐标。
优选的,所述S4:基于目标测距结果获取目标根部与机器本体的距离的步骤包括:
基于目标像素目标,获取目标根部相对于机器本体的实际距离;
基于目标根部相对于锄草机器人的实际距离,获取锄草机器人在前后、左右、上下位置所需运动的距离。
优选的,所述S21:图像空间处理的步骤中,具体采用如下公式:
R'=R/255
G'=G/255
B'=B/255
C max=max(R',G',B')
C min=min(R',G',B')
Δ=C max-C min
Figure PCTCN2022091056-appb-000001
Figure PCTCN2022091056-appb-000002
其中,R代表像素点的红色度值,G代表像素点的绿色度值,B代表像素点的蓝色度值,(R、G、B取值范围为(0~255的自然数);R’、G’、B’分别是将R、G、B除以最大色度值255后得到的(0~1)之间的小数值;H代表围绕HSV空间的锥形体垂直中心轴OV从横向中心轴OS作为起始所旋转的角度,S代表沿横向中心轴OS方向到起点O的距离,V代表沿垂直中心轴OH方向到起点O的距离。
根据本发明提供的具体实施例,本发明公开了以下技术效果:
本发明提供了一种基于作物茎秆定位的锄草机器人及锄草方法,包括:控制器、GPS定位模块、垄间视觉传感器、作物视觉传感器、电机驱动模块、履带电机、垄间锄草执行器以及株间锄草执行器;所述控制器分别与所述垄间视觉传感器以及作物视觉传感器相连;所述控制器与所述GPS定位模块相连所述控制器通过所述电机驱动模块分别与所述履带电机、垄间锄草执行器以及株间锄草执行器相连。通过本发明提供的方法可以提高现有技术中锄草机器人智能型、并提高智能机器人的锄草效率,进而代替人工作业。
说明书附图
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本发明实施例提供的一种基于作物茎秆定位的锄草机整机示意图;
图2为本发明实施例提供的一种基于作物茎秆定位的锄草机器人结构示意图;
图3为本发明实施例提供的一种基于作物茎秆定位的锄草方法示意图;
图4为本发明实施例提供的一种基于作物茎秆定位的锄草方法RGB转HSV示意图;
图5为本发明实施例提供的一种基于作物茎秆定位的锄草方法腐蚀算法示意图;
图6为本发明实施例提供的一种基于作物茎秆定位的锄草方法骨架提取算法示意图;
图7为本发明实施例提供的一种基于作物茎秆定位的锄草机器人履带车轮左视图。
附图说明:
1—小车底板,2—履带,3—垄间视觉传感器以及作物视觉传感器,4—第一伸缩杆,5—第二伸缩杆,6—蓄电池,7—GPS定位模块,8—传输模块,9—控制器,10—X、Y、Z三个自由度的连接轴,11—株间锄草执行器,12—行动视觉传感器,13—第三伸缩杆,14—垄间执行器,15—垃圾拾取装置,16—垃圾收集箱。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
为使本发明的上述目的、特征和优点能够更加明显易懂,下面结合附图和具体实施方式对本发明作进一步详细的说明。
目前,最主要的株间除草方式是化学除草或人工锄草。以化学除草为主的除草方式又给农业带来诸多新问题,如农田杂草种群演替加快、耐药性增强、滥用锄草剂造成的耕地环境污染等,作物由于化学药物残留造成食品安全性问题等;而人工锄草受农村劳动力的外流和劳动力价格的逐年上升的影响,这种作业方式对于大规模种植不具有可行性。以机械方式去除田间杂草,不仅可取得较好的锄草效果,特别是可以避免使用化学锄草剂,消除作物的化学农药污染隐患,保证食品安全。目前普遍使用的锄草机虽然可以显著提高除草作业效 率,但锄草机只能清除行间杂草,对株间杂草却无能为力,基于此,本发明实施例提供的一种基于作物茎秆定位的锄草机器人及锄草方法,可以提高现有技术中锄草机器人智能型、并提高智能机器人的锄草效率,进而代替人工作业。
图1给出了本发明锄草机整机示意图,1为小车底板,2为履带,3为垄间视觉传感器以及作物视觉传感器,4为第一伸缩杆,具体用于垄间视觉传感器以及作物视觉传感器上下方向的运动,5为第二伸缩杆,具体用于垄间视觉传感器以及作物视觉传感器左右方向的运动,6为蓄电池,7为GPS定位模块,8为传输模块,9为控制器,10为X、Y、Z三个自由度的连接轴,11为株间锄草执行器,12为行动视觉传感器,13为第三伸缩杆,14为垄间执行器,15垃圾拾取装置,16为垃圾收集箱(带有重量传感器,并可自动翻转倾倒垃圾)。
为便于对本实施例进行理解,首先对本发明实施例所公开的一种基于作物茎秆定位的锄草机器人及锄草方法进行详细介绍。
实施例一
第一方面,本发明提供了一种基于作物茎秆定位的锄草机器人,包括:
控制器、GPS定位模块、垄间视觉传感器、作物视觉传感器、电机驱动模块、履带电机、垄间锄草执行器以及株间锄草执行器。
所述控制器分别与所述垄间视觉传感器以及作物视觉传感器相连;
所述控制器与所述GPS定位模块相连。
所述控制器通过所述电机驱动模块分别与所述履带电机、垄间锄草执行器以及株间锄草执行器相连。
进一步的,所述垄间视觉传感器以及所述作物视觉传感器用于在小车两边作物正上方采集图像信息,然后通过有线传输模块将采集到的图像传输给上位机控制器进行农作物和杂草位置分析。
所述GPS定位模块主要是对锄草工作田进行定位,在第一次对田地进行锄草工作开始和结束时进行垄间位置、垄长甚至禾苗位置等进行定位,并将这些数据传送到上位机进行储存,以便上位机针对该田地进行锄草机定位和行进路线规划分析,并对分析结果进行存储记忆,下一次锄草可调取记忆路线进行,加快锄草速度,提高锄草效率。
所述垄间锄草执行器是在小车前端由前后两排倾斜入土的高速旋转刀片 组成,为了锄草和松土不留死角,后排的刀片与前排的位置呈交错排放,随着锄草小车的前行将垄间杂草从根部铲断并进行松土,并带有升降杆,作业时下降锄草松土,不作业时提升到高于地面15公分以上。
所述左右两边垄上苗间锄草执行器也是倾斜入土高速旋转的锯齿刀型刀片,为了更精确锄草,不伤害农作物苗,该旋转刀片由X、Y、Z三个自由度的电机伸缩杆控制。工作时,首先利用视觉传感器在作物正上方实时采集作物及周围杂草的图像,对采集到的RGB彩色图像进行色彩空间的转换,转换到HSV锥形体色彩空间中,并在HSV中进行背景分割,保留绿色的幼苗及杂草信息。然后利用腐蚀算法对大面积连接的杂草进行分离,最后保留图像中连接面积最大的区域即为作物幼苗。再对幼苗的骨架进行提取,得到交叉点为茎秆位置,通过坐标转换定位茎秆中心点。最后通过设置相关通用输入输出GPIO端口实现对机器人本体控制和对执行器的驱动步进电机进行控制,实现杂草的清除。
实施例二:
本发明提供了一种利用基于作物茎秆定位的锄草机器人的除草方法,具体包括如下步骤:
S1:利用所述垄间视觉传感器以及所述作物视觉传感器获取图像。
S2:基于所述垄间视觉传感器以及所述作物视觉传感器获取的图像进行目标识别以及根部定位。
S3:依据所述目标根部定位进行目标测距。
S4:基于目标测距结果获取目标根部与机器本体的距离。
优选的,所述基于所述垄间视觉传感器以及所述作物视觉传感器获取的图像进行目标识别以及根部定位的步骤包括:
S21:图像空间处理:将获取的图像进行HSV锥形体颜色空间转化并设置H的阈值以对绿色部分的图像进行分割,获取第二图像;
进一步的,采用如下方式进行HSV锥形体颜色空间转化
R'=R/255
G'=G/255
B'=B/255
C max=max(R',G',B')
C min=min(R',G',B')
Δ=C max-C min
Figure PCTCN2022091056-appb-000003
Figure PCTCN2022091056-appb-000004
其中,R代表像素点的红色度值,G代表像素点的绿色度值,B代表像素点的蓝色度值,(R、G、B取值范围为(0~255的自然数);R’、G’、B’分别是将R、G、B除以最大色度值255后得到的(0~1)之间的小数值;H代表围绕HSV空间的锥形体垂直中心轴OV从横向中心轴OS作为起始所旋转的角度,S代表沿横中心轴OS方向与起点O的距离,V代表沿垂直中心轴OH方向与起点O的距离。
对第二图像转换至RGB图像,并进行灰度二值化处理以获取第三图像A。
优选的,所述S21:图像空间处理的之后,所述方法包括:
S22:进行腐蚀操作:定义结构矩阵B,并使B在A上滑动;
判定结构矩阵B与A是否相同;
若相同,则删除第三图像A上结构矩阵B所对应的点直至结构矩阵B遍历第三图像A上所有的点以获取第四图像;
获取第四图像中连接区域最大的图像为第五图像。
优选的,所述S22:进行腐蚀操作的步骤之后,所述方法还包括:
S23进行骨架提取:定义白色像素取值为0,黑色像素取值为1。
1):遍历第五图像中所有像素点,获取所有满足如下条件的像素点并将所有满足条件的点转化为白色:
(1)所选定的像素P1为黑色,且有8个相邻像素,按照顺序排列为P2,P3,P4,P5,P6,P7,P8,P9,P2;
(2)2<=B(P1)<=6;
(3)A(P1)=1;
(4)P2×P4×P6=0;
(5)P4×P6×P8=0;
2):再次测试作物图像的所有像素,搜寻同时满足以下所有条件的像素并将所有满足条件的点转化为白色:
(1)像素为黑色,有8个相邻像素;
(2)2<=B(P1)<=6;
(3)A(P1)=1;
(4)P2×P4×P8=0;
(5)P2×P6×P8=0;
重复1)以及2)直至所述第五图像的所有像素点不发生改变。
优选的,所述S23进行骨架提取的步骤之后,所述方法还包括:
S24:通过查找骨架的交点,获取根部像素坐标。
优选的,所述S4:基于目标测距结果获取目标根部与机器本体的距离的步骤包括:
基于目标像素目标,获取目标根部相对于机器本体的实际距离;
基于目标根部相对于锄草机器人的实际距离,获取锄草机器人在前后、左右、上下位置所需运动的距离。
本发明具有如下优点:
(1)采用履带式车轮行走,增强了机器人行走的稳定性。
(2)通过无线传输模块,远程监控机器人的位置和电量信息,机器人操作人员可通过终端远程控制机器人的启停。
(3)在进行株间锄草的同时对垄间杂草和垄间的垃圾也进行了清除,同时语音报警模块和远程控制模块可以更好地实现对机器人的控制。
(4)株间锄草采用了X、Y、Z三自由度的连接杆连接执行器,X、Y方向通过导轨控制,Z方向通过伸缩杆控制。对作物周围的杂草进行彻底清除,解决了株间杂草清除的困难。与传统人工锄草相比,本发明大大增加锄草的工作效率,提高工作质量,降低人力的消耗。
(5)该机器人具有记忆功能,对第一次锄草作业的田地进行田地面积、垄方向、垄长、垄间距离,以及禾苗位置进行记录,后面再对该田地进行第二次、第三次锄草可以调用记录,加快作业速度。
本说明书中各个实施例采用递进的方式描述,每个实施例重点说明的都是与其他实施例的不同之处,各个实施例之间相同相似部分互相参见即可。
本文中应用了具体个例对本发明的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本发明的方法及其核心思想;同时,对于本领域的一般技术人员,依据本发明的思想,在具体实施方式及应用范围上均会有改变之处。综上所述,本说明书内容不应理解为对本发明的限制。

Claims (8)

  1. 一种基于作物茎秆定位的锄草机器人,其特征在于,包括:
    控制器、GPS定位模块、垄间视觉传感器、作物视觉传感器、电机驱动模块、履带电机、垄间锄草执行器以及株间锄草执行器;
    所述控制器分别与所述垄间视觉传感器以及作物视觉传感器相连;
    所述控制器与所述GPS定位模块相连;
    所述控制器通过所述电机驱动模块分别与所述履带电机、所述垄间锄草执行器以及所述株间锄草执行器相连。
  2. 一种利用基于作物茎秆定位的锄草机器人的除草方法,其特征在于,具体包括如下步骤:
    S1:利用垄间视觉传感器以及作物视觉传感器获取图像;
    S2:基于所述垄间视觉传感器以及所述作物视觉传感器获取的图像进行目标识别以及根部定位;
    S3:依据目标根部定位进行目标测距;
    S4:基于目标测距结果获取目标根部与机器本体的距离。
  3. 根据权利要求2所述的方法,其特征在于,所述基于所述垄间视觉传感器以及所述作物视觉传感器获取的图像进行目标识别以及根部定位的步骤包括:
    S21:图像空间处理:将获取的图像进行HSV锥形体颜色空间转化并设置H的阈值以对绿色部分的图像进行分割,获取第二图像;
    对第二图像转换至RGB图像,并进行灰度二值化处理以获取第三图像A。
  4. 根据权利要求3所述的方法,其特征在于,所述S21:图像空间处理完成之后的步骤包括:
    S22:进行腐蚀操作:定义结构矩阵B,并使B在A上滑动;
    判定结构矩阵B与A是否相同:
    若相同,则删除第三图像A上结构矩阵B所对应的点直至结构矩阵B遍历第三图像A上所有的点以获取第四图像;
    获取所述第四图像中连接区域最大的图像为第五图像。
  5. 根据权利要求4所述的方法,其特征在于,所述S22:进行腐蚀操作的步骤之后,所述方法还包括:
    S23进行骨架提取:定义白色像素取值为0,黑色像素取值为1;
    1):遍历第五图像中所有像素点,获取所有满足如下条件的像素点并将所有满足条件的点转化为白色:
    所选定的像素P1为黑色,且有8个相邻像素,按照顺序排列为P2,P3,P4,P5,P6,P7,P8,P9,P2;
    2<=B(P1)<=6;
    A(P1)=1;
    P2×P4×P6=0;
    P4×P6×P8=0;
    2):再次测试作物图像的所有像素,搜寻同时满足以下所有条件的像素并将所有满足条件的点转化为白色:
    像素为黑色,有8个相邻像素;
    2<=B(P1)<=6;
    A(P1)=1;
    P2×P4×P8=0;
    P2×P6×P8=0;
    重复1)以及2)直至所述第五图像的所有像素点不发生改变。
  6. 根据权利要求5所述的方法,其特征在于,所述S23进行骨架提取的步骤之后,所述方法还包括:
    S24:通过查找骨架的交点,获取根部像素坐标。
  7. 根据权利要求2所述的方法,其特征在于,所述S4:基于目标测距结果获取目标根部与机器本体的距离的步骤包括:
    基于目标像素目标,获取目标根部相对于机器本体的实际距离;
    基于目标根部相对于锄草机器人的实际距离,获取锄草机器人在前后、左右、上下位置所需运动的距离。
  8. 根据权利要求3所述的方法,其特征在于,所述S21:图像空间处理的步骤中,具体采用如下公式:
    R'=R/255
    G'=G/255
    B'=B/255
    C max=max(R',G',B')
    C min=min(R',G',B')
    Δ=C max-C min
    Figure PCTCN2022091056-appb-100001
    Figure PCTCN2022091056-appb-100002
    其中,R代表像素点的红色度值,G代表像素点的绿色度值,B代表像素点的蓝色度值,(R、G、B取值范围为(0~255的自然数);R’、G’、B’分别是将R、G、B除以最大色度值255后得到的(0~1)之间的小数值;H代表围绕HSV空间的锥形体垂直中心轴OV从横向中心轴OS作为起始所旋转的角度,S代表沿横向中心轴OS方向到起点O的距离,V代表沿垂直中心轴OH方向到起点O的距离。
PCT/CN2022/091056 2021-05-07 2022-05-06 一种基于作物茎秆定位的锄草机器人及锄草方法 WO2022233307A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110496132.8A CN113647212A (zh) 2021-05-07 2021-05-07 一种基于作物茎秆定位的锄草机器人及锄草方法
CN202110496132.8 2021-05-07

Publications (1)

Publication Number Publication Date
WO2022233307A1 true WO2022233307A1 (zh) 2022-11-10

Family

ID=78489101

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/091056 WO2022233307A1 (zh) 2021-05-07 2022-05-06 一种基于作物茎秆定位的锄草机器人及锄草方法

Country Status (2)

Country Link
CN (1) CN113647212A (zh)
WO (1) WO2022233307A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116250523A (zh) * 2023-04-03 2023-06-13 哈尔滨理工大学 一种基于机器视觉的智能激光除草装置及除草方法

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113647212A (zh) * 2021-05-07 2021-11-16 天津理工大学 一种基于作物茎秆定位的锄草机器人及锄草方法

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6199000B1 (en) * 1998-07-15 2001-03-06 Trimble Navigation Limited Methods and apparatus for precision agriculture operations utilizing real time kinematic global positioning system systems
CN101990796A (zh) * 2010-09-13 2011-03-30 中国农业大学 基于机器视觉的锄草机器人系统及方法
CN102428770A (zh) * 2011-09-23 2012-05-02 中国农业大学 一种锄草机器人系统及其锄草方法
ES2423106A1 (es) * 2011-11-24 2013-09-17 Universidad De Sevilla Dispositivo de desplazamiento lateral automático controlado por gps para el control de mala hierba en cultivos en línea
CN104156730A (zh) * 2014-07-25 2014-11-19 山东大学 一种基于骨架的抗噪声汉字特征提取方法
CN106717166A (zh) * 2015-11-23 2017-05-31 璧典凯 一种株间电驱锄草控制系统
CN109005693A (zh) * 2018-08-16 2018-12-18 中国农业大学 一种基于双目视觉获取苗草信息的开合式智能锄草装置
CN109769419A (zh) * 2017-11-15 2019-05-21 韩振磊 智能除草机器人
CN112400372A (zh) * 2020-11-06 2021-02-26 东北农业大学 一种基于机器视觉技术的仿形机械锄草装置
CN113647212A (zh) * 2021-05-07 2021-11-16 天津理工大学 一种基于作物茎秆定位的锄草机器人及锄草方法

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101707992B (zh) * 2009-10-15 2011-01-05 南京林业大学 高效除草机器人
CN207201237U (zh) * 2017-09-27 2018-04-10 湖北工业大学 一种履带式棚室遥控电动微耕机
CN107600202A (zh) * 2017-10-17 2018-01-19 中国科学院合肥物质科学研究院 一种适用复杂地形的智能除草机器人
CN109405757B (zh) * 2018-12-28 2020-04-17 华南农业大学 一种多光感式稻苗株距测量装置
CN109511356B (zh) * 2019-01-08 2023-10-24 安徽农业大学 一种基于深度视觉的智能除草机器人系统及控制方法
CN110765927B (zh) * 2019-10-21 2022-11-25 广西科技大学 一种植被群落中伴生杂草的识别方法
JP6824377B2 (ja) * 2019-12-27 2021-02-03 ヤンマーパワーテクノロジー株式会社 作業車両

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6199000B1 (en) * 1998-07-15 2001-03-06 Trimble Navigation Limited Methods and apparatus for precision agriculture operations utilizing real time kinematic global positioning system systems
CN101990796A (zh) * 2010-09-13 2011-03-30 中国农业大学 基于机器视觉的锄草机器人系统及方法
CN102428770A (zh) * 2011-09-23 2012-05-02 中国农业大学 一种锄草机器人系统及其锄草方法
ES2423106A1 (es) * 2011-11-24 2013-09-17 Universidad De Sevilla Dispositivo de desplazamiento lateral automático controlado por gps para el control de mala hierba en cultivos en línea
CN104156730A (zh) * 2014-07-25 2014-11-19 山东大学 一种基于骨架的抗噪声汉字特征提取方法
CN106717166A (zh) * 2015-11-23 2017-05-31 璧典凯 一种株间电驱锄草控制系统
CN109769419A (zh) * 2017-11-15 2019-05-21 韩振磊 智能除草机器人
CN109005693A (zh) * 2018-08-16 2018-12-18 中国农业大学 一种基于双目视觉获取苗草信息的开合式智能锄草装置
CN112400372A (zh) * 2020-11-06 2021-02-26 东北农业大学 一种基于机器视觉技术的仿形机械锄草装置
CN113647212A (zh) * 2021-05-07 2021-11-16 天津理工大学 一种基于作物茎秆定位的锄草机器人及锄草方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116250523A (zh) * 2023-04-03 2023-06-13 哈尔滨理工大学 一种基于机器视觉的智能激光除草装置及除草方法

Also Published As

Publication number Publication date
CN113647212A (zh) 2021-11-16

Similar Documents

Publication Publication Date Title
WO2022233307A1 (zh) 一种基于作物茎秆定位的锄草机器人及锄草方法
CN103081597B (zh) 一种摆动型智能苗间锄草机具单元
JP6737535B2 (ja) 植物有機体を自動処理するためのロボット車両及びロボットを使用する方法
CN206118329U (zh) 一种除草松土装置
US20200020093A1 (en) Object identification and collection system and method
CN101990796B (zh) 基于机器视觉的锄草机器人系统及方法
CN105230224A (zh) 一种智能除草机及其杂草清除方法
CN113115598B (zh) 一种果园自动避障除草机
CN106258028B (zh) 一种机器视觉伺服的智能间苗锄草机具
CN201600330U (zh) 成熟菠萝识别与定位系统
CN102428770A (zh) 一种锄草机器人系统及其锄草方法
WO2022038363A1 (en) Agricultural machine
CN205284056U (zh) 一种基于机器视觉杂草识别的农田除草机
CN104904700A (zh) 一种智能化农作物全方位除草装置及方法
CN113273376A (zh) 一种模块化除草装置
US20220207852A1 (en) Generating a ground plane for obstruction detection
CN107509399A (zh) 一种绿色智能除草机器人
Visentin et al. A mixed-autonomous robotic platform for intra-row and inter-row weed removal for precision agriculture
CN105874932A (zh) 基于机器视觉的电液控制中耕除草机
CN106912208B (zh) 一种基于影像识别技术的除草机器人
Zhang et al. Research status of agricultural robot technology
Kushwaha Robotic and mechatronic application in agriculture
CN204104302U (zh) 一种多垄杂草清除机
DK201600573A1 (en) A method for determining placement of new obstacles in an agricultural field
CN112712534B (zh) 基于导航趋势线的玉米根茎导航基准线提取方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22798647

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22798647

Country of ref document: EP

Kind code of ref document: A1