CN115157257A - Intelligent plant management robot and system based on UWB navigation and visual identification - Google Patents

Intelligent plant management robot and system based on UWB navigation and visual identification Download PDF

Info

Publication number
CN115157257A
CN115157257A CN202210868189.0A CN202210868189A CN115157257A CN 115157257 A CN115157257 A CN 115157257A CN 202210868189 A CN202210868189 A CN 202210868189A CN 115157257 A CN115157257 A CN 115157257A
Authority
CN
China
Prior art keywords
center
mechanical arm
relative
uwb
pose
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210868189.0A
Other languages
Chinese (zh)
Inventor
周军
石少杰
丁忠
皇攀凌
高新彪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University
Original Assignee
Shandong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University filed Critical Shandong University
Priority to CN202210868189.0A priority Critical patent/CN115157257A/en
Publication of CN115157257A publication Critical patent/CN115157257A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01CPLANTING; SOWING; FERTILISING
    • A01C23/00Distributing devices specially adapted for liquid manure or other fertilising liquid, including ammonia, e.g. transport tanks or sprinkling wagons
    • A01C23/04Distributing under pressure; Distributing mud; Adaptation of watering systems for fertilising-liquids
    • A01C23/047Spraying of liquid fertilisers
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01GHORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
    • A01G25/00Watering gardens, fields, sports grounds or the like
    • A01G25/09Watering arrangements making use of movable installations on wheels or the like
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/163Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Water Supply & Treatment (AREA)
  • Multimedia (AREA)
  • Environmental Sciences (AREA)
  • Electromagnetism (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Toxicology (AREA)
  • General Health & Medical Sciences (AREA)
  • Soil Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides an intelligent plant management robot and system based on UWB navigation and visual identification.A control terminal and a shell are both arranged at the upper part of a bottom plate, a liquid storage box body is arranged in the shell, a first end of a mechanical arm is provided with a spray head, and the spray head is communicated with the liquid storage box body through a pipeline; the second end of the mechanical arm is fixed with the position of the upper part of the bottom plate within the range of the shell, the shell is provided with an opening for the mechanical arm to stretch and exit, and the wheel set is arranged at the bottom of the bottom plate; the visual recognition module is arranged on the upper part of the bottom plate and is in communication connection with the control terminal, the control terminal performs primary plant positioning according to received UWB observation signals of the plant position, robot movement control is performed according to primary positioning results, plant repositioning is performed according to identification code information of the plant position, and mechanical arm movement control is performed according to repositioning results to perform liquid spraying control; the invention improves the efficiency and the precision of plant management.

Description

Intelligent plant management robot and system based on UWB navigation and visual identification
Technical Field
The invention relates to the technical field of intelligent robots, in particular to an intelligent plant management robot and system based on UWB navigation and visual identification.
Background
The statements in this section merely provide background information related to the present disclosure and may not constitute prior art.
At present, in rare plant planting places, office halls, schools, office buildings or other public places where plants are placed, watering of the plants is mostly carried out in a manual mode, but the mode of workers wastes time and labor, and when the number of the plants is large, whether the plants need to be watered is often difficult to distinguish; meanwhile, if special conditions are met and manual work is absent, plants and flowers are often lack of water and even die.
The inventor finds that the intelligent flower watering device in the market also needs to change water periodically, and cannot supply fertilizers to plants; moreover, people cannot completely know the state of the environment of the plants, such as illumination, soil humidity, soil pH value, temperature and the like, and quantitative management is difficult.
Disclosure of Invention
In order to solve the defects of the prior art, the invention provides an intelligent plant management robot and system based on UWB navigation and visual identification, which can supply water and fertilizer to plants, feed back the parameters and the state of the surrounding environment where the plants are located in real time, and accurately irrigate the plants by matching with a visual identification mode; can insert among the soil of plant wireless information transmission device to produce the UWB signal, the robot received behind the signal through inside algorithm navigation to plant peripheral, and water the plant through the supplementary of vision, very big improvement plant management efficiency.
In order to achieve the purpose, the invention adopts the following technical scheme:
the invention provides an intelligent plant management robot based on UWB navigation and visual identification.
An intelligent plant management robot based on UWB navigation and visual identification, includes:
the device comprises a mechanical arm, a shell, a bottom plate, a wheel set, a visual identification module and a control terminal;
the control terminal and the shell are both arranged at the upper part of the bottom plate, a liquid storage box body is arranged in the shell, a spray head is arranged at the first end of the mechanical arm, and the spray head is communicated with the liquid storage box body through a pipeline;
the second end of the mechanical arm is fixed with the position of the upper part of the bottom plate within the range of the shell, the shell is provided with an opening for the mechanical arm to stretch and exit, and the wheel set is arranged at the bottom of the bottom plate;
the visual recognition module is arranged on the upper portion of the bottom plate and is in communication connection with the control terminal, the control terminal carries out plant primary positioning according to received UWB observation signals of the plant position, robot movement control is carried out according to primary positioning results, plant repositioning is carried out according to identification code information of the plant position, and mechanical arm movement control is carried out according to repositioning results so as to carry out liquid spraying control.
In some optional implementations, the wheel sets include at least one set of differential wheels and at least two sets of universal wheels, the differential wheel sets are disposed between the universal wheel sets, and each differential wheel set is parallel to an axis of each universal wheel set.
The invention provides a working method of an intelligent plant management robot based on UWB navigation and visual identification.
An operation method of an intelligent plant management robot based on UWB navigation and visual identification comprises the steps of carrying out filtering processing on UWB observation distances by adopting a median mean filtering method, judging distance quantities, returning to continue observation if the quantity of the distance observation quantities does not meet the quantity of taylor positioning algorithm input quantities, and otherwise, solving position information through the taylor positioning algorithm.
As some optional implementation manners, the identifying of the identifier based on the April Tag target detection method includes:
in April Tag target detection, a threshold value determination method for threshold value segmentation comprises the following steps:
find the minimum and maximum values of the area around each pixel, divide the image into blocks of 4 × 4 pixels and calculate the extremum value within each block, then assign a value of white or black to each pixel, using the average value (max + min)/2 as the threshold.
Further, acquiring a center coordinate of the camera, and combining the first coordinate transformation matrix to obtain a pose of the center of the two-dimensional code relative to the camera;
acquiring the central coordinate of the mechanical arm, and combining a second coordinate transformation matrix to obtain the position and the posture of the camera center relative to the mechanical arm center;
acquiring the central coordinate of a target point to be irrigated, and combining a third coordinate transformation matrix to obtain the position and posture of the center of the target point relative to the two-dimensional code label center;
and synthesizing the pose of the center of the two-dimensional code relative to the camera, the pose of the center of the camera relative to the center of the mechanical arm and the pose of the center of the target point relative to the center of the two-dimensional code label to obtain the pose of the center of the target point to be irrigated relative to the center of the mechanical arm.
Further, according to the obtained pose of the center of the target point to be watered relative to the center of the mechanical arm, linear interpolation is carried out on the motion of the mechanical arm, so that the mechanical tail end can be used for watering the target point along a straight line, and in the process of advancing along the straight line, a sliding mode control strategy is adopted, so that the mechanical arm can track the target track in real time.
The invention provides an intelligent plant management system based on UWB navigation and visual identification.
An intelligent plant management system based on UWB navigation and visual identification, including the intelligent plant management robot based on UWB navigation and visual identification of the first aspect and wireless information transmission device, wireless information transmission device includes:
the device comprises a UWB radio frequency module, a device shell, an illumination sensor, a two-dimensional code display module, a soil humidity sensing element, a temperature sensing element and a PH value detection element, wherein the UWB radio frequency module, the illumination sensor and the two-dimensional code display module are all arranged on the device shell; the bottom of the device shell is provided with an insert used for inserting soil at the position of a plant, and the soil humidity sensing element, the temperature sensing element and the PH value detection element are all arranged on the insert.
As some optional implementation manners, a median mean filtering method is adopted to perform filtering processing on the UWB observation distance, the distance quantity is judged, if the quantity of the distance observation quantity does not meet the quantity of the input quantity of the taylor positioning algorithm, the continuous observation is returned, otherwise, the position information is solved through the taylor positioning algorithm.
As some optional implementation manners, the identification code is identified by an April Tag-based target detection method, and in the April Tag-based target detection, a threshold determination method for threshold segmentation includes: find the minimum and maximum values of the area around each pixel, divide the image into blocks of 4 × 4 pixels and calculate the extremum value within each block, then assign a value of white or black to each pixel, using the average value (max + min)/2 as the threshold.
Further, acquiring a center coordinate of the camera, and combining the first coordinate transformation matrix to obtain a pose of the center of the two-dimensional code relative to the camera;
acquiring the center coordinate of the mechanical arm, and combining a second coordinate transformation matrix to obtain the pose of the center of the camera relative to the center of the mechanical arm;
acquiring the central coordinate of a target point to be irrigated, and combining a third coordinate transformation matrix to obtain the position and posture of the center of the target point relative to the two-dimensional code label center;
and integrating the position and pose of the center of the two-dimensional code relative to the camera, the position and pose of the center of the camera relative to the center of the mechanical arm and the position and pose of the center of the target point relative to the center of the two-dimensional code label to obtain the position and pose of the center of the target point to be irrigated relative to the center of the mechanical arm.
Further, according to the obtained position and posture of the center of the target point needing to be watered relative to the center of the mechanical arm, linear interpolation is carried out on the motion of the mechanical arm, so that the tail end of the mechanical arm waters the target point along a straight line, and in the process of advancing along the straight line, a sliding mode control strategy is adopted, so that the mechanical arm can track the target track in real time.
Compared with the prior art, the invention has the beneficial effects that:
1. the intelligent plant management robot and system based on UWB navigation and visual identification can supply water and fertilizer to plants, feed back parameters and states of surrounding environments of the plants in real time, and accurately irrigate the plants in a visual identification mode; can insert among the soil of plant wireless information transmission device to produce the UWB signal, the robot received behind the signal through inside algorithm navigation to plant peripheral, and water the plant through the supplementary of vision, very big improvement plant management efficiency.
2. The intelligent plant management robot and system based on UWB navigation and visual recognition can manage large potted plants in places such as halls, schools, companies and the like, can supply water and nutrition to the robot in a timed and quantitative mode compared with manual mode, and cannot forget or mix. The device can still work in holidays or scenes that no people irrigate due to other special conditions; meanwhile, compared with the traditional irrigator, the robot can better and autonomously complete charging and water supplement without influencing the problems of beauty and the like, has the function of plant irrigation, can feed back the basic conditions of soil humidity, temperature, illumination and the like of plants, and is more favorable for the growth of the plants; moreover, the placement in some markets and other scenes can play a role in attracting customers and the like.
Advantages of additional aspects of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the invention and together with the description serve to explain the invention and not to limit the invention.
Fig. 1 is a perspective view of a plant management robot according to embodiment 1 of the present invention.
Fig. 2 is a logic diagram of robot positioning provided in embodiment 2 of the present invention.
Fig. 3 is a schematic diagram of an April Tag detection process according to embodiment 2 of the present invention.
Fig. 4 is a schematic diagram of a relative pose between a center of a two-dimensional code and a center of a camera provided in embodiment 2 of the present invention.
Fig. 5 shows a robot arm control strategy provided in embodiment 2 of the present invention.
Fig. 6 is a flowchart of visual recognition provided in embodiment 2 of the present invention.
Fig. 7 is a schematic diagram of a wireless information transmission apparatus according to embodiment 3 of the present invention.
Fig. 8 is a schematic diagram of a control system according to embodiment 3 of the present invention.
Wherein, 1-differential wheel; 2-universal wheels; 3-a bottom plate; 4-control the terminal; 5-mechanical arm shell; 6, a mechanical arm; 7-a visual recognition module; 8-a radio frequency module; 9-device housing; 10-a light sensor; 11-a two-dimensional code display module; 12-a soil moisture sensing element; 13-a temperature sensing element; 14-pH detecting element.
Detailed Description
The invention is further described with reference to the following figures and examples.
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the invention. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments according to the invention. As used herein, the singular forms "a", "an", and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
The embodiments and features of the embodiments of the present invention may be combined with each other without conflict.
Example 1:
as shown in fig. 1, an embodiment 1 of the present invention provides an intelligent plant management robot based on UWB navigation and visual identification, including: differential wheel 1, universal wheel 2, bottom plate 3, control terminal 4, arm shell 5, arm 6 and visual identification module 7, differential wheel 1 and universal wheel 2 are installed in the downside of bottom plate, mainly play the effect of removal and turning to.
The control terminal 4 is arranged on the upper part of the bottom plate, an industrial personal computer, an STM32 series single chip microcomputer and other basic electric control elements are arranged in the control terminal 4, the mechanical arm 6 is arranged on the bottom plate 3 in the mechanical arm shell 5, and the mechanical arm 6 can retract into the mechanical arm shell 5 when not working; meanwhile, a liquid box is arranged in the mechanical arm shell 5 and can be used for placing water sources or plant nutrient solution; the front end of the mechanical arm 6 is a watering sprinkler which can better and evenly spray when watering the plants.
The mechanical arm is a telescopic mechanical arm, and when the system is not in an irrigation state, the mechanical arm retracts into the mechanical arm shell, so that the space occupation is reduced; when the mechanical arm carries out irrigation tasks, the mechanical arm can extend out of the shell to irrigate the plants.
As shown in fig. 2, the control terminal of the robot is located in a control cabinet on a bottom plate, and the robot is mainly composed of an industrial personal computer, an STM32 series single chip microcomputer and a UWB positioning module, wherein the industrial personal computer is mainly used for controlling mechanical arms and processing positioning, vision and other functions; and the STM32 series single chip microcomputer is mainly used for receiving data transmitted by an external signal transmitter and receiving various switching values.
In other embodiments, eight ultrasonic ranging sensors are mounted around the chassis 3, two on each side.
Example 2:
the embodiment 2 of the invention provides an operation method of an intelligent plant management robot based on UWB navigation and visual identification, which comprises the following steps:
as shown in fig. 3, in order to overcome the problems of non-line-of-sight errors, random errors, and the like, in the embodiment, a median mean filtering method is adopted to perform filtering processing on UWB observation distances, and then distance quantities are determined, if the number of the distance observation quantities does not satisfy the number of the input quantities of the taylor positioning algorithm, the continuous observation is returned, and on the contrary, the position information is solved through the taylor positioning algorithm. Therefore, the positioning precision of the robot in a UWB positioning mode can be effectively guaranteed, and finally the robot can autonomously move to the position near the plant to be irrigated after receiving the UWB signals sent by the wireless information transmission device.
The Taylor positioning algorithm is a recursive weighted least square method, a rough positioning value is needed for the first iteration, and then a local Least Square (LS) of a measurement error is solved for each iteration, so that the position solution of a target to be measured is corrected.
The choice of initial values in the Taylor positioning algorithm is very important,
Figure BDA0003760248330000081
incorrect selection of the initial values may result in non-convergence of the results for each iteration. Taking a three-dimensional scene as an example, let the coordinates of the mobile tag be (x, y, z), and the initial coordinates be
Figure BDA0003760248330000082
Δ x, Δ y, and Δ z represent error amounts, there are
Figure BDA0003760248330000083
Three dimensional situation has d n 2 =(xn-x) 2 +(yn-y) 2 +(zn-z) 2 In the formula, (xn, yn, zn) is a coordinate value of the base station n, dn represents the measurement distance between the target to be measured and the base station n, and the measurement distance is expanded by the Taylor series and obtained by neglecting the component of more than two orders:
Figure BDA0003760248330000084
in the formula (I), the compound is shown in the specification,
Figure BDA0003760248330000091
Figure BDA0003760248330000092
in the formula (x) 0 ,y 0 ,z 0 ) And calculating error quantities delta x, delta y and delta z after each iteration as the last iteration result, updating coordinate values and performing the next iteration. The iteration result when | Δ x | + | Δ y | + | Δ z | < K is the final positioning result, where K is a preset threshold, and the value of K is generally selected to be 10 -5 . In the embodiment, a neighbor method is adopted to select an initial value, that is, the coordinate of the base station closest to the base station is used as the initial value.
The vision identification mode of the robot adopts a two-dimensional code positioning identification mode. When the robot is close to the plant, the visual recognition device of the robot can recognize the two-dimensional code, and the two-dimensional code can generate a three-dimensional coordinate system. At the same time the robot also has its own base coordinate system and then the relative coordinates of the transmitters, and the relative coordinates that need to be moved to position, can be determined.
As shown in fig. 4 and 5, in the target detection method based on April Tag, detection of a characteristic point line (such as an anchor point of a QRCode, a right-angle side of a Data Matrix, etc.) having a specific physical scale relationship is involved in a detection process, and a position and a posture of a two-dimensional code in a camera coordinate system can be estimated by combining internal parameters of a camera itself. The method further improves the detection speed and robustness of April Tag.
In the target detection method based on April Tag, the detection of a characteristic point line (such as an anchor point of QRCode, a right-angle side of Data Matrix and the like) with a specific physical scale relationship is involved in the detection process, and the position and the posture of the two-dimensional code in a camera coordinate system can be estimated by combining the internal parameters of the camera. The method further improves the detection speed and robustness of April Tag.
The April Tag consists of black and white alternating blocks, the ID information of the April Tag is judged by detecting the matching of a key corner point in the Tag and a Tag library, and the high-precision six-degree-of-freedom relative posture and the relative position between the April Tag and a camera can be estimated by combining internal parameters of the camera. The basic detection flow of April Tag is shown in fig. 4, where the conventional thresholding method is to find a global threshold for the entire image, and the present embodiment uses an adaptive thresholding method, whose idea is to find the minimum and maximum values of the region around each pixel. Dividing the image into blocks of 4 × 4 pixels and calculating an extremum within each block, then assigning each pixel a value of white or black, using the average value (max + min)/2 as a threshold; therefore, the threshold segmentation is calculated more quickly, and the detection speed and the robustness of the April Tag are further improved.
As shown in FIG. 5, the center of the camera is C O, inserting the wireless information transmission device into the flowerpot needing to manage the plant, the information transmission device will generate a two-dimensional code of April Tag, and the center of the two-dimensional code is A And O, calculating the matrix of the formula (1) to obtain the pose of the center of the two-dimensional code relative to the camera:
Figure BDA0003760248330000101
and the center of the mechanical arm is M O, the pose of the camera center to the robot arm center can now be derived from equation (2):
Figure BDA0003760248330000102
the center of the target point needing irrigation is T O, the pose of the center of the two-dimensional code relative to the label center of the two-dimensional code can be obtained by formula (3):
Figure BDA0003760248330000111
finally, the poses of the centers of the target points to be irrigated relative to the center of the mechanical arm can be obtained through the three-way simultaneous.
Figure BDA0003760248330000112
After the relative pose of the mechanical arm and the target pouring point is calculated, linear interpolation is carried out on the motion of the mechanical arm, so that the tail end of the machine pours the target point along a straight line, in the process of advancing along the straight line, a sliding mode control strategy is adopted, so that the mechanical arm can track the target track in real time, and the closed-loop control system shown in fig. 6 is established, and is specifically controlled as shown in fig. 7.
The sliding mode controller inputs the error between the target pose and the actual pose and outputs the angle increment which can continuously eliminate the deviation between the actual pose and the target pose. The sliding mode controller is designed as follows:
designing a slip form surface:
s=ce (5)
e=x d -x (6)
(1) The design approach rate is the constant velocity approach rate:
Figure BDA0003760248330000113
(2) Deriving a controller output u:
Figure BDA0003760248330000114
Figure BDA0003760248330000121
Figure BDA0003760248330000122
Figure BDA0003760248330000123
angular increment output by synovial controller
Figure BDA0003760248330000124
The joint rotation angle of the mechanical arm can be controlled, the error between the actual motion track and the target track is continuously reduced, and finally the motion of the mechanical arm is more accurate.
Example 3:
embodiment 3 of the present invention provides an intelligent plant management system based on UWB navigation and visual identification, including an intelligent plant management robot based on UWB navigation and visual identification and a wireless information transmission device described in embodiment 1, where the wireless information transmission device, as shown in fig. 8, includes:
the device comprises a UWB radio frequency module 8, a device shell 9, an illumination sensor 10, a two-dimensional code display module 11, a soil humidity sensing element 12, a temperature sensing element 13 and a PH value detection element 14, wherein the UWB radio frequency module, the illumination sensor and the two-dimensional code display module are all arranged on the device shell; the bottom of the device shell is provided with an insert used for inserting soil at the position of a plant, and the soil humidity sensing element, the temperature sensing element and the PH value detection element are all arranged on the insert.
After an external base station is established, the wireless information transmission device is inserted into flowers to be managed, the position of the emitter and the position of the robot can be determined, so that the robot can be guided to reach the position near the flowers to be watered, then the two-dimensional code generated by the two-dimensional code display module can be identified through the visual identification module of the robot, the relative position of the two-dimensional code from a mechanical arm coordinate system is calculated through the center of the two-dimensional code, and the mechanical arm is controlled to move to the watering position for watering or supplying liquid fertilizer; meanwhile, the information of the plant can be sent to the robot through the soil humidity sensing element 12, the temperature sensing element 13 and the PH value detection element 14, so that various states of the plant can be known.
The specific operation method of the intelligent plant management robot is the same as that of embodiment 2, and is not described herein again.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. The utility model provides an intelligence plant management robot based on UWB navigation and visual identification which characterized in that:
the method comprises the following steps:
the device comprises a mechanical arm, a shell, a bottom plate, a wheel set, a visual identification module and a control terminal;
the control terminal and the shell are both arranged at the upper part of the bottom plate, a liquid storage box body is arranged in the shell, a spray head is arranged at the first end of the mechanical arm, and the spray head is communicated with the liquid storage box body through a pipeline;
the second end of the mechanical arm is fixed with the position of the upper part of the bottom plate within the range of the shell, the shell is provided with an opening for the mechanical arm to stretch and exit, and the wheel set is arranged at the bottom of the bottom plate;
the visual recognition module is arranged on the upper portion of the bottom plate and is in communication connection with the control terminal, the control terminal carries out primary plant positioning according to received UWB observation signals of the plant position, robot movement control is carried out according to primary positioning results, plant repositioning is carried out according to identification code information of the plant position, and mechanical arm movement control is carried out according to repositioning results so as to carry out liquid spraying control.
2. An intelligent plant management robot based on UWB navigation and visual recognition as defined in claim 1, wherein:
the wheel set comprises at least one group of differential wheels and at least two groups of universal wheels, the differential wheel set is arranged between the universal wheel sets, and the axes of the differential wheel sets are parallel to the axes of the universal wheel sets.
3. An operation method of an intelligent plant management robot based on UWB navigation and visual recognition of claim 1 or 2, characterized in that:
and filtering the UWB observation distance by adopting a median mean filtering method, judging the distance quantity, returning to continue observation if the quantity of the distance observation quantity does not meet the quantity of the input quantity of the taylor positioning algorithm, and otherwise, solving the position information by the taylor positioning algorithm.
4. A method of operation as claimed in claim 3, characterized by:
the identification of the identification code is carried out by an April Tag-based target detection method, which comprises the following steps:
in April Tag target detection, a threshold segmentation threshold determination method comprises the following steps:
find the minimum and maximum values of the area around each pixel, divide the image into blocks of 4 × 4 pixels and calculate the extremum value within each block, then assign a value of white or black to each pixel, using the average value (max + min)/2 as the threshold.
5. The method of operation of claim 4, wherein:
acquiring the center coordinate of the camera, and combining the first coordinate transformation matrix to obtain the pose of the center of the two-dimensional code relative to the camera;
acquiring the central coordinate of the mechanical arm, and combining a second coordinate transformation matrix to obtain the position and the posture of the camera center relative to the mechanical arm center;
acquiring the central coordinate of a target point to be irrigated, and combining a third coordinate transformation matrix to obtain the position and posture of the center of the target point relative to the two-dimensional code label center;
and integrating the position and pose of the center of the two-dimensional code relative to the camera, the position and pose of the center of the camera relative to the center of the mechanical arm and the position and pose of the center of the target point relative to the center of the two-dimensional code label to obtain the position and pose of the center of the target point to be irrigated relative to the center of the mechanical arm.
6. The method of operation of claim 5, wherein:
according to the obtained pose of the center of the target point to be watered relative to the center of the mechanical arm, linear interpolation is carried out on the motion of the mechanical arm, so that the tail end of the mechanical arm waters the target point along a straight line, and in the process of advancing along the straight line, a sliding mode control strategy is adopted, so that the mechanical arm can track the target track in real time.
7. The utility model provides an intelligence plant management system based on UWB navigation and visual identification which characterized in that:
including the intelligent plant management robot based on UWB navigation and visual recognition and the wireless information transmission apparatus of claim 1 or 2, the wireless information transmission apparatus comprising:
the device comprises a UWB radio frequency module, a device shell, an illumination sensor, a two-dimensional code display module, a soil humidity sensing element, a temperature sensing element and a PH value detection element, wherein the UWB radio frequency module, the illumination sensor and the two-dimensional code display module are all arranged on the device shell; the bottom of the device shell is provided with an insert used for inserting soil at the position of a plant, and the soil humidity sensing element, the temperature sensing element and the PH value detection element are all arranged on the insert.
8. The intelligent UWB navigation and visual identification based plant management system of claim 7, wherein:
filtering the UWB observation distance by adopting a median mean filtering method, judging the distance quantity, returning to continue observation if the quantity of the distance observation quantity does not meet the quantity of the input quantity of the taylor positioning algorithm, and otherwise, solving the position information by the taylor positioning algorithm;
alternatively, the first and second electrodes may be,
the identification code identification is carried out by an April Tag-based target detection method, and in April Tag target detection, a threshold value determination method for threshold value segmentation comprises the following steps: find the minimum and maximum values of the area around each pixel, divide the image into blocks of 4 × 4 pixels and calculate the extremum value within each block, then assign a value of white or black to each pixel, using the average value (max + min)/2 as the threshold.
9. The intelligent UWB navigation and visual identification based plant management system of claim 7, wherein:
acquiring the center coordinate of the camera, and combining the first coordinate transformation matrix to obtain the pose of the center of the two-dimensional code relative to the camera;
acquiring the central coordinate of the mechanical arm, and combining a second coordinate transformation matrix to obtain the position and the posture of the camera center relative to the mechanical arm center;
acquiring the central coordinate of a target point to be irrigated, and combining a third coordinate transformation matrix to obtain the position and the posture of the center of the target point relative to the center of the two-dimensional code label;
and synthesizing the pose of the center of the two-dimensional code relative to the camera, the pose of the center of the camera relative to the center of the mechanical arm and the pose of the center of the target point relative to the center of the two-dimensional code label to obtain the pose of the center of the target point to be irrigated relative to the center of the mechanical arm.
10. The intelligent UWB navigation and visual identification based plant management system of claim 9, wherein:
according to the obtained pose of the center of the target point to be watered relative to the center of the mechanical arm, linear interpolation is carried out on the motion of the mechanical arm, so that the tail end of the mechanical arm waters the target point along a straight line, and in the process of advancing along the straight line, a sliding mode control strategy is adopted, so that the mechanical arm can track the target track in real time.
CN202210868189.0A 2022-07-22 2022-07-22 Intelligent plant management robot and system based on UWB navigation and visual identification Pending CN115157257A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210868189.0A CN115157257A (en) 2022-07-22 2022-07-22 Intelligent plant management robot and system based on UWB navigation and visual identification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210868189.0A CN115157257A (en) 2022-07-22 2022-07-22 Intelligent plant management robot and system based on UWB navigation and visual identification

Publications (1)

Publication Number Publication Date
CN115157257A true CN115157257A (en) 2022-10-11

Family

ID=83497504

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210868189.0A Pending CN115157257A (en) 2022-07-22 2022-07-22 Intelligent plant management robot and system based on UWB navigation and visual identification

Country Status (1)

Country Link
CN (1) CN115157257A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170248971A1 (en) * 2014-11-12 2017-08-31 SZ DJI Technology Co., Ltd. Method for detecting target object, detection apparatus and robot
CN108271765A (en) * 2018-01-05 2018-07-13 湘潭大学 A kind of multi-functional pawl head monitoring environment robot and its plants identification method
CN109895121A (en) * 2017-12-07 2019-06-18 泰科电子(上海)有限公司 Mechanical arm control system and method
CN111381209A (en) * 2018-12-29 2020-07-07 深圳市优必选科技有限公司 Distance measurement positioning method and device
CN113160075A (en) * 2021-03-30 2021-07-23 武汉数字化设计与制造创新中心有限公司 Processing method and system for Apriltag visual positioning, wall-climbing robot and storage medium
WO2021207787A1 (en) * 2020-04-14 2021-10-21 Iotrees Pty Ltd Mobile apparatus for treating plants
CN114266326A (en) * 2022-01-21 2022-04-01 北京微链道爱科技有限公司 Object identification method based on robot binocular three-dimensional vision

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170248971A1 (en) * 2014-11-12 2017-08-31 SZ DJI Technology Co., Ltd. Method for detecting target object, detection apparatus and robot
CN109895121A (en) * 2017-12-07 2019-06-18 泰科电子(上海)有限公司 Mechanical arm control system and method
CN108271765A (en) * 2018-01-05 2018-07-13 湘潭大学 A kind of multi-functional pawl head monitoring environment robot and its plants identification method
CN111381209A (en) * 2018-12-29 2020-07-07 深圳市优必选科技有限公司 Distance measurement positioning method and device
WO2021207787A1 (en) * 2020-04-14 2021-10-21 Iotrees Pty Ltd Mobile apparatus for treating plants
CN113160075A (en) * 2021-03-30 2021-07-23 武汉数字化设计与制造创新中心有限公司 Processing method and system for Apriltag visual positioning, wall-climbing robot and storage medium
CN114266326A (en) * 2022-01-21 2022-04-01 北京微链道爱科技有限公司 Object identification method based on robot binocular three-dimensional vision

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
MIGLIORELLI, M.; CIATTAGLIA, M.; MARROCCO, G: "Efficient modeling of ultra-wideband antennas by two-dimensional Hermite processing", 《2005 IEEE ANTENNAS AND PROPAGATION SOCIETY INTERNATIONAL SYMPOSIUM》, 1 January 2005 (2005-01-01), pages 710 *
袁鹏;周军;杨子兵;吴迪;皇攀凌: "AGV路径规划与偏差校正研究", 《现代制造工程》, 18 April 2021 (2021-04-18), pages 26 - 32 *

Similar Documents

Publication Publication Date Title
EP2884364B1 (en) Autonomous gardening vehicle with camera
CN105425791B (en) A kind of the group robot control system and method for view-based access control model positioning
CA2922711C (en) Sprinkling control system
CN109566359B (en) Flower watering robot system based on Beidou satellite positioning
EP3384243A1 (en) Path planning for area coverage
Masuzawa et al. Development of a mobile robot for harvest support in greenhouse horticulture—Person following and mapping
Edan Design of an autonomous agricultural robot
CN111326003A (en) Intelligent car tracking driving method, system and storage medium
WO2020182146A1 (en) Robotic system, mapping system and method for robotic navigation map
CN106327561A (en) Intelligent spraying method and system based on machine vision technology
CN107949768A (en) Vehicle location estimating device, vehicle location presumption method
CN109708644A (en) Mobile Robotics Navigation method, apparatus and mobile robot
CN109753075B (en) Agriculture and forestry park robot navigation method based on vision
CN107291072B (en) Mobile robot path planning system and method
KR20190080208A (en) Crop planting system
CN106527438A (en) Robot navigation control method and device
Choi et al. Localization and map-building of mobile robot based on RFID sensor fusion system
CN111348161A (en) Resource environment monitoring system applied to marine ranch and operation method thereof
CN112965481A (en) Orchard operation robot unmanned driving method based on point cloud map
CN206348666U (en) Unmanned plane is intelligently trimmed in a kind of 3D gardening moulding
CN110702134A (en) Garage autonomous navigation device and method based on SLAM technology
CN106873583A (en) Autonomous type implement
CN115157257A (en) Intelligent plant management robot and system based on UWB navigation and visual identification
CN113934225A (en) Plant protection unmanned aerial vehicle route planning method based on full coverage path
CN114564008A (en) Mobile robot path planning method based on improved A-Star algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination