CN113093743A - Navigation control method based on virtual radar model and deep neural network - Google Patents

Navigation control method based on virtual radar model and deep neural network Download PDF

Info

Publication number
CN113093743A
CN113093743A CN202110342701.3A CN202110342701A CN113093743A CN 113093743 A CN113093743 A CN 113093743A CN 202110342701 A CN202110342701 A CN 202110342701A CN 113093743 A CN113093743 A CN 113093743A
Authority
CN
China
Prior art keywords
virtual
radar model
path
vehicle
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110342701.3A
Other languages
Chinese (zh)
Other versions
CN113093743B (en
Inventor
刘志杰
任志刚
杨福增
刘恒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwest A&F University
Original Assignee
Northwest A&F University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwest A&F University filed Critical Northwest A&F University
Priority to CN202110342701.3A priority Critical patent/CN113093743B/en
Publication of CN113093743A publication Critical patent/CN113093743A/en
Application granted granted Critical
Publication of CN113093743B publication Critical patent/CN113093743B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention relates to a navigation control method based on a virtual radar model and a deep neural network, which specifically comprises the following steps: s1, initializing key parameters of the virtual radar model; s2, acquiring the pose and motion state data of the vehicle at the current moment through a navigation sensor; s3, generating virtual path boundaries by shifting towards two sides according to the planned straight path, scanning the virtual path boundaries, and generating a virtual radar model detection diagram; s4, inputting the generated virtual radar model detection graph into the trained deep neural network to generate a driving instruction, and realizing the tracking of the vehicle on the planned path; the navigation control method based on the virtual radar model and the deep neural network has the advantages of good stability, high response speed, high precision and good path tracking effect in the control process, and can well control the steering of the vehicle and reduce the path tracking error.

Description

Navigation control method based on virtual radar model and deep neural network
Technical Field
The invention relates to the field of orchard agricultural machinery navigation, in particular to an orchard agricultural machinery navigation control method based on a virtual radar model and a depth neural network, and belongs to the field of orchard machinery control.
Background
China is the first major fruit producing country in the world, and the fruit industry is one of the important industries for increasing the income in rural areas. With the urbanization development and the aging trend of the population in China, the labor force suitable for working in agricultural production is less and less. In the face of more and more severe industrial situation, the scale development and the standardized and mechanized management of the fruit planting industry will be the inevitable trend. Meanwhile, according to the ' national science and technology innovation planning ', the ' efficient, safe and ecological modern agricultural technology is developed in five years in the future, and key technologies and products such as agricultural biological manufacturing, agricultural intelligent production, intelligent agricultural equipment, facility agriculture and the like are developed in an important way. Therefore, the development of the research on intelligent orchard mechanical equipment has important significance on the comprehensive development of agricultural modernization in China.
In the traditional navigation method, the track transverse deviation and the course deviation of the robot motion are used as input quantities, and a steering instruction of the robot is output after the processing of a path tracking algorithm, so that the robot is controlled to move along an expected path. Although the lateral deviation and the heading deviation can reflect the deviation of the vehicle relative to the path, the shape change and the trend change of the path reflect weaker conditions.
Disclosure of Invention
The invention aims to provide a navigation method based on a virtual radar model and a deep neural network technology, which overcomes the defect of large error of path tracking effect based on fuzzy control and can accurately guide orchard vehicles to run along a set path in an orchard; a navigation system based on a virtual radar model and a deep neural network is provided.
The technical scheme adopted by the invention for solving the technical problem is as follows:
the navigation control method based on the virtual radar model and the deep neural network is characterized by comprising the following steps:
s1, initializing virtual radar model parameters;
the virtual radar model refers to an abstract radar of a hypothetical special scanning path boundary, and defines the maximum detection distance l of the virtual radar modelmaxAngular resolution alpharWherein the maximum detection distance only has an effect on the path virtual boundary, the angular resolution alpharDetermining the number of scans in 360 degrees around each vehicle position pair, and if the total number is n, determining that
Figure BDA0002999732880000011
S2, acquiring the pose and motion state data of the vehicle at the current moment through a navigation sensor;
obtaining vehicle pose information, including position information (x)p,yp) And heading angle thetapPosition and attitude information of vehicle P (x)p,ypp) The description is given.
S3, generating virtual path boundaries by shifting towards two sides according to the planned straight path, scanning the virtual path boundaries, and generating a virtual radar model detection diagram;
and determining the starting point and the end point of the planned path according to the actual orchard planting environment. Function f describing planned path0(x, y) 0, and a virtual path boundary function f1(x,y)=0、f2Each of (x, y) ═ 0 is as follows:
A0x+B0y+C0=0 (2)
A1x+B1y+C1=0 (3)
A2x+B2y+C2=0 (4)
wherein A is0=ys-ye;B0=xe-xs;C0=xs×ye-xe×ys;A2=A1=A0;B2=B1=B0
Figure BDA0002999732880000021
The process of calculating and generating the virtual radar model is as follows:
Figure BDA0002999732880000022
wherein l1,2 (i)The scanning distance of the current position of the vehicle to the virtual path boundary of the planned path is represented, and subscripts 1 and 2 respectively represent two sides; i represents the ith of the total number n of 360 ° scans of the radar to the surrounding in step s 1; alpha is alpharFor the angle of the virtual radar model determined in s1Resolution ratio; a. the1,2、B1,2、C1,2Is a parameter of the boundary function of the virtual paths on both sides; x is the number ofp、yp、θpThe vehicle pose information is the vehicle pose information under the rectangular plane coordinate system.
When A is1,2×cos(θp+i×αr)+B1,2×sin(θp+i×αr) 0 or detection distance l1,2< 0 and the detection range exceeds the maximum detection range l of the virtual radar modelmaxWhen l is turned on1,2 (i)=lmaxIf the ith scan results on both sides, the detection distance l of the virtual radar model(i)=min(l1,l2) For the detection distance l(i)Use of
Figure BDA0002999732880000023
And normalizing to obtain the ith scanning result of the virtual radar model.
This step is repeated until the number of scans i is equal to the total number n specified in step S1, and the virtual radar model probe map is generated.
Specifically, several representative results of the virtual radar model detection graph in step S3 are shown in fig. 3, where a green graph in the graph is composed of 4 line segments, where a straight line segment is a detected virtual path boundary, and an arc segment is a maximum detection distance of a radar; (a) is the case with neither lateral nor heading deviation; (b) the condition is that only course deviation does not have transverse deviation; (c) the case where there is only lateral deviation but the lateral deviation does not exceed the virtual path boundary set in the step S3 and there is no heading deviation; (d) if there is a lateral deviation but the lateral deviation does not exceed the virtual path boundary set in step S3, and there is a heading deviation; (e) in the case that there is a lateral deviation and the lateral deviation exceeds the virtual path boundary set in the step S3, but there is no course deviation; (f) it is the case that there is a lateral deviation and the lateral deviation exceeds the virtual path boundary set in the step S3 and there is a heading deviation.
S4, inputting the generated virtual radar model detection graph into the trained deep neural network to generate a driving instruction, and realizing the tracking of the vehicle on the planned path;
the deep neural network is built by adopting Python3.7 and Keras modules, a simple Sequential model is adopted, the deep neural network comprises an input layer, two fully-connected hidden layers and an output layer, and a data set is divided into a training set, a verification set and a test set according to the ratio of 6:2: 2. According to the size of the vehicle and a path planning method, the transverse deviation is divided into a plurality of intervals according to the distance, and steering is carried out according to a steering rule corresponding to the position information of the vehicle per se in each interval, so that the planned path is tracked.
The invention provides an orchard navigation control method based on a virtual radar model and a deep neural network, and by adopting the technical scheme, the orchard navigation control method has the beneficial effects that: the virtual radar model can effectively describe the relative position of the orchard robot and an ideal path and the shape and trend of the current path section; the controller based on the virtual radar model and the deep neural network has good stability, fast response speed and high precision in the control process, the path tracking effect is good, and the vehicle steering can be well controlled to reduce the path tracking error.
The method is written at the angle of readers, so that the readers can clearly understand the method, the implementation steps of the method are detailed, and the method has better reference value in the actual control process.
Drawings
The invention is further described with reference to the following figures and detailed description:
FIG. 1 is a flow chart of the present invention;
FIG. 2 is a schematic diagram of virtual radar scanning in step S3 according to the present invention;
FIG. 3 shows several representative results of the virtual radar detection map in step S3 according to the present invention.
Detailed Description
The present invention will be described in detail with reference to the accompanying drawings. Fig. 1 is a simplified schematic diagram illustrating only the basic structure of the present invention in a schematic manner, and thus shows only the constitution related to the present invention.
The invention discloses an orchard navigation control method based on a virtual radar model and a deep neural network, which is used for guiding an orchard vehicle to run in an orchard path, and comprises the following specific steps as shown in figure 1:
step S1: defining the maximum detection distance l of the virtual radar modelmaxAngular resolution alpharThe angular resolution determines the number of scans within 360 ° around each vehicle position pair, and the total number of scans is calculated by equation (1), assuming the total number as n.
Step S2: starting point S (x) for planning straight paths,ys) End point E (x)e,ye). Specifically, the starting point and the end point of the straight line path are determined according to the actual orchard planting environment.
Step S3: obtaining vehicle pose information, including position information (x), by a GNSS navigation devicep,yp) And heading angle thetapPosition and attitude information of vehicle P (x)p,ypp) A description is given.
Step S4: according to the straight line path starting point S (x) in step S2s,ys) And end point E (x)e,ye) Using function f0(x, y) 0, respectively shifting the planned track to both sides by a distance
Figure BDA0002999732880000041
Generating virtual path boundaries and using the function f1(x, y) is 0 and f2(x, y) ═ 0, as shown in fig. 2.
In particular, the function f describing the planned path therein0(x, y) 0, and a virtual path boundary function f1(x,y)=0、 f2The expressions (2), (3) and (4) indicate (x, y) ═ 0, respectively.
Step S5: the pose information P (x) of the vehicle according to step S3p,ypp) And S1, scanning the virtual path boundary of the step S4 by the virtual radar model parameters defined by the step S, and generating a virtual radar model detection map by calculation.
Specifically, the process in which the virtual radar model is generated by calculation is shown in equation (5).
Wherein l1,2 (i)The scanning distance of the current position of the vehicle to the virtual path boundary of the planned path is represented, and subscripts 1 and 2 respectively represent two sides; i represents the ith of the total number n of 360 ° scans of the radar to the surrounding in step S1; alpha is alpharAn angular resolution for the virtual radar model determined in S1; a. the1,2、B1,2、C1,2Specifically, the parameters of the boundary function of the two virtual paths in step S5; x is the number ofp、yp、θpThe pose information of the vehicle in the rectangular plane coordinate system is the vehicle shown in step S4.
When A is1,2×cos(θp+i×αr)+B1,2×sin(θp+i×αr) 0 or detection distance l1,2< 0 and the detection range exceeds the maximum detection range l of the virtual radar modelmaxWhen l is turned on1,2 (i)=lmaxIf the ith scan results on both sides, the detection distance l of the virtual radar model(i)=min(l1,l2) For the detection distance l(i)Use of
Figure BDA0002999732880000042
And carrying out normalization to obtain the ith scanning result of the virtual radar.
This step is repeated until the number of scans i is equal to the total number n specified in step S1, and the virtual radar model probe map is generated.
The representative results of the virtual radar model detection graph are shown in the attached figure 3, a green graph in the graph is composed of 4 line segments, wherein a straight line segment is a detected virtual path boundary, and a circular arc segment is the maximum detection distance of a radar; (a) is the case with neither lateral nor heading deviation; (b) the condition is that only course deviation does not have transverse deviation; (c) the case that only the lateral deviation exists but the lateral deviation does not exceed the virtual path boundary set in the step S5 and there is no course deviation; (d) if there is a lateral deviation but the lateral deviation does not exceed the virtual path boundary set in step S5 and there is a heading deviation; (e) if there is a lateral deviation and the lateral deviation exceeds the virtual path boundary set in the step S5, but there is no heading deviation; (f) it is the case that there is a lateral deviation and the lateral deviation exceeds the virtual path boundary set in the step S5 and there is a heading deviation.
Step S6: and (4) building a neural network controller and training the neural network.
The method comprises the steps of building by adopting Python3.7 and Keras modules, designing a training data set by adopting a simple Sequential model, wherein the network structure comprises an input layer, two fully-connected hidden layers and an output layer, dividing the data set into a training set, a verification set and a test set according to the ratio of 6:2:2, and the training result shows that the precision of the deep neural network controller in the test set can reach 97.82%.
Step S7: the virtual radar model probe map generated in step S5 is input as the deep neural network controller in step S6, and a steering control command for the vehicle is output.
Step S8: and (5) outputting a control command according to the deep neural network controller of the step S7 to perform movement control on the orchard vehicle.
Step S9: repeating the step S3, judging whether the vehicle reaches the end point of the linear path, and if so, stopping navigation; if not, continuing to the steps S5, S7 and S8 until the vehicle runs to the end point E (x) of the straight pathe,ye)。
The mechanism and operation principle of the present invention are described in the above embodiments, the present invention is not limited to the above embodiments, and any modification, replacement, and improvement made within the spirit and principle of the present invention should be included within the protection scope of the present invention based on the above description.

Claims (5)

1. An orchard navigation control method based on a virtual radar model and a deep neural network is characterized by comprising the following steps:
s1, initializing key parameters of the virtual radar model;
s2, acquiring the pose and motion state data of the vehicle at the current moment through a navigation sensor;
s3, generating virtual path boundaries by shifting towards two sides according to the planned straight path, scanning the virtual path boundaries, and generating a virtual radar model detection diagram;
and S4, inputting the generated virtual radar model detection graph into the trained deep neural network to generate a driving instruction, and realizing the tracking of the vehicle on the planned path.
2. The orchard navigation control method based on virtual radar model and deep neural network technology as claimed in claim 1, wherein in step S1, the maximum detection distance l of the virtual radar model is definedmaxAngular resolution alpharWherein the maximum detection distance only has an effect on the path virtual boundary, the angular resolution alpharDetermining the number of scans in 360 degrees around each vehicle position pair, and if the total number is n, determining that the number of scans is less than the total number
Figure FDA0002999732870000011
3. The orchard navigation control method based on virtual radar model and deep neural network technology of claim 1, wherein in step S2, vehicle pose information is obtained, including position information (x)p,yp) And heading angle thetapPosition and attitude information of vehicle P (x)p,ypp) A description is given.
4. The orchard navigation control method based on virtual radar model and deep neural network technology according to claim 1, wherein in step S3, a planned path starting point and an end point are determined according to an actual orchard planting environment; function f describing planned path0(x, y) 0, and a virtual path boundary function f1(x,y)=0、f2Each of (x, y) ═ 0 is as follows:
A0x+B0y+C0=0 (2)
A1x+B1y+C1=0 (3)
A2x+B2y+C2=0 (4)
wherein A is0=ys-ye;B0=xe-xs;C0=xs×ye-xe×ys;A2=A1=A0 B2=B1=B0
Figure FDA0002999732870000012
The process of calculating and generating the virtual radar model is as follows:
Figure FDA0002999732870000013
wherein l1,2 (i)The scanning distance of the current position of the vehicle to the virtual path boundary of the planned path is represented, and subscripts 1 and 2 respectively represent two sides; i represents the ith of the total number n of 360 ° scans of the radar to the surrounding in step S1; alpha is alpharAn angular resolution for the virtual radar model determined in S1; a. the1,2、B1,2、C1,2Is a parameter of the boundary function of the virtual paths on both sides; x is the number ofp、yp、θpThe vehicle pose information is the vehicle pose information under a plane rectangular coordinate system;
when A is1,2×cos(θp+i×αr)+B1,2×sin(θp+i×αr) 0 or detection distance l1,2< 0 and the detection range exceeds the maximum detection range l of the virtual radar modelmaxWhen l is turned on1,2 (i)=lmaxIf the ith scanning has results on both sides, the detection distance l of the virtual radar model(i)=min(l1,l2) For the detection distance l(i)Use of
Figure FDA0002999732870000021
And normalizing to obtain the ith scanning result of the virtual radar model.
This step is repeated until the number of scans i is equal to the total number n specified in step S1, and the virtual radar model probe map is generated.
5. The orchard navigation control method based on the virtual radar model and the deep neural network is characterized in that in the step S4, the deep neural network is built by adopting Python3.7 and Keras modules, a simple Sequential model is adopted, the deep neural network comprises an input layer, two fully-connected hidden layers and an output layer, a data set is divided into a training set, a verification set and a test set according to the ratio of 6:2: 2; according to the size of the vehicle and a path planning method, the transverse deviation is divided into a plurality of intervals according to the distance, and steering is carried out according to a steering rule corresponding to the position information of the vehicle per se in each interval, so that the planned path is tracked.
CN202110342701.3A 2021-03-30 2021-03-30 Navigation control method based on virtual radar model and deep neural network Active CN113093743B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110342701.3A CN113093743B (en) 2021-03-30 2021-03-30 Navigation control method based on virtual radar model and deep neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110342701.3A CN113093743B (en) 2021-03-30 2021-03-30 Navigation control method based on virtual radar model and deep neural network

Publications (2)

Publication Number Publication Date
CN113093743A true CN113093743A (en) 2021-07-09
CN113093743B CN113093743B (en) 2022-08-30

Family

ID=76671296

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110342701.3A Active CN113093743B (en) 2021-03-30 2021-03-30 Navigation control method based on virtual radar model and deep neural network

Country Status (1)

Country Link
CN (1) CN113093743B (en)

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5548512A (en) * 1994-10-04 1996-08-20 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Autonomous navigation apparatus with neural network for a mobile vehicle
KR101273245B1 (en) * 2013-02-26 2013-06-11 국방과학연구소 Autonomous vehicle system and path decision method for the same
CN105549597A (en) * 2016-02-04 2016-05-04 同济大学 Unmanned vehicle dynamic path programming method based on environment uncertainty
CN107390691A (en) * 2017-07-28 2017-11-24 广东嘉腾机器人自动化有限公司 A kind of AGV path following methods
CN107562060A (en) * 2017-10-17 2018-01-09 南京农业大学 A kind of crawler-type traveling united reaper navigation system
KR20180056322A (en) * 2016-11-18 2018-05-28 국민대학교산학협력단 Method for producing virtual lane based on short-range radar sensor
CN108919792A (en) * 2018-05-30 2018-11-30 华南农业大学 A kind of automated navigation system path planning control method
WO2019023628A1 (en) * 2017-07-27 2019-01-31 Waymo Llc Neural networks for vehicle trajectory planning
CN109717175A (en) * 2019-03-06 2019-05-07 山东交通学院 Orchard intelligence self-travel type spraying system and its control method
US20190250624A1 (en) * 2018-02-15 2019-08-15 Wipro Limited Method and system for real-time generation of reference navigation path for navigation of vehicle
CN110275536A (en) * 2019-06-16 2019-09-24 宁波祈禧智能科技股份有限公司 A kind of dual-beam fence and its recognition methods
CN110426045A (en) * 2019-08-12 2019-11-08 西北农林科技大学 A kind of farmland spray machine device people vision guided navigation parameter acquiring method
CN110851966A (en) * 2019-10-30 2020-02-28 同济大学 Digital twin model correction method based on deep neural network
US20200133274A1 (en) * 2018-10-17 2020-04-30 Mando Corporation Control method of determining virtual vehicle boundary and vehicle providing the control method
CN111201896A (en) * 2019-12-28 2020-05-29 苏州博田自动化技术有限公司 Picking robot based on visual navigation and control method
WO2020131687A2 (en) * 2018-12-17 2020-06-25 Diversey, Inc. Methods and systems for defining virtual boundaries for a robotic device
CN111429716A (en) * 2019-01-08 2020-07-17 威斯通全球技术公司 Method for determining position of own vehicle

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5548512A (en) * 1994-10-04 1996-08-20 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Autonomous navigation apparatus with neural network for a mobile vehicle
KR101273245B1 (en) * 2013-02-26 2013-06-11 국방과학연구소 Autonomous vehicle system and path decision method for the same
CN105549597A (en) * 2016-02-04 2016-05-04 同济大学 Unmanned vehicle dynamic path programming method based on environment uncertainty
KR20180056322A (en) * 2016-11-18 2018-05-28 국민대학교산학협력단 Method for producing virtual lane based on short-range radar sensor
WO2019023628A1 (en) * 2017-07-27 2019-01-31 Waymo Llc Neural networks for vehicle trajectory planning
CN107390691A (en) * 2017-07-28 2017-11-24 广东嘉腾机器人自动化有限公司 A kind of AGV path following methods
CN107562060A (en) * 2017-10-17 2018-01-09 南京农业大学 A kind of crawler-type traveling united reaper navigation system
US20190250624A1 (en) * 2018-02-15 2019-08-15 Wipro Limited Method and system for real-time generation of reference navigation path for navigation of vehicle
CN108919792A (en) * 2018-05-30 2018-11-30 华南农业大学 A kind of automated navigation system path planning control method
US20200133274A1 (en) * 2018-10-17 2020-04-30 Mando Corporation Control method of determining virtual vehicle boundary and vehicle providing the control method
WO2020131687A2 (en) * 2018-12-17 2020-06-25 Diversey, Inc. Methods and systems for defining virtual boundaries for a robotic device
CN111429716A (en) * 2019-01-08 2020-07-17 威斯通全球技术公司 Method for determining position of own vehicle
CN109717175A (en) * 2019-03-06 2019-05-07 山东交通学院 Orchard intelligence self-travel type spraying system and its control method
CN110275536A (en) * 2019-06-16 2019-09-24 宁波祈禧智能科技股份有限公司 A kind of dual-beam fence and its recognition methods
CN110426045A (en) * 2019-08-12 2019-11-08 西北农林科技大学 A kind of farmland spray machine device people vision guided navigation parameter acquiring method
CN110851966A (en) * 2019-10-30 2020-02-28 同济大学 Digital twin model correction method based on deep neural network
CN111201896A (en) * 2019-12-28 2020-05-29 苏州博田自动化技术有限公司 Picking robot based on visual navigation and control method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
崔维等: "基于视觉导航和RBF的移动采摘机器人路径规划研究", 《农机化研究》 *
李文洋: "猕猴桃采摘机器人视觉导航路径生成方法研究", 《中国优秀硕士学位论文全文数据库 农业科技辑》 *
王辉,等: "基于预瞄追踪模型的农机导航路径跟踪控制方法", 《农业工程学报》 *

Also Published As

Publication number Publication date
CN113093743B (en) 2022-08-30

Similar Documents

Publication Publication Date Title
CN114384920B (en) Dynamic obstacle avoidance method based on real-time construction of local grid map
CN108319293B (en) UUV real-time collision avoidance planning method based on LSTM network
CN111695299B (en) Mesoscale vortex trajectory prediction method
EP3336489A1 (en) Method and system for automatically establishing map indoors by mobile robot
CN108334677B (en) UUV real-time collision avoidance planning method based on GRU network
CN108320051B (en) Mobile robot dynamic collision avoidance planning method based on GRU network model
CN105045260A (en) Mobile robot path planning method in unknown dynamic environment
CN113625702B (en) Unmanned vehicle simultaneous path tracking and obstacle avoidance method based on quadratic programming
CN111271071A (en) Shield tunneling machine attitude control method based on fuzzy adaptive neural network
CN109976189A (en) A kind of intelligence naval vessels automatic cruising analog simulation method
CN117606490B (en) Collaborative search path planning method for autonomous underwater vehicle
CN116972854B (en) Agricultural machinery navigation path planning method and system based on GPS positioning
CN111811503B (en) Unscented Kalman filtering fusion positioning method based on ultra wide band and two-dimensional code
CN113093743A (en) Navigation control method based on virtual radar model and deep neural network
CN108459614B (en) UUV real-time collision avoidance planning method based on CW-RNN network
Xie et al. Random patrol path planning for unmanned surface vehicles in shallow waters
CN114489036B (en) Indoor robot navigation control method based on SLAM
CN116628836A (en) Data-driven supercritical airfoil performance prediction method, system and medium
CN115373383A (en) Autonomous obstacle avoidance method and device for garbage recovery unmanned boat and related equipment
CN107861501A (en) Underground sewage treatment works intelligent robot automatic positioning navigation system
CN111174790A (en) Method for forming topographic profile tracking path
Meng et al. Automatic control method of automobile steering-by-wire based on fuzzy PID
CN117195567B (en) Ship multivariable response model construction and parameter identification method oriented to maneuvering motion
Null et al. Automatically-Tuned Model Predictive Control for an Underwater Soft Robot
CN112578389B (en) Multi-source fusion ROV real-time path planning method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant