CN111624990A - Automatic navigation method, server and storage medium - Google Patents

Automatic navigation method, server and storage medium Download PDF

Info

Publication number
CN111624990A
CN111624990A CN201910153571.1A CN201910153571A CN111624990A CN 111624990 A CN111624990 A CN 111624990A CN 201910153571 A CN201910153571 A CN 201910153571A CN 111624990 A CN111624990 A CN 111624990A
Authority
CN
China
Prior art keywords
guided vehicle
automatic guided
time
automated
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910153571.1A
Other languages
Chinese (zh)
Inventor
占兆武
周严伟
谢恺
罗为
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Fulian Fugui Precision Industry Co Ltd
Original Assignee
Fuhuake Precision Industry Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuhuake Precision Industry Shenzhen Co ltd filed Critical Fuhuake Precision Industry Shenzhen Co ltd
Priority to CN201910153571.1A priority Critical patent/CN111624990A/en
Publication of CN111624990A publication Critical patent/CN111624990A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0272Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising means for registering the travel distance, e.g. revolutions of wheels

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses an automatic navigation method, a server and a storage medium, wherein the method comprises the following steps: receiving the pulse quantity fed back by a wheel encoder arranged on the wheel of the automatic guided vehicle in the running process of the automatic guided vehicle; calculating the pose information of the automatic guided vehicle at the moment k +1 according to the pulse number and the pose information of the automatic guided vehicle at the moment k, wherein the pose information comprises course angle information and position information; and generating a control instruction according to the pose information of the automatic guided vehicle at the moment of k +1 and reference point information on a virtual path prestored in the server so as to control the automatic guided vehicle to walk on the corresponding virtual path in a navigation space. By implementing the method, the automatic guided vehicle can be continuously controlled to travel along the virtual path when the automatic guided vehicle travels to the blind area of the camera in the navigation space.

Description

Automatic navigation method, server and storage medium
Technical Field
The present invention relates to the field of navigation, and in particular, to an automatic navigation method, a server, and a storage medium.
Background
Automated Guided Vehicles (AGVs) have become an important infrastructure in smart logistics, smart manufacturing, and digital factories. AGV can make things convenient for the transportation of mill, improves the production efficiency and the operating efficiency of mill. Common AGV navigation methods include magnetic stripe navigation, laser navigation, two-dimensional code navigation, and the like. Magnetic stripe navigation AGV price is cheap relatively, and the operation is reliable stable. However, the magnetic stripes need to be additionally laid in the application scene, the laying workload of the magnetic stripes is large, and the magnetic stripes can be laid only by modifying the application field. If the navigation path is changed, the magnetic strip needs to be laid again at the application field. And the bottom of the magnetic strip can be weakened due to long-term use, so that the magnetic strip is not favorable for repeated use. The AGV based on laser navigation is to construct an indoor complete map through a laser radar, obtain complete surrounding environment information, obtain the surrounding information in real time by utilizing laser scanning and realize algorithm navigation. The AGV of laser navigation does not need to lay a fixed track in an application field, and then when the laser navigation meets a large object or a wall, the positioning may be inaccurate, and the phenomenon of departing from a preset path may occur.
With the development of digital factories and intelligent factories, enterprises put higher demands on factory automation. At present, most of logistics robots operate under the condition of rail guidance, and a small part of logistics robots adopt relatively expensive laser radar trackless navigation and inertial navigation robots with high technical complexity. The rail guided navigation cannot adapt to the requirement of path change, and the laser navigation needs an additional reflecting plate. The mobile robot based on visual navigation can overcome the problems, and only a camera is required to be installed on the roof, so that accurate navigation is realized within the coverage range of the camera. However, the scheme based on the visual navigation has the problem that a camera covers a blind area, namely, the position and pose information of the AGV cannot be obtained in a place without the camera covering. When the AGV runs to a blind area covered by the camera, the server cannot acquire the pose information of the AGV, so that a control instruction cannot be provided for the AGV. Therefore, additional sensors are needed to assist in providing the pose information of the AGV, which in turn increases hardware costs.
Disclosure of Invention
In view of the above, it is desirable to provide an automatic navigation method, a server, and a storage medium, which can determine a heading angle and position information of an automatic guided vehicle when the automatic guided vehicle travels in a blind area of a camera in a navigation space, and control the automatic guided vehicle to travel along a virtual path according to the heading angle and position information.
The invention provides an automatic navigation method, which is applied to a server, wherein the server is in communication connection with at least one automatic guided vehicle, and the method comprises the following steps:
receiving the pulse quantity fed back by a wheel encoder arranged on the wheel of the automatic guided vehicle in the running process of the automatic guided vehicle;
calculating the pose information of the automatic guided vehicle at the moment k +1 according to the pulse number and the pose information of the automatic guided vehicle at the moment k, wherein the pose information comprises course angle information and position information;
and generating a control instruction according to the pose information of the automatic guided vehicle at the moment of k +1 and reference point information on a virtual path prestored in the server so as to control the automatic guided vehicle to walk on the corresponding virtual path in a navigation space.
Preferably, the calculating the pose information of the automatic guided vehicle at the time k +1 according to the pulse number and the pose information of the automatic guided vehicle at the time k comprises:
calculating the pulse number to obtain the running distance d of the right wheel after the automatic guided vehicle runs for a distanceRAnd the distance d traveled by the left wheelL
According to the distance d traveled by the right wheelRAnd the distance d traveled by the left wheelLThe difference in distance between the two gives the course angle offset of the automated guided vehicle,
Figure BDA0001982172540000031
wherein R is the instantaneous center of curvature radius, and l is the lateral distance between the left and right wheels;
calculating course angle deviation delta theta of the automatic guided vehicle at the k moment according to the course angle deviationkAnd obtaining the course angle theta of the automatic guided vehicle at the moment k +1k+1=θk+Δθk
According to the Euclidean distance D between the position of the automatic guided vehicle at the moment k and the position of the automatic guided vehicle at the moment k +1k,k+1And position information (x) of the automated guided vehicle at time kk,yk) Obtaining the position information (x) of the automatic guided vehicle at the moment k +1k+1,yk+1) Wherein, in the step (A),
Figure BDA0001982172540000033
wherein the Δ θkThe theta is the course angle deviation of the automatic guided vehicle at the moment kkAnd the heading angle of the automatic guided vehicle at the moment k is obtained.
Preferably, said right wheel travels a distance dR=de·NRThe distance d traveled by the left wheelL=de·NLWherein N isRNumber of pulses, N, fed back for a wheel encoder mounted on the right wheelLNumber of pulses fed back by a wheel encoder mounted on the left wheel, deFor the path distance represented by each pulse of the wheel encoder.
Preferably, the path distance d represented by each pulse of the wheel encoder iseCalculated by the following formula:
Figure BDA0001982172540000041
wherein N is0The total number of pulses of the wheel encoder, r, is the radius of the left and right wheels of the automatic guided vehicle.
Preferably, the euclidean distance D between the position of the automated guided vehicle at time k and the position at time k +1k,k+1Calculated by the following formula:
Figure BDA0001982172540000042
wherein R iskIs the instantaneous radius of curvature of the vehicle at time k, Delta thetakAnd the course angle deviation of the automatic guided vehicle at the moment k is obtained.
Preferably, the heading angle offset Δ θ of the autonomous guided vehicle at time kkCalculated by the following formula:
Figure BDA0001982172540000043
wherein d isR,k=de·NR,kRepresents the distance traveled by the right wheel of the automated guided vehicle at time k, dL,k=de·NL,kIndicating the distance traveled by the left wheel of the automated guided vehicle at time k.
Preferably, the instantaneous radius of curvature R of the automated guided vehicle at time kkCalculated by the following formula:
Figure BDA0001982172540000044
preferably, the pose information of the automatic guided vehicle at the time k is position information of the automatic guided vehicle at the time k and heading angle information of the automatic guided vehicle at the time k calculated according to the pulse number, wherein the position information of the automatic guided vehicle at the time k is position information determined according to an image shot by a camera installed in a navigation space or initial position information of the automatic guided vehicle on the virtual path.
A second aspect of the present invention provides a server comprising:
a processor; and
a memory having stored therein a plurality of program modules that are loaded by the processor and execute the automated navigation method.
A third aspect of the present invention provides a storage medium having stored thereon at least one computer instruction for execution by a processor and loaded to perform the method of automated navigation.
Compared with the prior art, the automatic navigation method, the server and the storage medium in the scheme can calculate the course angle offset of the automatic guided vehicle and the position information of the automatic guided vehicle at the moment k +1 according to the pulse quantity fed back by the left wheel and the right wheel of the automatic guided vehicle and the position information of the automatic guided vehicle at the moment k, and then generate a control instruction according to the position information of the automatic guided vehicle at the moment k +1 and the reference point information prestored on the virtual path in the server to control the automatic guided vehicle to walk on the corresponding virtual path in the navigation space. Therefore, when the automatic guided vehicle runs to a blind area of the camera in the navigation space, the server can continuously acquire the pose information of the automatic guided vehicle and control the automatic guided vehicle to run on the virtual path.
Drawings
FIG. 1 is a diagram of an automatic navigation system according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of a server in the automatic navigation system according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of an automatic guided vehicle in the automatic navigation system according to an embodiment of the present invention.
FIG. 4 is a flowchart illustrating an automatic navigation method according to an embodiment of the present invention.
Description of the main elements
Automatic navigation system 10
Server 1
Automatic guiding vehicle 2
Network unit 11
Memory device 12
Processor with a memory having a plurality of memory cells 13
Battery with a battery cell 20
Walking unit 21
Left wheel 210
Right wheel 211
Wheel encoder 3
Controller 23
Wireless unit 22
Step (ii) of S41~S43
The following detailed description will further illustrate the invention in conjunction with the above-described figures.
Detailed Description
Referring to fig. 1, a schematic diagram of an automatic navigation system 10 according to an embodiment of the invention is shown. The automatic navigation system 10 includes, but is not limited to, a server 1 and an automatic guided vehicle 2. In the present embodiment, the server 1 and the automatic guided vehicle 2 are connected by wireless communication. In the present embodiment, the server 1 controls the automatic guided vehicle 2 to navigate in the navigation space by estimating pose information of the automatic guided vehicle 2, which will be described in detail later. The pose information comprises position information and course angle information. The navigation space can be an indoor space (such as in a storage bin) or an outdoor space.
Referring to fig. 2, a schematic diagram of the server 1 in the automatic navigation system 10 according to an embodiment of the invention is shown. In the present embodiment, the server 1 includes, but is not limited to, a network unit 11, a memory 12, and a processor 13. The network unit 11, the memory 12 and the processor 13 are electrically connected to each other.
In this embodiment, the network unit 11 is configured to provide a network communication function for the server 1 through a wired or wireless network transmission manner. So that the server 1 can be network communicatively connected to the automated guided vehicle 2.
In the present embodiment, the memory 12 is used to store software programs and data installed in the server 1. In the present embodiment, the storage 12 may be an internal storage unit of the server 1, such as a hard disk or a memory of the server 1. In other embodiments, the memory 12 may also be an external storage device of the server 1, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like provided on the server 1.
In the present embodiment, the memory 12 further stores a virtual map (e.g., an electronic map) of the navigation space, where the virtual map includes a plurality of virtual paths. The virtual path is composed of a reference point and a connection relation between the reference point and the reference point. The virtual path defines a plurality of reference points of the automatic guided vehicle 2 on the whole virtual path and position coordinates corresponding to each reference point, and the sequence of each automatic guided vehicle 2 passing through the plurality of reference points.
In the present embodiment, the memory 12 stores in advance the position coordinates of each reference point on the virtual path. The position coordinates of the respective reference points on the virtual path may refer to coordinates in a coordinate system (XOY) established based on an area where the entire navigation space is located.
In this embodiment, the processor 13 may be a Central Processing Unit (CPU), or other microprocessor or other data processing chip capable of executing control functions. The processor 13 is used to execute software program codes, arithmetic data, and the like.
Referring to fig. 3, a schematic diagram of an automatic guided vehicle in the automatic navigation system according to an embodiment of the invention is shown. In the present embodiment, the number of the automatic guided vehicle 2 may be one or more. The automatic guided vehicle 2 includes, but is not limited to, a battery 20, a traveling unit 21, a wireless unit 22, and a controller 23. The battery 20, the traveling unit 21, the wireless unit 22, and the controller 23 are electrically connected to each other.
In the present embodiment, the battery 20 is used to supply power to the traveling unit 21, the wireless unit 22, and the controller 23. The walking unit 21 is configured to walk according to the movement instruction received by the automatic guided vehicle 2. The walking unit 21 is a wheel type, and includes a left wheel 210 and a right wheel 211. The left wheel 210 and the right wheel 211 are respectively provided with a wheel encoder 3.
The wheel encoder 3 is used for returning the pulse number according to the number of wheel rotation turns in the running process of the automatic guided vehicle 2. In the prior art, the wheel encoder 3 may return a predetermined number of pulses, for example 28 ten thousand, when the wheel makes one revolution.
The wireless unit 22 is used to provide a network connection between the automated guided vehicle 2 and the server 1. The controller 23 is configured to receive a control instruction sent by the server 1 to control the walking unit 21 to walk along a virtual path.
In this embodiment, the automatic guided vehicle 2 may further include a charging unit (not shown) for providing power to the battery 20.
In the prior art, the server 1 may assist to control the automatic guided vehicle 2 to travel on the virtual path according to video or image information captured by a camera installed in the navigation space. However, when the automatic guided vehicle 2 travels to the blind area of the camera, the server 1 cannot confirm whether the automatic guided vehicle 2 still travels on the virtual path. Therefore, the server 1 in this scheme may collect pulse information sent by the wheel encoders 3 mounted on the wheels of the automatic guided vehicle 2, and estimate pose information of the automatic guided vehicle 2 at the next time according to the pulse information and pose information of the automatic guided vehicle 2 at the current time, so that when the automatic guided vehicle 2 travels to a blind area that cannot be covered by a camera in a navigation space, the pose information of the automatic guided vehicle 2 may be obtained, and a control instruction may be generated according to the pose information and reference point information on the virtual path to control the automatic guided vehicle 2 to travel on the virtual path. In this embodiment, the pose information includes position information and course angle information.
For example, after the pose information of the automated guided vehicle 2 at the time k is obtained, the server 1 receives the pulse number fed back by the wheel encoder 3 on the automated guided vehicle 2, and calculates the heading offset angle of the automated guided vehicle 2 according to the pulse number, and the pose information of the automated guided vehicle 2 at the time k +1 is obtained according to the position information and the heading offset angle in the pose information of the time k. And the position information of the automatic guided vehicle at the time k is position information determined according to an image shot by a camera arranged in a navigation space or initial position information of the automatic guided vehicle on the virtual path. The position information determined according to the image shot by the camera installed in the navigation space is the prior art, and is not described herein again. In this embodiment, the pose information of the automated guided vehicle 2 at the time k is the position information of the automated guided vehicle at the time k and the heading angle information of the automated guided vehicle at the time k calculated according to the number of pulses. And the position information of the automatic guided vehicle at the time k is position information determined according to an image shot by a camera installed in a navigation space or initial position information of the automatic guided vehicle on the virtual path.
Specifically, assume that the total number of pulses of the wheel encoder 3 is N0And the radius of the left wheel 210 and the right wheel 211 of the automatic guided vehicle 2 is r, the path distance represented by each pulse can be calculated as follows:
Figure BDA0001982172540000101
let d be the distance traveled by the right wheel 211 of the automatic guided vehicle 2RThe left wheel 210 travels a distance dLThen, one can get: dR=de·NR,dL=de·NLWherein N isRIndicates the pulse number N fed back by the wheel encoder 3 of the right wheel 211 in the running process of the automatic guided vehicle 2LIndicating the number of pulses fed back by the wheel encoder 3 of the left wheel 210 during the driving of the automatic guided vehicle 2.
The heading angle deviation delta theta of the automatically guided vehicle 2 can be calculated from the difference in distance traveled by the left wheel 210 and the right wheel 211 of the automatically guided vehicle 2,
Figure BDA0001982172540000102
where R is the instantaneous center of curvature radius and l is the lateral distance between the left and right wheels.
Based on the theoretical derivation, the heading angle offset of the automated guided vehicle 2 at time k can be calculated as:
Figure BDA0001982172540000103
wherein d isR,k=de·NR,kRepresents the distance traveled by the right wheel 211 of the automatic guided vehicle 2 at time k, dL,k=de·NL,kIndicating the distance traveled by the left wheel 210 of the automated guided vehicle 2 at time k.
And at the next moment, i.e. at the moment k +1, the automatic primingThe heading angle of the guide vehicle 2 is thetak+1=θk+Δθk
Based on Δ θ that has been calculatedk、dR,kAnd dL,kInstantaneous center of curvature radius R at time kkCan be expressed as:
Figure BDA0001982172540000111
suppose that the position information of the automatic guided vehicle 2 at the time k is (x)k,yk) And the position information at the time k +1 is (x)k+1,yk+1). The euclidean distance between the position at time k and the position at time k +1 is then:
Figure BDA0001982172540000112
based on the distance D between two pointsk,k+1And position information (x) of the automated guided vehicle 2 at time kk,yk) The position information (x) of the automated guided vehicle 2 at the time k +1 can be calculatedk+1,yk+1),
Figure BDA0001982172540000113
Figure BDA0001982172540000114
And finally, the server 1 generates a control instruction according to the position information at the moment of k +1 and the reference point information on the virtual path so as to control the automatic guided vehicle 2 to walk on the virtual path.
Referring to fig. 4, a flowchart of an automatic navigation method according to an embodiment of the invention is shown. The order of the steps in the flow chart may be changed, and some steps may be omitted or combined according to different requirements.
In step S41, the server 1 receives the pulse number fed back by the wheel encoder 3 during the driving of the automatic guided vehicle 2.
In the present embodiment, a wheel encoder 3 is attached to each of the left wheel 210 and the right wheel 211 of the automatic guided vehicle 2. The wheel encoder 3 is used for returning the pulse number according to the number of wheel rotation turns in the running process of the automatic guided vehicle 2. After the automatic guided vehicle 2 travels a certain distance, the server 1 may receive the pulse numbers fed back by the wheel encoders 3 on the left wheel 210 and the right wheel 211 respectively during the traveling of the automatic guided vehicle 2. Setting NRNumber of pulses, N, fed back by a wheel encoder 3 mounted on the right wheel 211LThe number of pulses fed back to the wheel encoder 3 mounted on the left wheel 210.
Step S42, the server 1 calculates the pose information of the automated guided vehicle 2 at the time k +1 according to the number of pulses and the pose information of the automated guided vehicle 2 at the time k, where the pose information includes course angle information and position information.
In this embodiment, the pose information of the automated guided vehicle 2 at the time k is the position information of the automated guided vehicle at the time k and the heading angle information of the automated guided vehicle at the time k calculated according to the number of pulses. And the position information of the automatic guided vehicle at the time k is position information determined according to an image shot by a camera installed in a navigation space or initial position information of the automatic guided vehicle on the virtual path. Specifically, the distance d traveled by the right wheel 211 after the automatic guided vehicle 2 travels a distance is calculated separatelyRAnd the distance d traveled by the left wheel 210L
Assuming that the total number of pulses of the wheel encoder 3 is N0And the radius of the left wheel 210 and the right wheel 211 of the automatic guided vehicle 2 is r, the path distance represented by each pulse can be calculated as follows:
Figure BDA0001982172540000131
then, after the automatic guided vehicle 2 travels a certain distance, the number of pulses fed back from the wheel encoder 3 installed on the right wheel 211 can be determined according to the number of pulses installed on the wheel encoderThe number of pulses fed back by the wheel encoder 3 of the left wheel 210 and the path distance represented by each pulse of the wheel encoder 3 respectively calculate the distance d traveled by the right wheel 211 of the automatic guided vehicle 2RA distance d from the left wheel 210LWherein d isR=de·NR,dL=de·NL
The heading angle offset of the automatically guided vehicle 2 can be calculated from the difference in distance traveled by the left wheel 210 and the right wheel 211 of the automatically guided vehicle 2,
Figure BDA0001982172540000132
where R is the instantaneous center of curvature radius and l is the lateral distance between the left and right wheels.
Based on the theoretical derivation, the heading angle offset of the automated guided vehicle 2 at time k can be calculated as:
Figure BDA0001982172540000133
wherein d isR,k=de·NR,kRepresents the distance traveled by the right wheel 211 of the automatic guided vehicle 2 at time k, dL,k=de·NL,kIndicating the distance traveled by the left wheel 210 of the automated guided vehicle 2 at time k.
And at the next moment, i.e. at the moment k +1, the heading angle of the autonomous guided vehicle 2 is θk+1=θk+Δθk
Based on Δ θ that has been calculatedk、dR,kAnd dL,kInstantaneous center of curvature radius R at time kkCan be expressed as:
Figure BDA0001982172540000141
suppose that the position information of the automatic guided vehicle 2 at the time k is (x)k,yk) And the position information at the time k +1 is (x)k+1,yk+1). The euclidean distance between the position at time k and the position at time k +1 is then:
Figure BDA0001982172540000142
based on the distance D between two pointsk,k+1And position information (x) of the automated guided vehicle 2 at time kk,yk) The position information (x) of the automated guided vehicle 2 at the time k +1 can be calculatedk+1,yk+1),
Figure BDA0001982172540000143
Figure BDA0001982172540000144
Step S43, the server 1 generates a control instruction according to the pose information of the automatic guided vehicle 2 at the time k +1 and the reference point information on the virtual path prestored in the server 1, so as to control the automatic guided vehicle 2 to walk on the corresponding virtual path in the navigation space.
For example, when the automated guided vehicle 2 travels to a blind area of a camera in a navigation space at the time k +1, the server 1 calculates a heading angle θ of the automated guided vehicle 2 at the time k +1k+1And position coordinates (x)k+1,yk+1) And according to said position coordinates (x)k+1,yk+1) And reference point coordinates (a, b) on the navigation path, generating a walking control command, and then sending the walking control command to the automatic guided vehicle 2 so as to guide the automatic guided vehicle 2 from the position coordinate point (x)k+1,yk+1) And walking to the reference point coordinates (a, b) according to the course angle. So that the automatic guided vehicle 2 can be continuously controlled to travel on the virtual path.
It is understood that the reference point on the virtual path is the reference point closest to the position coordinates of the automatic guided vehicle at the time k + 1.
Although the present invention has been described in detail with reference to the preferred embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the spirit and scope of the invention.

Claims (10)

1. An automatic navigation method is applied to a server, the server is in communication connection with at least one automatic guided vehicle, and the method is characterized by comprising the following steps:
receiving the pulse quantity fed back by a wheel encoder arranged on the wheel of the automatic guided vehicle in the running process of the automatic guided vehicle;
calculating the pose information of the automatic guided vehicle at the moment k +1 according to the pulse number and the pose information of the automatic guided vehicle at the moment k, wherein the pose information comprises course angle information and position information;
and generating a control instruction according to the pose information of the automatic guided vehicle at the moment of k +1 and reference point information on a virtual path prestored in the server so as to control the automatic guided vehicle to walk on the corresponding virtual path in a navigation space.
2. The automated navigation method of claim 1, wherein the calculating pose information for the automated guided vehicle at time k +1 based on the number of pulses and the pose information for the automated guided vehicle at time k comprises:
calculating the pulse number to obtain the running distance d of the right wheel after the automatic guided vehicle runs for a distanceRAnd the distance d traveled by the left wheelL
According to the distance d traveled by the right wheelRAnd the distance d traveled by the left wheelLThe difference in distance between the two gives the course angle offset of the automated guided vehicle,
Figure FDA0001982172530000011
wherein R is the instantaneous center of curvature radius, and l is the lateral distance between the left and right wheels;
calculating the automatic guided vehicle according to the course angle deviationCourse angular offset delta theta at time kkAnd obtaining the course angle theta of the automatic guided vehicle at the moment k +1k+1=θk+Δθk
According to the Euclidean distance D between the position of the automatic guided vehicle at the moment k and the position of the automatic guided vehicle at the moment k +1k,k+1And position information (x) of the automated guided vehicle at time kk,yk) Obtaining the position information (x) of the automatic guided vehicle at the moment k +1k+1,yk+1) Wherein, in the step (A),
Figure FDA0001982172530000021
wherein the Δ θkThe theta is the course angle deviation of the automatic guided vehicle at the moment kkAnd the heading angle of the automatic guided vehicle at the moment k is obtained.
3. The automated navigation method of claim 2, wherein the right wheel travels a distance dR=de·NRThe distance d traveled by the left wheelL=de·NLWherein N isRNumber of pulses, N, fed back for a wheel encoder mounted on the right wheelLNumber of pulses fed back by a wheel encoder mounted on the left wheel, deFor the path distance represented by each pulse of the wheel encoder.
4. The automated navigation method according to claim 3, wherein each pulse of the wheel encoder represents a path distance deCalculated by the following formula:
Figure FDA0001982172530000022
wherein N is0The total number of pulses of the wheel encoder, r, is the radius of the left and right wheels of the automatic guided vehicle.
5. The automated navigation method of claim 2, wherein the automated guided vehicle is atEuclidean distance D between the position at time k and the position at time k +1k,k+1Calculated by the following formula:
Figure FDA0001982172530000023
wherein R iskIs the instantaneous radius of curvature of the vehicle at time k, Delta thetakAnd the course angle deviation of the automatic guided vehicle at the moment k is obtained.
6. The automated navigation method of claim 5, wherein the automated guided vehicle has a heading angle offset Δ θ at time kkCalculated by the following formula:
Figure FDA0001982172530000031
wherein d isR,k=de·NR,kRepresents the distance traveled by the right wheel of the automated guided vehicle at time k, dL,k=de·NL,kIndicating the distance traveled by the left wheel of the automated guided vehicle at time k.
7. The automated navigation method of claim 6, wherein the instantaneous radius of curvature center R of the automated guided vehicle at time kkCalculated by the following formula:
Figure FDA0001982172530000032
8. the automated navigation method according to claim 1, wherein the pose information of the automated guided vehicle at the time k is position information of the automated guided vehicle at the time k and heading angle information of the automated guided vehicle at the time k calculated from the number of pulses, wherein the position information of the automated guided vehicle at the time k is position information determined from an image taken by a camera installed in a navigation space or initial position information of the automated guided vehicle on the virtual path.
9. A server, characterized in that the server comprises:
a processor; and
a memory having stored therein a plurality of program modules that are loaded by the processor and execute the automated navigation method of any one of claims 1-8.
10. A storage medium having stored thereon at least one computer instruction, wherein the instruction is loaded by a processor to perform the automated navigation method of any one of claims 1-8.
CN201910153571.1A 2019-02-28 2019-02-28 Automatic navigation method, server and storage medium Pending CN111624990A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910153571.1A CN111624990A (en) 2019-02-28 2019-02-28 Automatic navigation method, server and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910153571.1A CN111624990A (en) 2019-02-28 2019-02-28 Automatic navigation method, server and storage medium

Publications (1)

Publication Number Publication Date
CN111624990A true CN111624990A (en) 2020-09-04

Family

ID=72269914

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910153571.1A Pending CN111624990A (en) 2019-02-28 2019-02-28 Automatic navigation method, server and storage medium

Country Status (1)

Country Link
CN (1) CN111624990A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114061566A (en) * 2021-11-04 2022-02-18 北京特种机械研究所 Arc navigation method, device, equipment and computer readable storage medium
CN114322978A (en) * 2020-10-10 2022-04-12 广州汽车集团股份有限公司 Vehicle positioning method, computer equipment and computer readable storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102520723A (en) * 2011-12-28 2012-06-27 天津理工大学 Wheelchair indoor global video monitor navigation system based on suspended wireless transmission camera
CN106933223A (en) * 2015-12-30 2017-07-07 深圳市朗驰欣创科技股份有限公司 A kind of autonomous navigation method of robot and system
CN107272008A (en) * 2017-06-14 2017-10-20 上海大学 A kind of AGV Laser navigation systems with inertia compensation
CN107390686A (en) * 2017-07-17 2017-11-24 深圳拓邦股份有限公司 A kind of grass-removing robot control method and automatically control mowing system
CN107478214A (en) * 2017-07-24 2017-12-15 杨华军 A kind of indoor orientation method and system based on Multi-sensor Fusion
CN108919810A (en) * 2018-07-26 2018-11-30 东北大学 The localization for Mobile Robot and navigation system of view-based access control model teaching
CN108955688A (en) * 2018-07-12 2018-12-07 苏州大学 Two-wheel differential method for positioning mobile robot and system
CN109374003A (en) * 2018-11-06 2019-02-22 山东科技大学 A kind of mobile robot visual positioning and air navigation aid based on ArUco code

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102520723A (en) * 2011-12-28 2012-06-27 天津理工大学 Wheelchair indoor global video monitor navigation system based on suspended wireless transmission camera
CN106933223A (en) * 2015-12-30 2017-07-07 深圳市朗驰欣创科技股份有限公司 A kind of autonomous navigation method of robot and system
CN107272008A (en) * 2017-06-14 2017-10-20 上海大学 A kind of AGV Laser navigation systems with inertia compensation
CN107390686A (en) * 2017-07-17 2017-11-24 深圳拓邦股份有限公司 A kind of grass-removing robot control method and automatically control mowing system
CN107478214A (en) * 2017-07-24 2017-12-15 杨华军 A kind of indoor orientation method and system based on Multi-sensor Fusion
CN108955688A (en) * 2018-07-12 2018-12-07 苏州大学 Two-wheel differential method for positioning mobile robot and system
CN108919810A (en) * 2018-07-26 2018-11-30 东北大学 The localization for Mobile Robot and navigation system of view-based access control model teaching
CN109374003A (en) * 2018-11-06 2019-02-22 山东科技大学 A kind of mobile robot visual positioning and air navigation aid based on ArUco code

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114322978A (en) * 2020-10-10 2022-04-12 广州汽车集团股份有限公司 Vehicle positioning method, computer equipment and computer readable storage medium
CN114322978B (en) * 2020-10-10 2024-03-15 广州汽车集团股份有限公司 Vehicle positioning method, computer equipment and computer readable storage medium
CN114061566A (en) * 2021-11-04 2022-02-18 北京特种机械研究所 Arc navigation method, device, equipment and computer readable storage medium
CN114061566B (en) * 2021-11-04 2024-05-28 北京特种机械研究所 Arc navigation method, device, equipment and computer readable storage medium

Similar Documents

Publication Publication Date Title
US11373395B2 (en) Methods and systems for simultaneous localization and calibration
CN109508021B (en) Guiding method, device and system of automatic guided vehicle
US10281922B2 (en) Method and system for mobile work system confinement and localization
Boniardi et al. Robust LiDAR-based localization in architectural floor plans
US8972095B2 (en) Automatic guided vehicle and method for drive control of the same
CN106338991A (en) Robot based on inertial navigation and two-dimensional code and positioning and navigation method thereof
US20180306587A1 (en) Methods and Systems for Map Generation and Alignment
CN111158355A (en) Automatic navigation cloud server and automatic navigation control method
US20090312871A1 (en) System and method for calculating location using a combination of odometry and landmarks
Sprunk et al. Lidar-based teach-and-repeat of mobile robot trajectories
CN205121338U (en) AGV navigation based on image recognition and wireless network
CN104750115A (en) Laser active type navigation system and method of mobile equipment
CN111624990A (en) Automatic navigation method, server and storage medium
US11537140B2 (en) Mobile body, location estimation device, and computer program
CN204883363U (en) AGV transport robot navigation system that laser guidance map found
CN110907891A (en) AGV positioning method and device
CN112462762B (en) Robot outdoor autonomous moving system and method based on roadside two-dimensional code unit
KR20110105926A (en) Driving path plan and path control method of the synchronous control mobile robot using rfid tagging maps
CN111580530A (en) Positioning method, positioning device, autonomous mobile equipment and medium
CN109491374B (en) Track adjusting method and device for automatic guided vehicle
CN112346446A (en) Code-shedding recovery method and device for automatic guided transport vehicle and electronic equipment
Shioya et al. Minimal Autonomous Mover-MG-11 for Tsukuba Challenge–
CN111796589A (en) Navigation control method, intelligent warehousing system and automatic guide vehicle
CN112947487B (en) Automatic guided vehicle and curve path tracking method and control device thereof
Yamada et al. Autonomous path travel control of mobile robot using internal and external camera images in gps-denied environments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20221201

Address after: The first floor, the second floor, the third floor and the fourth floor of the factory building No.1, f8d District, Foxconn science and Technology Industrial Park, east side of Minqing Road, Longhua street, Shenzhen City, Guangdong Province

Applicant after: Shenzhen Fulian Fugui Precision Industry Co.,Ltd.

Address before: 518109 3rd floor, building 1, F8B, Foxconn Science Park, No.2, Donghuan 2nd Road, Longhua street, Longhua District, Shenzhen City, Guangdong Province

Applicant before: FUHUAKE PRECISION INDUSTRY (SHENZHEN) Co.,Ltd.

TA01 Transfer of patent application right
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200904

WD01 Invention patent application deemed withdrawn after publication