CN111830955A - Automatic navigation method, server and storage medium - Google Patents

Automatic navigation method, server and storage medium Download PDF

Info

Publication number
CN111830955A
CN111830955A CN201910300909.1A CN201910300909A CN111830955A CN 111830955 A CN111830955 A CN 111830955A CN 201910300909 A CN201910300909 A CN 201910300909A CN 111830955 A CN111830955 A CN 111830955A
Authority
CN
China
Prior art keywords
guided vehicle
information
automatic guided
pose information
moment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910300909.1A
Other languages
Chinese (zh)
Inventor
周严伟
占兆武
谢恺
罗为
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Fulian Fugui Precision Industry Co Ltd
Original Assignee
Fuhuake Precision Industry Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuhuake Precision Industry Shenzhen Co ltd filed Critical Fuhuake Precision Industry Shenzhen Co ltd
Priority to CN201910300909.1A priority Critical patent/CN111830955A/en
Publication of CN111830955A publication Critical patent/CN111830955A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/027Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising intertial navigation means, e.g. azimuth detector

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Navigation (AREA)

Abstract

The invention provides an automatic navigation method, which comprises the steps of controlling a camera shooting unit to position an automatic guided vehicle; controlling an inertia detection unit to detect pose information of the automatic guided vehicle, wherein the pose information at least comprises acceleration information, speed information and yaw angle information; judging whether the obtained positioning information is effective or not; when the obtained positioning information is judged to be effective, updating the positioning information obtained by the camera shooting unit into the pose information currently output by the inertia detection unit and the pose information output by the motion prediction model; when the obtained positioning information is judged to be invalid, calculating the pose information of the automatic guided vehicle at the moment k +1 by a Kalman filtering algorithm according to the pose information currently detected by the inertia detection unit, the pose information output by the motion prediction model and the pose information of the automatic guided vehicle at the moment k; and generating a control instruction according to the pose information of the automatic guided vehicle at the moment of k +1 and reference point information on a prestored virtual path to control the automatic guided vehicle to run in a navigation space.

Description

Automatic navigation method, server and storage medium
Technical Field
The present invention relates to the field of navigation, and in particular, to an automatic navigation method, a server, and a storage medium.
Background
Automated Guided Vehicles (AGVs) have become an important infrastructure in smart logistics, smart manufacturing, and digital factories. AGV can make things convenient for the transportation of mill, improves the production efficiency and the operating efficiency of mill. Common AGV navigation methods include magnetic stripe navigation, laser navigation, two-dimensional code navigation, and the like. Magnetic stripe navigation AGV price is cheap relatively, and the operation is reliable stable. However, the magnetic stripes need to be additionally laid in the application scene, the laying workload of the magnetic stripes is large, and the magnetic stripes can be laid only by modifying the application field. If the navigation path is changed, the magnetic strip needs to be laid again at the application field. And the bottom of the magnetic strip can be weakened due to long-term use, so that the magnetic strip is not favorable for repeated use. The AGV based on laser navigation is to construct an indoor complete map through a laser radar, obtain complete surrounding environment information, obtain the surrounding information in real time by utilizing laser scanning and realize algorithm navigation. The AGV of laser navigation does not need to lay a fixed track in an application field, and then when the laser navigation meets a large object or a wall, the positioning may be inaccurate, and the phenomenon of departing from a preset path may occur.
With the development of digital factories and intelligent factories, enterprises put higher demands on factory automation. At present, most of logistics robots operate under the condition of rail guidance, and a small part of logistics robots adopt relatively expensive laser radar trackless navigation and inertial navigation robots with high technical complexity. The rail guided navigation cannot adapt to the requirement of path change, and the laser navigation needs an additional reflecting plate. The mobile robot based on visual navigation can overcome the problems, and only a camera is required to be installed on the roof, so that accurate navigation is realized within the coverage range of the camera. However, the scheme based on the visual navigation has the problem that a camera covers a blind area, namely, the position and pose information of the AGV cannot be obtained in a place without the camera covering. When the AGV runs to a blind area covered by the camera, the server cannot acquire the pose information of the AGV, so that a control instruction cannot be provided for the AGV. Therefore, additional sensors are needed to assist in providing the pose information of the AGV, which in turn increases hardware costs.
Disclosure of Invention
In view of the above, it is desirable to provide an automatic guidance method, a server, and a storage medium that can determine yaw angle and pose information when an automatic guided vehicle travels in a blind area of a camera in a navigation space, and control the automatic guided vehicle to travel along a virtual path according to the yaw angle and pose information.
The invention provides an automatic navigation method through a Kalman filtering algorithm according to information of an inertia detection unit and a motion prediction model, which is applied to a server, wherein the server is in communication connection with at least one automatic guided vehicle and at least one camera unit, and the method comprises the following steps:
controlling the camera unit to position the automatic guided vehicle;
controlling an inertia detection unit arranged on the automatic guided vehicle to detect pose information of the automatic guided vehicle, wherein the pose information at least comprises speed information and yaw angle information;
judging whether the positioning information obtained by the camera shooting unit is effective or not;
when the positioning information obtained by the camera shooting unit is judged to be effective, updating the positioning information currently obtained by the camera shooting unit into the pose information currently output by the inertia detection unit and the pose information output by the motion prediction model;
when the positioning information acquired by the camera unit is judged to be invalid, calculating the pose information of the automatic guided vehicle at the moment k +1 by a Kalman filtering algorithm according to the pose information currently detected by the inertia detection unit, the pose information output by the motion prediction model and the pose information of the automatic guided vehicle at the moment k;
and generating a control instruction according to the pose information of the automatic guided vehicle at the moment of k +1 and reference point information on a virtual path prestored in the server so as to control the automatic guided vehicle to run on the corresponding virtual path in a navigation space.
A second aspect of the present invention provides a server comprising:
a processor; and
a memory, in which a plurality of program modules are stored, the plurality of program modules being loaded by the processor and executing the automatic navigation method.
A third aspect of the present invention provides a storage medium having stored thereon at least one computer instruction for execution by a processor and loaded to perform the method of automatic navigation described above.
According to the automatic navigation method, when the automatic guided vehicle runs in a blind area which cannot be navigated by the camera unit, the running route of the automatic guided vehicle can be estimated through the combination of the yaw angle and the pose information detected by the inertia detection unit and the motion prediction model and the Kalman filtering algorithm, so that the automatic guided vehicle is prevented from deviating from the preset route.
Drawings
FIG. 1 is a diagram of an automatic navigation system according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of a server in the automatic navigation system according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of an automatic guided vehicle in the automatic navigation system according to an embodiment of the present invention.
FIG. 4 is a flowchart illustrating an automatic navigation method according to an embodiment of the present invention.
Description of the main elements
Automatic navigation system 10
Server 1
Automatic guiding vehicle 2
Network unit 11
Memory device 12
Processor with a memory having a plurality of memory cells 13
Battery with a battery cell 20
Wireless unit 21
Controller 22
Inertia detecting unit 23
Image pickup unit 3
Step (ii) of S41~S49
The following detailed description will further illustrate the invention in conjunction with the above-described figures.
Detailed Description
Referring to fig. 1, a schematic diagram of an automatic navigation system 10 according to an embodiment of the invention is shown. The automatic navigation system 10 includes, but is not limited to, a server 1, an automatic guided vehicle 2, and at least one camera unit 3. In the present embodiment, the server 1 and the automatic guided vehicle 2 are connected by wireless communication. The server 1 is electrically connected with the at least one camera unit 3 through a wired manner. The camera unit 3 is a camera and is installed in a navigation space.
In the present embodiment, the server 1 controls the automatic guided vehicle 2 to navigate in the navigation space by estimating pose information of the automatic guided vehicle 2, which will be described in detail later. The pose information comprises position information and course angle information. The navigation space can be an indoor space (such as in a storage bin) or an outdoor space.
Referring to fig. 2, a schematic diagram of the server 1 in the automatic navigation system 10 according to an embodiment of the invention is shown. In the present embodiment, the server 1 includes, but is not limited to, a network unit 11, a memory 12, and a processor 13. The network unit 11, the memory 12 and the processor 13 are electrically connected to each other.
In this embodiment, the network unit 11 is configured to provide a network communication function for the server 1 through a wired or wireless network transmission manner. So that the server 1 can be network communicatively connected to the automated guided vehicle 2.
In the present embodiment, the memory 12 is used to store software programs and data installed in the server 1. In the present embodiment, the storage 12 may be an internal storage unit of the server 1, for example, a hard disk or a memory of the server 1. In other embodiments, the memory 12 may also be an external storage device of the server 1, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like provided on the server 1.
In the present embodiment, the memory 12 further stores a virtual map (e.g., an electronic map) of the navigation space, where the virtual map includes a plurality of virtual paths. The virtual path is composed of a reference point and a connection relation between the reference point and the reference point. The virtual path defines a plurality of reference points of the automatic guided vehicle 2 on the whole virtual path and position coordinates corresponding to each reference point, and the sequence of each automatic guided vehicle 2 passing through the plurality of reference points.
In the present embodiment, the memory 12 stores in advance the position coordinates of each reference point on the virtual path. The position coordinates of the respective reference points on the virtual path may refer to coordinates in a coordinate system (XOY) established based on an area where the entire navigation space is located.
In this embodiment, the processor 13 may be a Central Processing Unit (CPU), or other microprocessor or other data processing chip capable of executing control functions. The processor 13 is used for executing software program codes, calculating data, and the like.
Referring to fig. 3, a schematic diagram of an automatic guided vehicle in the automatic navigation system according to an embodiment of the invention is shown. In the present embodiment, the number of the automatic guided vehicle 2 may be one or more. The automatic guided vehicle 2 includes, but is not limited to, a battery 20, a wireless unit 21, a controller 22, and an inertia detection unit 23. The battery 20, the wireless unit 21, the controller 22 and the inertia detecting unit 23 are electrically connected.
In the present embodiment, the battery 20 is used for supplying power to the wireless unit 21, the controller 22 and the inertia detection unit 23.
The wireless unit 21 is used to provide a network connection between the automatic guided vehicle 2 and the server 1. The controller 22 is configured to receive a control instruction sent by the server 1 to control the automatic guided vehicle 2 to travel along a virtual path.
In this embodiment, the automatic guided vehicle 2 may further include a charging unit (not shown) for providing power to the battery 20.
The Inertial Measurement Unit (IMU) 23 is used for measuring the three-axis attitude angle, angular velocity, and acceleration of the object. The inertia detecting unit 23 includes a plurality of single-axis accelerometers and a plurality of single-axis gyroscopes, the accelerometers detect acceleration signals of the object in three independent axes of the carrier coordinate system, and the gyroscopes detect angular velocity signals of the carrier relative to the navigation coordinate system, measure angular velocity and acceleration of the object in three-dimensional space, and calculate the attitude of the object based on the angular velocity and acceleration signals.
In the prior art, the server 1 can assist in controlling the automatic guided vehicle 2 to travel on the virtual path according to the video or image information captured by the camera unit 3 installed in the navigation space. However, when the automatic guided vehicle 2 travels to the blind area of the camera unit 3, the server 1 cannot confirm whether the automatic guided vehicle 2 still travels on the virtual path. Therefore, in the present scheme, the server 1 may collect the pose information of the automatic guided vehicle 2 detected by the inertia detection unit 23 installed on the automatic guided vehicle 2, and estimate the pose information of the automatic guided vehicle 2 at the next time through the kalman filter algorithm according to the pose information and the motion prediction model, so that when the automatic guided vehicle 2 travels to a blind area that cannot be covered by the camera unit 3 in the navigation space, the pose information of the automatic guided vehicle 2 may be obtained, and a control instruction may be generated according to the pose information and the reference point information on the virtual path to control the automatic guided vehicle 2 to travel on the virtual path. In the present embodiment, the pose information includes position information and yaw angle information.
For example, first, the server 1 sends a control command to the camera unit 3 to control the camera unit 3 to position the automatic guided vehicle 2, and controls an inertia detection unit 23 installed on the automatic guided vehicle 2 to detect pose information of the automatic guided vehicle 2, where the pose information further includes acceleration information. The positioning information determined according to the image captured by the camera unit 3 installed in the navigation space is the prior art, and is not described herein again.
Further, it is determined whether the positioning information obtained by the camera unit 3 is valid, and when it is determined that the positioning information obtained by the camera unit 3 is valid, the positioning information currently obtained by the camera unit 3 is updated to the pose information currently output by the inertia detection unit 23 and the pose information output by the motion prediction model. When the positioning information obtained by the camera unit 3 is judged to be invalid, the pose information of the automatic guided vehicle 2 at the time k +1 is calculated through a Kalman filtering algorithm according to the pose information currently detected by the inertia detection unit 23 and the pose information of the automatic guided vehicle 2 at the time k output by the motion prediction model. The position information of the automatic guided vehicle 2 at the time k is position information determined from an image captured by the imaging unit 3 installed in a navigation space or initial position information of the automatic guided vehicle 2 on the virtual path.
Further, a control instruction is generated according to the pose information of the automatic guided vehicle 2 at the moment of k +1 and reference point information on a virtual path prestored in the server 1 so as to control the automatic guided vehicle 2 to run on the corresponding virtual path in the navigation space.
Referring to fig. 4, a flowchart of an automatic navigation method according to an embodiment of the invention is shown. The order of the steps in the flow diagrams may be changed, and some steps may be omitted or combined, according to different needs.
In step S41, the imaging unit 3 is controlled to position the automatic guided vehicle 2.
Step S42, controlling the inertia detection unit 23 mounted on the automatic guided vehicle 2 to detect the pose information of the automatic guided vehicle 2, where the pose information at least includes acceleration information, speed information, and yaw angle information.
Step S43, it is determined whether or not the positioning information obtained by the image pickup unit 3 is valid.
Specifically, the step S43 is to determine whether the image captured by the imaging unit 3 includes the complete automatic guided vehicle 2; when the image shot by the camera unit 3 contains the complete automatic guided vehicle 2, judging that the positioning information obtained by the camera unit 3 is valid; and when the image shot by the camera unit 3 does not contain the complete automatic guided vehicle 2, judging that the positioning information obtained by the camera unit 3 is invalid.
In step S44, when it is determined that the positioning information obtained by the camera unit 3 is valid, the positioning information currently obtained by the camera unit 3 is updated to the pose information currently output by the inertia detection unit 23 and the motion prediction model.
Step S45, when it is determined that the positioning information obtained by the camera unit 3 is invalid, calculating the pose information of the automated guided vehicle 2 at the time k +1 by using a kalman filter algorithm according to the pose information currently detected by the inertia detection unit 23, the pose information output by the motion prediction model, and the pose information of the automated guided vehicle 2 at the time k.
Specifically, the step S45 includes: and calculating the yaw angle variation delta theta of the automatic guided vehicle 2 at the moment k +1 through the motion prediction model. Wherein, the calculation formula is: Δ θ ═ θ '×, v is the speed of the automatically guided vehicle 2 currently detected by the inertia detection unit 23, and θ' is the amount of change in the yaw angle of the automatically guided vehicle 2 at the time from k to k + 1.
And calculating the displacement variation deltas of the automatic guided vehicle 2 at the moment k +1 through the motion prediction model. Wherein, the calculation formula is: Δ s ═ Δ t × v, v is the speed of the automated guided vehicle 2 currently detected by the inertia detection unit 23, and Δ t is the time interval from k to k + 1.
Calculating the pose information (position _ predict (x) of the automatic guided vehicle 2 at the moment k +1 by the motion prediction modelk+1,yk+1,θk+1)=(xk+ΔS*cos(θk),y0+ΔS*sin(θk),θk+ Δ θ), wherein the pose information pose1 (x) of the automated guided vehicle 2 at time kk,yk,θk) Detected and determined by the camera unit 3.
And calculating pose information (position _ IMU) of the automatic guided vehicle 2 according to the information detected by the inertia detection unit 23. Specifically, the displacement change value Δ S of the automatic guided vehicle 2 in the x-axis direction between the time k and the time k +1 is calculatedx. Wherein, the calculation formula is:
Figure BDA0002028193080000091
wherein accX is the acceleration of the automatic guided vehicle 2 on the x axis currently detected by the inertia detection unit 23.
Calculating the displacement change value delta S of the automatic guided vehicle 2 in the y-axis direction between the k moment and the k +1 momenty. Wherein, the calculation formula is:
Figure BDA0002028193080000092
wherein accY is the acceleration of the automatic guided vehicle 2 on the y axis currently detected by the inertia detection unit 23.
Calculating the displacement change value of the automatic guided vehicle 2 between the k moment and the k +1 moment
Figure BDA0002028193080000101
Calculating the yaw angle change value of the automatic guided vehicle 2 between the moment k and the moment k +1
Figure BDA0002028193080000102
Calculating the pose information (position _ IMU (x)) of the automatic guided vehicle 2 at the moment of k +1k+1,yk+1,θk+1)=(xk+ΔS*cos(θk+Δθ),yk+ΔS*sin(θk+Δθ),θk+ Δ θ), wherein the pose information pose1 (x) of the automated guided vehicle 2 at time kk,yk,θk) Detected and determined by the camera unit 3.
Calculating the position and orientation information (pos _ kalman (x)) of the automated guided vehicle 2 at the moment k +1 by a kalman filtering algorithm in combination with the motion prediction modelk+1,yk+1,θk+1). Specifically, the method comprises the following steps:
Figure BDA0002028193080000103
wherein (x)k,yk,θk)TIs (dose _ kalman (x)k+1,yk+1,θk+1))T
Figure BDA0002028193080000104
Represents pose information calculated based on the information detected by the inertia detection unit 23,
Figure BDA0002028193080000105
representing pose information calculated according to the motion prediction model.
Computing a predictive estimation covariance matrix
Pk=(FkPk-1Fk T+Q)。
Computing optimal Kalman gain
Kk=PkHk T(HkPkHk T+R)-1
Wherein, KkIs a 3 x 3 matrix.
Updating the prediction estimation covariance matrix according to the calculation result
Pk=(1-KkHk)Pk
Wherein, when k is 1, Pk-1Initialisation to zero matrix, FkIs a state transition matrix, HkIs an observation matrix.
And step S46, generating a control instruction according to the pose information of the automatic guided vehicle 2 at the moment of k +1 and reference point information on a virtual path prestored in the server 1 so as to control the automatic guided vehicle 2 to run on the corresponding virtual path in the navigation space.
In step S47, it is determined whether or not the positioning information obtained by the image pickup unit 3 has changed from invalid to valid.
In step S48, when it is determined that the positioning information obtained by the imaging unit 3 has changed from invalid to valid, the automatic guided vehicle 2 is navigated based on the positioning information obtained by the imaging unit 3.
Step S49, when it is determined that the positioning information obtained by the imaging unit 3 is still invalid, continues navigating the automatic guided vehicle 2 according to the pose information of the automatic guided vehicle 2 at the time k + 1.
Although the present invention has been described in detail with reference to the preferred embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the spirit and scope of the invention.

Claims (10)

1. An automatic navigation method is applied to a server, and the server is in communication connection with at least one automatic guided vehicle and at least one camera unit, and is characterized by comprising the following steps:
controlling the camera unit to position the automatic guided vehicle;
controlling an inertia detection unit arranged on the automatic guided vehicle to detect pose information of the automatic guided vehicle, wherein the pose information at least comprises speed information and yaw angle information;
judging whether the positioning information obtained by the camera shooting unit is effective or not;
when the positioning information obtained by the camera shooting unit is judged to be effective, updating the positioning information currently obtained by the camera shooting unit into the pose information currently output by the inertia detection unit and the pose information output by the motion prediction model;
when the positioning information acquired by the camera unit is judged to be invalid, calculating the pose information of the automatic guided vehicle at the moment k +1 by a Kalman filtering algorithm according to the pose information currently detected by the inertia detection unit, the pose information output by the motion prediction model and the pose information of the automatic guided vehicle at the moment k;
and generating a control instruction according to the pose information of the automatic guided vehicle at the moment of k +1 and reference point information on a virtual path prestored in the server so as to control the automatic guided vehicle to run on the corresponding virtual path in a navigation space.
2. The automated navigation method of claim 1, wherein the pose information of the automated guided vehicle at time k is position information and yaw information of the automated guided vehicle at time k, wherein the position information and yaw information of the automated guided vehicle at time k is determined from the image captured by the imaging unit or is initial position information of the automated guided vehicle on the virtual path.
3. The automatic navigation method according to claim 2, wherein determining whether the positioning information obtained by the camera unit is valid comprises:
judging whether the image shot by the camera shooting unit contains the complete automatic guided vehicle or not;
when the image shot by the camera shooting unit contains the complete automatic guide vehicle, judging that the positioning information obtained by the camera shooting unit is effective; and
and when the image shot by the camera shooting unit does not contain the complete automatic guide vehicle, judging that the positioning information obtained by the camera shooting unit is invalid.
4. The automatic navigation method of claim 1, wherein calculating the pose information of the automatically guided vehicle at the time k +1 by using a kalman filter algorithm according to the pose information currently detected by the inertia detection unit and the pose information of the automatically guided vehicle at the time k output by the motion prediction model comprises:
calculating the yaw angle variation delta theta of the automatic guided vehicle at the moment k +1 through the motion prediction model;
calculating the displacement variation deltas of the automatic guided vehicle at the moment k +1 through the motion prediction model;
calculating position and attitude information (position _ predict (x)) of the automatic guided vehicle at the moment k +1 through the motion prediction modelk+1,yk+1,θk+1)=(xk+ΔS*cos(θk),y0+ΔS*sin(θk),θk+ Δ θ), wherein the pose information pose1 (x) of the automated guided vehicle k at time instantk,yk,θk) Detecting and determining by the camera shooting unit;
calculating pose information (position _ IMU) of the automatic guided vehicle according to the information detected by the inertia detection unit; and
calculating the position and orientation information po of the automatic guided vehicle at the moment k +1 by a Kalman filtering algorithmse_kalman(xk+1,yk+1,θk+1)。
5. The automated navigation method according to claim 4, wherein the calculation formula for calculating the yaw angle variation Δ θ of the automated guided vehicle at the time k +1 by the motion prediction model is: Δ θ ═ θ '× v, where v is the speed of the automatically guided vehicle currently detected by the inertia detection unit, and θ' is the amount of change in the yaw angle of the automatically guided vehicle at the time from k to k + 1; and
the calculation formula for calculating the displacement variation deltas of the automatic guided vehicle at the moment k +1 through the motion prediction model is as follows: Δ s ═ Δ t ═ v, where v is the speed of the automated guided vehicle currently detected by the inertia detection unit, and Δ t is the time interval from k to k + 1.
6. The automatic navigation method of claim 4, wherein calculating the pose information posIMU of the automated guided vehicle from the information detected by the inertial detection unit comprises:
calculating the displacement change value delta S of the automatic guided vehicle in the x-axis direction between the k moment and the k +1 momentxWherein, the calculation formula is as follows:
Figure FDA0002028193070000031
wherein accX is the acceleration of the automatic guided vehicle currently detected by the inertia detection unit on the x axis;
calculating the displacement change value delta S of the automatic guided vehicle in the y-axis direction between the k moment and the k +1 momentyWherein, the calculation formula is as follows:
Figure FDA0002028193070000032
the accY is the acceleration of the automatic guided vehicle currently detected by the inertia detection unit on the y axis;
calculating the displacement change value of the automatic guided vehicle between the k moment and the k +1 moment
Figure FDA0002028193070000041
Calculating the yaw angle change value of the automatic guided vehicle between the k moment and the k +1 moment
Figure FDA0002028193070000042
And
calculating the position and pose information (position _ IMU) (x) of the automatic guided vehicle at the moment of k +1k+1,yk+1,θk+1)=(xk+ΔS*cos(θk+Δθ),yk+ΔS*sin(θk+Δθ),θk+ Δ θ), wherein the pose information of the automated guided vehicle at time k, position 1 (x)k,yk,θk) And detecting and determining by the camera shooting unit.
7. The automated navigation method according to claim 6, wherein the pose information posjkalman (x) of the automated guided vehicle at the time k +1 is calculated by a kalman filter algorithmk+1,yk+1,θk+1) The method comprises the following steps:
Figure FDA0002028193070000043
wherein (x)k,yk,θk)TIs (dose _ kalman (x)k+1,yk+1,θk+1))T
Figure FDA0002028193070000044
Representing pose information calculated according to the information detected by the inertia detection unit,
Figure FDA0002028193070000045
representing pose information calculated according to the motion prediction model.
Computing a predictive estimation covariance matrix
Pk=(FkPk-1Fk T+Q),
Computing optimal Kalman gain
Kk=PkHk T(HkPkHk T+R)-1Wherein k iskIs a matrix of 3 x 3,
updating the prediction estimation covariance matrix according to the calculation result
Pk=(1-KkHk)Pk
Wherein, when k is 1, Pk-1Initialisation to zero matrix, FkIs a state transition matrix, HkIs an observation matrix.
8. The automated navigation method of claim 1, the method further comprising:
when a control instruction is generated according to the pose information of the automatic guided vehicle at the moment of k +1 and reference point information on a virtual path prestored in a server to control the automatic guided vehicle to run on the corresponding virtual path in a navigation space, the method further comprises the following steps:
determining whether the positioning information obtained by the camera unit changes from invalid to valid;
when the positioning information obtained by the camera shooting unit is judged to be changed from invalid to valid, navigating the automatic guided vehicle according to the positioning information obtained by the camera shooting unit; and
and when the positioning information obtained by the camera unit is still invalid, continuing to navigate the automatic guided vehicle according to the pose information of the automatic guided vehicle at the moment of k + 1.
9. A server, characterized in that the server comprises:
a processor; and
a memory having stored therein a plurality of program modules that are loaded by the processor and execute the automated navigation method of any one of claims 1-8.
10. A storage medium having stored thereon at least one computer instruction, wherein the instruction is loaded by a processor to perform the automated navigation method of any one of claims 1-8.
CN201910300909.1A 2019-04-15 2019-04-15 Automatic navigation method, server and storage medium Pending CN111830955A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910300909.1A CN111830955A (en) 2019-04-15 2019-04-15 Automatic navigation method, server and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910300909.1A CN111830955A (en) 2019-04-15 2019-04-15 Automatic navigation method, server and storage medium

Publications (1)

Publication Number Publication Date
CN111830955A true CN111830955A (en) 2020-10-27

Family

ID=72914198

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910300909.1A Pending CN111830955A (en) 2019-04-15 2019-04-15 Automatic navigation method, server and storage medium

Country Status (1)

Country Link
CN (1) CN111830955A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101576384A (en) * 2009-06-18 2009-11-11 北京航空航天大学 Indoor movable robot real-time navigation method based on visual information correction
CN106123890A (en) * 2016-06-14 2016-11-16 中国科学院合肥物质科学研究院 A kind of robot localization method of Fusion
CN106918830A (en) * 2017-03-23 2017-07-04 安科机器人有限公司 A kind of localization method and mobile robot based on many navigation modules
CN108759834A (en) * 2018-04-28 2018-11-06 温州大学激光与光电智能制造研究院 A kind of localization method based on overall Vision

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101576384A (en) * 2009-06-18 2009-11-11 北京航空航天大学 Indoor movable robot real-time navigation method based on visual information correction
CN106123890A (en) * 2016-06-14 2016-11-16 中国科学院合肥物质科学研究院 A kind of robot localization method of Fusion
CN106918830A (en) * 2017-03-23 2017-07-04 安科机器人有限公司 A kind of localization method and mobile robot based on many navigation modules
CN108759834A (en) * 2018-04-28 2018-11-06 温州大学激光与光电智能制造研究院 A kind of localization method based on overall Vision

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
胡小平: "导航技术基础", 国防工业出版社, pages: 44 - 46 *

Similar Documents

Publication Publication Date Title
CN110312912B (en) Automatic vehicle parking system and method
Ohya et al. Vision-based navigation by a mobile robot with obstacle avoidance using single-camera vision and ultrasonic sensing
CN112183133B (en) Aruco code guidance-based mobile robot autonomous charging method
CN111457929A (en) Logistics vehicle autonomous path planning and navigation method based on geographic information system
CN110146909A (en) A kind of location data processing method
CN105486311A (en) Indoor robot positioning navigation method and device
CN112740274A (en) System and method for VSLAM scale estimation on robotic devices using optical flow sensors
KR101341204B1 (en) Device and method for estimating location of mobile robot using raiser scanner and structure
KR102006291B1 (en) Method for estimating pose of moving object of electronic apparatus
CN110763224A (en) Navigation method and navigation system for automatic guided transport vehicle
WO2020248210A1 (en) Roadmodel manifold for 2d trajectory planner
Zheng et al. An optimization-based UWB-IMU fusion framework for UGV
CN110988949A (en) Positioning method, positioning device, computer readable storage medium and mobile device
CN111263308A (en) Positioning data acquisition method and system
CN110162038A (en) Control method for movement, device, storage medium and processor
Motlagh et al. Position Estimation for Drones based on Visual SLAM and IMU in GPS-denied Environment
Smith et al. PiPS: Planning in perception space
CN114714357A (en) Sorting and carrying method, sorting and carrying robot and storage medium
CN103328907B (en) Robotic heliostat calibaration system and method
CN111830955A (en) Automatic navigation method, server and storage medium
CN111580530A (en) Positioning method, positioning device, autonomous mobile equipment and medium
CN111829510A (en) Automatic navigation method, server and storage medium
CN111624990A (en) Automatic navigation method, server and storage medium
CN113632029B (en) Information processing device, program, and information processing method
Yamada et al. Autonomous path travel control of mobile robot using internal and external camera images in gps-denied environments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20221205

Address after: The first floor, the second floor, the third floor and the fourth floor of the factory building No.1, f8d District, Foxconn science and Technology Industrial Park, east side of Minqing Road, Longhua street, Shenzhen City, Guangdong Province

Applicant after: Shenzhen Fulian Fugui Precision Industry Co.,Ltd.

Address before: 518109 3rd floor, building 1, F8B, Foxconn Science Park, No.2, Donghuan 2nd Road, Longhua street, Longhua District, Shenzhen City, Guangdong Province

Applicant before: FUHUAKE PRECISION INDUSTRY (SHENZHEN) Co.,Ltd.

TA01 Transfer of patent application right
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20201027

WD01 Invention patent application deemed withdrawn after publication