Automatic navigation method, server and storage medium
Technical Field
The present invention relates to the field of navigation, and in particular, to an automatic navigation method, a server, and a storage medium.
Background
Automated Guided Vehicles (AGVs) have become an important infrastructure in smart logistics, smart manufacturing, and digital factories. AGV can make things convenient for the transportation of mill, improves the production efficiency and the operating efficiency of mill. Common AGV navigation methods include magnetic stripe navigation, laser navigation, two-dimensional code navigation, and the like. Magnetic stripe navigation AGV price is cheap relatively, and the operation is reliable stable. However, the magnetic stripes need to be additionally laid in the application scene, the laying workload of the magnetic stripes is large, and the magnetic stripes can be laid only by modifying the application field. If the navigation path is changed, the magnetic strip needs to be laid again at the application field. And the bottom of the magnetic strip can be weakened due to long-term use, so that the magnetic strip is not favorable for repeated use. The AGV based on laser navigation is to construct an indoor complete map through a laser radar, obtain complete surrounding environment information, obtain the surrounding information in real time by utilizing laser scanning and realize algorithm navigation. The AGV of laser navigation does not need to lay a fixed track in an application field, and then when the laser navigation meets a large object or a wall, the positioning may be inaccurate, and the phenomenon of departing from a preset path may occur.
With the development of digital factories and intelligent factories, enterprises put higher demands on factory automation. At present, most of logistics robots operate under the condition of rail guidance, and a small part of logistics robots adopt relatively expensive laser radar trackless navigation and inertial navigation robots with high technical complexity. The rail guided navigation cannot adapt to the requirement of path change, and the laser navigation needs an additional reflecting plate. The mobile robot based on visual navigation can overcome the problems, and only a camera is required to be installed on the roof, so that accurate navigation is realized within the coverage range of the camera. However, the scheme based on the visual navigation has the problem that a camera covers a blind area, namely, the position and pose information of the AGV cannot be obtained in a place without the camera covering. When the AGV runs to a blind area covered by the camera, the server cannot acquire the pose information of the AGV, so that a control instruction cannot be provided for the AGV. Therefore, additional sensors are needed to assist in providing the pose information of the AGV, which in turn increases hardware costs.
Disclosure of Invention
In view of the above, it is desirable to provide an automatic guidance method, a server, and a storage medium that can determine yaw angle and pose information when an automatic guided vehicle travels in a blind area of a camera in a navigation space, and control the automatic guided vehicle to travel along a virtual path according to the yaw angle and pose information.
The invention provides an automatic navigation method through a Kalman filtering algorithm according to information of an inertia detection unit and a motion prediction model, which is applied to a server, wherein the server is in communication connection with at least one automatic guided vehicle and at least one camera unit, and the method comprises the following steps:
controlling the camera unit to position the automatic guided vehicle;
controlling an inertia detection unit arranged on the automatic guided vehicle to detect pose information of the automatic guided vehicle, wherein the pose information at least comprises speed information and yaw angle information;
judging whether the positioning information obtained by the camera shooting unit is effective or not;
when the positioning information obtained by the camera shooting unit is judged to be effective, updating the positioning information currently obtained by the camera shooting unit into the pose information currently output by the inertia detection unit and the pose information output by the motion prediction model;
when the positioning information acquired by the camera unit is judged to be invalid, calculating the pose information of the automatic guided vehicle at the moment k +1 by a Kalman filtering algorithm according to the pose information currently detected by the inertia detection unit, the pose information output by the motion prediction model and the pose information of the automatic guided vehicle at the moment k;
and generating a control instruction according to the pose information of the automatic guided vehicle at the moment of k +1 and reference point information on a virtual path prestored in the server so as to control the automatic guided vehicle to run on the corresponding virtual path in a navigation space.
A second aspect of the present invention provides a server comprising:
a processor; and
a memory, in which a plurality of program modules are stored, the plurality of program modules being loaded by the processor and executing the automatic navigation method.
A third aspect of the present invention provides a storage medium having stored thereon at least one computer instruction for execution by a processor and loaded to perform the method of automatic navigation described above.
According to the automatic navigation method, when the automatic guided vehicle runs in a blind area which cannot be navigated by the camera unit, the running route of the automatic guided vehicle can be estimated through the combination of the yaw angle and the pose information detected by the inertia detection unit and the motion prediction model and the Kalman filtering algorithm, so that the automatic guided vehicle is prevented from deviating from the preset route.
Drawings
FIG. 1 is a diagram of an automatic navigation system according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of a server in the automatic navigation system according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of an automatic guided vehicle in the automatic navigation system according to an embodiment of the present invention.
FIG. 4 is a flowchart illustrating an automatic navigation method according to an embodiment of the present invention.
Description of the main elements
Automatic navigation system
|
10
|
Server
|
1
|
Automatic guiding vehicle
|
2
|
Network unit
|
11
|
Memory device
|
12
|
Processor with a memory having a plurality of memory cells
|
13
|
Battery with a battery cell
|
20
|
Wireless unit
|
21
|
Controller
|
22
|
Inertia detecting unit
|
23
|
Image pickup unit
|
3
|
Step (ii) of
|
S41~S49 |
The following detailed description will further illustrate the invention in conjunction with the above-described figures.
Detailed Description
Referring to fig. 1, a schematic diagram of an automatic navigation system 10 according to an embodiment of the invention is shown. The automatic navigation system 10 includes, but is not limited to, a server 1, an automatic guided vehicle 2, and at least one camera unit 3. In the present embodiment, the server 1 and the automatic guided vehicle 2 are connected by wireless communication. The server 1 is electrically connected with the at least one camera unit 3 through a wired manner. The camera unit 3 is a camera and is installed in a navigation space.
In the present embodiment, the server 1 controls the automatic guided vehicle 2 to navigate in the navigation space by estimating pose information of the automatic guided vehicle 2, which will be described in detail later. The pose information comprises position information and course angle information. The navigation space can be an indoor space (such as in a storage bin) or an outdoor space.
Referring to fig. 2, a schematic diagram of the server 1 in the automatic navigation system 10 according to an embodiment of the invention is shown. In the present embodiment, the server 1 includes, but is not limited to, a network unit 11, a memory 12, and a processor 13. The network unit 11, the memory 12 and the processor 13 are electrically connected to each other.
In this embodiment, the network unit 11 is configured to provide a network communication function for the server 1 through a wired or wireless network transmission manner. So that the server 1 can be network communicatively connected to the automated guided vehicle 2.
In the present embodiment, the memory 12 is used to store software programs and data installed in the server 1. In the present embodiment, the storage 12 may be an internal storage unit of the server 1, for example, a hard disk or a memory of the server 1. In other embodiments, the memory 12 may also be an external storage device of the server 1, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like provided on the server 1.
In the present embodiment, the memory 12 further stores a virtual map (e.g., an electronic map) of the navigation space, where the virtual map includes a plurality of virtual paths. The virtual path is composed of a reference point and a connection relation between the reference point and the reference point. The virtual path defines a plurality of reference points of the automatic guided vehicle 2 on the whole virtual path and position coordinates corresponding to each reference point, and the sequence of each automatic guided vehicle 2 passing through the plurality of reference points.
In the present embodiment, the memory 12 stores in advance the position coordinates of each reference point on the virtual path. The position coordinates of the respective reference points on the virtual path may refer to coordinates in a coordinate system (XOY) established based on an area where the entire navigation space is located.
In this embodiment, the processor 13 may be a Central Processing Unit (CPU), or other microprocessor or other data processing chip capable of executing control functions. The processor 13 is used for executing software program codes, calculating data, and the like.
Referring to fig. 3, a schematic diagram of an automatic guided vehicle in the automatic navigation system according to an embodiment of the invention is shown. In the present embodiment, the number of the automatic guided vehicle 2 may be one or more. The automatic guided vehicle 2 includes, but is not limited to, a battery 20, a wireless unit 21, a controller 22, and an inertia detection unit 23. The battery 20, the wireless unit 21, the controller 22 and the inertia detecting unit 23 are electrically connected.
In the present embodiment, the battery 20 is used for supplying power to the wireless unit 21, the controller 22 and the inertia detection unit 23.
The wireless unit 21 is used to provide a network connection between the automatic guided vehicle 2 and the server 1. The controller 22 is configured to receive a control instruction sent by the server 1 to control the automatic guided vehicle 2 to travel along a virtual path.
In this embodiment, the automatic guided vehicle 2 may further include a charging unit (not shown) for providing power to the battery 20.
The Inertial Measurement Unit (IMU) 23 is used for measuring the three-axis attitude angle, angular velocity, and acceleration of the object. The inertia detecting unit 23 includes a plurality of single-axis accelerometers and a plurality of single-axis gyroscopes, the accelerometers detect acceleration signals of the object in three independent axes of the carrier coordinate system, and the gyroscopes detect angular velocity signals of the carrier relative to the navigation coordinate system, measure angular velocity and acceleration of the object in three-dimensional space, and calculate the attitude of the object based on the angular velocity and acceleration signals.
In the prior art, the server 1 can assist in controlling the automatic guided vehicle 2 to travel on the virtual path according to the video or image information captured by the camera unit 3 installed in the navigation space. However, when the automatic guided vehicle 2 travels to the blind area of the camera unit 3, the server 1 cannot confirm whether the automatic guided vehicle 2 still travels on the virtual path. Therefore, in the present scheme, the server 1 may collect the pose information of the automatic guided vehicle 2 detected by the inertia detection unit 23 installed on the automatic guided vehicle 2, and estimate the pose information of the automatic guided vehicle 2 at the next time through the kalman filter algorithm according to the pose information and the motion prediction model, so that when the automatic guided vehicle 2 travels to a blind area that cannot be covered by the camera unit 3 in the navigation space, the pose information of the automatic guided vehicle 2 may be obtained, and a control instruction may be generated according to the pose information and the reference point information on the virtual path to control the automatic guided vehicle 2 to travel on the virtual path. In the present embodiment, the pose information includes position information and yaw angle information.
For example, first, the server 1 sends a control command to the camera unit 3 to control the camera unit 3 to position the automatic guided vehicle 2, and controls an inertia detection unit 23 installed on the automatic guided vehicle 2 to detect pose information of the automatic guided vehicle 2, where the pose information further includes acceleration information. The positioning information determined according to the image captured by the camera unit 3 installed in the navigation space is the prior art, and is not described herein again.
Further, it is determined whether the positioning information obtained by the camera unit 3 is valid, and when it is determined that the positioning information obtained by the camera unit 3 is valid, the positioning information currently obtained by the camera unit 3 is updated to the pose information currently output by the inertia detection unit 23 and the pose information output by the motion prediction model. When the positioning information obtained by the camera unit 3 is judged to be invalid, the pose information of the automatic guided vehicle 2 at the time k +1 is calculated through a Kalman filtering algorithm according to the pose information currently detected by the inertia detection unit 23 and the pose information of the automatic guided vehicle 2 at the time k output by the motion prediction model. The position information of the automatic guided vehicle 2 at the time k is position information determined from an image captured by the imaging unit 3 installed in a navigation space or initial position information of the automatic guided vehicle 2 on the virtual path.
Further, a control instruction is generated according to the pose information of the automatic guided vehicle 2 at the moment of k +1 and reference point information on a virtual path prestored in the server 1 so as to control the automatic guided vehicle 2 to run on the corresponding virtual path in the navigation space.
Referring to fig. 4, a flowchart of an automatic navigation method according to an embodiment of the invention is shown. The order of the steps in the flow diagrams may be changed, and some steps may be omitted or combined, according to different needs.
In step S41, the imaging unit 3 is controlled to position the automatic guided vehicle 2.
Step S42, controlling the inertia detection unit 23 mounted on the automatic guided vehicle 2 to detect the pose information of the automatic guided vehicle 2, where the pose information at least includes acceleration information, speed information, and yaw angle information.
Step S43, it is determined whether or not the positioning information obtained by the image pickup unit 3 is valid.
Specifically, the step S43 is to determine whether the image captured by the imaging unit 3 includes the complete automatic guided vehicle 2; when the image shot by the camera unit 3 contains the complete automatic guided vehicle 2, judging that the positioning information obtained by the camera unit 3 is valid; and when the image shot by the camera unit 3 does not contain the complete automatic guided vehicle 2, judging that the positioning information obtained by the camera unit 3 is invalid.
In step S44, when it is determined that the positioning information obtained by the camera unit 3 is valid, the positioning information currently obtained by the camera unit 3 is updated to the pose information currently output by the inertia detection unit 23 and the motion prediction model.
Step S45, when it is determined that the positioning information obtained by the camera unit 3 is invalid, calculating the pose information of the automated guided vehicle 2 at the time k +1 by using a kalman filter algorithm according to the pose information currently detected by the inertia detection unit 23, the pose information output by the motion prediction model, and the pose information of the automated guided vehicle 2 at the time k.
Specifically, the step S45 includes: and calculating the yaw angle variation delta theta of the automatic guided vehicle 2 at the moment k +1 through the motion prediction model. Wherein, the calculation formula is: Δ θ ═ θ '×, v is the speed of the automatically guided vehicle 2 currently detected by the inertia detection unit 23, and θ' is the amount of change in the yaw angle of the automatically guided vehicle 2 at the time from k to k + 1.
And calculating the displacement variation deltas of the automatic guided vehicle 2 at the moment k +1 through the motion prediction model. Wherein, the calculation formula is: Δ s ═ Δ t × v, v is the speed of the automated guided vehicle 2 currently detected by the inertia detection unit 23, and Δ t is the time interval from k to k + 1.
Calculating the pose information (position _ predict (x) of the automatic guided vehicle 2 at the moment k +1 by the motion prediction modelk+1,yk+1,θk+1)=(xk+ΔS*cos(θk),y0+ΔS*sin(θk),θk+ Δ θ), wherein the pose information pose1 (x) of the automated guided vehicle 2 at time kk,yk,θk) Detected and determined by the camera unit 3.
And calculating pose information (position _ IMU) of the automatic guided
vehicle 2 according to the information detected by the
inertia detection unit 23. Specifically, the displacement change value Δ S of the automatic guided
vehicle 2 in the x-axis direction between the time k and the time k +1 is calculated
x. Wherein, the calculation formula is:
wherein accX is the acceleration of the automatic guided
vehicle 2 on the x axis currently detected by the
inertia detection unit 23.
Calculating the displacement change value delta S of the automatic guided
vehicle 2 in the y-axis direction between the k moment and the k +1 moment
y. Wherein, the calculation formula is:
wherein accY is the acceleration of the automatic guided
vehicle 2 on the y axis currently detected by the
inertia detection unit 23.
Calculating the displacement change value of the automatic guided
vehicle 2 between the k moment and the k +1 moment
Calculating the yaw angle change value of the automatic guided
vehicle 2 between the moment k and the moment k +1
Calculating the pose information (position _ IMU (x)) of the automatic guided vehicle 2 at the moment of k +1k+1,yk+1,θk+1)=(xk+ΔS*cos(θk+Δθ),yk+ΔS*sin(θk+Δθ),θk+ Δ θ), wherein the pose information pose1 (x) of the automated guided vehicle 2 at time kk,yk,θk) Detected and determined by the camera unit 3.
Calculating the position and orientation information (pos _ kalman (x)) of the automated guided
vehicle 2 at the moment k +1 by a kalman filtering algorithm in combination with the motion prediction model
k+1,y
k+1,θ
k+1). Specifically, the method comprises the following steps:
wherein (x)
k,y
k,θ
k)
TIs (dose _ kalman (x)
k+1,y
k+1,θ
k+1))
T,
Represents pose information calculated based on the information detected by the
inertia detection unit 23,
representing pose information calculated according to the motion prediction model.
Computing a predictive estimation covariance matrix
Pk=(FkPk-1Fk T+Q)。
Computing optimal Kalman gain
Kk=PkHk T(HkPkHk T+R)-1。
Wherein, KkIs a 3 x 3 matrix.
Updating the prediction estimation covariance matrix according to the calculation result
Pk=(1-KkHk)Pk。
Wherein, when k is 1, Pk-1Initialisation to zero matrix, FkIs a state transition matrix, HkIs an observation matrix.
And step S46, generating a control instruction according to the pose information of the automatic guided vehicle 2 at the moment of k +1 and reference point information on a virtual path prestored in the server 1 so as to control the automatic guided vehicle 2 to run on the corresponding virtual path in the navigation space.
In step S47, it is determined whether or not the positioning information obtained by the image pickup unit 3 has changed from invalid to valid.
In step S48, when it is determined that the positioning information obtained by the imaging unit 3 has changed from invalid to valid, the automatic guided vehicle 2 is navigated based on the positioning information obtained by the imaging unit 3.
Step S49, when it is determined that the positioning information obtained by the imaging unit 3 is still invalid, continues navigating the automatic guided vehicle 2 according to the pose information of the automatic guided vehicle 2 at the time k + 1.
Although the present invention has been described in detail with reference to the preferred embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the spirit and scope of the invention.