CN114330832A - Intelligent express package distribution system and working method thereof - Google Patents

Intelligent express package distribution system and working method thereof Download PDF

Info

Publication number
CN114330832A
CN114330832A CN202111453753.4A CN202111453753A CN114330832A CN 114330832 A CN114330832 A CN 114330832A CN 202111453753 A CN202111453753 A CN 202111453753A CN 114330832 A CN114330832 A CN 114330832A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
module
express
parcel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111453753.4A
Other languages
Chinese (zh)
Inventor
亚德
周佳欢
唐亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aidi Uav Technology Nanjing Co ltd
Original Assignee
Aidi Uav Technology Nanjing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aidi Uav Technology Nanjing Co ltd filed Critical Aidi Uav Technology Nanjing Co ltd
Priority to CN202111453753.4A priority Critical patent/CN114330832A/en
Publication of CN114330832A publication Critical patent/CN114330832A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses an intelligent express delivery package distribution system and a working method thereof. The invention changes the traditional express delivery package distribution mode and creates a complete and multifunctional express delivery package distribution system; the invention realizes multi-machine cooperation by utilizing the autonomous mobile trolley and the unmanned aerial vehicle, realizes high-efficiency delivery of express packages, and adapts to future requirements in the field of cargo delivery; the invention adopts advanced technologies such as a target detection and tracking algorithm, a visual servo control algorithm, a path planning algorithm and the like, thereby effectively ensuring the reliability of the whole express parcel distribution system; in addition, the invention also provides a remote task scheduling system and a monitoring system, which ensure the long-time stable operation of the whole system.

Description

Intelligent express package distribution system and working method thereof
Technical Field
The invention relates to an intelligent express delivery package distribution system and a working method thereof, and belongs to the technical field of cargo distribution.
Background
The express industry has developed rapidly over the last 10 years, with hundreds of millions of express packages being available each day throughout the country. However, the delivery of express packages at present mainly depends on a front dispatcher to deliver or store the packages one by one to a specific express cabinet, and when the number of packages is increased, especially in an express peak, the traditional delivery mode seriously reduces the delivery efficiency of the express packages. At present, an unmanned aerial vehicle is adopted in a part of areas to complete the last delivery task of 1 kilometer, but the unmanned aerial vehicle still depends on manual control, and the problem of low efficiency of delivery of express packages is not fundamentally solved. Along with the gradual improvement of people's standard of living, the demand of express delivery parcel also increases day by day, and traditional express delivery parcel delivery mode will not satisfy people's demand, must need an express delivery parcel delivery mode of more high-efficient intelligence.
Disclosure of Invention
The invention aims to overcome the technical defects in the prior art, provides an intelligent express delivery package distribution system and a working method thereof, and aims to solve the technical problems that: the traditional express parcel distribution mode is changed; the intelligent control of the unmanned aerial vehicle and the autonomous mobile vehicle and the grabbing of the express packages are realized by combining algorithms such as visual servo control, target detection, path planning and the like; a more efficient and intelligent express package delivery mode is created by matching a task scheduling system and a monitoring system.
The invention specifically adopts the following technical scheme: the utility model provides an intelligence express delivery parcel delivery system, includes autonomous mobile vehicle, unmanned aerial vehicle, main control room, parcel delivery terminal, autonomous mobile vehicle unmanned aerial vehicle respectively with the main control room communication is connected, the main control room with parcel delivery terminal communication is connected.
As a better embodiment, the autonomous mobile vehicle comprises a first controller, a first communication module, a navigation express module, a parcel storage and taking module and an unmanned aerial vehicle charging module, wherein the first controller is in communication connection with the navigation express module, the parcel storage and taking module and the unmanned aerial vehicle charging module respectively, and the first controller is in communication connection with the main control room through the first communication module.
As a preferred embodiment, the unmanned aerial vehicle comprises a second controller, a navigation module, a visual sensor module, a second communication module and a parcel grabbing module, wherein the second controller is in communication connection with the navigation module, the visual sensor module and the parcel grabbing module respectively, and the second controller is in communication connection with the main control room through the second communication module.
As a preferred embodiment, the main control room includes a task scheduling system and a monitoring system, and the task scheduling system is configured to perform: express delivery scheduling and unmanned aerial vehicle charging scheduling; the monitoring system is configured to perform: communication state monitoring, unmanned aerial vehicle flight state monitoring, autonomous mobile vehicle running state monitoring.
As a preferred embodiment, the parcel delivery terminal includes a landing module and a third communication module, and the landing module is in communication connection with the main control room through the third communication module.
As a preferred embodiment, the parcel delivery terminal is fixedly disposed outside a balcony, and comprises: a horizontal object placing table and a vertical landing landmark which can rotate by 90 degrees; the horizontal object placing platform is used for placing packages by the unmanned aerial vehicle, and the vertical landing landmark is used for hovering the unmanned aerial vehicle.
The invention also provides a working method of the intelligent express delivery package distribution system, which comprises the following steps:
step SS 1: the autonomous moving trolley places the express parcels to be delivered on a lifting type objective table through a parcel storing and taking module, the lifting type objective table sends the express parcels to the roof of the autonomous moving trolley, and the step SS2 is carried out;
step SS 2: the autonomous mobile trolley sends a package distribution request to the task scheduling system through the MAVLINK communication protocol, sends the size, weight and target position information of the express package at the same time, and then goes to step SS 3;
step SS 3: the task scheduling system selects the unmanned aerial vehicle corresponding to the load according to the size, weight and target position information of the express package, sends a task instruction to the adjacent unmanned aerial vehicles, and if the unmanned aerial vehicles successfully receive the task instruction and return to a receiving state, the step SS4 is carried out, otherwise, the unmanned aerial vehicles are judged to be in a task or charging state, and the step SS5 is carried out;
step SS 4: the unmanned aerial vehicle establishes communication with the autonomous mobile vehicle and is switched to a task state; the autonomous mobile vehicle sends real-time GPS information to the unmanned aerial vehicle, and the unmanned aerial vehicle flies above the autonomous mobile vehicle and keeps a relatively static flying posture with the autonomous mobile vehicle;
step SS 5: the task scheduling system sends a waiting instruction to the autonomous mobile vehicle, and when an unmanned aerial vehicle is in a waiting state nearby, the task scheduling system sends a task instruction to the unmanned aerial vehicle, and the step SS4 is carried out;
step SS 6: the unmanned aerial vehicle starts to detect and track the landing landmark on the autonomous mobile vehicle, and the control quantity is calculated through a visual servo algorithm; when the two-dimensional code above the express package to be delivered is detected and analyzed, the unmanned aerial vehicle adjusts the grabbing attitude by means of AprilTag above the express package, keeps relative rest with the express package, and then goes to step SS 7;
step SS 7: the unmanned aerial vehicle uses a parcel grabbing module to grab the express parcel, and rapidly plans an optimal flight path according to distribution position information analyzed from the two-dimensional code above the express parcel, and the step SS8 is carried out;
step SS 8: detecting and tracking a vertical landing landmark by an unmanned aerial vehicle near a parcel distribution terminal, hovering the unmanned aerial vehicle at a position 0.5 m in front of the vertical landing landmark by using a visual servo control algorithm, placing an express parcel on a horizontal placement platform, and entering a step SS 9;
step SS 9: the unmanned aerial vehicle sends a task completion signal to the task scheduling center, and the task scheduling center sends information of the autonomous mobile trolley with the nearby automatic charging plate in an idle state to the unmanned aerial vehicle; the unmanned aerial vehicle flies to the position near the upper part of the autonomous mobile trolley, and the visual sensor module is utilized to track the landing landmark on the roof and the charging landing landmark on the automatic charging plate in sequence, adjust the flying attitude in real time and land; the automatic charging panel starts the charging pose correction device to adjust the position of the unmanned aerial vehicle and starts charging after detecting that the unmanned aerial vehicle lands safely.
As a preferred embodiment, the landing landmark detection in steps SS6, SS8 and SS9 adopts a lightweight NanoDet algorithm, extracts image features through a ShuffleNet multilayer convolutional neural network, fuses multi-scale image features through a feature pyramid network, completes the identification and positioning of landmarks through a shallow classification network and a regression network, and corrects the landing landmark position by an extended kalman filter algorithm.
As a preferred embodiment, the calculation formula of the visual servo control algorithm in steps SS6, SS8 and SS9 is:
Figure BDA0003385937200000041
wherein v iscRepresenting desired linear and angular velocity vectors of the camera; l isxThe jacobian matrix of the image is represented, is determined by camera internal parameters and pixel coordinates and is responsible for mapping the speed transformation in the pixel coordinate system into the camera coordinate system; λ represents the visual servo gain, which determines the magnitude of the control force.
As a preferred embodiment, the optimal flight path in step SS7 is obtained by using Q-Learning reinforcement Learning, and the specific Learning process is as follows: selecting and executing the current flight action through the initialized Q value table, calculating corresponding return under the current flight action, and updating the Q value table through the current return until the learning process is converged; updating a Q function iteration formula corresponding to the Q value table as follows: q (s, a) — Q (s, a) + α [ R (s, a) + γ maxQ ' (s ', a ') -Q (s, a) ], where s is the state at the current time, a is the action taken at the current time, R is the reward resulting from taking action a in the current state s, α represents the learning rate, and γ represents the attenuation factor.
The invention achieves the following beneficial effects: the invention changes the traditional express delivery package distribution mode and creates a complete and multifunctional express delivery package distribution system; the invention realizes multi-machine cooperation by utilizing the autonomous mobile trolley and the unmanned aerial vehicle, realizes high-efficiency delivery of express packages, and adapts to future requirements in the field of cargo delivery; the invention adopts advanced technologies such as a target detection and tracking algorithm, a visual servo control algorithm, a path planning algorithm and the like, thereby effectively ensuring the reliability of the whole express parcel distribution system; in addition, the invention also provides a remote task scheduling system and a monitoring system, which ensure the long-time stable operation of the whole system.
Drawings
FIG. 1 is a schematic view of the structural topology of an intelligent express package delivery system of the present invention;
FIG. 2 is a flow chart of a method of operation of an intelligent express package delivery system of the present invention;
FIG. 3 is a schematic diagram of the master control room architecture topology of the present invention;
FIG. 4 is a flow chart of a landmark tracking control method for visual landing of an unmanned aerial vehicle according to the present invention;
fig. 5 is a schematic topological diagram of a landmark tracking control system for visual landing of an unmanned aerial vehicle according to the present invention.
Detailed Description
The invention is further described below with reference to the accompanying drawings. The following examples are only for illustrating the technical solutions of the present invention more clearly, and the protection scope of the present invention is not limited thereby.
Example 1: as shown in fig. 1, which is a schematic structural topology diagram of an intelligent express delivery package distribution system of the present invention, the present invention provides an intelligent express delivery package distribution system including an autonomous mobile cart, an unmanned aerial vehicle, a main control room, and a package distribution terminal; the autonomous mobile trolley comprises a first controller, a first communication module, a navigation express delivery module, a parcel storage module and an unmanned aerial vehicle charging module; the unmanned aerial vehicle comprises a second controller, a navigation module, a visual sensor module, a second communication module and a package grabbing module; the main control room comprises a task scheduling system and a monitoring system; the package delivery terminal includes a landing module and a third communication module.
Optionally, the parcel storage and taking module is located behind the autonomous mobile trolley, and the parcel storage and taking module comprises a parcel storage box and a lifting type objective table; the parcel storage box is used for storing parcels to be delivered in a short time, the parcels to be delivered are placed on the lifting type object stage at present, and the two-dimensional codes and AprilTag are pasted on the upper surfaces of the parcels and are respectively used for the unmanned aerial vehicle to analyze parcel information and adjust hovering grabbing pose; the unmanned aerial vehicle charging module comprises an automatic charging plate, and an unmanned aerial vehicle visual landing landmark and a charging pose correction device are pasted on the automatic charging plate; the unmanned aerial vehicle utilizes the vision sensor module to track the landmark on the automatic charging plate and adjust the flight attitude in real time above the autonomous mobile trolley.
Optionally, the parcel distribution terminal is fixed outside the balcony and includes a horizontal placement platform and a vertical landing landmark which can rotate by 90 degrees; the horizontal object placing platform is used for placing packages by the unmanned aerial vehicle, and the vertical landing landmarks are used for accurate hovering of the unmanned aerial vehicle.
Optionally, the vision sensor module comprises RGB cameras respectively mounted at the head and the bottom of the unmanned aerial vehicle, and the RGB cameras at the bottom are used for the unmanned aerial vehicle to hover at a fixed point above the autonomous mobile vehicle for picking up goods or landing for charging; the RGB camera of head is used for unmanned aerial vehicle to arrive the accurate input of parcel delivery terminal back according to vertical landing ground mark.
Optionally, the main control room is built in a range of 1 km of the autonomous mobile vehicle, and the task scheduling system communicates with the unmanned aerial vehicle by using a MAVLINK protocol and is responsible for sending distribution tasks to the unmanned aerial vehicle; the monitoring system is responsible for monitoring the current flight state and the parcel delivery state of the unmanned aerial vehicle, timely sends alarm information when the unmanned aerial vehicle state or the parcel delivery is abnormal, and remotely controls the unmanned aerial vehicle to land safely or return to the air at necessary moments.
Optionally, the drone control module uses raspberry pi 4B.
Optionally, the drone navigation module includes an inertial measurement unit and a odometer.
Example 2: fig. 2 is a flow chart of a working method of the intelligent delivery system for parcels of the present invention; the execution steps are as follows:
step SS 1: the autonomous moving trolley places the express parcels to be delivered on a lifting type objective table through a parcel storing and taking module, the lifting type objective table sends the express parcels to the roof of the autonomous moving trolley, and the step SS2 is carried out;
step SS 2: the autonomous mobile trolley sends a package distribution request to the task scheduling system through the MAVLINK communication protocol, sends the size, weight and target position information of the express package at the same time, and then goes to step SS 3;
step SS 3: the task scheduling system selects the unmanned aerial vehicle corresponding to the load according to the size, weight and target position information of the express package, sends a task instruction to the adjacent unmanned aerial vehicles, and if the unmanned aerial vehicles successfully receive the task instruction and return to a receiving state, the step SS4 is carried out, otherwise, the unmanned aerial vehicles are judged to be in a task or charging state, and the step SS5 is carried out;
step SS 4: the unmanned aerial vehicle establishes communication with the autonomous mobile vehicle and is switched to a task state; the autonomous mobile vehicle sends real-time GPS information to the unmanned aerial vehicle, and the unmanned aerial vehicle flies above the autonomous mobile vehicle and keeps a relatively static flying posture with the autonomous mobile vehicle;
step SS 5: the task scheduling system sends a waiting instruction to the autonomous mobile vehicle, and when an unmanned aerial vehicle is in a waiting state nearby, the task scheduling system sends a task instruction to the unmanned aerial vehicle, and the step SS4 is carried out;
step SS 6: the unmanned aerial vehicle starts to detect and track the landing landmark on the autonomous mobile vehicle, and the control quantity is calculated through a visual servo algorithm; when the two-dimensional code above the express package to be delivered is detected and analyzed, the unmanned aerial vehicle adjusts the grabbing attitude by means of AprilTag above the package, keeps relative rest with the express package, and then goes to step SS 7;
step SS 7: the unmanned aerial vehicle uses a parcel grabbing module to grab the express parcels, quickly plans an optimal flight path according to distribution position information analyzed from the two-dimensional codes above the parcels, and enters a step SS 8;
step SS 8: the unmanned aerial vehicle detects and tracks the vertical landing landmark near the parcel distribution terminal, hovers at a position 0.5 m in front of the vertical landing landmark by using a visual servo control algorithm, places the express parcel on a horizontal placement platform, and enters step SS 9;
step SS 9: the unmanned aerial vehicle sends a task completion signal to a task scheduling center, and the task scheduling center sends information of the autonomous mobile trolley with a nearby automatic charging plate in an idle state to the unmanned aerial vehicle; the unmanned aerial vehicle flies to the position near the upper part of the autonomous mobile trolley, and the visual sensor module is used for tracking the landing landmark on the roof and the charging landing landmark on the automatic charging plate in sequence, adjusting the flying attitude in real time and landing; the automatic charging panel starts the charging pose correction device to adjust the position of the unmanned aerial vehicle and starts charging after detecting that the unmanned aerial vehicle lands safely.
Optionally, the landing landmark detection in steps SS6, SS8, and SS9 adopts a lightweight NanoDet algorithm, extracts image features through a ShuffleNet multilayer convolutional neural network, fuses multi-scale image features through a feature pyramid network, and completes landmark identification and positioning through a shallow classification network and a regression network; and meanwhile, correcting the landing landmark position by adopting an extended Kalman filtering algorithm.
It should be noted that the step of correcting the landing landmark position by using the NanoDet algorithm and the extended kalman filter algorithm specifically includes: as shown in fig. 4, the present invention further provides a landmark tracking control method for visual landing of an unmanned aerial vehicle, which specifically includes the following 6 steps.
1. The unmanned aerial vehicle is positioned near the landing site and hovers, and a 3-axis brushless tripod head positioned at the bottom of the unmanned aerial vehicle is controlled to search for a landing landmark in a fan-shaped range. The search area can be gradually enlarged by continuously adjusting the pitch angle and the roll angle of the holder. At the moment, the landmark is very small in the visual field due to the fact that the landmark is far away from the target, and the detection algorithm adopts a lightweight Nanodet model. The model automatically extracts image features through a ShuffleNet multilayer convolutional neural network, fuses multi-scale image features through a feature pyramid network, and completes identification and positioning of landmarks through a shallow classification network and a regression network.
2. When the land landmark is detected in the visual field, the landmark is tracked by adopting extended Kalman filtering, the angle and the angular velocity of the target are also put into a state space of the extended Kalman filtering in consideration of the change of a yaw angle when the unmanned aerial vehicle tracks the target, namely, the state vector adopts the pixel position, the speed, the angle and the angular velocity of the target in the image visual field, and the measured value adopts the pixel position information obtained by detecting the Nanodet target. Taking the upper left corner of the target as an example, the corresponding state quantity is expressed as
Figure BDA0003385937200000091
Wherein p isxAnd pyIndicating the pixel position of the upper left corner of the target in the horizontal and vertical directions of the image at time t, respectively, v indicating the current linear velocity, ψ indicating the angle between the current target moving direction and the horizontal direction,
Figure BDA0003385937200000092
indicating the current angular velocity. The state of the target at the moment k +1 can be obtained by predicting the state at the moment k, and the corresponding prediction formula is as follows:
Figure BDA0003385937200000093
wherein:
Figure BDA0003385937200000094
Figure BDA0003385937200000101
for model simplification, the velocity and angular velocity changes of the target in the image from the time k to the time k +1 are considered in the process noise of the model, and the target acceleration and the angular acceleration are not considered to be 0 in the state transition equation, namely:
Figure BDA0003385937200000102
Figure BDA0003385937200000103
Figure BDA0003385937200000104
in process noise, an acceleration noise w in which an object moves in an image is assumeda,kAnd angular acceleration noise
Figure BDA0003385937200000105
Gaussian distribution with 0 mean, i.e.:
Figure BDA0003385937200000106
Figure BDA0003385937200000107
since the result of the Nanodet detection is the pixel coordinates of the top left corner and the bottom right corner of the target, the corresponding measurement formula is:
zk=Hxk+vk
wherein z iskDenotes the detection result of Nanodet, H ═ 1,0,0, 1,0,0,0), vkRepresenting the measurement noise.
3. In order to ensure that the landmark is not lost in the tracking process, after the landmark is tracked through the Nanodet detection model and the extended Kalman filtering, the tripod head at the bottom of the unmanned aerial vehicle needs to be adjusted, and the landmark is positioned in the middle of the visual field. Taking a common 3-axis brushless holder as an example, the horizontal offset of the landmark in the image corresponds to the roll angle change of the holder, and the vertical offset corresponds to the pitch angle change of the holder. Setting the width and height of the imageW and H, respectively, and the field angles of the camera in the horizontal direction and the known direction are thetaxAnd thetay(ii) a Suppose the coordinates of the center point of the target in the image at a certain moment are (x)t,yt) The linear distance from the target to the camera is R, the deviation angles of the target relative to the camera in the horizontal direction and the vertical direction are alpha and beta, and the following relation can be obtained by utilizing a trigonometric function:
W=2kRtanθx
H=2kRtanθy
xt=kRtanα
yt=kRtanβ
where k represents the scaling factor from the three-dimensional space to the pixel plane, determined by camera parameters. From the above formula one can obtain:
Figure BDA0003385937200000111
Figure BDA0003385937200000112
if the angle error e of the current pan/tilt head tracking is (α, β), the following PID controller can be designed for pan/tilt head control:
Figure BDA0003385937200000113
wherein, Kp、Ki、KdRespectively representing the proportion, the differential coefficient and the integral coefficient of the tripod head control, and u is the final tripod head angle change control quantity.
4. Calculating the position information of the current target relative to the unmanned aerial vehicle according to the current attitude information of the holder and the position information of the target in the camera, taking a ground landmark as an example, setting the position of the target relative to the unmanned aerial vehicle as (x, y, z), and using a corresponding calculation formula as follows:
x=ztanα
y=ztanβ
Figure BDA0003385937200000121
wherein alpha and beta are deflection angles of the target in the horizontal direction and the vertical direction in the camera; d is the distance from the target to the camera and is calculated by the attitude information of the target relative to the camera.
5. After the unmanned aerial vehicle acquires the position information of the target, the cradle head starts to track the target, and the specific control mode of the unmanned aerial vehicle depends on the current attitude angle information of the cradle head camera.
When the pan-tilt camera is not in a vertical state, only PID position control is applied to the unmanned aerial vehicle, and the input error signal is the deviation between the three-dimensional position information of the target relative to the unmanned aerial vehicle and the target position, namely:
Figure BDA0003385937200000122
to further smooth the control curve and limit the response range of the output control, the control quantity is mapped using the following function:
Figure BDA0003385937200000131
wherein u isinIs the output of the PID position control; u. ofoutIs the final position control quantity obtained after mapping; u shapemaxRepresenting the maximum control quantity of the scope unmanned aerial vehicle; λ is a damping factor, and determines the damping rate of the control amount. And when the landing mark is gradually approached, the effective detection range of the Apriltag is entered, the extended Kalman filter is adopted to track 4 vertexes of the target in the image, and the three-dimensional space position information equivalent to the unmanned aerial vehicle can be accurately calculated according to the camera internal reference and the pixel position information of the Apriltag in the image. When the distance between the unmanned aerial vehicle and the landing mark is close, the speed of the unmanned aerial vehicle tends to be stable, so that the detection robustness of the Ariltag target is high at this stage, and the deep learning is not relied on. Position control and angle control are applied to the unmanned aerial vehicle by adopting visual servo control, and the output at the momentThe input error signal is the deviation of the pixel coordinate of the current landmark in the image and the pixel coordinate mapped to the image by the three-dimensional space position of the landmark, namely:
e(t)=s[m(t),a]-s*
where s denotes the coordinates of the 4 vertices of the landmark on the image plane, s*The position of the object position of the landmark on the image plane is represented, m represents the pixel coordinate of the object in the image, and a represents the camera internal parameter for mapping the pixel coordinate to the image plane. The visual servo control rate adopted according to the error input is calculated by the following formula:
Figure BDA0003385937200000132
wherein v iscRepresenting desired linear and angular velocity vectors of the camera; l isxThe jacobian matrix of the image is represented, is determined by camera internal parameters and pixel coordinates and is responsible for mapping the speed transformation in the pixel coordinate system into the camera coordinate system; λ represents the visual servo gain, which determines the magnitude of the control force.
6. The unmanned aerial vehicle rapidly reaches the target position according to the visual servo idle rate, and the unmanned aerial vehicle is controlled to land when the error is lower than a specified threshold value.
As shown in fig. 5, the present invention further provides a landmark tracking control system for visual landing of an unmanned aerial vehicle, including:
the landmark gesture acquisition module specifically executes: obtaining the attitude information of the landmark relative to the camera by a Nanodet target detection algorithm and an Apriltag detection algorithm, wherein the attitude information of the landmark relative to the camera comprises three-dimensional position information p of the landmark relative to the cameratargetAnd angle information wtarget
Cloud platform gesture obtains the module, specifically carries out: obtaining the attitude angle w of the holder by the inertial sensing unit of the holdergimbalThe attitude angle w of the holdergimbalThe pitch angle, the yaw angle and the roll angle of the tripod head are included;
unmanned aerial vehicle state acquisition module specifically carries out: unmanned aerial vehicle carried sensor measures and obtains unmannedMachine state information sdroneThe unmanned aerial vehicle state information sdroneThe method comprises the steps of (1) including real-time attitude and altitude flight data of the unmanned aerial vehicle;
the control information generation module specifically executes: respectively generating control quantity containing the posture of the tripod head by the information acquired by the landmark posture acquisition module, the posture acquisition module of the tripod head and the unmanned aerial vehicle state acquisition module
Figure BDA0003385937200000141
Unmanned aerial vehicle attitude control quantity
Figure BDA0003385937200000142
And the speed control quantity of the unmanned aerial vehicle
Figure BDA0003385937200000143
Control amount information of (2); the control amount information is expressed as:
u=F(ptarget,wtarget,wgimbal,sdrone
wherein the content of the first and second substances,
Figure BDA0003385937200000144
f represents a control algorithm comprising an extended Kalman filter algorithm, a PID control algorithm and a visual servo algorithm.
Optionally, the calculation formula of the visual servoing control algorithm in the steps SS6, SS8 and SS9 is as follows:
Figure BDA0003385937200000145
wherein v iscRepresenting desired linear and angular velocity vectors of the camera; l isxThe jacobian matrix of the image is represented, is determined by camera internal parameters and pixel coordinates and is responsible for mapping the speed transformation in the pixel coordinate system into the camera coordinate system; λ represents the visual servo gain, which determines the magnitude of the control force.
Optionally, the optimal flight path in step SS7 is obtained by using Q-Learning reinforcement Learning, and the specific Learning process is as follows: and selecting and executing the current flight action through the initialized Q value table, calculating corresponding return under the current flight action, and updating the Q value table through the current return until the learning process is converged. Updating a Q function iteration formula corresponding to the Q value table as follows: q (s, a) — Q (s, a) + α [ R (s, a) + γ maxQ ' (s ', a ') -Q (s, a) ], where s is the state at the current time, a is the action taken at the current time, R is the reward resulting from taking action a in the current state s, α represents the learning rate, and γ represents the attenuation factor.
Fig. 3 is a schematic structural topology diagram of a main control room of the intelligent express package distribution system according to the present invention. The main control room specifically comprises a task scheduling system and a monitoring system. The task scheduling system is responsible for express delivery scheduling of goods and unmanned aerial vehicle charging scheduling; the monitoring system is responsible for monitoring the communication state, the flight state of the unmanned aerial vehicle and the running state of the autonomous mobile trolley.
The above description is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, several modifications and variations can be made without departing from the technical principle of the present invention, and these modifications and variations should also be regarded as the protection scope of the present invention.

Claims (10)

1. The utility model provides an intelligence express delivery parcel delivery system, its characterized in that, includes autonomous mobile vehicle, unmanned aerial vehicle, main control room, parcel delivery terminal, autonomous mobile vehicle unmanned aerial vehicle respectively with the main control room communication is connected, main control room with parcel delivery terminal communication is connected.
2. The system of claim 1, wherein the autonomous mobile cart comprises a first controller, a first communication module, a navigation express delivery module, a parcel storage and taking module and an unmanned aerial vehicle charging module, the first controller is in communication connection with the navigation express delivery module, the parcel storage and taking module and the unmanned aerial vehicle charging module respectively, and the first controller is in communication connection with the main control room through the first communication module.
3. The system of claim 1, wherein the unmanned aerial vehicle comprises a second controller, a navigation module, a visual sensor module, a second communication module, and a parcel grasping module, the second controller is in communication connection with the navigation module, the visual sensor module, and the parcel grasping module, respectively, and the second controller is in communication connection with the main control room through the second communication module.
4. The intelligent delivery package distribution system of claim 1, wherein the main control room comprises a task scheduling system and a monitoring system, and the task scheduling system is configured to perform: express delivery scheduling and unmanned aerial vehicle charging scheduling; the monitoring system is configured to perform: communication state monitoring, unmanned aerial vehicle flight state monitoring, autonomous mobile vehicle running state monitoring.
5. The system of claim 1, wherein the parcel delivery terminal comprises a landing module and a third communication module, and the landing module is in communication connection with the main control room through the third communication module.
6. The intelligent delivery package distribution system of claim 1, wherein the package distribution terminal is fixedly disposed outside a balcony, comprising: a horizontal object placing table and a vertical landing landmark which can rotate by 90 degrees; the horizontal object placing platform is used for placing packages by the unmanned aerial vehicle, and the vertical landing landmark is used for hovering the unmanned aerial vehicle.
7. An operating method of an intelligent express delivery package distribution system based on claim 1, characterized by comprising the following steps:
step SS 1: the autonomous moving trolley places the express parcels to be delivered on a lifting type objective table through a parcel storing and taking module, the lifting type objective table sends the express parcels to the roof of the autonomous moving trolley, and the step SS2 is carried out;
step SS 2: the autonomous mobile trolley sends a package distribution request to the task scheduling system through the MAVLINK communication protocol, sends the size, weight and target position information of the express package at the same time, and then goes to step SS 3;
step SS 3: the task scheduling system selects the unmanned aerial vehicle corresponding to the load according to the size, weight and target position information of the express package, sends a task instruction to the adjacent unmanned aerial vehicles, and if the unmanned aerial vehicles successfully receive the task instruction and return to a receiving state, the step SS4 is carried out, otherwise, the unmanned aerial vehicles are judged to be in a task or charging state, and the step SS5 is carried out;
step SS 4: the unmanned aerial vehicle establishes communication with the autonomous mobile vehicle and is switched to a task state; the autonomous mobile vehicle sends real-time GPS information to the unmanned aerial vehicle, and the unmanned aerial vehicle flies above the autonomous mobile vehicle and keeps a relatively static flying posture with the autonomous mobile vehicle;
step SS 5: the task scheduling system sends a waiting instruction to the autonomous mobile vehicle, and when an unmanned aerial vehicle is in a waiting state nearby, the task scheduling system sends a task instruction to the unmanned aerial vehicle, and the step SS4 is carried out;
step SS 6: the unmanned aerial vehicle starts to detect and track the landing landmark on the autonomous mobile vehicle, and the control quantity is calculated through a visual servo algorithm; when the two-dimensional code above the express package to be delivered is detected and analyzed, the unmanned aerial vehicle adjusts the grabbing attitude by means of AprilTag above the express package, keeps relative rest with the express package, and then goes to step SS 7;
step SS 7: the unmanned aerial vehicle uses a parcel grabbing module to grab the express parcel, and rapidly plans an optimal flight path according to distribution position information analyzed from the two-dimensional code above the express parcel, and the step SS8 is carried out;
step SS 8: the unmanned aerial vehicle detects and tracks the vertical landing landmark near the parcel distribution terminal, hovers at a position 0.5 m in front of the vertical landing landmark by using a visual servo control algorithm, places the express parcel on a horizontal placement platform, and enters step SS 9;
step SS 9: the unmanned aerial vehicle sends a task completion signal to a task scheduling center, and the task scheduling center sends information of the autonomous mobile trolley with a nearby automatic charging plate in an idle state to the unmanned aerial vehicle; the unmanned aerial vehicle flies to the position near the upper part of the autonomous mobile trolley, and the visual sensor module is utilized to track the landing landmark on the roof and the charging landing landmark on the automatic charging plate in sequence, adjust the flying attitude in real time and land; the automatic charging panel starts a charging pose correction device to adjust the position of the unmanned aerial vehicle and starts charging after the unmanned aerial vehicle is detected to land safely.
8. The operating method of the intelligent express delivery package distribution system according to claim 7, wherein the landing landmark detection tracking algorithm in steps SS6, SS8 and SS9 detects landmarks in the field of view using a NanoDet target detection algorithm, and simultaneously predicts and corrects the landing landmark position using an extended Kalman filter algorithm.
9. The method of claim 7, wherein the calculation formula of the visual servo control algorithm in the steps SS6, SS8 and SS9 is as follows:
Figure FDA0003385937190000031
wherein v iscRepresenting desired linear and angular velocity vectors of the camera; l isxThe jacobian matrix of the image is represented, is determined by camera internal parameters and pixel coordinates and is responsible for mapping the speed transformation in the pixel coordinate system into the camera coordinate system; λ represents the visual servo gain, which determines the magnitude of the control force.
10. The operating method of an intelligent delivery package distribution system according to claim 7, wherein the optimal flight path in step SS7 is obtained by Q-Learning reinforcement Learning, and the corresponding Q function iterative formula is: q (s, a) — Q (s, a) + α [ R (s, a) + γ maxQ ' (s ', a ') -Q (s, a) ], where s is the state at the current time, a is the action taken at the current time, R is the reward resulting from taking action a in the current state s, α represents the learning rate, and γ represents the attenuation factor.
CN202111453753.4A 2021-12-01 2021-12-01 Intelligent express package distribution system and working method thereof Pending CN114330832A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111453753.4A CN114330832A (en) 2021-12-01 2021-12-01 Intelligent express package distribution system and working method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111453753.4A CN114330832A (en) 2021-12-01 2021-12-01 Intelligent express package distribution system and working method thereof

Publications (1)

Publication Number Publication Date
CN114330832A true CN114330832A (en) 2022-04-12

Family

ID=81048676

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111453753.4A Pending CN114330832A (en) 2021-12-01 2021-12-01 Intelligent express package distribution system and working method thereof

Country Status (1)

Country Link
CN (1) CN114330832A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114906326A (en) * 2022-05-06 2022-08-16 大连理工大学 Closed living area intelligent material distribution robot based on unmanned aerial vehicle and mobile ground platform

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114906326A (en) * 2022-05-06 2022-08-16 大连理工大学 Closed living area intelligent material distribution robot based on unmanned aerial vehicle and mobile ground platform

Similar Documents

Publication Publication Date Title
US11673650B2 (en) Adaptive thrust vector unmanned aerial vehicle
CN110062919B (en) Drop-off location planning for delivery vehicles
US20200207474A1 (en) Unmanned aerial vehicle and payload delivery system
CN111158355A (en) Automatic navigation cloud server and automatic navigation control method
CN110162103A (en) A kind of unmanned plane independently cooperates with transportation system and method with intelligent vehicle group
JP7492718B2 (en) System, method, program, and storage medium for storing the program for identifying a safe landing area
CN113156998B (en) Control method of unmanned aerial vehicle flight control system
CN114415731B (en) Multi-flying robot cooperative operation method and device, electronic equipment and storage medium
CN114330832A (en) Intelligent express package distribution system and working method thereof
CN112859923B (en) Unmanned aerial vehicle vision formation flight control system
CN113568427B (en) Unmanned aerial vehicle autonomous landing mobile platform method and system
CN114714357A (en) Sorting and carrying method, sorting and carrying robot and storage medium
Laiacker et al. Automatic aerial retrieval of a mobile robot using optical target tracking and localization
CN107783542A (en) The control method and control system of unmanned plane
CN114326765B (en) Landmark tracking control system and method for unmanned aerial vehicle visual landing
CN115755575A (en) ROS-based double-tripod-head unmanned aerial vehicle autonomous landing method
JP7031997B2 (en) Aircraft system, air vehicle, position measurement method, program
JP2022067672A (en) Landing control device of flying object
CN115775356A (en) Unmanned distribution method for express packages
US20240124137A1 (en) Obstacle avoidance for aircraft from shadow analysis
CN113371180A (en) Operation type flying robot system, landing control method, landing control device, and electronic device
CN117284676A (en) Suction type double-arm transfer robot cluster system and control method
Seth et al. AeroBridge: Autonomous Drone Handoff System for Emergency Battery Service
CN116449873A (en) Tethered plant protection unmanned aerial vehicle system and air-ground cooperative automatic operation method
CN115097504A (en) Multi-sensor fusion perception unmanned patrol car system and working method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination