CN109934871A - A kind of system and method for the Intelligent unattended machine crawl target towards high-risk environment - Google Patents
A kind of system and method for the Intelligent unattended machine crawl target towards high-risk environment Download PDFInfo
- Publication number
- CN109934871A CN109934871A CN201910123394.2A CN201910123394A CN109934871A CN 109934871 A CN109934871 A CN 109934871A CN 201910123394 A CN201910123394 A CN 201910123394A CN 109934871 A CN109934871 A CN 109934871A
- Authority
- CN
- China
- Prior art keywords
- target
- unmanned plane
- coordinate
- mechanical arm
- center
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The present invention provides a kind of system and method for Intelligent unattended machine crawl target towards high-risk environment.The present invention by combine unmanned air vehicle technique, image processing techniques and some mechanics in terms of technology, by unmanned plane be applied to high-risk environment under foreign matter cleaning work.Meanwhile using multiple sensors, using positioning simultaneously and drafting algorithm, algorithm of target detection and target location algorithm, realizes the accurate detection to interesting target, precise positioning and stablize crawl.Relative to the method for traditional manual cleaning foreign matter, high degree of automation of the present invention is high-efficient, and safety coefficient is high, can be widely applied in the foreign matter cleaning work under high-risk environment.
Description
Technical field
The invention belongs to image procossing and air vehicle technique field, it is related to the location navigation and mesh of unmanned plane under high-risk environment
Target detection positioning and precisely crawl technology.
Background technique
In these years, with the fast development of tourist industry, the hygienic cleaning work at scenic spot is related to scenic spot economic benefit, ring
The balanced development of border benefit is one of scenic spot urgent problem to be solved.But the scenic spot dangerously steep in part, for example overhanging cliff
On, hygienic cleaning work just becomes extremely dangerous.In addition, the cleaning work of foreign matter is also ten on some overhead power transmission line roads
Divide difficult task.
With being constantly progressive for unmanned air vehicle technique, the strong flexibility of manipulation and its high scalability of function are gradually by big
Crowd's favor, compared with people take risks and recycle rubbish, best selection is undoubtedly using unmanned plane.Traditional UAV system
Movement is executed using the program write in advance and fixed process, capacity of will is limited, it is difficult to perceive the variation of external environment
Response action is carried out, close friend is carried out between people and working environment and is interacted.With flying for computer technology and artificial intelligence technology
Speed development, the raising of computer process ability and sensor accuracy, unmanned plane have very big promotion at the functional and technical levels,
Intelligent unattended machine system with impression, thinking, decision and Motor ability becomes the hot spot of research field.Intelligent unattended machine system
Can under complex environment, complete autonomous perception deduce, planning and control, in application fields such as electric power line inspection, disaster assistances,
Carry out object detection tracking, barrier perception, the location path planning, movement to ambient enviroment.It is relative to general unmanned machine
The dramatically different point of one of platform is that it is chosen best angle and carries out article crawl function, and the UAV system can be made to pick up
The rubbish and the foreign matter of cleaning overhead power transmission line at dangerously steep scenic spot etc. play a significant role in applying.
However, unmanned plane is completed to still remain many problems when task under high-risk environment, GPS signal is lost, is positioned not
Standard etc., this makes unmanned plane need to carry out self poisoning by other sensors, while improving autonomous flight ability, is just avoided that
Obstacle or crash are hit when flight under the high-risk environment of various complexity.Further, since crawl target posture is different and machine
Influence etc. caused by unmanned plane balance, needs to design better scheme to realize stable crawl when tool arm grabs.
Summary of the invention
The system and method for the Intelligent unattended machine crawl target towards high-risk environment that the purpose of the present invention is to provide a kind of,
It is intended to replace the manual work under current high-risk environment, reduces human cost, improve operating efficiency and safety.
In order to achieve the above objectives, this invention takes following technical schemes:
A kind of system of Intelligent unattended machine crawl target, including multi-rotor unmanned aerial vehicle, are mounted in the airborne place on unmanned plane
Manage device, depth camera, monocular camera, Inertial Measurement Unit, on-board controller, laser radar and mechanical arm and center of gravity compensation
Unit;
The on-board controller takes off, moves, lands for controlling unmanned plane;
The airborne processor, monocular camera, Inertial Measurement Unit, laser radar are connected, for unmanned plane itself
It is positioned;
The depth camera is connected with airborne processor, for detecting placement angle and the calculating of target and target
Coordinate position of the target relative to unmanned plane out;
The mechanical arm, center of gravity compensation unit are connected with airborne processor, for realizing the crawl of multivariant target;
The mechanical arm is the three stage structure being hinged, including three degree of freedom, and one end is fixed in unmanned plane
At heart point, end effector is rotatable gripper structure, for grabbing target;
Cause unmanned plane during flying unstable when the center of gravity compensation unit is for eliminating mechanical arm crawl target and grab
Unstable equal influence, is mounted among unmanned plane, including traction guide, slide unit, counterweight and the 4th steering engine, the traction guide
Central point and unmanned plane center are overlapped, and for controlling movement of the slide unit in traction guide, counterweight is placed in the 4th steering engine
Above slide unit, the position of drone center of unmanned aerial vehicle is balanced with the weight of counterweight.
Further, it is connected between three sections of the mechanical arm using three steering engines, the first steering engine connects mechanical arm
First segment and unmanned aerial vehicle body, control rotation angle, θ1, the first segment and second segment of the second steering engine connection mechanical arm, control rotation
Angle, θ2, third steering engine connects mechanical arm second segment and end, controls the rotation angle, θ of end effector3;The first two freedom degree
Guaranteeing that mechanical arm can move in 2D plane, the last one freedom degree carries out accurately rotation grasping for realizing to target,
The dynamics formula of three-freedom mechanical arm is as follows:
x0=L1cosθ1+L2cos(θ1+θ2)
y0=L1sinθ1+L2sin(θ1+θ2)
θ3=θ
Wherein, L1And L2It is the length of first segment and second segment, θ1、θ2、θ3It is the rotation angle in each joint, (x0,y0)
It is the coordinate of the robot arm end effector under the coordinate system using mechanical arm fixed point as origin, θ is the placement angle of target.
Further, the counterweight in the center of gravity compensation unit is for providing the battery of electric energy to unmanned plane.
The present invention also provides a kind of, and the Intelligent unattended machine towards high-risk environment grabs mesh calibration method, is applied to above-mentioned technology
In system in scheme, described method includes following steps:
Step 100, unmanned plane under complex environment is positioned itself by airborne monocular camera and Inertial Measurement Unit,
It realizes accurately navigation flight, and passes through the flying height of lidar measurement unmanned plane, guarantee that unmanned plane and ground keep one
Fixed safe altitude distance;
Step 200, the picture under captured in real-time current environment is shot using airborne depth camera, obtains RGB image and right
The depth image answered identifies and positions in conjunction with the detection that RGB image carries out target, and detects the placement angle θ of target, ties simultaneously
It closes depth image and obtains relative position coordinates of the target relative to unmanned plane;
Step 300, unmanned plane during flying is controlled to the position for leaning on close-target, the position of real-time detection target using on-board controller
The coordinate for setting coordinate and unmanned plane robot arm end effector, when the distance of the two is less than certain threshold value, airborne processor control
Mechanical arm rotation processed, and control center of gravity compensation unit and the center of gravity of unmanned plane is adjusted, when robot arm end effector rotates
To it is identical as target placement angle θ when, target is grabbed;
Step 400, target being carried using unmanned plane mechanical arm and returning to recovery point, mechanical arm leaves target behind.
Further, the specific implementation that unmanned plane itself is positioned in step S100 is as follows,
Step S101, on-board controller are controlled unmanned plane during flying, are adopted using image of the monocular camera to ambient enviroment
Collection is carried out the distance of trigonometric ratio measurement pixel, the three-dimensional point of the environment around unmanned plane is obtained with this after monocular camera is mobile
Cloud data are connected on airborne processor by serial ports or USB;
Step S102, airborne processor obtain the three-dimensional coordinate of the three-dimensional point cloud relative to unmanned plane at per moment, and match
The three dimensional point cloud of adjacent moment calculates corresponding coordinate difference, estimates two neighboring moment unmanned plane by optimization algorithm
Displacement and attitudes vibration;
Step S103 establishes three-dimensional system of coordinate using unmanned plane initial position as origin, utilizes adjacent two moment unmanned plane
The accumulative estimation unmanned plane of posture change in displacement relative to the position of coordinate origin is mobile and attitudes vibration, and using unmanned plane from
The IMU data that body carries are corrected cumulative errors.
Further, it is identified and positioned in step 200 using the detection that Rotation-SqueezeDet carries out target, institute
Stating Rotation-SqueezeDet is a kind of improved SqueezeDet network model, is demarcated using the bounding box of rotation
Target, expression formula are R'=(cx,cy, h, w, θ), wherein cx, cyThe pixel coordinate in the bounding box upper left corner is respectively indicated, h, w divide
The height and width of the bounding box are not indicated, and θ indicates the rotation angle of rotation bounding box.
Further, in step S200 obtain target relative to the relative position coordinates of unmanned plane specific implementation such as
Under,
After detecting target, the range information of target is obtained by depth image, by RGB image and depth image same
It is aligned in one coordinate system, obtains RGB image and three dimensional point cloud;Then, using the objective result detected, from entire
The point cloud subregion comprising target is extracted in the point cloud of image;Later, it is calculated by calculating the center of target point cloud subregion
Coordinate (the x under depth camera coordinate system of targett,yt,zt), formula is as follows:
In formula, (Xp,Yp,Zp) it is the set for rotating all the points cloud coordinate in bounding box, the subscript i in formula is indicated in target
I-th of available point cloud coordinate in bounding box since the upper left corner, K indicate the available point cloud number in rotation bounding box;
Finally, finding out relative position coordinates of the target about unmanned plane using the internal reference of known depth camera.
Further, airborne processor control mechanical arm rotation, and control center of gravity compensation unit to the center of gravity of unmanned plane into
The specific implementation that row is adjusted is as follows,
Enable the coordinate (x of robot arm end effector0,y0) it is equal to the coordinate of the target resolved in step S200, thus
Coordinate inversely calculates the angle, θ that preceding first steering engine and the second steering engine should rotate1、θ2, θ3With the target rotation angle that detected
It is equal to spend θ;It is provided with 4 proportional-integral derivative controllers (PID controller) in airborne processor, is respectively used to control steering engine
The speed of rotation passes through the currently practical angle, θ of return1cur、θ2cur、θ3curWith desired angle θ1、θ2、θ3It compares, determines every
The speed v of a steering engine rotation1、v1、v3, to ensure that the angle of final first, second, third steering engine control reaches ideal angle
Value, in combination with actual angle value θ1cur、θ2curPrinciple is resolved using center of gravity, solves the desired displacement P of counterweightb, then lead to
Cross the actual displacement P of detection counterweight distance centerbcur, the control of the 4th steering engine rotation speed is realized using the 4th PID controller
System.
The system of Intelligent unattended machine crawl target of the present invention towards high-risk environment, the effect completed is: first
First, unmanned plane takes off, is handled the image photographed in real time in flight course, and detection wherein whether there is interested mesh
Mark, if it does, positioning target, unmanned plane is implemented to grab close to target, using mechanical arm to target, is successfully grabbing target
Afterwards, it carries target flight and returns to recovery point, after leaving target behind, continue to make an inspection tour.The present invention is by combining unmanned air vehicle technique and image
Unmanned plane is applied in the foreign matter cleaning work under high-risk environment by processing technique, using multiple sensors, using positioning simultaneously
Realize that the detection of target identifies and positions with drafting algorithm, path planning algorithm, the target detection scheduling algorithm based on deep learning,
Meanwhile a set of complete mechanical systems are devised, it realizes and stablizing for target is grabbed.It is cleared up relative to traditional manpower, this hair
Bright high degree of automation, high-efficient, safety coefficient is high.It can be widely applied in the foreign matter cleaning work under high-risk environment.
Detailed description of the invention
Fig. 1 is present system module diagram.
Fig. 2 is present system workflow.
Fig. 3 is mechanical arm and its coordinate system schematic diagram in the embodiment of the present invention.
Fig. 4 is mechanical arm working space current rate schematic diagram in the embodiment of the present invention.
Fig. 5 is mechanical arm and center of gravity compensation unit control flow chart in the embodiment of the present invention.
Specific embodiment
To make the purpose of the present invention, technical solution and effect clearer, clear and definite, referring to the drawings to the present invention into one
Step is described in detail.
The present invention principally falls into air vehicle technique field, is related to the basic problem of air-robot, including unmanned plane itself
Positioning, path planning etc. also relate to the technology in terms of image procossing, including target detection, target positioning, in addition, crawl
Mechanical arm used in target has further related to the technology in terms of some mechanical kinetics.
As shown in Figure 1, a kind of system of Intelligent unattended machine crawl target provided by the invention, including multi-rotor unmanned aerial vehicle,
Airborne processor, depth camera, monocular camera, Inertial Measurement Unit, on-board controller, the laser thunder being mounted on unmanned plane
It reaches and mechanical arm and center of gravity compensation unit;
The on-board controller takes off, moves, lands for controlling unmanned plane;
The airborne processor, monocular camera, Inertial Measurement Unit, laser radar are connected, for unmanned plane itself
It is positioned;
The depth camera is connected with airborne processor, for detecting placement angle and the calculating of target and target
Coordinate position of the target relative to unmanned plane out;
The mechanical arm, center of gravity compensation unit are connected with airborne processor, for realizing the crawl of multivariant target;
The mechanical arm is the three stage structure being hinged, including three degree of freedom, and one end is fixed in unmanned plane
At heart point, end effector is rotatable gripper structure, for grabbing target;
Cause unmanned plane during flying unstable when the center of gravity compensation unit is for eliminating mechanical arm crawl target and grab
Unstable equal influence, is mounted among unmanned plane, including traction guide, slide unit, counterweight and the 4th steering engine, the traction guide
Central point and unmanned plane center are overlapped, and for controlling movement of the slide unit in traction guide, counterweight is placed in the 4th steering engine
Above slide unit, the position of drone center of unmanned aerial vehicle is balanced with the weight of counterweight.
As shown in Fig. 2, a kind of Intelligent unattended machine towards high-risk environment provided by the invention grabs mesh calibration method, include
Following steps:
Step S100: by airborne monocular camera and Inertial Measurement Unit (IMU) to unmanned plane under complex environment itself into
Row positioning realizes accurately navigation flight, and passes through the flying height of lidar measurement unmanned plane, guarantees unmanned plane and ground
Keep certain safe altitude distance;
The step mainly using the data of airborne monocular camera acquisition ambient enviroment, is handled by airborne processor come increment
It calculates odometer to position unmanned plane likes, simultaneously because the error of buildup of increments method can be gradually increased, therefore uses IMU
It is merged with the point cloud that monocular vision calculates to obtain odometer, unmanned plane is positioned to realize.It specifically includes:
Step S101: the mobile flight of unmanned plane is acquired, monocular camera using image of the monocular camera to ambient enviroment
The distance of trigonometric ratio measurement pixel can be carried out after movement, with the three-dimensional point cloud number of the environment around this available unmanned plane
According to being connected on airborne processor by serial ports or USB.
Step S102: airborne processor obtains the three-dimensional coordinate of the three-dimensional point cloud relative to unmanned plane at per moment, and matches
The three dimensional point cloud of adjacent moment calculates corresponding coordinate difference, estimates two neighboring moment unmanned plane by optimization algorithm
Displacement and attitudes vibration.
Step S103: establishing three-dimensional system of coordinate using unmanned plane initial position as origin, utilizes adjacent two moment unmanned plane
The accumulative estimation unmanned plane of posture change in displacement relative to the position of coordinate origin is mobile and attitudes vibration, and using unmanned plane from
The IMU data that body carries are corrected cumulative errors.
It should be noted that unmanned plane is not to be realized in the measurement of height using monocular camera and Inertial Measurement Unit
, but realized by airborne Tfmini laser radar, guarantee that unmanned plane and ground keep certain safe altitude distance.
Step S200: using the picture under airborne depth camera shooting captured in real-time current environment, RGB image and right is obtained
The depth image answered identifies and positions in conjunction with the detection that RGB image carries out target, and detects the placement angle θ of target, ties simultaneously
It closes depth image and obtains relative position coordinates of the target relative to unmanned plane;
The step is mainly the picture shot under current environment using airborne depth camera, and is utilized by airborne processor
Algorithm of target detection real-time detection just positions target if detecting target with the presence or absence of target, and solving target is opposite
In the coordinate of unmanned plane, specifically include:
Step S201: acquiring image, available RGB image and corresponding depth image using depth camera, meanwhile, machine
Borne processor using the algorithm of target detection based on deep learning to collected color image (RGB image) image at
Reason.This system carries out target detection, Rotation-SqueezeDet as detection model using Rotation-SqueezeDet
It is a kind of improved SqueezeDet network model, combines Rotation Region Proposal Networks (RRPN)
In Rotation anchors (R-anchors), i.e., using the bounding box of rotation come spotting.SqueezeDet is by side
Boundary confines the single channel detection model that position and classifier are combined by single network, and bounding box therein is the side of target
Boundary's frame, in the network model of SqueezeDet, bounding box is a rectangle for being parallel to image boundary, can be expressed as R=
(cx,cy, h, w), wherein cx, cyThe pixel coordinate in the bounding box upper left corner is respectively indicated, h, w respectively indicate the height of the bounding box
Degree and width.This rectangle frame for being parallel to image boundary, easy frame enters a large amount of background information, and believes without containing target angle
Breath, so, in conjunction with the method for the rotation bounding box taken in RRPN, we repair the bounding box in SqueezeDet
Change, with R'=(cx,cy, h, w, θ), to indicate rotation bounding box, wherein θ indicates the rotation angle of rotation bounding box.Utilize improvement
Detection model afterwards, so that it may detect the object boundary containing rotation angle information.
Step S202: detecting target in step s 201, obtains the range information of target, Zhi Houli by depth image
Specific location of the target relative to unmanned plane is calculated with airborne processor.Firstly, by color image (RGB image) and depth map
As being aligned in the same coordinate system, RGB image and three dimensional point cloud are obtained;Then, using the objective result detected,
The point cloud subregion comprising target is extracted from the point cloud of whole image;Later, pass through the center of calculating target point cloud subregion
To calculate the coordinate (x under depth camera coordinate system of targett,yt,zt), formula is as follows:
In formula, (Xp,Yp,Zp) be all the points cloud coordinate in object boundary frame set, the subscript i in formula indicates in target
I-th of available point cloud coordinate in bounding box since the upper left corner, K indicate the available point cloud number in object boundary frame.
Target is obtained after the coordinate under camera coordinates system, using the internal reference of known depth camera, mesh can be found out
The relative position coordinates about unmanned plane are marked, which is the prior art, and the present invention not writes.
Step S300: using on-board controller control unmanned plane during flying arrive by close-target position, real-time detection target
The coordinate of position coordinates and unmanned plane robot arm end effector, when the distance of the two is less than certain threshold value, airborne processor
Mechanical arm rotation is controlled, and controls center of gravity compensation unit and the center of gravity of unmanned plane is adjusted, when robot arm end effector revolves
When going to identical as target placement angle θ, target is grabbed;
The step is arrived on the basis of step S200 detects target and is accurately positioned to target using unmanned plane during flying
By the position of close-target, at this point, the control mechanical arm rotation of airborne processor, grabs target.It specifically includes:
Step S301: airborne processor utilizes the collected data of monocular camera, is aided with Inertial Measurement Unit, computation vision
Odometer, control unmanned plane lean on close-target, meanwhile, the position coordinates of real-time detection target and the seat of unmanned plane mechanical arm tail end
Mark, unmanned plane during flying to the two coordinate distance are less than 3cm, and hovering prepares crawl target;
Step S302: when unmanned plane is close to target in step S301, end effector is less than 3cm at a distance from target,
At this point, being overlapped using the coordinate that airborne processor control mechanical arm is moved to end effector and target, while controlling end and holding
Row device is rotated to consistent with the placement angle of target, grabs target later.
Mechanical arm has three sections, is all made of the PLA material of 3D printing, and similar to the arm of people, first two sections are used to extend,
One section next is end effector, is the gripper of two pawls, and crawl is realized by opening and closing and is put down, and end effector can
It is rotated at any angle with realizing, therefore crawl angle can be adjusted according to the placement angle of object, realized more stable
Crawl.It is connected between three sections of mechanical arm using steering engine, the first steering engine connects the first segment and unmanned aerial vehicle body of mechanical arm, Gu
It is scheduled on the center of unmanned plane, control rotation angle, θ1, the first segment and second segment of the second steering engine connection mechanical arm, control rotation
Gyration θ2, third steering engine connects mechanical arm second segment and end effector, controls the rotation angle, θ of end effector3;Therefore,
Mechanical arm has three degree of freedom, and the first two freedom degree guarantees that mechanical arm can move in 2D plane, in the last one freedom
On degree, angle is rotated using the target that target detection provides, may be implemented accurately to rotate grasping, significantly improve crawl
Stability and veracity.This system three-freedom mechanical arm schematic diagram is as shown in Fig. 3, and dynamics formula is as follows:
x0=L1cosθ1+L2cos(θ1+θ2)
y0=L1sinθ1+L2sin(θ1+θ2)
θ3=θ
Wherein, L1And L2It is the length of first segment and second segment, θ1、θ2、θ3It is the rotation angle in each joint, (x0,y0)
It is the coordinate of the robot arm end effector under the coordinate system using mechanical arm fixed point as origin.Due to θ3Only with target rotation angle
It is related to spend θ, i.e. the target angle that detected in step S200.Therefore our mechanical arm is equivalent to only that there are two freedom degrees.
Due to the influence for the wind-force that unmanned plane rotor generates, unmanned plane, which gets too close to target, may result in target by wind-force
Influence movement.This system considers the airflow influence that unmanned machine rotor generates, and has carried out centainly to the working space of mechanical arm
Limitation.By inspection information, we have obtained high-fidelity computational fluid dynamics (CFD) the simulation knot of many different unmanned planes
Fruit.From from these results, the flow that each unmanned machine rotor generates is reduced rapidly in peripheral region, and specific schematic diagram is such as
Shown in attached drawing 4.The figure shows be the subregional current rate in unmanned plane lower section using unmanned plane central point as origin, in figure,
Light gray zones domain representation weak current moves influence area or without flow effect region, and dark gray areas indicates that high current moves influence area.By
, as a result, this system provides to be grabbed or abandoned in mechanical arm under the working condition of target, robot arm end effector needs for this
It is maintained at weak current and moves influence area, reduce the influence of air-flow with this.Concrete scheme is as follows, firstly, three-dimensional system of coordinate is established,
This system uses quadrotor drone, including two front wings (right side front wing and left side front wing) and two rear wings (right side rear wing with
Left side rear wing), using unmanned plane center as origin, the line of the midpoints of two front wing lines and origin is as X-axis, from origin to two
The midpoint direction of a front wing line is X-axis positive direction, and the line at the midpoint and origin of right side front wing and right side rear wing line is as Y
Axis is Y-axis positive direction by origin to the right, and vertical unmanned plane plane is Z axis upwards.Target is grabbed or abandoned in mechanical arm
Working condition under, it is dynamic that robot arm end effector needs to be maintained at the weak current of the vertical range of distance ZOY plane greater than 30cm
Influence area;It carries out leaving behind under the working condition of target in unmanned plane, robot arm end effector is maintained at distance ZOY plane
Weak current of the vertical range less than 5cm moves influence area.
Further, it should be noted that mechanical arm is in mobile crawl target, the center of gravity of whole system can change, meeting
Cause unmanned plane during flying unstable and the influences such as crawl is unstable, therefore, this system devises a center of gravity compensation unit, uses
Carry out the offset of the mobile caused drone center of unmanned aerial vehicle of balancing mechanical arm.Center of gravity compensation unit is mounted among unmanned plane, by 3D printing
PLA material be made, including traction guide, slide unit, counterweight and the 4th steering engine, the long 60cm of guide rail are fixed in above-mentioned coordinate system
X-axis on, traction guide midpoint is located at unmanned plane center, in traction guide midpoint, the i.e. position at unmanned plane center, placement the
Four steering engines are rotated by the 4th steering engine to control the movement of slide unit, and counterweight is placed in above slide unit, is balanced with the weight of counterweight
The initial position of the position of center of gravity, slide unit center is located at (- 15) cm point in above-mentioned coordinate system in X-axis.Its working principle is just
It is airborne processor when mechanical arm is mobile, control the 4th steering engine work drives slide unit movement in traction guide corresponding
Distance carrys out the center of gravity of balance system with this.The battery of unmanned plane is 5200mAh 4S-35C battery, and weight 0.525kg may be used as
For the counterweight of center of gravity compensating unit.
The work of mechanical arm and center of gravity compensation unit is controlled by airborne processor, and the entire process that grabs is by four ratios-
Integral-derivative controller (Proportion Integration Differentiation Controller, PID control of abridging
Device) carry out regulation and control, control schematic diagram is as shown in Fig. 5.In order to be overlapped the coordinate of robot arm end effector and target,
Enable the coordinate (x of robot arm end effector0,y0) it is equal to the coordinate of target resolved in step S202, thus coordinate can be with
Inversely calculate the angle, θ that the first two steering engine (i.e. the first steering engine and the second steering engine) should rotate1、θ2, θ3With the mesh that detected
Mark rotation angle, θ is equal.During controlling steering engine rotation, since the rotation of steering engine is pre- there may be being more than or being not up to
If angle value, so, we using three PID controllers (i.e. the first, second and third PID controller) go control steering engine rotation
Speed passes through the currently practical angle, θ of return1cur、θ2cur、θ3curWith desired angle θ1、θ2、θ3It compares, determines each steering engine
The speed v of rotation1、v1、v3, to ensure that the angle of control of first three final steering engine reaches ideal angle value, in combination with reality
The angle value θ on border1cur、θ2curThe knowledge resolved using center of gravity, solves the desired displacement P of counterweightb, then pass through detection counterweight
The actual displacement P of distance centerbcur, the control of the 4th steering engine rotation speed is realized using the 4th PID controller.All PID
The parameter of controller passes through good tuning, ensure that the stability of control.
S400: target is carried using unmanned plane and returns to recovery point, mechanical arm leaves target behind;
On the basis of the step successfully grabs target in step S300, target is carried, the recovery point being previously set is returned to,
At recovery point, target is left behind with mechanical arm, completes entire work.It specifically includes:
Step S401: being combined using monocular camera and Inertial Measurement Unit, computation vision odometer, and planning unmanned plane returns
The path of Huis sink, on-board controller control unmanned plane fly back recovery point;
Step S402: vertically downward, release end actuator leaves the target of crawl behind to airborne processor control mechanical arm.
In conclusion a kind of Intelligent unattended machine grasping system towards high-risk environment of the present invention, mainly passes through
Monocular camera and Inertial Measurement Unit computation vision odometer position unmanned plane with this, control the flight of unmanned plane
Track;In flight course, using the image under airborne depth camera captured in real-time current environment, meanwhile, pass through airborne place
Device is managed, the detection of target is carried out using the algorithm of target detection based on deep learning;It is determining there are after target, target is being carried out
Precise positioning obtains position coordinates of the target relative to unmanned plane;Later, unmanned plane leans on close-target, using mechanical arm, in conjunction with it
The target placement angle obtained when preceding target detection, steadily grabs target;It finally carries target and returns to recovery point, leave mesh behind
Mark.The present invention by combine unmanned air vehicle technique, image processing techniques and some mechanics in terms of technology, by unmanned plane application
Foreign matter cleaning work under high-risk environment.Meanwhile using multiple sensors, advised using positioning simultaneously and drafting algorithm, path
Cost-effective method, algorithm of target detection and target location algorithm are realized and are grabbed to accurate detection, precise positioning and the stabilization of interesting target
It takes.Relative to the method for traditional manual cleaning foreign matter, high degree of automation of the present invention is high-efficient, and safety coefficient is high.It can be extensive
Applied in the foreign matter cleaning work under high-risk environment.
Claims (8)
1. a kind of system of Intelligent unattended machine crawl target, it is characterised in that: including multi-rotor unmanned aerial vehicle, be mounted on unmanned plane
Airborne processor, depth camera, monocular camera, Inertial Measurement Unit, on-board controller, laser radar and mechanical arm and
Center of gravity compensation unit;
The on-board controller takes off, moves, lands for controlling unmanned plane;
The airborne processor, monocular camera, Inertial Measurement Unit, laser radar are connected, for carrying out to unmanned plane itself
Positioning;
The depth camera is connected with airborne processor, for detecting the placement angle of target and target and calculating mesh
Mark the coordinate position relative to unmanned plane;
The mechanical arm, center of gravity compensation unit are connected with airborne processor, for realizing the crawl of multivariant target;
The mechanical arm is the three stage structure being hinged, including three degree of freedom, and one end is fixed on the central point of unmanned plane
Place, end effector is rotatable gripper structure, for grabbing target;
Cause unmanned plane during flying unstable when the center of gravity compensation unit is for eliminating mechanical arm crawl target and crawl is unstable
Fixed wait influences, and is mounted among unmanned plane, including traction guide, slide unit, counterweight and the 4th steering engine, the traction guide center
Point and unmanned plane center are overlapped, and for the 4th steering engine for controlling movement of the slide unit in traction guide, counterweight is placed in slide unit
Above, the position of drone center of unmanned aerial vehicle is balanced with the weight of counterweight.
2. a kind of system of Intelligent unattended machine crawl target as described in claim 1, it is characterised in that: the three of the mechanical arm
It is connected between section using three steering engines, the first steering engine connects the first segment and unmanned aerial vehicle body of mechanical arm, control rotation angle
θ1, the first segment and second segment of the second steering engine connection mechanical arm, control rotation angle, θ2, third steering engine connection mechanical arm second segment
And end, control the rotation angle, θ of end effector3;The first two freedom degree guarantees that mechanical arm can move in 2D plane, most
The latter freedom degree is as follows for realizing accurately rotation grasping, the dynamics formula of three-freedom mechanical arm is carried out to target:
x0=L1cosθ1+L2cos(θ1+θ2)
y0=L1sinθ1+L2sin(θ1+θ2)
θ3=θ
Wherein, L1And L2It is the length of first segment and second segment, θ1、θ2、θ3It is the rotation angle in each joint, (x0,y0) be with
Mechanical arm fixed point is the coordinate of robot arm end effector under the coordinate system of origin, and θ is the placement angle of target.
3. a kind of system of Intelligent unattended machine crawl target as described in claim 1, it is characterised in that: the center of gravity compensation list
Counterweight in member is for providing the battery of electric energy to unmanned plane.
4. a kind of Intelligent unattended machine towards high-risk environment grabs mesh calibration method, it is applied to any in the claims 1-3
System described in, which is characterized in that described method includes following steps:
Step 100, unmanned plane under complex environment is positioned itself by airborne monocular camera and Inertial Measurement Unit, is realized
Accurately navigation flight, and pass through the flying height of lidar measurement unmanned plane, guarantee that unmanned plane and ground are kept centainly
Safe altitude distance;
Step 200, the picture under captured in real-time current environment is shot using airborne depth camera, obtains RGB image and corresponding
Depth image identifies and positions in conjunction with the detection that RGB image carries out target, and detects the placement angle θ of target, in combination with depth
It spends image and obtains relative position coordinates of the target relative to unmanned plane;
Step 300, unmanned plane during flying is controlled to the position for leaning on close-target using on-board controller, the position of real-time detection target is sat
The coordinate of mark and unmanned plane robot arm end effector, when the distance of the two is less than certain threshold value, airborne processor controls machine
The rotation of tool arm, and controls center of gravity compensation unit and the center of gravity of unmanned plane is adjusted, when robot arm end effector rotate to
When target placement angle θ is identical, target is grabbed;
Step 400, target being carried using unmanned plane mechanical arm and returning to recovery point, mechanical arm leaves target behind.
5. a kind of Intelligent unattended machine towards high-risk environment grabs mesh calibration method as claimed in claim 4, it is characterised in that: step
The specific implementation that unmanned plane itself is positioned in rapid S100 is as follows,
Step S101, on-board controller are controlled unmanned plane during flying, are acquired using image of the monocular camera to ambient enviroment, single
The distance that trigonometric ratio measurement pixel is carried out after mesh camera is mobile, the three-dimensional point cloud number of the environment around unmanned plane is obtained with this
According to being connected on airborne processor by serial ports or USB;
Step S102, airborne processor obtain the three-dimensional coordinate of the three-dimensional point cloud relative to unmanned plane at per moment, and match adjacent
The three dimensional point cloud at moment calculates corresponding coordinate difference, and the displacement of two neighboring moment unmanned plane is estimated by optimization algorithm
And attitudes vibration;
Step S103 establishes three-dimensional system of coordinate using unmanned plane initial position as origin, utilizes the appearance of adjacent two moment unmanned plane
Position movement and attitudes vibration of the accumulative estimation unmanned plane of state change in displacement relative to coordinate origin, and taken using unmanned plane itself
The IMU data of band are corrected cumulative errors.
6. a kind of Intelligent unattended machine crawl mesh calibration method towards high-risk environment, feature exist as described in claim 4 or 5
In: it is identified and positioned in step 200 using the detection that Rotation-SqueezeDet carries out target, the Rotation-
SqueezeDet is a kind of improved SqueezeDet network model, using the bounding box of rotation come spotting, expression formula
For R'=(cx,cy, h, w, θ), wherein cx, cyThe pixel coordinate in the bounding box upper left corner is respectively indicated, h, w respectively indicate the side
The height and width of boundary's frame, θ indicate the rotation angle of rotation bounding box.
7. a kind of Intelligent unattended machine towards high-risk environment grabs mesh calibration method as claimed in claim 6, it is characterised in that: step
It is as follows relative to the specific implementation of the relative position coordinates of unmanned plane that target is obtained in rapid S200,
After detecting target, the range information of target is obtained by depth image, by RGB image and depth image in same seat
It is aligned in mark system, obtains RGB image and three dimensional point cloud;Then, using the objective result detected, from whole image
Point cloud in extract include target point cloud subregion;Later, target is calculated by calculating the center of target point cloud subregion
The coordinate (x under depth camera coordinate systemt,yt,zt), formula is as follows:
In formula, (Xp,Yp,Zp) it is the set for rotating all the points cloud coordinate in bounding box, the subscript i in formula is indicated in object boundary
I-th of available point cloud coordinate in frame since the upper left corner, K indicate the available point cloud number in rotation bounding box;
Finally, finding out relative position coordinates of the target about unmanned plane using the internal reference of known depth camera.
8. a kind of Intelligent unattended machine towards high-risk environment grabs mesh calibration method as claimed in claim 7, it is characterised in that: machine
Borne processor controls mechanical arm rotation, and controls the specific implementation that the center of gravity of unmanned plane is adjusted in center of gravity compensation unit
It is as follows,
Enable the coordinate (x of robot arm end effector0,y0) it is equal to the coordinate of target resolved in step S200, thus coordinate
Inversely calculate the angle, θ that preceding first steering engine and the second steering engine should rotate1、θ2, θ3Angle, θ is rotated with the target that detected
It is equal;It is provided with 4 proportional-integral derivative controllers (PID controller) in airborne processor, is respectively used to control steering engine rotation
The speed turned, passes through the currently practical angle, θ of return1cur、θ2cur、θ3curWith desired angle θ1、θ2、θ3It compares, determines each
The speed v of steering engine rotation1、v1、v3, to ensure that the angle of final first, second, third steering engine control reaches ideal angle value,
In combination with actual angle value θ1cur、θ2curPrinciple is resolved using center of gravity, solves the desired displacement P of counterweightb, then pass through
Detect the actual displacement P of counterweight distance centerbcur, the control of the 4th steering engine rotation speed is realized using the 4th PID controller.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910123394.2A CN109934871A (en) | 2019-02-18 | 2019-02-18 | A kind of system and method for the Intelligent unattended machine crawl target towards high-risk environment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910123394.2A CN109934871A (en) | 2019-02-18 | 2019-02-18 | A kind of system and method for the Intelligent unattended machine crawl target towards high-risk environment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109934871A true CN109934871A (en) | 2019-06-25 |
Family
ID=66985631
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910123394.2A Pending CN109934871A (en) | 2019-02-18 | 2019-02-18 | A kind of system and method for the Intelligent unattended machine crawl target towards high-risk environment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109934871A (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110427043A (en) * | 2019-09-04 | 2019-11-08 | 福州大学 | Pose Control device design method based on operation flying robot's centre-of gravity shift |
CN110525642A (en) * | 2019-08-26 | 2019-12-03 | 核工业北京地质研究院 | A kind of verification of UAV system multisensor field and one-point measurement system |
US10822082B2 (en) | 2017-04-07 | 2020-11-03 | Mark Holbrook Hanna | Distributed-battery aerial vehicle and a powering method therefor |
CN112161173A (en) * | 2020-09-10 | 2021-01-01 | 国网河北省电力有限公司检修分公司 | Power grid wiring parameter detection device and detection method |
CN113351631A (en) * | 2021-07-05 | 2021-09-07 | 北京理工大学 | Photoelectric intelligent garbage sorting trolley system |
CN113602481A (en) * | 2021-09-01 | 2021-11-05 | 浙江科顿科技有限公司 | Unmanned aerial vehicle autonomous balance control method carrying manipulator and gravity balance device |
CN113702995A (en) * | 2021-09-01 | 2021-11-26 | 国网江苏省电力有限公司扬州供电分公司 | Space positioning system for assisting in hanging and placing grounding wire operation |
US11235823B2 (en) | 2018-11-29 | 2022-02-01 | Saudi Arabian Oil Company | Automation methods for UAV perching on pipes |
CN114115321A (en) * | 2021-12-13 | 2022-03-01 | 盐城工学院 | Automatic foreign matter removing aircraft for high-voltage transmission line and automatic foreign matter removing method thereof |
CN114429432A (en) * | 2022-04-07 | 2022-05-03 | 科大天工智能装备技术(天津)有限公司 | Multi-source information layered fusion method and device and storage medium |
CN117182354A (en) * | 2023-11-07 | 2023-12-08 | 中国铁建电气化局集团有限公司 | Foreign matter removing method, device, equipment and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105775152A (en) * | 2016-03-08 | 2016-07-20 | 谭圆圆 | Unmanned aerial vehicle with battery type counterweight device and counterweight method thereof |
CN106645205A (en) * | 2017-02-24 | 2017-05-10 | 武汉大学 | Unmanned aerial vehicle bridge bottom surface crack detection method and system |
CN107314762A (en) * | 2017-07-06 | 2017-11-03 | 广东电网有限责任公司电力科学研究院 | Atural object distance detection method below power line based on unmanned plane the sequence monocular image |
CN108780325A (en) * | 2016-02-26 | 2018-11-09 | 深圳市大疆创新科技有限公司 | System and method for adjusting unmanned vehicle track |
CN108858199A (en) * | 2018-07-27 | 2018-11-23 | 中国科学院自动化研究所 | The method of the service robot grasp target object of view-based access control model |
-
2019
- 2019-02-18 CN CN201910123394.2A patent/CN109934871A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108780325A (en) * | 2016-02-26 | 2018-11-09 | 深圳市大疆创新科技有限公司 | System and method for adjusting unmanned vehicle track |
CN105775152A (en) * | 2016-03-08 | 2016-07-20 | 谭圆圆 | Unmanned aerial vehicle with battery type counterweight device and counterweight method thereof |
CN106645205A (en) * | 2017-02-24 | 2017-05-10 | 武汉大学 | Unmanned aerial vehicle bridge bottom surface crack detection method and system |
CN107314762A (en) * | 2017-07-06 | 2017-11-03 | 广东电网有限责任公司电力科学研究院 | Atural object distance detection method below power line based on unmanned plane the sequence monocular image |
CN108858199A (en) * | 2018-07-27 | 2018-11-23 | 中国科学院自动化研究所 | The method of the service robot grasp target object of view-based access control model |
Non-Patent Citations (1)
Title |
---|
SHIJIE LIN等: "Toward Autonomous Rotation-Aware Unmanned Aerial Grasping", 《ARXIV:1811.03921V1》 * |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10822082B2 (en) | 2017-04-07 | 2020-11-03 | Mark Holbrook Hanna | Distributed-battery aerial vehicle and a powering method therefor |
US11811224B2 (en) | 2017-04-07 | 2023-11-07 | Mark Holbrook Hanna | Distributed-battery aerial vehicle and a powering method therefor |
US11235823B2 (en) | 2018-11-29 | 2022-02-01 | Saudi Arabian Oil Company | Automation methods for UAV perching on pipes |
CN110525642A (en) * | 2019-08-26 | 2019-12-03 | 核工业北京地质研究院 | A kind of verification of UAV system multisensor field and one-point measurement system |
CN110427043B (en) * | 2019-09-04 | 2021-09-28 | 福州大学 | Pose controller design method based on gravity center offset of operation flying robot |
CN110427043A (en) * | 2019-09-04 | 2019-11-08 | 福州大学 | Pose Control device design method based on operation flying robot's centre-of gravity shift |
CN112161173B (en) * | 2020-09-10 | 2022-05-13 | 国网河北省电力有限公司检修分公司 | Power grid wiring parameter detection device and detection method |
CN112161173A (en) * | 2020-09-10 | 2021-01-01 | 国网河北省电力有限公司检修分公司 | Power grid wiring parameter detection device and detection method |
CN113351631A (en) * | 2021-07-05 | 2021-09-07 | 北京理工大学 | Photoelectric intelligent garbage sorting trolley system |
CN113602481A (en) * | 2021-09-01 | 2021-11-05 | 浙江科顿科技有限公司 | Unmanned aerial vehicle autonomous balance control method carrying manipulator and gravity balance device |
CN113702995A (en) * | 2021-09-01 | 2021-11-26 | 国网江苏省电力有限公司扬州供电分公司 | Space positioning system for assisting in hanging and placing grounding wire operation |
CN114115321A (en) * | 2021-12-13 | 2022-03-01 | 盐城工学院 | Automatic foreign matter removing aircraft for high-voltage transmission line and automatic foreign matter removing method thereof |
CN114429432A (en) * | 2022-04-07 | 2022-05-03 | 科大天工智能装备技术(天津)有限公司 | Multi-source information layered fusion method and device and storage medium |
CN114429432B (en) * | 2022-04-07 | 2022-06-21 | 科大天工智能装备技术(天津)有限公司 | Multi-source information layered fusion method and device and storage medium |
CN117182354A (en) * | 2023-11-07 | 2023-12-08 | 中国铁建电气化局集团有限公司 | Foreign matter removing method, device, equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109934871A (en) | A kind of system and method for the Intelligent unattended machine crawl target towards high-risk environment | |
CN106595631B (en) | A kind of method and electronic equipment of avoiding barrier | |
CN108453738B (en) | Control method for four-rotor aircraft aerial autonomous grabbing operation based on Opencv image processing | |
CN111966114A (en) | Drop-off location planning for delivery vehicles | |
CN205453893U (en) | Unmanned aerial vehicle | |
CN105492985A (en) | Multi-sensor environment map building | |
Santos et al. | UAV obstacle avoidance using RGB-D system | |
Hui et al. | A novel autonomous navigation approach for UAV power line inspection | |
US20240126294A1 (en) | System and method for perceptive navigation of automated vehicles | |
CN111061266A (en) | Night on-duty robot for real-time scene analysis and space obstacle avoidance | |
Hsiao et al. | Autopilots for ultra lightweight robotic birds: Automatic altitude control and system integration of a sub-10 g weight flapping-wing micro air vehicle | |
CN110209202A (en) | A kind of feas ible space generation method, device, aircraft and aerocraft system | |
CN108415460A (en) | A kind of combination separate type rotor and sufficient formula moving operation machine people concentration-distributed control method | |
JP2020149186A (en) | Position attitude estimation device, learning device, mobile robot, position attitude estimation method, and learning method | |
Proctor et al. | Vision‐only control and guidance for aircraft | |
Zufferey et al. | Optic flow to steer and avoid collisions in 3D | |
Sa et al. | Close-quarters Quadrotor flying for a pole inspection with position based visual servoing and high-speed vision | |
Shastry et al. | Autonomous detection and tracking of a high-speed ground vehicle using a quadrotor UAV | |
Lee et al. | Landing Site Inspection and Autonomous Pose Correction for Unmanned Aerial Vehicles | |
CN110879607A (en) | Offshore wind power blade detection method based on multi-unmanned aerial vehicle formation cooperative detection | |
CN116009583A (en) | Pure vision-based distributed unmanned aerial vehicle cooperative motion control method and device | |
US20230142863A1 (en) | Performance of autonomous vehicle operation in varying conditions by using imagery generated with machine learning for simulations | |
Sato et al. | A simple autonomous flight control method of quadrotor helicopter using only single Web camera | |
Abdessameud et al. | Dynamic image-based tracking control for VTOL UAVs | |
CN114330832A (en) | Intelligent express package distribution system and working method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190625 |
|
RJ01 | Rejection of invention patent application after publication |