CN113602799A - Airport luggage case carrying system and control method thereof - Google Patents

Airport luggage case carrying system and control method thereof Download PDF

Info

Publication number
CN113602799A
CN113602799A CN202110897488.2A CN202110897488A CN113602799A CN 113602799 A CN113602799 A CN 113602799A CN 202110897488 A CN202110897488 A CN 202110897488A CN 113602799 A CN113602799 A CN 113602799A
Authority
CN
China
Prior art keywords
luggage case
straight line
grid
industrial camera
calculating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110897488.2A
Other languages
Chinese (zh)
Other versions
CN113602799B (en
Inventor
祝会龙
张静
刘满禄
单毛毛
田凤莲
王姮
张华�
段淇昱
周建
白克强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southwest University of Science and Technology
Original Assignee
Southwest University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southwest University of Science and Technology filed Critical Southwest University of Science and Technology
Priority to CN202110897488.2A priority Critical patent/CN113602799B/en
Publication of CN113602799A publication Critical patent/CN113602799A/en
Application granted granted Critical
Publication of CN113602799B publication Critical patent/CN113602799B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G47/00Article or material-handling devices associated with conveyors; Methods employing such devices
    • B65G47/74Feeding, transfer, or discharging devices of particular kinds or types
    • B65G47/90Devices for picking-up and depositing articles or materials
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G47/00Article or material-handling devices associated with conveyors; Methods employing such devices
    • B65G47/52Devices for transferring articles or materials between conveyors i.e. discharging or feeding devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Mechanical Engineering (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a luggage case carrying system for an airport and a control method thereof, wherein the luggage case carrying system comprises a control unit, a detection unit in signal connection with the control unit and a transportation unit for transporting luggage cases; the control unit comprises a control box and a gripper which is fixedly arranged on the mechanical arm and is controlled by the control unit; the detection unit comprises a grid laser, an industrial camera and a pressure film sensor; the industrial camera and the grid laser are both arranged on the top plate of the gripper, and a plurality of point light source lamps are arranged around the industrial camera; the pressure film sensor is fixedly arranged on the side wall of the inner side of the hand grip; the transport unit comprises a conveyor belt and a transport trolley arranged on one side of the conveyor belt. The invention utilizes the tongs at the tail end of the mechanical arm and the peripheral point light source lamps to be matched with the industrial camera to combine the object detection in the machine vision to estimate the size and the volume of the suitcase, and the mechanical arm controls the tongs to grab the suitcase to be put on the suitcase carrying trolley.

Description

Airport luggage case carrying system and control method thereof
Technical Field
The invention belongs to the technical field of robot handling, and particularly relates to an airport luggage case handling system and a control method thereof.
Background
In the face of increasingly developed air transportation business, the mobility requirements of personnel, luggage and goods are continuously improved, and the workload of airport operation is greatly increased. The luggage consignment is carried in and out port, the sorting task is increasingly heavy, and the manual sorting mode is adopted, so that the labor intensity of workers is high, the luggage sorting operation efficiency of an airport is seriously hindered, and the operation cost is rapidly increased.
Therefore, the automatic baggage handling and sorting system of the airport is required to be intelligently transformed, so that the automation and the intellectualization of the baggage handling of the airport are realized, the bag clamping, the mistaken sorting and the damage in the baggage handling and sorting process are avoided, the baggage and cargo handling efficiency of the airport is improved, and the intelligent construction level and the image of the airport are improved.
Present luggage transport mostly leans on the mode of manual sorting, and its work efficiency is very low, because the size and size of suitcase is unified not the sorting that can not be fine when carrying out the suitcase transport, seriously hinders airport luggage letter sorting operation efficiency more, leads to operation cost to rise sharply etc..
Disclosure of Invention
The present invention is directed to solving the above-mentioned problems, and provides a baggage handling system for airport and a control method thereof.
In order to achieve the purpose, the invention adopts the technical scheme that:
in one aspect, a luggage handling system for an airport and a control method thereof, which includes a control unit, a detection unit in signal connection with the control unit, and a transportation unit for transporting luggage;
the control unit comprises a control box and a gripper which is fixedly arranged on the mechanical arm and is controlled by the control unit;
the detection unit comprises a grid laser, an industrial camera and a pressure film sensor; the industrial camera and the grid laser are both arranged on the top plate of the gripper, and a plurality of point light source lamps are arranged around the industrial camera; the pressure film sensor is fixedly arranged on the side wall of the inner side of the hand grip;
the transport unit comprises a conveyor belt and a transport trolley arranged on one side of the conveyor belt.
Furthermore, the grid laser, the industrial camera and the film pressure sensor are connected with a small embedded industrial personal computer and are in communication connection with the control box through the small embedded industrial personal computer.
Further, the point light source lamp is an LED light-emitting source lamp.
In one aspect, a method of controlling an airport baggage handling system comprising the steps of:
s1, industrial camera calibration, including parameter calibration in the industrial camera and calibration of hand-eye coordinate conversion relation;
s2, the industrial camera collects the image information of the luggage case on the conveyor belt, and transmits the collected image information to the control unit for processing to obtain the position information of the luggage case in the image;
s3, controlling the hand grip to move to a target position according to the position information of the luggage case, controlling the hand grip to move according to the moving track of the luggage case, and setting the vertical downward moving distance and the horizontal displacement of the hand grip until the hand grip moves to two sides of the luggage case;
s4, grabbing the luggage case according to the weight of the luggage case and the pressure value fed back by the film sensor, and adjusting the grabbing strength of the grab in real time;
s5, the control unit drives the mechanical arm to rotate, the luggage case is transported to the upper part of the transport trolley, the mechanical arm is moved downwards continuously, the gripper is controlled to release the luggage case, and the luggage case falls on the transport trolley;
and S6, the control unit controls the mechanical arm to return to the initial position.
Further, the step S2, the industrial camera acquires image information of the luggage on the conveyor belt, and transmits the acquired image information to the control unit for processing, so as to obtain the position information of the luggage in the image, including:
marking and extracting the acquired image information data by adopting a silhouette processing method;
carrying out binarization processing on the extracted image information, and carrying out image segmentation on the image subjected to binarization processing to obtain image information containing a target area of the trunk;
filtering the segmented image information;
and adopting skeletonization to identify the position information of the target area, and transmitting the position information to the control unit.
Further, the step S3 of calculating the moving track of the luggage case includes:
the grid laser projects a group of parallel grid red laser lines on the surface of the suitcase to be grabbed;
calculating a horizontal straight line equation and a vertical straight line equation in the grid by adopting standard Hough transform, and then calculating coordinates of start points and stop points of grid lines through intersection points of the horizontal straight lines and the vertical straight lines;
and drawing straight lines on the basis of the binary image of the calculated coordinates of the start point and the stop point, and obtaining a result graph of grid line detection.
Further, according to the result graph of the grid line detection, calculating the coordinates of the starting point of the grid horizontal line:
keeping the vertical straight line equation unchanged, sequentially traversing all the horizontal straight line equations, calculating the coordinates of the intersection point of each horizontal straight line and the set vertical line, and calculating to obtain the coordinates of the initial point of the horizontal straight line of the actual grid;
keeping the horizontal straight line equation unchanged, sequentially traversing all the vertical straight line equations, calculating the coordinates of the intersection points of the horizontal straight line and the vertical straight line, and calculating to obtain the coordinates of the initial points of the straight lines in the vertical direction of the actual grid;
and calculating the coordinates of the end points of the grid horizontal lines according to a result graph of grid line detection:
keeping the horizontal straight line equation unchanged, sequentially traversing all the vertical straight line equations, calculating the intersection point coordinate of each vertical straight line and the set horizontal straight line, and calculating to obtain the end point coordinate of the actual grid horizontal straight line;
keeping the horizontal straight line equation unchanged, sequentially traversing all the vertical straight line equations, and calculating the coordinates of the intersection points of the horizontal straight line and the vertical straight line to obtain the coordinates of the end points of the straight lines in the vertical direction of the actual grid.
Further, the industrial camera calibration in step S1 includes:
carrying out parameter calibration and distortion correction in the industrial camera by adopting a Zhangyou calibration algorithm;
and performing hand-eye calibration by adopting a Tsai-Lenz calibration method.
Further, the calculation and correction of the volume of the suitcase is also included:
when the grid laser irradiates on the target luggage case, the industrial camera photographs the target luggage case, identifies the laser grid on the photographed picture, and takes the edge of the luggage case with the farthest bent grid line of the luggage case as the farthest edge;
calculating the area of the uppermost surface of the farthest luggage case, and further estimating the initial volume of the luggage case;
the calculated volume of the case is filled with an edge dilation filling of a total of 5cm of gray border outwardly filled at the most distal curved gridlines of the case.
The airport luggage case carrying system and the control method thereof provided by the invention have the following beneficial effects:
the size and the volume of the luggage case are estimated by utilizing the tongs at the tail end of the mechanical arm and the peripheral point light source lamps in combination with the object detection in the machine vision, and the mechanical arm controls the tongs to grab the luggage case and place the luggage case on the luggage case carrying trolley; the pressure film sensor is utilized to realize real-time detection and feedback of the strength of the gripper at the tail end of the mechanical arm, so that the luggage case can be gripped without damage; it can adapt to the suitcase that snatchs not unidimensional to the harmless of the suitcase and snatching and carrying that has realized snatching of controllable dynamics has been realized.
Drawings
FIG. 1 is a schematic diagram of a airport luggage handling system.
FIG. 2 is a view showing the construction of the gripper.
Fig. 3 is a block diagram of image processing.
Fig. 4 is a diagram showing an example of a straight line in a polar coordinate system.
FIG. 5 is an illustration of calculating grid line coordinates.
Fig. 6 is a diagram of hand-eye calibration and visual guidance, in which the left diagram is a diagram of hand-eye calibration coordinate transformation, and the right diagram is a diagram of visual guidance process.
Fig. 7 is a flow chart of hand-eye calibration.
FIG. 8 is a three-dimensional information measurement and localization diagram based on structured light.
Fig. 9 is a schematic diagram of luggage volume estimation.
Wherein, 1, a conveyor belt; 2. a top plate; 3. a mechanical arm; 4. a control box; 5. a luggage case; 6. transporting the trolley; 7. an industrial camera; 8. grid laser; 9. a pressure membrane sensor; 10. and (4) a hand grip.
Detailed Description
The following description of the embodiments of the present invention is provided to facilitate the understanding of the present invention by those skilled in the art, but it should be understood that the present invention is not limited to the scope of the embodiments, and it will be apparent to those skilled in the art that various changes may be made without departing from the spirit and scope of the invention as defined and defined in the appended claims, and all matters produced by the invention using the inventive concept are protected.
According to the first embodiment of the application, referring to fig. 1 and 2, the airport luggage 5 handling system and the control method thereof in the present scheme comprise a control unit, a detection unit in signal connection with the control unit, and a transportation unit for transporting the luggage 5.
The control unit comprises a control box 4, a mechanical arm 3 and a gripper 10, the gripper 10 is mounted on the mechanical arm 3 and controlled by the control box 4, and the mechanical arm 3 is a mechanical arm 3 with 6 degrees of freedom.
The control box 4 is connected with the mechanical arm 3 with 6 degrees of freedom through a communication line; the gripper 10 at the tail end of the mechanical arm 3 is connected with the control box 4 through a communication line; the grid laser 8, the industrial camera 7 and the film pressure sensor are connected with the small embedded industrial personal computer firstly and then are in wired communication with the machine box controlled by the small embedded industrial personal computer and the mechanical arm 3.
The industrial camera 7 positions and classifies the luggage case 5 according to categories, the luggage case 5 needs to be sorted after the classification and the positioning, and the force of the pressure film sensor 9 on the end gripper 10 is fed back to the mechanical arm 3 to control the gripper 10 to carry out force-controlled sorting.
The control unit is used for controlling the mechanical arm 3 and the gripper 10 to realize the grabbing of the luggage case 5, placing the grabbed luggage case 5 on the transport trolley 6, and finally returning to the initial position to wait for the next grabbing task.
The detection unit comprises a grid laser 8, an industrial camera 7 and a pressure film sensor 9, the industrial camera 7 and the grid laser 8 are both arranged on the top plate 2 of the gripper 10, the grid laser 8 projects a group of parallel grid red laser lines on the surface of an object to be gripped, and the object is generally judged.
And the periphery of the industrial camera 7 is provided with a plurality of point light source lamps in a surrounding manner, the point light source lamps select the characteristics for projecting the shooting target, so that the different parts of the point light source lamps have enough contrast, and the LED luminous electro-optic is selected, and meanwhile, the defect that the industrial camera 7 can be matched well for good detection in some dim environments can be overcome.
The industrial camera 7 collects the object marked by the grid laser 8, and converts the visual image and the characteristics of the target object into a series of data which can be processed by calculation. The pressure film sensor 9 is fixedly arranged on the inner side wall of the hand grip 10.
The transport unit comprises a conveyor belt 1 and a transport trolley 6 arranged on one side of the conveyor belt 1.
According to the second embodiment of the present application, a method for controlling a system for handling a luggage case 5 at an airport comprises the following steps:
step S1, calibrating the industrial camera 7, including calibrating parameters in the industrial camera 7 and calibrating a hand-eye coordinate transformation relationship, which specifically includes:
the high-precision hand-eye calibration is a key for ensuring the precision of the system, and in the embodiment, the calibration comprises two parts of camera internal parameter calibration and hand-eye coordinate conversion relation calibration, wherein the camera internal parameter calibration and distortion correction are realized by adopting an improved Zhang Zhengyou calibration algorithm.
The calibration of the hand-eye coordinate relationship of the mechanical arm 3 mainly comprises the step of solving a hand-eye mapping model, namely a nonlinear mapping model from a robot visual space to a robot working space, such as a camera coordinate system O in the left diagram of FIG. 6cTo the manipulator base coordinate system OrThe conversion relationship of (1). The right diagram of fig. 6 shows the coordinate transformation relationship in the visual guidance process.
Referring to fig. 7, in the present solution, a Tsai-Lenz calibration method is adopted to perform hand-eye calibration, so as to obtain a spatial pose between a robot base coordinate and a camera.
Step S2, the industrial camera 7 collects the image information of the luggage case 5 on the conveyor belt 1, and transmits the collected image information to the control unit for processing, so as to obtain the position information of the luggage case 5 in the image;
referring to fig. 3, the image acquisition includes:
marking and extracting the acquired image information data by adopting a silhouette processing method;
carrying out binarization processing on the extracted image information, and carrying out image segmentation on the image subjected to binarization processing to obtain image information containing a target area of the trunk 5;
filtering the segmented image information;
adopting skeletonization to identify the position information of the target area, and transmitting the position information to a control unit;
the control unit issues commands to the control box 4 and controls the operation of the mechanical arm 3 and the gripper 10 according to the position information of the luggage box 5.
Step S3, controlling the hand grip 10 to move to the target position according to the position information of the luggage 5, controlling the hand grip 10 to move according to the moving track of the luggage 5, and setting the vertical downward movement distance and the horizontal displacement of the hand grip 10 until the hand grip 10 moves to the two sides of the luggage 5, which specifically comprises:
the posture and the volume of the luggage case 5 are intelligently identified, a group of parallel grid red laser lines are projected on the surface of an object to be grabbed by the grid laser 8, and the coordinate position of the luggage case 5 in the grid is firstly calculated by combining point light source lamps with an industrial camera 7 and object detection in machine vision;
analyzing the hough transform detection result shows that there are some problems in directly using hough transform to detect the grid lines, such as overlapping of a plurality of straight lines at the same position, shielding of light, and the fact that the start and end points of the straight lines are not actual start and end points. The problem of linear overlapping is solved by the idea of classifying and combining Hough transform detection results. Because the non-parallel straight lines have intersection points, the coordinates of the start point and the end point of the actual grid line can be obtained by calculating the intersection points of the straight lines in the horizontal direction and the vertical direction, so that the problems of partial shielding, non-true coordinates of the start point and the end point of the grid line and the like can be solved.
And (4) Hough transform, namely comparing two Hough transform functions to know, and selecting standard Hough transform. Because the result of standard Hough transform is (theta, rho) parameters under a polar coordinate system, and the theta parameter is the included angle between a straight line and a horizontal axis, the method provides a convenient way for horizontal and vertical classification arrangement of the straight line. In addition, the standard Hough transform function only has one dynamically changing parameter, and the parameter value is easy to set.
Referring to fig. 4, calculation is performed by using a hough transform straight line method:
point A (x)0,y0) The straight line passing through the point A satisfies the equation y ═ k x0+ b. (k is slope, b is intercept);
then the point A (x) is crossed in the XOY plane0,y0) The linear clusters of (a) may be represented by y ═ k × x0+ b, however, since the slope of the line perpendicular to the X-axis is infinite and cannot be represented, the special case can be solved by converting the rectangular coordinate system to the polar coordinate system;
in a polar coordinate system, the linear equation can be expressed as ρ ═ xcos θ + ysin θ (ρ represents the distance from the origin to the line), and the calculation method is shown in fig. 4.
Judging the linear direction: by judging the parameter theta, when theta belongs to a horizontal line between two epsilon (pi/4, 3 x pi/4), or is a vertical line, then sorting and combining straight line data with small difference, eliminating the problem that a plurality of straight lines are overlapped at the same position, and finally obtaining the coordinate information of the straight lines in a rectangular coordinate system through coordinate system conversion.
And calculating a horizontal straight line equation and a vertical straight line equation by using the results, and then calculating the coordinates of the start and stop points of the grid lines through the intersection points of the horizontal straight line and the vertical straight line, so that the problems caused by the fact that the coordinates of the start and stop points of the grid lines are not actual in Hough transformation and lamplight shielding can be solved.
And drawing the straight line on the basis of the binary image according to the coordinates of the straight line start and stop points obtained by calculation, and obtaining a result graph of grid line detection. The straight line is drawn on the binary image because not only can the laser line of the original position be obtained, but also the laser area deformed under the influence of the barrier can be reserved, and the problem of shielding of light on the grid laser 8 lines can be solved.
In FIG. 5, line LhsAnd a straight line LheIs a straight line in the horizontal direction in the detection result of the Hough transform of the binary image, wherein the straight line LhsIn the presentation of the results of the detectionFirst horizontal straight line, straight line LheRepresenting the last horizontal line in the test results. Straight line LvsAnd a straight line LveIs a straight line in the vertical direction in the detection result of the Hough transform of the binary image, wherein the straight line LvsRepresents the first vertical straight line, straight line L in the detection resultveRepresenting the last vertical line in the test result. And in fig. 5, the coordinate point (x)1,y1),(x2,y2),(x3,y3) And (x)4,y4) Are the start and end point coordinates of the actual grid laser 8 line.
Wherein, the coordinate point (x)v1,yv1) And (x)v2,yv2) In xv1And xv2Is calculated by Hough transform, and yv1And yv2The value of (d) is 0. Coordinate point (x)v3,yv3) And (x)v4,yv4) In xv3And xv4Calculated by Hough transform, and yh1And yh3Is the height of the image. Coordinate point (x)h1,yh1) And (x)h3,yh3) In xh1And xh3Is 0 and y ish1And yh3Is calculated by Hough transform. Coordinate point (x)h2,yh2) And (x)h4,yh4) In xh2And xh4Is the width of the image, and yh2And yh4Is calculated by Hough transform.
Then, according to the calculation method of the grid horizontal line starting point coordinates, keeping the vertical straight line equations unchanged, sequentially traversing all the horizontal straight line equations, calculating the intersection point coordinates of each horizontal straight line and the set vertical line, and finally obtaining the actual grid horizontal line starting point coordinates. For example (x)1,y1) And (x)3,y3) And the coordinates are equal. And similarly, keeping the horizontal straight line equation unchanged, sequentially traversing all the vertical straight line equations, then calculating the coordinates of the intersection points of the horizontal straight line and the vertical straight line, and finally obtaining the coordinates of the initial point of the straight line in the vertical direction of the actual grid. For example (x)1,y1) And (x)2,y2) And the coordinates are equal.
Then, according to the calculation method of the grid horizontal line terminal point coordinates, keeping the horizontal straight line equations unchanged, traversing all the vertical straight line equations in sequence, calculating the intersection point coordinates of each vertical straight line and the set horizontal straight line, and finally obtaining the terminal point coordinates of the actual grid horizontal straight lines. For example (x)2,y2) And (x)4,y4) And the coordinates are equal. And similarly, keeping the horizontal straight line equation unchanged, sequentially traversing all the vertical straight line equations, then calculating the coordinates of the intersection points of the horizontal straight line and the vertical straight line, and finally obtaining the coordinates of the end point of the straight line in the vertical direction of the actual grid. For example (x)3,y3) And (x)4,y4) And the coordinates are equal.
The size information measuring and positioning method based on the industrial camera 7 comprises the following steps:
referring to fig. 8, a reference light triggering method based on active grid laser 8 is adopted, and a size information measuring and positioning method based on an industrial camera 7 is utilized, so that the high precision of luggage size information measurement can be met, the time loss and the computing resource consumption caused by dense point cloud operation are avoided, and the real-time performance of the system is ensured. A schematic diagram of the dimensional measurement based on the industrial camera 7 is shown in fig. 7.
A circle of point light sources are arranged at the tail end of the mechanical arm 3 and matched with the industrial camera 7, and when the environmental light is dark, the point light sources can provide enough light for the industrial camera 7; meanwhile, when the grid laser 8 irradiates the surface of the luggage case 5, the point light source can well highlight the lines of the grid laser 8, and the lines which are clear enough are provided for the industrial camera 7 to photograph, so that the size and the volume of the luggage case 5 can be estimated and calculated in the rear direction.
Calculating and correcting the volume of the luggage case 5, when the grid laser 8 irradiates the target luggage case 5, the industrial camera 7 photographs the target luggage case 5, and identifies the laser grid on the photographed picture, with the edge of the luggage case 5 of the farthest curved grid line of the luggage case 5 as the farthest edge;
a first step of calculating the area of the preliminary uppermost face of the furthest-side luggage case 5, and then calculating the volume of the preliminary luggage case 5;
the second step is to perform an edge expansion filling of the calculated volume of the luggage case 5 on the basis of the first step, in order for the gripper 10 at the end of the robot arm 3 to be able to completely wrap the edge of the luggage case 5, thus allowing both ends of the gripper 10 to be transferred to the luggage case 5.
Edge filling is an optimization of the industrial camera 7 in making the target size inspection relative to the gripper 10; filling the edges of the gray frames of 5cm in total outwards on the basis of the first step, wherein the virtual modeling is performed when the size of the luggage case 5 is estimated, so that the opening degree of the hand grip 10 is greater than the lengths of the two actually longest sides of the posture of the luggage case 5 during detection; if the distance between the longest sides is just taken as the degree of opening of the hand grip 10, then the hand grip 10 may be blocked by the thickness of the hand grip 10, so that the hand grip 10 cannot be gripped, and therefore, in practical situations, the gripping state of the hand grip 10 can be well adapted by performing a size expansion, so that each luggage case 5 can be gripped.
Take 20 inch luggage case 5 as an example:
since different sizes of the luggage case 5 are fixed, the grid laser 8 is irradiated above the luggage case 5, the industrial camera 7 performs detection shooting on the luggage case 5, the area of the front face of the case is estimated, 34 x 50cm in fig. 9 represents the real area of the front face of the luggage case 5(20 inches), the length and the width are respectively expanded by 2.5cm on the basis of the length and the width calculated by the camera to reserve enough space for grabbing, the broken line (39 x 55cm) in fig. 9 is an optimization made by the industrial camera 7 relative to the grab 10 when the industrial camera performs target size detection, and the edge filling does not exist actually but is a virtual modeling made when the size of the luggage case 5 is estimated, so that the expansion degree of the grab 10 is greater than the length of the actual longest two sides of the posture of the luggage case 5 during detection. Since the height of the luggage case 5 corresponds to the corresponding size, the volume of the luggage case 5 can be estimated by multiplying the height by the area of the front surface.
And S4, grabbing the luggage case 5 according to the weight of the luggage case 5 and the pressure value fed back by the film sensor, adjusting the grabbing strength of the grab 10 in real time, detecting and feeding back the strength of the grab 10 at the tail end of the mechanical arm 3 in real time by using the pressure film sensor 9, and grabbing the luggage case 5 without damage.
Step S5, the control unit drives the mechanical arm 3 to rotate, the luggage case 5 is transported to the upper part of the transport trolley 6, the mechanical arm 3 is moved downwards continuously, the gripper 10 is controlled to release the luggage case 5, and the luggage case 5 falls on the transport trolley 6;
step S6, the control unit controls the robot arm 3 to return to the initial position.
While the embodiments of the invention have been described in detail in connection with the accompanying drawings, it is not intended to limit the scope of the invention. Various modifications and changes may be made by those skilled in the art without inventive step within the scope of the appended claims.

Claims (9)

1. The utility model provides an airport suitcase handling system which characterized in that: comprises a control unit, a detection unit in signal connection with the control unit and a transportation unit for transporting the luggage case;
the control unit comprises a control box and a gripper which is fixedly arranged on the mechanical arm and is controlled by the control unit;
the detection unit comprises a grid laser, an industrial camera and a pressure film sensor; the industrial camera and the grid laser are both arranged on the top plate of the gripper, and a plurality of point light source lamps are arranged around the industrial camera; the pressure film sensor is fixedly arranged on the side wall of the inner side of the hand grip;
the transportation unit comprises a conveyor belt and a transportation trolley arranged on one side of the conveyor belt.
2. The airport luggage handling system of claim 1 wherein: the grid laser, the industrial camera and the film pressure sensor are connected with a small embedded industrial personal computer and are in communication connection with the control box through the small embedded industrial personal computer.
3. The airport luggage handling system of claim 1 wherein: the point light source lamp is an LED light-emitting light source lamp.
4. The method of controlling an airport baggage handling system of any one of claims 1 to 3, comprising the steps of:
s1, industrial camera calibration, including parameter calibration in the industrial camera and calibration of hand-eye coordinate conversion relation;
s2, the industrial camera collects the image information of the luggage case on the conveyor belt, and transmits the collected image information to the control unit for processing to obtain the position information of the luggage case in the image;
s3, controlling the hand grip to move to a target position according to the position information of the luggage case, controlling the hand grip to move according to the moving track of the luggage case, and setting the vertical downward moving distance and the horizontal displacement of the hand grip until the hand grip moves to two sides of the luggage case;
s4, grabbing the luggage case according to the weight of the luggage case and the pressure value fed back by the film sensor, and adjusting the grabbing strength of the grab in real time;
s5, the control unit drives the mechanical arm to rotate, the luggage case is transported to the upper part of the transport trolley, the mechanical arm is moved downwards continuously, the gripper is controlled to release the luggage case, and the luggage case falls on the transport trolley;
and S6, the control unit controls the mechanical arm to return to the initial position.
5. The method as claimed in claim 4, wherein the step S2, the industrial camera capturing image information of the luggage on the conveyor belt and transmitting the captured image information to the control unit for processing to obtain the position information of the luggage in the image, comprises:
marking and extracting the acquired image information data by adopting a silhouette processing method;
carrying out binarization processing on the extracted image information, and carrying out image segmentation on the image subjected to binarization processing to obtain image information containing a target area of the trunk;
filtering the segmented image information;
and adopting skeletonization to identify the position information of the target area, and transmitting the position information to the control unit.
6. The method as claimed in claim 4, wherein the step S3 of calculating the movement path of the luggage includes:
the grid laser projects a group of parallel grid red laser lines on the surface of the luggage case to be grabbed, the industrial camera is matched with the point light source to shoot images, and shot image information is transmitted to the control unit;
calculating a horizontal straight line equation and a vertical straight line equation in the grid by adopting standard Hough transform, and then calculating coordinates of start points and stop points of grid lines through intersection points of the horizontal straight lines and the vertical straight lines;
and drawing straight lines on the basis of the binary image of the calculated coordinates of the start point and the stop point, and obtaining a result graph of grid line detection.
7. The method of claim 6, wherein the grid horizontal line start point coordinates are calculated from the grid line detection result map:
keeping the vertical straight line equation unchanged, sequentially traversing all the horizontal straight line equations, calculating the coordinates of the intersection point of each horizontal straight line and the set vertical line, and calculating to obtain the coordinates of the initial point of the horizontal straight line of the actual grid;
keeping the horizontal straight line equation unchanged, sequentially traversing all the vertical straight line equations, calculating the coordinates of the intersection points of the horizontal straight line and the vertical straight line, and calculating to obtain the coordinates of the initial points of the straight lines in the vertical direction of the actual grid;
and calculating the coordinates of the end points of the grid horizontal lines according to a result graph of grid line detection:
keeping the horizontal straight line equation unchanged, sequentially traversing all the vertical straight line equations, calculating the intersection point coordinate of each vertical straight line and the set horizontal straight line, and calculating to obtain the end point coordinate of the actual grid horizontal straight line;
keeping the horizontal straight line equation unchanged, sequentially traversing all the vertical straight line equations, and calculating the coordinates of the intersection points of the horizontal straight line and the vertical straight line to obtain the coordinates of the end points of the straight lines in the vertical direction of the actual grid.
8. The method for controlling the airport baggage handling system of claim 4, wherein the industrial camera calibration of step S1 comprises:
carrying out parameter calibration and distortion correction in the industrial camera by adopting a Zhangyou calibration algorithm;
and performing hand-eye calibration by adopting a Tsai-Lenz calibration method.
9. The method of claim 4, further comprising calculating and correcting a volume of the luggage:
when the grid laser irradiates on the target luggage case, the industrial camera photographs the target luggage case, identifies the laser grid on the photographed picture, and takes the edge of the luggage case with the farthest bent grid line of the luggage case as the farthest edge;
calculating the area of the uppermost surface of the farthest luggage case, and further estimating the initial volume of the luggage case;
the calculated volume of the case is filled with an edge dilation filling of a total of 5cm of gray border outwardly filled at the most distal curved gridlines of the case.
CN202110897488.2A 2021-08-05 2021-08-05 Airport luggage case carrying system and control method thereof Active CN113602799B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110897488.2A CN113602799B (en) 2021-08-05 2021-08-05 Airport luggage case carrying system and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110897488.2A CN113602799B (en) 2021-08-05 2021-08-05 Airport luggage case carrying system and control method thereof

Publications (2)

Publication Number Publication Date
CN113602799A true CN113602799A (en) 2021-11-05
CN113602799B CN113602799B (en) 2022-09-13

Family

ID=78307207

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110897488.2A Active CN113602799B (en) 2021-08-05 2021-08-05 Airport luggage case carrying system and control method thereof

Country Status (1)

Country Link
CN (1) CN113602799B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115108316A (en) * 2022-08-29 2022-09-27 民航成都物流技术有限公司 Baggage pickup clamp, apparatus and method
CN115112508A (en) * 2022-08-29 2022-09-27 民航成都物流技术有限公司 Device and method for identifying soft and hard bags of consigned luggage in civil aviation airport

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1067068A1 (en) * 1999-07-08 2001-01-10 ABBPATENT GmbH Method and device for luggage transport in airports
NL1023904C2 (en) * 2003-07-11 2005-01-12 Csi Ind B V Automatic airport baggage handling process, has automatic loading controlled using measured characteristics of baggage pieces and baggage container loading condition
CN106927079A (en) * 2017-03-21 2017-07-07 长春理工大学 A kind of industrial detonator crawl and packaging system and method based on machine vision
CN108161913A (en) * 2017-12-31 2018-06-15 柳州福能机器人开发有限公司 A kind of intelligent mobile transfer robot and its method of work
CN208795188U (en) * 2018-09-05 2019-04-26 无锡维胜威信息科技有限公司 A kind of structured light binocular vision detection system
CN110509281A (en) * 2019-09-16 2019-11-29 中国计量大学 The apparatus and method of pose identification and crawl based on binocular vision
CN110660104A (en) * 2019-09-29 2020-01-07 珠海格力电器股份有限公司 Industrial robot visual identification positioning grabbing method, computer device and computer readable storage medium
CN112047113A (en) * 2020-08-26 2020-12-08 苏州中科全象智能科技有限公司 3D visual stacking system and method based on artificial intelligence technology
CN112136506A (en) * 2020-09-27 2020-12-29 哈尔滨理工大学 Robot arm device with fruit maturity distinguishing function
CN112374119A (en) * 2020-11-05 2021-02-19 泉州装备制造研究所 Self-adaptive airport logistics system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1067068A1 (en) * 1999-07-08 2001-01-10 ABBPATENT GmbH Method and device for luggage transport in airports
NL1023904C2 (en) * 2003-07-11 2005-01-12 Csi Ind B V Automatic airport baggage handling process, has automatic loading controlled using measured characteristics of baggage pieces and baggage container loading condition
CN106927079A (en) * 2017-03-21 2017-07-07 长春理工大学 A kind of industrial detonator crawl and packaging system and method based on machine vision
CN108161913A (en) * 2017-12-31 2018-06-15 柳州福能机器人开发有限公司 A kind of intelligent mobile transfer robot and its method of work
CN208795188U (en) * 2018-09-05 2019-04-26 无锡维胜威信息科技有限公司 A kind of structured light binocular vision detection system
CN110509281A (en) * 2019-09-16 2019-11-29 中国计量大学 The apparatus and method of pose identification and crawl based on binocular vision
CN110660104A (en) * 2019-09-29 2020-01-07 珠海格力电器股份有限公司 Industrial robot visual identification positioning grabbing method, computer device and computer readable storage medium
CN112047113A (en) * 2020-08-26 2020-12-08 苏州中科全象智能科技有限公司 3D visual stacking system and method based on artificial intelligence technology
CN112136506A (en) * 2020-09-27 2020-12-29 哈尔滨理工大学 Robot arm device with fruit maturity distinguishing function
CN112374119A (en) * 2020-11-05 2021-02-19 泉州装备制造研究所 Self-adaptive airport logistics system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115108316A (en) * 2022-08-29 2022-09-27 民航成都物流技术有限公司 Baggage pickup clamp, apparatus and method
CN115112508A (en) * 2022-08-29 2022-09-27 民航成都物流技术有限公司 Device and method for identifying soft and hard bags of consigned luggage in civil aviation airport

Also Published As

Publication number Publication date
CN113602799B (en) 2022-09-13

Similar Documents

Publication Publication Date Title
CN108399639B (en) Rapid automatic grabbing and placing method based on deep learning
CN108555908B (en) Stacked workpiece posture recognition and pickup method based on RGBD camera
CN111791239B (en) Method for realizing accurate grabbing by combining three-dimensional visual recognition
CN109230580B (en) Unstacking robot system and unstacking robot method based on mixed material information acquisition
CN110580725A (en) Box sorting method and system based on RGB-D camera
CN113602799B (en) Airport luggage case carrying system and control method thereof
CN112010024B (en) Automatic container grabbing method and system based on laser and vision fusion detection
JPWO2009028489A1 (en) Object detection method, object detection apparatus, and robot system
CN113666028B (en) Garbage can detecting and grabbing method based on fusion of laser radar and camera
CN114952809A (en) Workpiece identification and pose detection method and system and grabbing control method of mechanical arm
CN114758236A (en) Non-specific shape object identification, positioning and manipulator grabbing system and method
KR20180058440A (en) Gripper robot control system for picking of atypical form package
CN111311691A (en) Unstacking method and system of unstacking robot
CN114155301A (en) Robot target positioning and grabbing method based on Mask R-CNN and binocular camera
CN115070781A (en) Object grabbing method and two-mechanical-arm cooperation system
CN116984269A (en) Gangue grabbing method and system based on image recognition
CN113021391A (en) Integrated vision robot clamping jaw and using method thereof
CN113715012A (en) Automatic assembly method and system for remote controller parts
JP7408107B2 (en) Systems and methods for robotic systems with object handling
CN110533717A (en) A kind of target grasping means and device based on binocular vision
CN115848715A (en) Disordered sorting robot, system and method
CN113731860B (en) Automatic sorting system and method for piled articles in container
CN112525157B (en) Hydraulic oil cylinder size measurement and pose estimation method and system based on video image
Rybakov et al. Application of a computer vision system for recognizing tomato fruits and determining their position relative to the gripper device of the harvesting robot
CN111360822B (en) Vision-based method for grabbing space cube by manipulator

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant