CN108942946A - A kind of wisdom logistics environment robot stowage and device - Google Patents

A kind of wisdom logistics environment robot stowage and device Download PDF

Info

Publication number
CN108942946A
CN108942946A CN201810995897.4A CN201810995897A CN108942946A CN 108942946 A CN108942946 A CN 108942946A CN 201810995897 A CN201810995897 A CN 201810995897A CN 108942946 A CN108942946 A CN 108942946A
Authority
CN
China
Prior art keywords
cargo
wolf
loading area
sorting machine
firefly
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810995897.4A
Other languages
Chinese (zh)
Other versions
CN108942946B (en
Inventor
刘辉
尹恒鑫
李燕飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Central South University
Original Assignee
Central South University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Central South University filed Critical Central South University
Priority to CN201810995897.4A priority Critical patent/CN108942946B/en
Publication of CN108942946A publication Critical patent/CN108942946A/en
Application granted granted Critical
Publication of CN108942946B publication Critical patent/CN108942946B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1687Assembly, peg and hole, palletising, straight line, weaving pattern movement
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/1605Simulation of manipulator lay-out, design, modelling of manipulator

Abstract

The invention discloses a kind of wisdom logistics environment robot stowage and devices, method includes the following steps: step 1: obtaining goods information in real time;Step 2: it is corresponding to loading area placement location coordinate in Shipping Address to calculate cargo;Step 3: being clamped cargo to being used for temporary correspondence to loading area from transmission belt using sorting machine people;Step 4: calculating cargo will be in the loading area final position coordinate of AGV intelligence carrier;Step 5: carrying the control matrix that Controlling model obtains according to intelligence and be carried to cargo on the loading area of AGV intelligence carrier.Obtain the information such as cargo size, goods handling address, cargo location coordinate automatically in conjunction with machine vision, sorting machine people and intelligent carrier is utilized to carry out automated intelligent classification to cargo and load by establishing neural network model, it greatly reduces and loads error rate, improve efficiency of loading.

Description

A kind of wisdom logistics environment robot stowage and device
Technical field
The invention belongs to robot cargo load field, in particular to a kind of wisdom logistics environment robot stowage with Device.
Background technique
In recent years, development of logistics line is rapid, " wisdom logistics " is had been to be concerned by more and more people, in " wisdom logistics " It is generally manually to complete cargo to load in environment, causes human cost high, cargo efficiency of loading is low.Wisdom logistics and machine People combines, and realizing that cargo automated intelligent loads is the core direction of development of logistics line.
Cargo loading under wisdom logistics environment is a link important in logistics management process comprising is entered to cargo Library, stock control, the picking that replenishes, circulation and process, putting up working and delivery operation.At present, each step linking Property it is poor, step is complicated, and many operations need to be accomplished manually.Therefore logistics integration management process, realize " wisdom logistics " really with It is necessary that robot, which combines,.
Traditional artificial stowage has not been suitable for logistic industry due to the drawbacks such as at high cost.Worker's Manipulation of the machine arm Or the problems such as robot palletizer will lead to cargo mistake to cargo loading and load, and a variety of cargos loading practicabilities are low, low efficiency.Such as The present, there are automatic loader device people to obtain training sample using the method for arm joint kinetics equation, to utilize machine The method of study is loaded cargo, facts proved that there are stability is bad for the method solved using kinetics equation, it is practical The not strong drawback of property.For these reasons, it is badly in need of the robot stowage and device of a kind of more " wisdom ".
Summary of the invention
The present invention proposes a kind of wisdom logistics environment machine for defect present in cargo load mode in the prior art People's stowage and device make full use of vision to obtain cargo size and Shipping Address information automatically, combining classification robot with Intelligent carrier is classified and is loaded using neural network model to cargo, greatly reduces and loads error rate, improves and loads Efficiency.
A kind of wisdom logistics environment robot stowage, comprising the following steps:
Step 1: obtaining cargo size to be loaded in transmission belt, Shipping Address, cargo in real time and be crawled in transmission belt Line and cargo-transmission belt between the distance between moment and sorting machine human arm and cargo-sorting machine human arm hang down The angle of straight line;
Wherein, the cargo size, Shipping Address, cargo be crawled in transmission belt moment and sorting machine human arm it Between distance and cargo-sorting machine human arm between line and cargo-transmission belt vertical line angle by being mounted on point ZED camera in class robot is obtained by visual identity;
Step 2: according to cargo size and Shipping Address, it is corresponding to loading area placement location in Shipping Address to calculate cargo Coordinate, and unique number is carried out to cargo;
Using, as origin, horizontal plane is X-axis to the right to the loading area lower left corner, it is upwards Y-axis, is pointed into the sky perpendicular to the ground as Z Axis is established to loading area three-dimensional system of coordinate, and is placed using cargo lower surface near X-axis and farthest away from the point of Y-axis as cargo Point;
Step 3: based on acquisition to loading area placement location coordinate, cargo is clamped from transmission belt using sorting machine people , to loading area, the Item Number to newly store in loading area is sent to server to for temporary correspondence;
Step 4: if being greater than 10 to loading area quantity of goods, according to all cargo sizes and storage kept in loading area Position calculates cargo in the loading area final position coordinate of AGV intelligence carrier, otherwise, continues waiting for cargo to be installed, returns Step 1;
Using the loading area lower left corner as origin, horizontal plane is A axis to the right, is upwards B axle, is pointed into the sky perpendicular to the ground as C axis, Loading area three-dimensional system of coordinate is established, and using cargo lower surface near A axis and farthest away from the vertex of B axle as the terminal position of cargo Set coordinate;
Step 5: according to the goods information recorded in server, issues loading and instruct to AGV intelligence carrier, it will be wait load Cargo in area according to cargo size, to loading area placement location coordinate and in the loading area terminal position of AGV intelligence carrier It sets coordinate input intelligence and carries Controlling model, the loading machine arm joint for obtaining AGV intelligence carrier controls matrix Q2, foundation Control matrix Q obtained2Cargo is carried on the loading area of AGV intelligence carrier;
The loading machine arm joint of the AGV intelligence carrier controls matrix Q2Size is N2*M2, N2For the loading Robotic arm joint number, M2Change number for joint position of the loading robotic arm in entire handling process;
It is described intelligence carry Controlling model be will to the cargo in loading area to loading area placement location coordinate (x, y, Z), in the loading area final position coordinate (a, b, c) of AGV intelligence carrier and cargo size as input data, by loading machine Joint control matrix of device arm during removing goods is trained acquisition to grey neural network as output data;
The loading machine arm joint controls matrix Q2Exist in each joint in the handling process including loading robotic arm Coordinate value (the α at each movement moment22), wherein α2Indicate the angle at joint connection arm both ends, θ2It indicates artis and closes The angle rotated when section starting position;
The intelligence carries the training process of Controlling model are as follows: loads robotic arm pair using the manipulation of manual control server Various cargos carry out carrying loading, obtain and load training data, are placing position to loading area to load the cargo in training data Set coordinate (x, y, z), loading area final position coordinate (a, b, c) and cargo size the conduct input number in AGV intelligence carrier According to will load joint control matrix of robotic arm during removing goods as output data, be instructed to grey neural network Practice, and it is 9 that input layer number, which is arranged, hidden layer node number is 19, and output layer node number is to load robotic arm to close Save number N2;Maximum number of iterations in training process is set as 500, and training learning rate is 0.01, threshold value 0.05.
Further, the intelligence carries the weight of grey neural network and threshold value in Controlling model and uses glowworm swarm algorithm Acquisition is in optimized selection, detailed process is as follows:
Step C1: firefly initial parameter is arranged in initialization firefly population;
Using firefly body position as the weight of grey neural network and threshold value, random initializtion firefly population;
Wherein, setting firefly number value range is [10,400], maximum Attraction Degree β0=1, light intensity absorption coefficient gamma Value range is [0.002,200], and step factor α value range is [0.01,1], and maximum number of iterations T value range is [300,2000], search precision ε value range are [0.001,0.1];
Step C2: setting fitness function, and determine initial most bright firefly body position and the number of iterations t, t=1;
The corresponding weight in firefly body position and threshold value are substituted into the grey neural network, and utilize firefly The intelligence that a body position determines carries Controlling model and calculates loading each joint coordinates value of robotic arm, will load robotic arm and respectively close Section the sum of coordinate predicted value and the difference E of actual value add 1 inverse as the first fitness function f1(x), f1(x)=1/ (∑ E+ 1);
The fitness of every firefly body position is calculated, using the first fitness function with the corresponding firefly of maximum adaptation degree Fireworm body position is as initial most bright firefly body position;
Step C3: calculating the relative luminance I and Attraction Degree β of firefly in group, determines firefly according to relative luminance Moving direction;
The relative luminance I of firefly are as follows:
Wherein, I0Indicate the brightness of most bright firefly, γ indicates the absorption coefficient of light, rijIndicate firefly i and j between away from From;
The Attraction Degree β of firefly are as follows:
Wherein, β0Indicate maximum Attraction Degree;
Step C4: updating a body position of firefly, carries out random movement to most bright firefly body position;
xi(t+1)=xi(t)+β(xj(t)-xi(t))+α(rand-1/2)
Wherein, xi(t) and xj(t) a body position of i and two fireflies of j are indicated, α is step factor, and rand is [0,1] It is upper to obey equally distributed random factor;
Step C5: the fitness of each firefly body position in current firefly population is calculated;
Firefly individual each in firefly population is ranked up according to firefly fitness, finds fitness highest Firefly body position be used as most bright firefly body position;
Step C6: judging whether to reach maximum number of iterations or reach maximum search precision, if so, selecting most bright firefly Fireworm individual, exports the best weight value and threshold value of the corresponding grey neural network in most bright firefly body position, obtains intelligence and removes Otherwise fortune Controlling model enables t=t+1, go to step C3 and continue next iteration.
Further, the cargo will correspond to the calculating process to loading area placement location coordinate of Shipping Address such as Under:
Step A1: it is arranged to loading area along X-axis, Y-axis, Z axis maximum length is respectively xmax, ymax, zmax, enable xd=xmax,yd =0, zd=0;
Step A2: judge to (x, y, z)=(x in loading aread,yd,zd) it whether there is cargo at position coordinates, if so, into Enter step A3, otherwise, enters step A8;
Step A3: setting l (xd)、l(yd)、l(zd) it is respectively to be in (x, y, z)=(xd,yd,zd) position cargo along X Axis, Y-axis, the length of Z axis;
Step A4: judge l (yd)+yd>ymaxIt is whether true, if so, entering step A5, otherwise, enable yd=l (yd)+yd, so After enter step A2;
Step A5: judge l (zd)+zd>zmaxAnd ymax-l(ydWhether)=0 sets up simultaneously, if so, A6 is entered step, if only There is l (zd)+zd>zmaxIt sets up, then enters step A4, otherwise, enable zd=l (zd)+zd, subsequently into step A2;
Step A6: judge xd-l(xd) < 0 and l (zd)+zd>zmaxWhether set up simultaneously, if so, A7 is entered step, if only xd-l(xd) < 0 is set up, then enters step A4, otherwise, enable xd=xd-l(xd), subsequently into step A2;
Step A7: terminating and calculate, and issues the warning full to loading area cargo;
Step A8: whether the cargo size that judgement will place, which meets Length x Width, is respectively less than immediately below the position coordinates There are the length and widths of cargo, if so, using the position coordinates as the placement location coordinate of cargo to be placed, otherwise, by this The length of existing cargo along the y axis adds y immediately below position coordinatesdAs new yd, subsequently into step A2.
Further, the cargo will be as follows in the loading area final position Coordinate calculation method of AGV intelligence carrier:
Step B1: it chooses in server virtual environment to the maximum cargo of area in loading area upper surface, obtains the goods Object number and length, width and height information, enter step B2;
Step B2: setting loading area is along A axis, and B axle, C axis maximum length is respectively amax, bmax, cmax, enable ad=amax,bd= 0,cd=0;
Step B3: (a, b, c)=(a is judged in loading aread,bd,cd) it whether there is cargo at position coordinates, if so, into Enter step B4, otherwise, enters step B9;
Step B4: setting l (ad)、l(bd)、l(cd) it is respectively to be in (a, b, c)=(ad,bd,cd) position cargo along A Axis, B axle, the length of C axis direction;
Step B5: judge l (bd)+bd>bmaxIt is whether true, if so, entering step B6, otherwise, enable bd=l (bd)+bd, so After enter step B3;
Step B6: judge l (cd)+cd>cmaxAnd bmax-l(bdWhether)=0 sets up simultaneously, if so, B7 is entered step, if only There is l (cd)+cd>cmaxIt sets up, then enters step B5, otherwise, enable cd=l (cd)+cd, subsequently into step B3;
Step B7: judge ad-l(ad) < 0 and l (cd)+cd>cmaxWhether set up simultaneously, if so, B8 is entered step, if only ad-l(ad) < 0 is set up, then enters step B5, otherwise, enable ad=ad-l(ad), subsequently into step B3;
Step B8: terminating and calculate, and the cargo for obtaining each different numbers will be in the final position coordinate of loading area;
Step B9: whether the cargo size that judgement will place, which meets Length x Width, is respectively less than immediately below the position coordinates There are the length and widths of cargo, if so, using the position coordinates as the placement coordinate of the number cargo subsequently into step Otherwise cargo existing immediately below the position coordinates is added b along B axle direction length by B10dAs new bd, subsequently into step Rapid B3;
Step B10: judging whether there is corresponding final position coordinate to the numbered cargo of institute in loading area, if so, The calculating of the cargo final position coordinate of each number is then completed, otherwise, deletes and had been computed in server virtual environment eventually The cargo of point position coordinates, updates to loading area upper surface goods information, and enter step B1.
Further, described to be clamped cargo to being used for temporary correspondence to loading area from transmission belt using sorting machine people Process refer to will using on sorting machine people ZED camera obtain cargo size, cargo wink is crawled in transmission belt Between line and cargo-transmission belt vertical line between the distance between sorting machine human arm, cargo-sorting machine human arm Angle and cargo, as input data, be input to Classing filing appliance modulus type to loading area placement location coordinate, classified Sorting machine human arm joint control matrix Q of the robot in classification clamping process1, according to sorting machine human arm obtained Joint control matrix Q1Cargo is clamped from transmission belt on to loading area;
The sorting machine human arm joint control matrix Q1Size is N1*M1, N1For the sorting machine manpower shoulder joint Number, M1Change number for joint position of sorting machine human arm during entire clamping;
The sorting machine human arm joint control matrix Q1Including sorting machine human arm each pass in the process of grasping Save the coordinate value (α at each movement moment11), wherein α1Indicate the angle at joint connection arm both ends, θ1Indicate artis The angle rotated when starting position with joint;
The Classing filing appliance modulus type is that cargo size, cargo are crawled moment and sorting machine human arm in transmission belt The distance between, the line between cargo-sorting machine human arm and cargo-transmission belt vertical line angle and cargo to Loading area placement location coordinate (x, y, z) is used as input data, by each joint control square of sorting machine people during clamping Battle array is used as output data, is trained acquisition to wavelet neural network;
The training process of the Classing filing appliance modulus type are as follows: first with manual control server manipulation sorting machine people to various Cargo is clamped, and is obtained clamping training data, is crawled in transmission belt with clamping the cargo size in training data, cargo Line between the distance between moment and sorting machine human arm, cargo-sorting machine human arm is vertical with cargo-transmission belt The angle and cargo of line will divide in the input data to loading area placement location coordinate (x, y, z) as wavelet neural network Output data of each joint control matrix of the class robot in clamping handling process as wavelet neural network, sets in training Setting input layer number is 8, and hidden layer node number is 17, and output layer node number is sorting machine manpower shoulder joint number N1, maximum number of iterations is set as 600, and training learning rate is 0.01, threshold value 0.05, and the power of the wavelet neural network Acquisition is in optimized selection using wolf pack algorithm in value, threshold value and flexible translation coefficient.
Further, the weight of wavelet neural network, threshold value and flexible translation coefficient use in the Classing filing appliance modulus type Acquisition is in optimized selection in wolf pack algorithm, and detailed process is as follows:
Step E1: simultaneously wolf pack parameter is arranged in initialization wolf pack;
It is [5,130] that value range, which is arranged, in wolf pack scale, and it is [900,3000] that value range, which is arranged, in step factor, visits wolf ratio Example factor setting value range is [3,10], and maximum migration number setting value range is [5,30], the setting of the range estimation factor Value range is [100,400], and maximum long-range raid number setting value range is [6,20], updates scale factor and value range is arranged For [2,30], it is [100,1500] that value range, which is arranged, in maximum number of iterations, and maximum search precision setting value range is [0.001,0.2];
Step E2: setting fitness function, and determine initial optimal head wolf position and the number of iterations t, t=1;
Successively the corresponding parameter value in individual wolf position is brought into classification clamping model, and point determined using individual wolf position Class clamping model is exported as a result, using the inverse for the mean square deviation MSE for exporting result and actual value as the second fitness function f2(x), f2(x)=1/MSE;
The fitness of each individual wolf position is calculated, using the second fitness function with maximum adaptation degreeIt is corresponding Individual wolf position is as initial optimal head wolf position
Step E3: wolf migration is visited;
The maximum wolf of fitness is chosen from wolf pack as head wolf, and is randomly selected and visited wolf;It calculates and visits wolf in all directions Fitness, and make to visit wolf to the maximum direction exploration of fitness, when certain only visits wolf fitness greater than head wolf or reaches maximum trip When walking number, migration terminates;
Step E4: violent wolf long-range raid;
Individual wolf in addition to except head wolf and visiting wolf is violent wolf, and violent wolf constantly calculates violent to head wolf direction long-range raid The fitness of wolf position;
If certain violent wolf position fitness is higher than head wolf position fitness, a wolf is updated, and remaining violent wolf is changed to current Head wolf long-range raid, when violent wolf with when front wolf distance be less than determine apart from when, which stops, when all violent wolf long-range raids stop Or when reaching maximum long-range raid number, long-range raid terminates, wolf pack enters jointly attack state;
Step E5: wolf pack besieges;
Except individual wolves all in addition to the wolf of front take a step forward to head wolf direction, successively judge forward further after Whether body wolf position fitness is better than not the fitness of further position forward, if so, will make further position forward For the new position of individual wolf, otherwise, individual wolf keeps original position constant;
Step E6: after completing jointly attack behavior, all individual wolves are sorted from high to low according to current fitness in wolf pack, are adapted to It spends highest individual wolf and is set as a wolf, the artificial wolf to rank behind is eliminated, and the new artificial wolf of random generation again;
Step E7: when reaching maximum search precision or maximum number of iterations, the corresponding small echo mind of newest head wolf is exported Best initial weights, threshold value and flexible translation coefficient through network, obtain Classing filing appliance modulus type, otherwise, enable t=t+1, return step E3 continues next iteration.
Further, the Shipping Address identifies that the bar code on cargo obtains by ZED camera, or passes through ZED Camera identifies the address character on cargo and obtains;
The address character identification acquisition process is as follows:
Step D1: reading the image that ZED camera obtains, and carries out gray processing and binary conversion treatment to image;
Step D2: to by step D1, treated that image does Slant Rectify, the image after correction is filtered smoothly Processing;
Step D3: it to by filtering, treated that character zone extracts is partitioned into single character, obtains single Character picture matrix;
Step D4: successively using the single character picture matrix being partitioned into as input data, it will be input to and train The character recognition model based on Elman neural network;
Step D5: it combines all output characters by recognition sequence, comparison is with having single-level address, two-level address, three-level The address base of location obtains identification address;
The training process of the character recognition model based on Elman neural network are as follows: utilize the figure of known address information Picture, and being handled according to step D1-D3 obtains each single character picture matrix, using each single character picture matrix as defeated Entering data, corresponding character name is referred to as output data, and it is 2 that input layer number, which is arranged, and hidden layer node number is 5, Output layer node number is 1, is trained to Elman neural network;Maximum number of iterations in training process is set as 1000, Training learning rate is 0.01, threshold value 0.02.
Further, the single-level address refers to all provinces, autonomous region, municipality directly under the Central Government or special administrative region;The second level Address refers to all regions, alliance, autonomous prefecture, prefecture-level city;The third-level address refer to all counties, autonomous county, flag, automonous banner, county-level city, Districts under city administration, forest zone, special zone.
A kind of wisdom logistics environment robot loading attachment, comprising:
Server is established for storing goods information to loading area and loading area virtual environment and data operation;
Transmission belt is used for transmission various cargos;
Sorting machine people is equipped with multi-joint robotic arm, adopt with the aforedescribed process by cargo from transmission belt clamp to In temporary correspondence to loading area;
ZED camera is mounted on sorting machine people, is grabbed in transmission belt for visual identity cargo size, cargo Line and cargo-transmission belt between the distance between moment and sorting machine human arm, cargo-sorting machine human arm is taken to hang down The angle of straight line;
To loading area, the cargo to come is clamped from transmission belt for keeping in sorting machine people;
Kinect camera is mounted on to overlook entirely to loading area, be stored in for visual identity right above loading area To the position of all cargos of loading area, size marginal information and location tracking is numbered to cargo;
AGV intelligence carrier, including load robotic arm, loading area and SR200 camera, wherein load robotic arm It adopts and cargo is carried to loading area to loading area with the aforedescribed process, loading area is used to store from coming to loading area carrying Cargo, SR200 camera are installed on the position for overlooking entire loading area, for identification all cargo locations of loading area, cargo ruler Very little marginal information.
Beneficial effect
The present invention provides a kind of wisdom logistics environment robot stowage and devices, method includes the following steps: Step 1: obtaining goods information in real time;Step 2: it is corresponding to loading area placement location coordinate in Shipping Address to calculate cargo;Step Rapid 3: based on acquisition to loading area placement location coordinate, cargo is clamped from transmission belt to being used to keep in using sorting machine people Correspondence to loading area;Step 4: if being greater than 10 to loading area quantity of goods, calculating cargo will be in the dress of AGV intelligence carrier It carries area final position coordinate and otherwise continues waiting for cargo to be installed;Step 5: according to the goods information recorded in server, issuing Instruction is loaded to AGV intelligence carrier, by the cargo in loading area according to cargo size, to loading area placement location coordinate Controlling model is carried with the loading area final position coordinate input intelligence in AGV intelligence carrier, obtains AGV intelligence carrier It loads machine arm joint and controls matrix, cargo is carried to the loading of AGV intelligence carrier according to control matrix obtained Qu Shang.Whole device structure is simple, easy to operate, greatly realizes unmanned, intelligentized loading process, has great Promotional value.
Present invention combination machine vision obtains the information such as cargo size, goods handling address, cargo location coordinate automatically, presses The setting of different Shipping Address for keep in cargo to loading area, facilitate cargo further to carry loadings, using being equipped with loading Robotic arm and SR200 camera and the AGV intelligence carrier for being provided with loading area transports goods greatly improve cargo and load effect Rate;It establishes using transmission belt and sorting machine people location information and to loading area placement location coordinate information based on neural network Clamping disaggregated model, the cargo of different location state is clamped, improve cargo clamp accuracy;Cargo is established to be installed The mapping for carrying area's placement location coordinate and final position coordinate pair loading each joint control matrix of robotic arm in loading area is closed System establishes intelligence and carries Controlling model, and the cargo improved for different beginning and end positions clamps efficiency;Entire wisdom logistics Environment robot stowage and device realize that cargo loads process automation intelligence wisdom, can substantially reduce manpower and disappear Consumption reduces and loads error rate, improves efficiency of loading.
Detailed description of the invention
Fig. 1 is the flow diagram of the method for the invention;
Fig. 2 is the structural schematic diagram of device of the present invention.
Specific embodiment
The present invention is described further below in conjunction with drawings and examples.
As shown in Figure 1, a kind of wisdom logistics environment robot stowage, comprising the following steps:
Step 1: obtain in real time cargo size (l, w, h) to be loaded in transmission belt, Shipping Address (f_p, s_p, t_p), Cargo is crawled moment between sorting machine human arm between distance L and cargo-sorting machine human arm in transmission belt Line and cargo-transmission belt vertical line angle μ;
Wherein, the cargo size, Shipping Address, cargo be crawled in transmission belt moment and sorting machine human arm it Between distance and cargo-sorting machine human arm between line and cargo-transmission belt vertical line angle by being mounted on point ZED camera in class robot is obtained by visual identity;
Wherein, the cargo size includes the length l, width w, height h of cargo;
Wherein, the Shipping Address of the cargo includes single-level address f_p, two-level address s_p, third-level address t_p;Described one Grade address refers to province, autonomous region, municipality directly under the Central Government or the special administrative region of goods handling;The two-level address refers to the ground of goods handling Area, alliance, autonomous prefecture, prefecture-level city;The third-level address refers to county, autonomous county, flag, automonous banner, the county-level city, city's linchpin of goods handling Area, forest zone, special zone;
Wherein, it is cargo quilt that the cargo is crawled moment distance L between sorting machine human arm in transmission belt Cargo centre that moment is differentiated by ZED camera is grasped at a distance from sorting machine people's base central;
Wherein, the cargo is crawled line and cargo-between moment cargo-sorting machine human arm in transmission belt The angle μ of transmission belt vertical line is cargo centre-sorting machine people's pedestal that cargo is crawled that moment differentiated by ZED camera The angle of the line of centres and cargo in the vertical line of transmission belt transmission direction;
The Shipping Address identifies that the bar code on cargo obtains by ZED camera, or by ZED camera to goods Address character on object, which identifies, to be obtained;
The address character identification acquisition process is as follows:
Step D1: reading the image that ZED camera obtains, and carries out gray processing and binary conversion treatment to image;
Step D2: to by step D1, treated that image does Slant Rectify, the image after correction is filtered smoothly Processing;
Step D3: it to by filtering, treated that character zone extracts is partitioned into single character, obtains single Character picture matrix;
Step D4: successively using the single character picture matrix being partitioned into as input data, it will be input to and train The character recognition model based on Elman neural network;
Step D5: it combines all output characters by recognition sequence, comparison is with having single-level address, two-level address, three-level The address base of location obtains identification address;
The training process of the character recognition model based on Elman neural network are as follows: utilize the figure of known address information Picture, and being handled according to step D1-D3 obtains each single character picture matrix, using each single character picture matrix as defeated Entering data, corresponding character name is referred to as output data, and it is 2 that input layer number, which is arranged, and hidden layer node number is 5, Output layer node number is 1, is trained to Elman neural network;Maximum number of iterations in training process is set as 1000, Training learning rate is 0.01, threshold value 0.02;
Step 2: according to cargo size (l, w, h) and Shipping Address (f_p, s_p, t_p), calculating cargo in Shipping Address pair That answers carries out unique number to loading area placement location coordinate, and to cargo, obtains cargo clamping information vector (li,wi,hi, Lii,xi,yi,zi);Wherein, i is the unique number of the cargo, (li,wi,hi,Lii,xi,yi,zi) in data generation respectively Table numbering is the length l of the cargo of ii, width wi, height hi, cargo moment and sorting machine human arm are crawled in transmission belt The distance between Li, line between cargo-sorting machine human arm and cargo-transmission belt vertical line angle μ and to Loading area placement location coordinate (xi,yi,zi);
It is rectangular area to loading area, top is covered equipped with kinect camera downwards, kinect camera view Lid is entirely upwards Y-axis, is directed toward perpendicular to the ground to loading area using, as origin, horizontal plane is X-axis to the right to the loading area lower left corner Sky is Z axis, is established to loading area three-dimensional system of coordinate, and using cargo lower surface near X-axis and farthest away from the point of Y-axis as goods Object set-point;The cargo is as follows in the calculating process to loading area placement location coordinate of corresponding Shipping Address:
Step A1: it is arranged to loading area along X-axis, Y-axis, Z axis maximum length is respectively xmax, ymax, zmax, enable xd=xmax,yd =0, zd=0;
Step A2: judge to (x, y, z)=(x in loading aread,yd,zd) it whether there is cargo at position coordinates, if so, into Enter step A3, otherwise, enters step A8;
Step A3: setting l (xd)、l(yd)、l(zd) it is respectively to be in (x, y, z)=(xd,yd,zd) position cargo along X Axis, Y-axis, the length of Z axis;
Step A4: judge l (yd)+yd>ymaxIt is whether true, if so, entering step A5, otherwise, enable yd=l (yd)+yd, so After enter step A2;
Step A5: judge l (zd)+zd>zmaxAnd ymax-l(ydWhether)=0 sets up simultaneously, if so, A6 is entered step, if only There is l (zd)+zd>zmaxIt sets up, then enters step A4, otherwise, enable zd=l (zd)+zd, subsequently into step A2;
Step A6: judge xd-l(xd) < 0 and l (zd)+zd>zmaxWhether set up simultaneously, if so, A7 is entered step, if only xd-l(xd) < 0 is set up, then enters step A4, otherwise, enable xd=xd-l(xd), subsequently into step A2;
Step A7: terminating and calculate, and issues the warning full to loading area cargo;
Step A8: whether the cargo size that judgement will place, which meets Length x Width, is respectively less than immediately below the position coordinates There are the length and widths of cargo, if so, using the position coordinates as the placement location coordinate of cargo to be placed, otherwise, by this The length of existing cargo along the y axis adds y immediately below position coordinatesdAs new yd, subsequently into step A2;
Step 3: based on acquisition to loading area placement location coordinate, cargo is clamped from transmission belt using sorting machine people , to loading area, the Item Number to newly store in loading area is sent to server to for temporary correspondence;
Described clamped cargo to being used for process of the temporary correspondence to loading area from transmission belt using sorting machine people be Refer to and the cargo size obtained using the ZED camera on sorting machine people, cargo are crawled moment and classification in transmission belt Line and cargo-transmission belt vertical line angle between the distance between robot arm L, cargo-sorting machine human arm And cargo is to loading area placement location coordinate, that is, goods information vector (li,wi,hi,Lii,xi,yi,zi) as input number According to being input to Classing filing appliance modulus type, obtain sorting machine people in the sorting machine human arm joint control square of classification clamping process Battle array Q1, according to sorting machine human arm joint control matrix Q obtained1Cargo is clamped from transmission belt on to loading area;
The sorting machine human arm joint control matrix Q1Size is N1*M1, N1For the sorting machine manpower shoulder joint Number, M1Change number for joint position of sorting machine human arm during entire clamping;
The sorting machine human arm joint control matrix Q1Including sorting machine human arm each pass in the process of grasping Save the coordinate value (α at each movement moment11), wherein α1Indicate the angle at joint connection arm both ends, θ1Indicate artis The angle rotated when starting position with joint;The Classing filing appliance modulus type is to be crawled cargo size, cargo in transmission belt Line between the distance between moment and sorting machine human arm, cargo-sorting machine human arm is vertical with cargo-transmission belt The angle and cargo of line to loading area placement location coordinate as input data, by sorting machine people during clamping Each joint control matrix is trained acquisition as output data, to wavelet neural network;
When the Classing filing appliance modulus type is trained using wavelet neural network, first with the manipulation point of manual control server Class robot clamps various cargos, obtains clamping training data, to clamp the cargo size in training data, cargo exists Be crawled in transmission belt line between the distance between moment and sorting machine human arm, cargo-sorting machine human arm with Cargo-transmission belt vertical line angle and cargo are being used as wavelet neural network to loading area placement location coordinate (x, y, z) Input data, using sorting machine people clamping handling process in each joint control matrix as the output of wavelet neural network Data, in training, setting input layer number is 8, and hidden layer node number is 17, and output layer node number is classifier Device manpower shoulder joint number N1, maximum number of iterations is set as 600, and training learning rate is 0.01, threshold value 0.05, and described small Acquisition is in optimized selection using wolf pack algorithm in weight, threshold value and the flexible translation coefficient of wave neural network;
In the Classing filing appliance modulus type weight of wavelet neural network, threshold value and flexible translation coefficient using wolf pack algorithm into Row optimum choice obtains, and detailed process is as follows:
Step E1: simultaneously wolf pack parameter is arranged in initialization wolf pack;
It is [5,130] that value range, which is arranged, in wolf pack scale, and it is [900,3000] that value range, which is arranged, in step factor, visits wolf ratio Example factor setting value range is [3,10], and maximum migration number setting value range is [5,30], the setting of the range estimation factor Value range is [100,400], and maximum long-range raid number setting value range is [6,20], updates scale factor and value range is arranged For [2,30], it is [100,1500] that value range, which is arranged, in maximum number of iterations, and maximum search precision setting value range is [0.001,0.2];
Step E2: setting fitness function, and determine initial optimal head wolf position and the number of iterations t, t=1;
Successively the corresponding parameter value in individual wolf position is brought into classification clamping model, and point determined using individual wolf position Class clamping model is exported as a result, using the inverse for the mean square deviation MSE for exporting result and actual value as the second fitness function f2(x), f2(x)=1/MSE;
The fitness of each individual wolf position is calculated, using the second fitness function with maximum adaptation degreeIt is corresponding Individual wolf position is as initial optimal head wolf position
Step E3: wolf migration is visited;
The maximum wolf of fitness is chosen from wolf pack as head wolf, and is randomly selected and visited wolf;It calculates and visits wolf in all directions Fitness, and make to visit wolf to the maximum direction exploration of fitness, when certain only visits wolf fitness greater than head wolf or reaches maximum trip When walking number, migration terminates;
Step E4: violent wolf long-range raid;
Individual wolf in addition to except head wolf and visiting wolf is violent wolf, and violent wolf constantly calculates violent to head wolf direction long-range raid The fitness of wolf position;
If certain violent wolf position fitness is higher than head wolf position fitness, a wolf is updated, and remaining violent wolf is changed to current Head wolf long-range raid, when violent wolf with when front wolf distance be less than determine apart from when, which stops, when all violent wolf long-range raids stop Or when reaching maximum long-range raid number, long-range raid terminates, wolf pack enters jointly attack state;
Step E5: wolf pack besieges;
Except individual wolves all in addition to the wolf of front take a step forward to head wolf direction, successively judge forward further after Whether body wolf position fitness is better than not the fitness of further position forward, if so, will make further position forward For the new position of individual wolf, otherwise, individual wolf keeps original position constant;
Step E6: after completing jointly attack behavior, all individual wolves are sorted from high to low according to current fitness in wolf pack, are adapted to It spends highest individual wolf and is set as a wolf, the artificial wolf to rank behind is eliminated, and the new artificial wolf of random generation again;
Step E7: when reaching maximum search precision or maximum number of iterations, the corresponding small echo mind of newest head wolf is exported Best initial weights, threshold value and flexible translation coefficient through network, obtain Classing filing appliance modulus type, otherwise, enable t=t+1, return step E3 continues next iteration;
Step 4: if being greater than 10 to loading area quantity of goods, according to all cargo sizes and storage kept in loading area Position calculates cargo in the loading area final position coordinate of AGV intelligence carrier, otherwise, continues waiting for cargo to be installed, returns Step 1;
The cargo size, cargo placement location coordinate, the cargo to loading area will be in the loading of AGV intelligence carrier Area final position coordinate forms cargo and carries information vector (li,wi,hi,xi,yi,zi,ai,bi,ci);Wherein, i be cargo only One number, (li,wi,hi,xi,yi,zi,ai,bi,ci) in data respectively represent number be i cargo length li, width wi、 Height hi, cargo is to loading area placement location coordinate (xi,yi,zi) and cargo will be in the loading area terminal of AGV intelligence carrier Position coordinates (ai,bi,ci);
Loading area on AGV intelligence carrier is rectangular area, be equipped on designated position load robotic arm and SR200 camera, the opereating specification for loading robotic arm can cover entirely to loading area and loading area, SR200 camera view Wild range can cover entire loading area, and it is upwards B axle, vertically that using the loading area lower left corner as origin, horizontal plane is A axis to the right Ground points into the sky as C axis, establishes loading area three-dimensional system of coordinate, and with cargo lower surface near A axis and farthest away from the top of B axle Final position coordinate of the point as cargo;
The cargo is as follows by the loading area final position Coordinate calculation method on AGV intelligence carrier:
Step B1: it chooses in server virtual environment to the maximum cargo of area in loading area upper surface, obtains the goods Object number and length, width and height information, enter step B2;
Step B2: setting loading area is along A axis, and B axle, C axis maximum length is respectively amax, bmax, cmax, enable ad=amax,bd= 0,cd=0;
Step B3: (a, b, c)=(a is judged in loading aread,bd,cd) it whether there is cargo at position coordinates, if so, into Enter step B4, otherwise, enters step B9;
Step B4: setting l (ad)、l(bd)、l(cd) it is respectively to be in (a, b, c)=(ad,bd,cd) position cargo along A Axis, B axle, the length of C axis direction;
Step B5: judge l (bd)+bd>bmaxIt is whether true, if so, entering step B6, otherwise, enable bd=l (bd)+bd, so After enter step B3;
Step B6: judge l (cd)+cd>cmaxAnd bmax-l(bdWhether)=0 sets up simultaneously, if so, B7 is entered step, if only There is l (cd)+cd>cmaxIt sets up, then enters step B5, otherwise, enable cd=l (cd)+cd, subsequently into step B3;
Step B7: judge ad-l(ad) < 0 and l (cd)+cd>cmaxWhether set up simultaneously, if so, B8 is entered step, if only ad-l(ad) < 0 is set up, then enters step B5, otherwise, enable ad=ad-l(ad), subsequently into step B3;
Step B8: terminating and calculate, and the cargo for obtaining each different numbers will be in the final position coordinate of loading area;
Step B9: whether the cargo size that judgement will place, which meets Length x Width, is respectively less than immediately below the position coordinates There are the length and widths of cargo, if so, using the position coordinates as the placement coordinate of the number cargo subsequently into step Otherwise cargo existing immediately below the position coordinates is added b along B axle direction length by B10dAs new bd, subsequently into step Rapid B3;
Step B10: judging whether there is corresponding final position coordinate to the numbered cargo of institute in loading area, if so, The calculating of the cargo final position coordinate of each number is then completed, otherwise, deletes and had been computed in server virtual environment eventually The cargo of point position coordinates, updates to loading area upper surface goods information, and enter step B1;
Step 5: according to the goods information recorded in server, issues loading and instruct to AGV intelligence carrier, it will be wait load Cargo in area according to cargo size, to loading area placement location coordinate and in the loading area terminal position of AGV intelligence carrier It sets coordinate input intelligence and carries Controlling model, the loading machine arm joint for obtaining AGV intelligence carrier controls matrix Q2, foundation Control matrix Q obtained2Cargo is carried on the loading area of AGV intelligence carrier;
The loading machine arm joint of the AGV intelligence carrier controls matrix Q2Size is N2*M2, N2For the loading Robotic arm joint number, M2Change number for joint position of the loading robotic arm in entire handling process;
It is described intelligence carry Controlling model be will to the cargo in loading area to loading area placement location coordinate (x, y, Z), in the loading area final position coordinate (a, b, c) of AGV intelligence carrier and cargo size as input data, by loading machine Joint control matrix of device arm during removing goods is trained acquisition to grey neural network as output data;
The loading machine arm joint controls matrix Q2Exist in each joint in the handling process including loading robotic arm Coordinate value (the α at each movement moment22), wherein α2Indicate the angle at joint connection arm both ends, θ2It indicates artis and closes The angle rotated when section starting position;
The intelligence carries the training process of Controlling model are as follows: loads robotic arm pair using the manipulation of manual control server Various cargos carry out carrying loading, obtain and load training data, are placing position to loading area to load the cargo in training data Set coordinate (x, y, z), AGV intelligence carrier loading area final position coordinate (a, b, c) and cargo size as grey mind Input data through network will load joint control matrix of robotic arm during removing goods as the defeated of grey neural network Data out are trained grey neural network, and it is 9 that input layer number, which is arranged, and hidden layer node number is 19, output Node layer number is to load machine arm joint number N2;Maximum number of iterations in training process is set as 500, training study Rate is 0.01, threshold value 0.05;
The intelligence is carried the weight of grey neural network and threshold value in Controlling model and is optimized using glowworm swarm algorithm Selection obtains, and detailed process is as follows:
Step C1: firefly initial parameter is arranged in initialization firefly population;
Using firefly body position as the weight of grey neural network and threshold value, random initializtion firefly population;
Wherein, setting firefly number value range is [10,400], maximum Attraction Degree β0=1, light intensity absorption coefficient gamma Value range is [0.002,200], and step factor α value range is [0.01,1], and maximum number of iterations T value range is [300,2000], search precision ε value range are [0.001,0.1];
Step C2: setting fitness function, and determine initial most bright firefly body position and the number of iterations t, t=1;
The corresponding weight in firefly body position and threshold value are substituted into the grey neural network, and utilize firefly The intelligence that a body position determines carries Controlling model and calculates loading each joint coordinates value of robotic arm, will load robotic arm and respectively close Section the sum of coordinate predicted value and the difference E of actual value add 1 inverse as the first fitness function f1(x), f1(x)=1/ (∑ E+ 1);
The fitness of every firefly body position is calculated, using the first fitness function with the corresponding firefly of maximum adaptation degree Fireworm body position is as initial most bright firefly body position;
Step C3: calculating the relative luminance I and Attraction Degree β of firefly in group, determines firefly according to relative luminance Moving direction;
The relative luminance I of firefly are as follows:
Wherein, I0Indicate the brightness of most bright firefly, γ indicates the absorption coefficient of light, rijIndicate firefly i and j between away from From;
The Attraction Degree β of firefly are as follows:
Wherein, β0Indicate maximum Attraction Degree;
Step C4: updating a body position of firefly, carries out random movement to most bright firefly body position;
xi(t+1)=xi(t)+β(xj(t)-xi(t))+α(rand-1/2)
Wherein, xi(t) and xj(t) a body position of i and two fireflies of j are indicated, α is step factor, and rand is [0,1] It is upper to obey equally distributed random factor;
Step C5: the fitness of each firefly body position in current firefly population is calculated;
Firefly individual each in firefly population is ranked up according to firefly fitness, finds fitness highest Firefly body position be used as most bright firefly body position;
Step C6: judging whether to reach maximum number of iterations or reach maximum search precision, if so, selecting most bright firefly Fireworm individual, exports the best weight value and threshold value of the corresponding grey neural network in most bright firefly body position, obtains intelligence and removes Otherwise fortune Controlling model enables t=t+1, go to step C3 and continue next iteration.
As shown in Fig. 2, a kind of wisdom logistics environment robot loading attachment, comprising:
Server is established for storing goods information to loading area and loading area virtual environment and data operation;
Transmission belt is used for transmission various cargos;
Sorting machine people is equipped with multi-joint robotic arm, adopt with the aforedescribed process by cargo from transmission belt clamp to In temporary correspondence to loading area;
ZED camera is mounted on sorting machine people, is grabbed in transmission belt for visual identity cargo size, cargo Line and cargo-transmission belt between the distance between moment and sorting machine human arm, cargo-sorting machine human arm is taken to hang down The angle of straight line;
To loading area, the cargo to come is clamped for keeping in sorting machine people;
Kinect camera is mounted on to overlook entirely to loading area, be stored in for visual identity right above loading area To the position of all cargos of loading area, size marginal information and location tracking is numbered to cargo;
AGV intelligence carrier, including load robotic arm, loading area and SR200 camera, wherein load robotic arm It adopts and cargo is carried to loading area to loading area with the aforedescribed process, loading area transports goods for storing, SR200 camera Installation site can overlook entire loading area, for identification all cargo locations of loading area, cargo size marginal information.
Specific embodiment described herein is only an example for the spirit of the invention.The neck of technology belonging to the present invention The technical staff in domain can make various modifications or additions to the described embodiments or replace by a similar method In generation, however, it does not deviate from the spirit of the invention or beyond the scope of the appended claims.

Claims (9)

1. a kind of wisdom logistics environment robot stowage, which comprises the following steps:
Step 1: obtaining cargo size to be loaded in transmission belt, Shipping Address, cargo in real time and be crawled moment in transmission belt Line and cargo-transmission belt vertical line between the distance between sorting machine human arm and cargo-sorting machine human arm Angle;
Wherein, the cargo size, Shipping Address, cargo are crawled between moment and sorting machine human arm in transmission belt Line and cargo-transmission belt vertical line angle between distance and cargo-sorting machine human arm is by being mounted on classifier ZED camera on device people is obtained by visual identity;
Step 2: according to cargo size and Shipping Address, it is corresponding to loading area placement location seat in Shipping Address to calculate cargo Mark, and unique number is carried out to cargo;
It is upwards Y-axis so that, as origin, horizontal plane is X-axis to the right to the loading area lower left corner, is pointed into the sky perpendicular to the ground as Z axis, built It is vertical to loading area three-dimensional system of coordinate, and using cargo lower surface near X-axis and farthest away from the point of Y-axis as cargo set-point;
Step 3: based on acquisition to loading area placement location coordinate, using sorting machine people by cargo from transmission belt clamp to In temporary correspondence to loading area, the Item Number to newly store in loading area is sent to server;
Step 4: if being greater than 10 to loading area quantity of goods, according to all cargo sizes and storage position kept in loading area It sets, calculates cargo in the loading area final position coordinate of AGV intelligence carrier, otherwise, continue waiting for cargo to be installed, return to step Rapid 1;
Using the loading area lower left corner as origin, horizontal plane is A axis to the right, is upwards B axle, is pointed into the sky perpendicular to the ground as C axis, foundation Loading area three-dimensional system of coordinate, and sat using cargo lower surface near A axis and farthest away from the vertex of B axle as the final position of cargo Mark;
Step 5: according to the goods information recorded in server, issues loading and instruct to AGV intelligence carrier, it will be in loading area Cargo according to cargo size, to loading area placement location coordinate and AGV intelligence carrier loading area final position sit Mark input intelligence carries Controlling model, and the loading machine arm joint for obtaining AGV intelligence carrier controls matrix Q2, according to being obtained The control matrix Q obtained2Cargo is carried on the loading area of AGV intelligence carrier;
The loading machine arm joint of the AGV intelligence carrier controls matrix Q2Size is N2*M2, N2For the loading mechanical hand Shoulder joint number, M2Change number for joint position of the loading robotic arm in entire handling process;
It is described intelligence carry Controlling model be will to the cargo in loading area to loading area placement location coordinate (x, y, z), The loading area final position coordinate (a, b, c) and cargo size of AGV intelligence carrier will load robotic arm as input data Joint control matrix during removing goods is trained acquisition to grey neural network as output data;
The loading machine arm joint controls matrix Q2Including load robotic arm in the handling process each joint in each fortune Coordinate value (the α at dynamic moment22), wherein α2Indicate the angle at joint connection arm both ends, θ2Indicate that artis and joint start The angle rotated when position;
The intelligence carries the training process of Controlling model are as follows: loads robotic arm to various using the manipulation of manual control server Cargo carries out carrying loading, obtains and loads training data, is sat with loading the cargo in training data to loading area placement location Mark (x, y, z), AGV intelligence carrier loading area final position coordinate (a, b, c) and cargo size as input data, general Joint control matrix of robotic arm during removing goods is loaded as output data, grey neural network is trained, and It is 9 that input layer number, which is arranged, and hidden layer node number is 19, and output layer node number is to load machine arm joint number N2;Maximum number of iterations in training process is set as 500, and training learning rate is 0.01, threshold value 0.05.
2. the method according to claim 1, wherein the intelligence carries grey neural network in Controlling model Weight and threshold value use glowworm swarm algorithm that acquisition is in optimized selection, and detailed process is as follows:
Step C1: firefly initial parameter is arranged in initialization firefly population;
Using firefly body position as the weight of grey neural network and threshold value, random initializtion firefly population;
Wherein, setting firefly number value range is [10,400], maximum Attraction Degree β0=1, light intensity absorption coefficient gamma value model It encloses for [0.002,200], step factor α value range is [0.01,1], and maximum number of iterations value range is [300,2000], Search precision ε value range is [0.001,0.1];
Step C2: setting fitness function, and determine initial most bright firefly body position and the number of iterations t, t=1;
The corresponding weight in firefly body position and threshold value are substituted into the grey neural network, and utilize firefly individual The intelligence that position determines carries Controlling model and calculates loading each joint coordinates value of robotic arm, will load each joint of robotic arm and sits Mark the sum of predicted value and the difference E of actual value add 1 inverse as the first fitness function f1(x), f1(x)=1/ (∑ E+1);
The fitness of every firefly body position is calculated, using the first fitness function with the corresponding firefly of maximum adaptation degree A body position is as initial most bright firefly body position;
Step C3: calculating the relative luminance I and Attraction Degree β of firefly in group, and the movement of firefly is determined according to relative luminance Direction;
The relative luminance I of firefly are as follows:
Wherein, I0Indicate the brightness of most bright firefly, γ indicates the absorption coefficient of light, rijIndicate the distance between firefly i and j;
The Attraction Degree β of firefly are as follows:
Wherein, β0Indicate maximum Attraction Degree;
Step C4: updating a body position of firefly, carries out random movement to most bright firefly body position;
xi(t+1)=xi(t)+β(xj(t)-xi(t))+α(rand-1/2)
Wherein, xi(t) and xj(t) a body position of i and two fireflies of j are indicated, α is step factor, and rand is to take on [0,1] From equally distributed random factor;
Step C5: the fitness of each firefly body position in current firefly population is calculated;
Firefly individual each in firefly population is ranked up according to firefly fitness, finds the highest firefly of fitness Fireworm body position is used as most bright firefly body position;
Step C6: judging whether to reach maximum number of iterations or reach maximum search precision, if so, selecting most bright firefly Individual exports the best weight value and threshold value of the corresponding grey neural network in most bright firefly body position, obtains intelligence and carries control Otherwise simulation enables t=t+1, go to step C3 and continue next iteration.
3. the method according to claim 1, wherein the cargo will correspond to putting to loading area for Shipping Address The calculating process of seated position coordinate is as follows:
Step A1: it is arranged to loading area along X-axis, Y-axis, Z axis maximum length is respectively xmax, ymax, zmax, enable xd=xmax,yd=0, zd=0;
Step A2: judge to (x, y, z)=(x in loading aread,yd,zd) it whether there is cargo at position coordinates, if so, entering step Otherwise rapid A3 enters step A8;
Step A3: setting l (xd)、l(yd)、l(zd) it is respectively to be in (x, y, z)=(xd,yd,zd) position cargo along X-axis, Y The length of axis, Z axis;
Step A4: judge l (yd)+yd>ymaxIt is whether true, if so, entering step A5, otherwise, enable yd=l (yd)+yd, then into Enter step A2;
Step A5: judge l (zd)+zd>zmaxAnd ymax-l(ydWhether)=0 sets up simultaneously, if so, A6 is entered step, if only l (zd)+zd>zmaxIt sets up, then enters step A4, otherwise, enable zd=l (zd)+zd, subsequently into step A2;
Step A6: judge xd-l(xd) < 0 and l (zd)+zd>zmaxWhether set up simultaneously, if so, A7 is entered step, if only xd-l (xd) < 0 is set up, then enters step A4, otherwise, enable xd=xd-l(xd), subsequently into step A2;
Step A7: terminating and calculate, and issues the warning full to loading area cargo;
Step A8: the cargo size that will place of judgement whether meet Length x Width be respectively less than it is existing immediately below the position coordinates The length and width of cargo, if so, using the position coordinates as the placement location coordinate of cargo to be placed, otherwise, by the position The length of existing cargo along the y axis adds y immediately below coordinatedAs new yd, subsequently into step A2.
4. the method according to claim 1, wherein the cargo will be whole in the loading area of AGV intelligence carrier Point position Coordinate calculation method is as follows:
Step B1: choosing in server virtual environment to the maximum cargo of area in loading area upper surface, obtains cargo volume Number and length, width and height information, enter step B2;
Step B2: setting loading area is along A axis, and B axle, C axis maximum length is respectively amax, bmax, cmax, enable ad=amax,bd=0, cd =0;
Step B3: (a, b, c)=(a is judged in loading aread,bd,cd) it whether there is cargo at position coordinates, if so, entering step Otherwise rapid B4 enters step B9;
Step B4: setting l (ad)、l(bd)、l(cd) it is respectively to be in (a, b, c)=(ad,bd,cd) position cargo along A axis, B The length of axis, C axis direction;
Step B5: judge l (bd)+bd>bmaxIt is whether true, if so, entering step B6, otherwise, enable bd=l (bd)+bd, then into Enter step B3;
Step B6: judge l (cd)+cd>cmaxAnd bmax-l(bdWhether)=0 sets up simultaneously, if so, B7 is entered step, if only l (cd)+cd>cmaxIt sets up, then enters step B5, otherwise, enable cd=l (cd)+cd, subsequently into step B3;
Step B7: judge ad-l(ad) < 0 and l (cd)+cd>cmaxWhether set up simultaneously, if so, B8 is entered step, if only ad-l (ad) < 0 is set up, then enters step B5, otherwise, enable ad=ad-l(ad), subsequently into step B3;
Step B8: terminating and calculate, and the cargo for obtaining each different numbers will be in the final position coordinate of loading area;
Step B9: the cargo size that will place of judgement whether meet Length x Width be respectively less than it is existing immediately below the position coordinates The length and width of cargo, if so, using the position coordinates as the placement coordinate of the number cargo subsequently into step B10, it is no Then, cargo existing immediately below the position coordinates is added into b along B axle direction lengthdAs new bd, subsequently into step B3;
Step B10: judge whether there is corresponding final position coordinate to the numbered cargo of institute in loading area, if so, complete At the calculating of the cargo final position coordinate of each number, otherwise, deletes in server virtual environment and be computed terminal position The cargo of coordinate is set, is updated to loading area upper surface goods information, and enter step B1.
5. method according to claim 1-4, which is characterized in that it is described using sorting machine people by cargo from biography Defeated band, which is clamped to for temporary correspondence, to be referred to the process of loading area and the ZED camera on sorting machine people will be utilized to obtain Cargo size, cargo be crawled in transmission belt moment between sorting machine human arm at a distance from, cargo-sorting machine manpower Line and cargo-transmission belt vertical line angle and cargo between arm are being used as input number to loading area placement location coordinate According to being input to Classing filing appliance modulus type, obtain sorting machine people in the sorting machine human arm joint control square of classification clamping process Battle array Q1, according to sorting machine human arm joint control matrix Q obtained1Cargo is clamped from transmission belt on to loading area;
The sorting machine human arm joint control matrix Q1Size is N1*M1, N1For the sorting machine manpower shoulder joint number, M1Change number for joint position of sorting machine human arm during entire clamping;
The sorting machine human arm joint control matrix Q1Including sorting machine human arm in the process of grasping each joint every Coordinate value (the α at a movement moment11), wherein α1Indicate the angle at joint connection arm both ends, θ1Indicate artis and joint The angle rotated when starting position;
The Classing filing appliance modulus type is that cargo size, cargo are crawled between moment and sorting machine human arm in transmission belt Distance, the line between cargo-sorting machine human arm with cargo-transmission belt vertical line angle and cargo wait load Area's placement location coordinate (x, y, z) is used as input data, and each joint control matrix of sorting machine people during clamping is made For output data, acquisition is trained to wavelet neural network;
The training process of the Classing filing appliance modulus type are as follows: first with manual control server manipulation sorting machine people to various cargos It is clamped, obtains clamping training data, to clamp the cargo size in training data, cargo is crawled moment in transmission belt Line and cargo-transmission belt vertical line between the distance between sorting machine human arm, cargo-sorting machine human arm Angle and cargo are in the input data to loading area placement location coordinate (x, y, z) as wavelet neural network, by classifier Output data of each joint control matrix of the device people in clamping handling process as wavelet neural network, is arranged defeated in training Entering node layer number is 8, and hidden layer node number is 17, and output layer node number is sorting machine manpower shoulder joint number N1, Maximum number of iterations is set as 600, and training learning rate is 0.01, threshold value 0.05, and the weight of the wavelet neural network, threshold Value and flexible translation coefficient use wolf pack algorithm that acquisition is in optimized selection.
6. according to the method described in claim 5, it is characterized in that, in the Classing filing appliance modulus type wavelet neural network power Acquisition is in optimized selection using wolf pack algorithm in value, threshold value and flexible translation coefficient, and detailed process is as follows:
Step E1: simultaneously wolf pack parameter is arranged in initialization wolf pack;
Wolf pack scale be arranged value range be [5,130], step factor be arranged value range be [900,3000], visit wolf ratio because Son setting value range is [3,10], and maximum migration number setting value range is [5,30], and value is arranged in the range estimation factor Range is [100,400], and maximum long-range raid number setting value range is [6,20], updates scale factor setting value range and is [2,30], maximum number of iterations be arranged value range be [100,1500], maximum search precision setting value range be [0.001, 0.2];
Step E2: setting fitness function, and determine initial optimal head wolf position and the number of iterations t, t=1;
Successively the corresponding parameter value in individual wolf position is brought into classification clamping model, and the Classing filing appliance determined using individual wolf position Modulus type is exported as a result, using the inverse for the mean square deviation MSE for exporting result and actual value as the second fitness function f2 (x), f2(x)=1/MSE;
The fitness of each individual wolf position is calculated, using the second fitness function with maximum adaptation degreeCorresponding individual Wolf position is as initial optimal head wolf position
Step E3: wolf migration is visited;
The maximum wolf of fitness is chosen from wolf pack as head wolf, and is randomly selected and visited wolf;It calculates and visits wolf in the adaptation of all directions Degree, and make to visit wolf to the maximum direction exploration of fitness, when only spy wolf fitness is greater than head wolf or reaches maximum migration time for certain When number, migration terminates;
Step E4: violent wolf long-range raid;
Individual wolf in addition to except head wolf and visiting wolf is violent wolf, and violent wolf constantly calculates violent wolf position to head wolf direction long-range raid The fitness set;
If certain violent wolf position fitness is higher than head wolf position fitness, a wolf is updated, and remaining violent wolf is changed to when front wolf Long-range raid, when violent wolf with when front wolf distance be less than determine apart from when, which stops, when all violent wolf long-range raids stop or reach When to maximum long-range raid number, long-range raid terminates, and wolf pack enters jointly attack state;
Step E5: wolf pack besieges;
Except individual wolves all in addition to the wolf of front take a step forward to head wolf direction, successively judge forward further after individual wolf Whether position fitness is better than not the fitness of further position forward, if so, will further position conduct forward The new position of body wolf, otherwise, individual wolf keep original position constant;
Step E6: after completing jointly attack behavior, all individual wolves are sorted from high to low according to current fitness in wolf pack, and fitness is most High individual wolf is set as a wolf, and the artificial wolf to rank behind is eliminated, and the new artificial wolf of random generation again;
Step E7: when reaching maximum search precision or maximum number of iterations, the corresponding Wavelet Neural Network of newest head wolf is exported Best initial weights, threshold value and the flexible translation coefficient of network, obtain Classing filing appliance modulus type, otherwise, enable t=t+1, return step E3, after Continuous next iteration.
7. method according to claim 1-4, which is characterized in that the Shipping Address is known by ZED camera Bar code on other cargo obtains, or is identified and obtained to the address character on cargo by ZED camera;
The address character identification acquisition process is as follows:
Step D1: reading the image that ZED camera obtains, and carries out gray processing and binary conversion treatment to image;
Step D2: to treated that image does Slant Rectify by step D1, smoothing processing is filtered to the image after correction;
Step D3: to treated that character zone extracts is partitioned into single character by filtering, single character is obtained Image array;
Step D4: successively using the single character picture matrix being partitioned into as input data, trained base will be input to In the character recognition model of Elman neural network;
Step D5: it combines all output characters by recognition sequence, comparison has single-level address, two-level address, third-level address Address base obtains identification address;
The training process of the character recognition model based on Elman neural network are as follows: using the image of known address information, and It is handled according to step D1-D3, obtains each single character picture matrix, using each single character picture matrix as input number According to corresponding character name is referred to as output data, and it is 2 that input layer number, which is arranged, and hidden layer node number is 5, output Node layer number is 1, is trained to Elman neural network;Maximum number of iterations in training process is set as 1000, training Learning rate is 0.01, threshold value 0.02.
8. the method according to the description of claim 7 is characterized in that the single-level address refers to all provinces, autonomous region, municipality directly under the Central Government Or special administrative region;The two-level address refers to all regions, alliance, autonomous prefecture, prefecture-level city;The third-level address refer to all counties, Autonomous county, flag, automonous banner, county-level city, districts under city administration, forest zone, special zone.
9. a kind of wisdom logistics environment robot loading attachment characterized by comprising
Server is established for storing goods information to loading area and loading area virtual environment and data operation;
Transmission belt is used for transmission various cargos;
Sorting machine people is equipped with multi-joint robotic arm, using the described in any item methods of claim 1-8 by cargo from biography Defeated band is clamped to the correspondence for keeping in loading area;
ZED camera is mounted on sorting machine people, and wink is crawled in transmission belt for visual identity cargo size, cargo Between line and cargo-transmission belt vertical line between the distance between sorting machine human arm, cargo-sorting machine human arm Angle;
To loading area, the cargo to come is clamped from transmission belt for keeping in sorting machine people;
Kinect camera is mounted on to overlook entirely to loading area, be stored in for visual identity to be installed right above loading area It carries the positions of all cargos in area, size marginal information and location tracking is numbered to cargo;
AGV intelligence carrier, including load robotic arm, loading area and SR200 camera, wherein it loads robotic arm and uses Cargo is carried to loading area to loading area by the described in any item methods of claim 1-8, and loading area is for storing to be installed It carries area and carries the cargo to come, SR200 camera is installed on the position for overlooking entire loading area, and loading area institute is available in stock for identification Object location, cargo size marginal information.
CN201810995897.4A 2018-08-29 2018-08-29 Intelligent logistics environment robot loading method and device Active CN108942946B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810995897.4A CN108942946B (en) 2018-08-29 2018-08-29 Intelligent logistics environment robot loading method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810995897.4A CN108942946B (en) 2018-08-29 2018-08-29 Intelligent logistics environment robot loading method and device

Publications (2)

Publication Number Publication Date
CN108942946A true CN108942946A (en) 2018-12-07
CN108942946B CN108942946B (en) 2020-06-30

Family

ID=64474783

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810995897.4A Active CN108942946B (en) 2018-08-29 2018-08-29 Intelligent logistics environment robot loading method and device

Country Status (1)

Country Link
CN (1) CN108942946B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110076777A (en) * 2019-05-05 2019-08-02 北京云迹科技有限公司 A kind of picking method and device
CN110533717A (en) * 2019-08-06 2019-12-03 武汉理工大学 A kind of target grasping means and device based on binocular vision
CN110751431A (en) * 2019-09-18 2020-02-04 国机工业互联网研究院(河南)有限公司 Material tracking method and system based on rail carrier
CN111880405A (en) * 2020-07-03 2020-11-03 广东工业大学 AGV self-adaptive path planning real-time control method in flexible manufacturing workshop system
CN112904865A (en) * 2021-01-28 2021-06-04 广东职业技术学院 Method and system for controlling transportation of ceramic material and computer readable storage medium
CN113485330A (en) * 2021-07-01 2021-10-08 苏州罗伯特木牛流马物流技术有限公司 Robot logistics carrying system and method based on Bluetooth base station positioning and scheduling
CN113589685A (en) * 2021-06-10 2021-11-02 常州工程职业技术学院 Vehicle moving robot control system based on deep neural network and method thereof
CN115619300A (en) * 2022-11-14 2023-01-17 昆船智能技术股份有限公司 Automatic loading system and method for containers
CN116946610A (en) * 2023-09-21 2023-10-27 中科源码(成都)服务机器人研究院有限公司 Method and device for picking up goods in intelligent warehousing system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100013364A (en) * 2008-07-31 2010-02-10 주식회사 유진로봇 System and method for transporting object of mobing robot
CN103043359A (en) * 2011-10-17 2013-04-17 株式会社安川电机 Robot system, robot, and sorted article manufacturing method
CN105032783A (en) * 2015-07-02 2015-11-11 天津耀通科技发展有限公司 E-commerce intelligent storage-type goods sorting system and sorting method thereof
US9272419B2 (en) * 2010-08-02 2016-03-01 Brightstar Corp. Robotic picking line for serialized products
CN107414830A (en) * 2017-07-31 2017-12-01 中南大学 A kind of carrying machine human arm manipulation multi-level mapping intelligent control method and system
CN108297084A (en) * 2018-03-16 2018-07-20 五邑大学 A kind of intelligent conveyor type mechanical arm system based on image recognition

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100013364A (en) * 2008-07-31 2010-02-10 주식회사 유진로봇 System and method for transporting object of mobing robot
US9272419B2 (en) * 2010-08-02 2016-03-01 Brightstar Corp. Robotic picking line for serialized products
CN103043359A (en) * 2011-10-17 2013-04-17 株式会社安川电机 Robot system, robot, and sorted article manufacturing method
CN105032783A (en) * 2015-07-02 2015-11-11 天津耀通科技发展有限公司 E-commerce intelligent storage-type goods sorting system and sorting method thereof
CN107414830A (en) * 2017-07-31 2017-12-01 中南大学 A kind of carrying machine human arm manipulation multi-level mapping intelligent control method and system
CN108297084A (en) * 2018-03-16 2018-07-20 五邑大学 A kind of intelligent conveyor type mechanical arm system based on image recognition

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110076777B (en) * 2019-05-05 2020-11-27 北京云迹科技有限公司 Goods taking method and device
CN110076777A (en) * 2019-05-05 2019-08-02 北京云迹科技有限公司 A kind of picking method and device
CN110533717B (en) * 2019-08-06 2023-08-01 武汉理工大学 Target grabbing method and device based on binocular vision
CN110533717A (en) * 2019-08-06 2019-12-03 武汉理工大学 A kind of target grasping means and device based on binocular vision
CN110751431A (en) * 2019-09-18 2020-02-04 国机工业互联网研究院(河南)有限公司 Material tracking method and system based on rail carrier
CN110751431B (en) * 2019-09-18 2024-02-06 国机工业互联网研究院(河南)有限公司 Material tracking method and system based on track carrier
CN111880405A (en) * 2020-07-03 2020-11-03 广东工业大学 AGV self-adaptive path planning real-time control method in flexible manufacturing workshop system
CN111880405B (en) * 2020-07-03 2022-06-14 广东工业大学 AGV self-adaptive path planning real-time control method in flexible manufacturing workshop system
CN112904865A (en) * 2021-01-28 2021-06-04 广东职业技术学院 Method and system for controlling transportation of ceramic material and computer readable storage medium
CN113589685A (en) * 2021-06-10 2021-11-02 常州工程职业技术学院 Vehicle moving robot control system based on deep neural network and method thereof
CN113589685B (en) * 2021-06-10 2024-04-09 常州工程职业技术学院 Vehicle moving robot control system and method based on deep neural network
CN113485330A (en) * 2021-07-01 2021-10-08 苏州罗伯特木牛流马物流技术有限公司 Robot logistics carrying system and method based on Bluetooth base station positioning and scheduling
CN113485330B (en) * 2021-07-01 2022-07-12 苏州罗伯特木牛流马物流技术有限公司 Robot logistics carrying system and method based on Bluetooth base station positioning and scheduling
CN115619300B (en) * 2022-11-14 2023-03-28 昆船智能技术股份有限公司 Automatic loading system and method for containers
CN115619300A (en) * 2022-11-14 2023-01-17 昆船智能技术股份有限公司 Automatic loading system and method for containers
CN116946610A (en) * 2023-09-21 2023-10-27 中科源码(成都)服务机器人研究院有限公司 Method and device for picking up goods in intelligent warehousing system
CN116946610B (en) * 2023-09-21 2023-12-12 中科源码(成都)服务机器人研究院有限公司 Method and device for picking up goods in intelligent warehousing system

Also Published As

Publication number Publication date
CN108942946B (en) 2020-06-30

Similar Documents

Publication Publication Date Title
CN108942946A (en) A kind of wisdom logistics environment robot stowage and device
Tang et al. Color image segmentation with genetic algorithm for in-field weed sensing
CN108171748B (en) Visual identification and positioning method for intelligent robot grabbing application
CN105488528B (en) Neural network image classification method based on improving expert inquiry method
CN108280856A (en) The unknown object that network model is inputted based on mixed information captures position and orientation estimation method
CN110991435A (en) Express waybill key information positioning method and device based on deep learning
CN107563396A (en) The construction method of protection screen intelligent identifying system in a kind of electric inspection process
CN109102543A (en) Object positioning method, equipment and storage medium based on image segmentation
CN106250812A (en) A kind of model recognizing method based on quick R CNN deep neural network
CN110472738A (en) A kind of unmanned boat Real Time Obstacle Avoiding algorithm based on deeply study
CN103824291B (en) Automatic image segmentation method of continuous quantum goose group algorithm evolution pulse coupling neural network system parameters
CN109146972A (en) Vision navigation method based on rapid characteristic points extraction and gridding triangle restriction
Weber et al. Robot docking with neural vision and reinforcement
CN109272487A (en) The quantity statistics method of crowd in a kind of public domain based on video
CN108550162A (en) A kind of object detecting method based on deeply study
CN105760898A (en) Vision mapping method based on mixed group regression method
CN111582123A (en) AGV positioning method based on beacon identification and visual SLAM
CN109493344A (en) A kind of semantic segmentation method of large-scale city three-dimensional scenic
CN110497419A (en) Building castoff sorting machine people
CN112580662A (en) Method and system for recognizing fish body direction based on image features
CN107016371A (en) UAV Landing Geomorphological Classification method based on improved depth confidence network
CN111950346A (en) Pedestrian detection data expansion method based on generation type countermeasure network
CN109670501A (en) Object identification and crawl position detection method based on depth convolutional neural networks
CN109190666A (en) Flowers image classification method based on improved deep neural network
CN114298198A (en) Intelligent goods storage warehouse

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant