CN110026987A - Generation method, device, equipment and the storage medium of a kind of mechanical arm crawl track - Google Patents

Generation method, device, equipment and the storage medium of a kind of mechanical arm crawl track Download PDF

Info

Publication number
CN110026987A
CN110026987A CN201910451779.1A CN201910451779A CN110026987A CN 110026987 A CN110026987 A CN 110026987A CN 201910451779 A CN201910451779 A CN 201910451779A CN 110026987 A CN110026987 A CN 110026987A
Authority
CN
China
Prior art keywords
track
mechanical arm
grabbed
crawl
converted
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910451779.1A
Other languages
Chinese (zh)
Other versions
CN110026987B (en
Inventor
刘文印
叶子涵
陈俊洪
梁达勇
周小静
张启翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Minglong Electronic Technology Co ltd
Original Assignee
Guangdong University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong University of Technology filed Critical Guangdong University of Technology
Priority to CN201910451779.1A priority Critical patent/CN110026987B/en
Publication of CN110026987A publication Critical patent/CN110026987A/en
Application granted granted Critical
Publication of CN110026987B publication Critical patent/CN110026987B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/163Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses generation method, device, equipment and the computer readable storage mediums of a kind of mechanical arm crawl track;This programme includes: the basis function weights that motion model is determined according to teaching trace information, obtains target image by depth camera;Mask R-CNN network is inputted, determines starting point and target point, crawl track is generated by moving model;As it can be seen that in the present solution, the basis function weights in motion model are identical as teaching track, so crawl track is similar to teaching track in shape, to reach track the destination of study;Target in image is effectively identified and is positioned by the detection method Mask R-CNN based on deep learning by this programme, so that mechanical arm is more intelligent to the crawl of object;This programme combines visual perception and dynamic motion primitive frame, assigns vision to motion model, learns in mechanical arm teaching track and preferably interacts with external environment in terms of grabbing trajectory planning.

Description

Generation method, device, equipment and the storage medium of a kind of mechanical arm crawl track
Technical field
The present invention relates to track generation technique fields, more specifically to a kind of generation side of mechanical arm crawl track Method, device, equipment and computer readable storage medium.
Background technique
Learning from instruction is a kind of behavior of human-computer interaction, so that robot reproduces teaching movement under new environment. Kinaesthesia teaching and remote teaching can be divided into according to the contact whether demonstrator directly occurs with robot physically.Kinaesthesia teaching is Operator directly controls robot and completes corresponding actions, acquires robot self information and using the information as learning object, this Mode is not suitable for multivariant mechanical arm.Remote teaching generally passes through vision, wearable sensor or other long-range controls Tool controls robot motion.Using learning from instruction as traditional method for planning track of representative, emphatically by learning and imitating people Class demonstration, so that robot can achieve the ability with user collaborative work.Therefore, it is found properly for given behavior or movement And more general dynamical system model is the thing of great meaning.
Dynamic motion primitive (Dynamic Movement Primitives, DMP) is a kind of study of track, planing method, Motion trajectory when changing convenient for target position has same movement trend convenient for imitating teaching track to generate Motion profile.List can be effectively performed in mechanical arm learning from instruction method based on dynamic motion primitive, multiple degrees of freedom track is learned It practises, but in actual application, such as: when mechanical arm carries out grasping body, due to lacking the visual perception to environment, make At the research excessively unification of the technology, human-computer interaction cannot be realized well.And Artificial Potential Field method for planning track can exist The problem of local best points, can not make manipulator motion have limitation to target position sometimes.
Summary of the invention
The purpose of the present invention is to provide generation method, device, equipment and the computers of a kind of mechanical arm crawl track can Storage medium is read, to realize through the study to teaching track, mechanical arm is transported according to different environment from main modulation Dynamic strategy realizes reproduction to initial trace and extensive.
To achieve the above object, the present invention provides a kind of generation method of mechanical arm crawl track, comprising:
The basis function weights of motion model are determined according to teaching trace information;The motion model is based on dynamic motion base The model of Meta algorithm frame;
Target image is obtained by depth camera;
The target image is inputted into Mask R-CNN network, the position of determining object to be grabbed and the object to be grabbed Position to be placed;
The position of the object to be grabbed is converted into corresponding mechanical arm 7 degree of freedom joint angles, and as crawl track Initial point position;The object to be grabbed position to be placed is converted into corresponding mechanical arm 7 degree of freedom joint angles, and conduct Grab the aiming spot of track;
The initial point position, the aiming spot are inputted into the moving model and generate crawl track.
Optionally, the basis function weights that motion model is determined according to teaching trace information, comprising:
Obtain movement locus, the start point information, endpoint information of teaching track;
Velocity information and acceleration information are determined according to the movement locus;
According to the start point information, the endpoint information, the velocity information and acceleration information, the movement is determined The target forcing functions of model;
The basis function weights of the motion model are determined using the target forcing functions.
Optionally, the position by the object to be grabbed is converted to corresponding mechanical arm 7 degree of freedom joint angles, and makees For the initial point position for grabbing track;The object to be grabbed position to be placed is converted into corresponding mechanical arm 7 degree of freedom joint Angle, and the aiming spot as crawl track, comprising:
The position of the object to be grabbed is converted to the first three-dimensional coordinate under robot basis coordinates;
First three-dimensional coordinate is converted into corresponding mechanical arm 7 degree of freedom joint angles, and the starting as crawl track Point position;
The object to be grabbed position to be placed is converted into the second three-dimensional coordinate under robot basis coordinates;
Second three-dimensional coordinate is converted into corresponding mechanical arm 7 degree of freedom joint angles, and the target as crawl track Point position.
Optionally, the initial point position, the aiming spot are inputted into the moving model and generates crawl track, packet It includes:
The initial point position, the aiming spot are inputted into the moving model and generate crawl track;
The motion model includes:
Wherein, τ is the time-scaling factor, and y is system displacement,For speed,For acceleration, αyFor the first system parameter, βyFor second system parameter, g is the aiming spot, and f is the target forcing functions;X is time parameter, and ψ (x) is base letter Number, N are the number of basic function, and ω is basis function weights, y0For the initial point position.
To achieve the above object, the present invention further provides a kind of generating means of mechanical arm crawl track, comprising:
Parameter determination module, for determining the basis function weights of motion model according to teaching trace information;The movement mould Type is the model based on dynamic motion primitive algorithm frame;
Module is obtained, for obtaining target image by depth camera;
Position determination module determines the position of object to be grabbed for the target image to be inputted Mask R-CNN network It sets and position that the object to be grabbed is to be placed;
Location conversion module, for the position of the object to be grabbed to be converted to the starting of mechanical arm 7 degree of freedom joint angles The object to be grabbed position to be placed, is converted to the aiming spot of mechanical arm 7 degree of freedom joint angles by point position;
Track generation module, for inputting the moving model and generating the initial point position, the aiming spot Grab track.
Optionally, the parameter determination module includes:
Information acquisition unit, for obtaining the movement locus, start point information, endpoint information of teaching track;
Speed determination module, for determining velocity information according to the movement locus;
Acceleration determining module, for determining acceleration information according to the movement locus;
Function determination unit, for according to the start point information, the endpoint information, the velocity information and acceleration Information determines the target forcing functions of the motion model;
Weight determining unit, for determining the basis function weights of the motion model using the target forcing functions.
Optionally, the location conversion module includes:
First three-dimensional coordinate converting unit, for being converted to the position of the object to be grabbed under robot basis coordinates First three-dimensional coordinate;
First joint angles converting unit, for first three-dimensional coordinate to be converted to corresponding mechanical arm 7 degree of freedom joint Angle, and the initial point position as crawl track;
Second three-dimensional coordinate converting unit, for the object to be grabbed position to be placed to be converted to robot base The second three-dimensional coordinate under mark;
Second joint angle conversion unit, for second three-dimensional coordinate to be converted to corresponding mechanical arm 7 degree of freedom joint Angle, and the aiming spot as crawl track.
Optionally, the track generation module is specifically used for:
The initial point position, the aiming spot are inputted into the moving model and generate crawl track;The movement Model includes:
Wherein, τ is the time-scaling factor, and y is system displacement,For speed,For acceleration, αyFor the first system parameter, βyFor second system parameter, g is the aiming spot, and f is the target forcing functions;X is time parameter, and ψ (x) is base letter Number, N are the number of basic function, and ω is basis function weights, y0For the initial point position.
To achieve the above object, the present invention further provides a kind of generating devices of mechanical arm crawl track, comprising:
Memory, for storing computer program;
Processor is realized when for executing the computer program such as the generation method of above-mentioned mechanical arm crawl track Step.
To achieve the above object, the present invention further provides a kind of computer readable storage mediums, described computer-readable It is stored with computer program on storage medium, realizes that above-mentioned mechanical arm such as grabs when the computer program is executed by processor The step of generation method of track.
By above scheme it is found that a kind of generation method of mechanical arm crawl track provided in an embodiment of the present invention, comprising: The basis function weights of motion model are determined according to teaching trace information;The motion model is based on dynamic motion primitive algorithm frame The model of frame;Target image is obtained by depth camera;The target image is inputted into Mask R-CNN network, is determined wait grab Take the position of object and the position that the object to be grabbed is to be placed;The position of the object to be grabbed is converted into corresponding machine Tool arm 7 degree of freedom joint angles, and the initial point position as crawl track;By the object to be grabbed position conversion to be placed For corresponding mechanical arm 7 degree of freedom joint angles, and the aiming spot as crawl track;By the initial point position, the mesh Punctuate position inputs the moving model and generates crawl track.
As it can be seen that in the present solution, the basis function weights in motion model are identical as teaching track, so newly-generated grabs It is similar to teaching track in shape to take track, to reach track the destination of study;Also, this programme is by being based on depth The detection method Mask R-CNN of study, the target in image is effectively identified and is positioned, so that mechanical arm is to object Crawl it is more intelligent, to better adapt to service robot market;Further, this programme transports visual perception and dynamic Dynamic primitive frame combines, and assigns " vision " to motion model, learns and grab trajectory planning side in mechanical arm teaching track Face is preferably interacted with external environment, with more intelligence.
The invention also discloses a kind of mechanical arm crawl track generating means, equipment and computer readable storage medium, Equally it is able to achieve above-mentioned technical effect.
Detailed description of the invention
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this Some embodiments of invention for those of ordinary skill in the art without creative efforts, can be with It obtains other drawings based on these drawings.
Fig. 1 is the generation method flow diagram that a kind of mechanical arm disclosed by the embodiments of the present invention grabs track;
Fig. 2 is the network structure schematic diagram of detection algorithm disclosed by the embodiments of the present invention;
Fig. 3 is the generating means structural schematic diagram that a kind of mechanical arm disclosed by the embodiments of the present invention grabs track;
Fig. 4 is the generating device structural schematic diagram that a kind of mechanical arm disclosed by the embodiments of the present invention grabs track.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other Embodiment shall fall within the protection scope of the present invention.
It should be noted that the system in nature is all nonlinear, therefore building about its model in most cases Vertical is a very important research work.However, the complicated of subtle Parameters variation becomes due to the Parametric Sensitivity of these systems Change and be difficult to analyze, therefore it is extremely difficult for carrying out modeling to goal-directed behavior with nonlinear system;Intuition and time-consuming Parameter adjustment plays an important role.
On this basis, Ijspeert et al. proposes the concept of dynamic motion primitive, this is a kind of using statistical learning The research method that technology models the attractor behavior of autonomous Kind of Nonlinear Dynamical System, essence are simple dynamic from one Force system starts, such as the one-dimensional linear differential equation, is converted them by the autonomous forced term that can learn and is inhaled with regulation The weakly non-linear system of introduction may finally generate almost arbitrarily complicated point attractor and limit cycle attractor.Dynamic motion Primitive is a kind of study of track, planing method, Motion trajectory when changing convenient for target position, convenient for imitating teaching rail Mark has the motion profile of same movement trend to generate.The frame core of dynamic motion primitive system is the attraction based on point Subsystem, it allows to generate the track of arbitrary shape by policing parameter, these policing parameters can use local weighted recurrence Method carrys out learning dynamics adjustment, can also learn to adjust by intensified learning method.The parameter of dynamic motion primitive algorithm, than Such as run duration, terminal, starting point can obtain the track of corresponding planning using these parameters with DMP.
At present for the study of teaching track, including the following two kinds mode:
Mode one: it based on the mechanical arm learning from instruction technique study of dynamic motion primitive based on DMP theory, devises The algorithm flow of track study by reproduction to one-dimensional teaching track and extensive demonstrates the validity of algorithm.By sharing The mode of canonical system realizes that dynamic motion primitive multiple degrees of freedom couples, and realizes the study of multifreedom motion track.Pass through Single, multiple degrees of freedom track reproducing and extensive emulation experiment analyze and summarize to the parameter of dynamic motion primitive.
Mode two: the trajectory planning of mechanical arm can be in joint space, can also be in cartesian space.Cartesian space Trajectory planning can intuitively indicate the motion profile of mechanical arm or arm end, but need to the motion profile cooked up into A large amount of calculate of row is converted into joint angles.Artificial Potential Field Method is a kind of method for realizing trajectory planning under cartesian coordinate, the party The movement of robot is abstracted into virtual force field and issues raw pose variation by method, i.e., target point and barrier are respectively to machine People generates to attract and repel and rent, by the way that the effect of both fictitious forces is controlled its movement on the robotic arm.
And for both modes, the former lacks the visual perception to environment, thus the research of the technology excessively unification, Human-computer interaction cannot be realized well;The latter can have local best points, manipulator motion can not be made to target sometimes Position, there are significant limitations.
Therefore, the embodiment of the invention discloses generation method, device, equipment and the computers of a kind of mechanical arm crawl track Readable storage medium storing program for executing enables mechanical arm to move plan from main modulation according to different environment by the study to teaching track Slightly, it realizes to the reproduction of initial trace and extensive;And it is integrated with the detection technique based on deep learning, effectively to external environment Visual perception is carried out, to realize that mechanical arm to the trajectory planning of grasping body, better adapts to the changeable ring of service robot Border.
Referring to Fig. 1, a kind of generation method of mechanical arm crawl track provided in an embodiment of the present invention, comprising:
S101, the basis function weights that motion model is determined according to teaching trace information;The motion model is based on dynamic Move the model of primitive algorithm frame;
Wherein, the basis function weights that motion model is determined according to teaching trace information, comprising:
Obtain movement locus, the start point information, endpoint information of teaching track;
Velocity information and acceleration information are determined according to the movement locus;
According to the start point information, the endpoint information, the velocity information and acceleration information, the movement is determined The target forcing functions of model;
The basis function weights of the motion model are determined using the target forcing functions.
In the present solution, in order to learn teaching track, it is thus necessary to determine that the basis function weights of teaching track, and by the basic function Weight is as the basis function weights in the motion model for generating this crawl track;In the present solution, the motion model is to be based on The model of dynamic motion primitive algorithm frame;The model of dynamic motion primitive algorithm frame initially comes from second order spring damping system System, the characteristic of whole system are towards target position convergence.DMP basic thought is dynamic with good stability characteristic (quality) using one Mechanical system, and modulate it with nonlinear terms, i.e., nonlinear function is introduced to simple and stable dynamical system, by non-linear letter The motion process of number control system.Spring mass-damper model is abstracted as an attraction subsystem by DMP, and introduces forced term f:
ψi=exp (- hi(x-ci)2) (3)
Wherein, y is the motion state of single-mode system, that is, is displaced,For corresponding speed, acceleration.G is mesh Scale value is also referred to as attractor, i.e., desired motion state, as mechanical arm joint position or cartesian product coordinate system under point Position.αy、βyFor system parameter, pass through the parameter value of setting, such as βyy/ 4 can make system reach critical damping, can Guarantee system is stablized, and system mode y is gradually changed at any time, and final system converges on target value g.Formula (1) is known as converting system System.
Forcing functions f is the core of DMP, wherein ψiReferred to as basic function, N indicate the number of basic function.y0For the initial of system State can be the initial value of given teaching track, be also possible to specified starting point coordinate.ω is basis function weights, basic function It obeys with ciCentered on Gaussian Profile, hiFor the variance of basic function, forcing functions f is composed of a series of nonlinear functions, Therefore entire DMP system is also nonlinear.Above-mentioned formula (4) is known as canonical system, and wherein τ is known as the time-scaling factor, uses In the rate of decay for adjusting canonical system, αxFor canonical system parameter, x is time parameter,It is led for the single order of time parameter Value.
Further, it for the study movement from teaching track, needs to determine weight according to teaching trace information, the teaching rail Mark information may include that movement locus, start point information, endpoint information by these teaching trace informations can determine base letter Number weight, specifically comprises the following steps:
Firstly, it is necessary to record the movement locus y of teaching trackdemo(t), wherein t ∈ [0 ..., T], then to its derivation, Acquisition speed informationAnd acceleration informationBy the starting point y in formula0It is set as rising for teaching track Initial point information, target value g are set as the endpoint information of teaching track:
y0=ydemo(t=0) (5)
G=ydemo(t=T) (6)
Formula (1) is deformed, and substitute into primary condition to obtain:
Obtain target forcing functions ftargetAfterwards, the problem of finding weights omega can be converted to minimum errorSo that forcing functions f approaches ftarget, this is a linear regression problem, can use office A variety of methods such as portion's weighted regression (Locally Weighted Regression) solve, so that basis function weights are obtained, it is this In such a way that learning algorithm adjusts the weight parameter of forced term, arbitrarily complicated shape can be generated from initial point to target point Track.
S102, target image is obtained by depth camera;
In the present solution, the depth camera can be Kinect V2 depth camera colour imagery shot, pass through the camera shooting Head can read color image as target image, in the target image, it may include object to be captured, and object to be grabbed Placement location, such as: need for spoon to be put into fixed position, spoon here is exactly object to be grabbed, and what is ultimately generated grabs Take track just and be and spoon is put into the crawl track of fixed position, and the target image that this programme is obtained by S102, be for The initial point position and aiming spot of determining crawl track.
S103, the target image is inputted into Mask R-CNN network, determines the position of object to be grabbed and described wait grab Take the position that object is to be placed;
Currently, effect of the machine vision in intelligence manufacture industry field is more and more important, various with the arrival of industry 4.0 Application of the object detection method of various kinds in industry and service robot field is also more and more extensive.Deep learning is in recent years The hot-topic subject quickly grown, the object detection method based on deep learning can more effectively solve object identification and classification Problem.Ross Girshick in 2014 et al. proposes R-CNN, and target detection is carried out using convolutional neural networks CNN, will Verification and measurement ratio on PASCAL VOC is promoted from 35.1% to 53.7%, but too long is asked there are training time and testing time Topic.Later, Ross Girshick team proposed Fast R-CNN in 2015, and exquisite composition, process is more compact, substantially The speed of target detection is improved, solves the problems, such as that R-CNN training, test speed are slow and the required space of training is big.Then, Faster R-CNN is pushed out again, and the selective search before being replaced with the neural network at an extraction edge is waited finding out The time is greatly saved on the problem of selecting frame.So far, target detection four basic steps (generate candidate region, feature extraction, Classification, position refine) it is unified among a deep neural network frame.Mask R-CNN is inherited in Faster R-CNN, Only add a mask predicted branches on Faster R-CNN, and improved ROI Pooling, proposes ROI Align possesses good effect in example segmentation.
Therefore in this application, the detection algorithm used is Mask R-CNN, is the network knot of detection algorithm referring to fig. 2 Composition schematic diagram;It can be seen that convolutional layer+activation primitive+pond that Mask R-CNN uses one group of basis first by the figure Layer extracts the characteristic pattern feature maps of input picture, and obtained feature maps is then sent into Area generation network (Region Proposal Networks, RPN) and generate the corrected candidate region of cutting.Later, Mask R-CNN is used The characteristic pattern that ROI Align replaces ROI Pooling layers to generate fixed size, ROI Align eliminate ROI Pooling To the floor operation of each feature segment, each piece of corresponding feature is accurately found out using bilinear interpolation.Finally, will consolidate The characteristic pattern for determining size is sent into full articulamentum, carries out Classification and Identification using Softmax, is returned using L1Loss and carry out determining for frame Position, and increase a mask branch and carry out example segmentation.
This programme needs to be trained Mask R-CNN network before identifying position by the algorithm, will in advance The training dataset of data mark is carried out to object to be detected, input Mask R-CNN network is trained.Training terminates Afterwards, the color image just read the colour imagery shot of Kinect V2 depth camera inputs Mask R-CNN as target image Network obtains the position of object to be grabbed and the position that object to be grabbed is to be placed;The position can with a four-tuple (x, Y, w, h) position of the box of object to be grabbed in target image is characterized, four parameters respectively indicate the two dimension in the box upper left corner The width and height of coordinate and frame.
S104, the position of the object to be grabbed is converted into corresponding mechanical arm 7 degree of freedom joint angles, and as crawl The initial point position of track;The object to be grabbed position to be placed is converted into corresponding mechanical arm 7 degree of freedom joint angles, And the aiming spot as crawl track;
Wherein, the position by the object to be grabbed is converted to corresponding mechanical arm 7 degree of freedom joint angles, and conduct Grab the initial point position of track;The object to be grabbed position to be placed is converted into corresponding mechanical arm 7 degree of freedom joint angle Degree, and the aiming spot as crawl track, comprising:
The position of the object to be grabbed is converted to the first three-dimensional coordinate under robot basis coordinates;By the described 1st Dimension coordinate is converted to corresponding mechanical arm 7 degree of freedom joint angles, and the initial point position as crawl track;
The object to be grabbed position to be placed is converted into the second three-dimensional coordinate under robot basis coordinates;It will be described Second three-dimensional coordinate is converted to corresponding mechanical arm 7 degree of freedom joint angles, and the aiming spot as crawl track.
In the present solution, particular by Kinect V2 depth camera carry out visual perception, also, in the present solution, should Camera has carried out inside and outside ginseng calibration in advance, and internal reference demarcates the distortion for having corrected camera imaging, and outer ginseng calibration realizes that Kinect is sat Mark the mapping of the three-dimensional coordinate under system to robot basis coordinates system.
The position of object to be grabbed and the position that object to be grabbed is to be placed are had been obtained in S103, which is target Positioning in two dimensional image, i.e. the two-dimensional coordinate position under Kinect vision system coordinate system.Further, Kinect SDK The MapColorFrameToCameraSpace function of middle offer can be converted to the pixel in RGB image corresponding Three-dimensional coordinate under Kinect coordinate system, and the Kinect coordinate that can will have been obtained by outer ginseng transition matrix obtained by calibrating Three-dimensional coordinate under system is converted to the three-dimensional coordinate under robot basis coordinates system, and the three-dimensional coordinate finally obtained is in this programme The first three-dimensional coordinate and the second three-dimensional coordinate.
It is understood that inverse kinematics (Inverse Kinematics, IK) can be by the machine under cartesian product space Tool arm end effector position is converted to each joint angles.Therefore in the present solution, using MoveIt!In integrate Kinematics and Dynamics Library (KDL) kinematics plug-in unit carries out the solution of inverse kinematics, namely to the one or three Dimension coordinate and the second three-dimensional coordinate are converted, and are obtained Baxter mechanical arm 7 and are tieed up joint angles, and respectively as initial point position And aiming spot.
S105, the initial point position, the aiming spot are inputted into the moving model generation crawl track.
Wherein, the initial point position, the aiming spot are inputted into the moving model and generates crawl track, packet It includes:
The initial point position, the aiming spot are inputted into the moving model and generate crawl track;
The motion model includes:
Wherein, τ is the time-scaling factor, and y is system displacement,For speed,For acceleration, αyFor the first system parameter, βyFor second system parameter, g is the aiming spot, and f is the target forcing functions;X is time parameter, and ψ (x) is base letter Number, N are the number of basic function, and ω is basis function weights, y0For the initial point position.
It should be noted that the basis function weights in motion model are had determined that by S101, in turn, by previous step Obtained initial point position and aiming spot inputs the motion model, by the model calculate fresh target under system mode y,Because new motion state forcing functions weight is identical as teaching trail weight, thus new motion profile in shape with show Teach track similar, to reach track the destination of study.
This orbit generation method that this programme proposes, with the mechanical arm teaching based on dynamic motion primitive in mode one Learning method is compared, and since this programme uses the depth camera such as Kinect V2 to extend experimental setup, is transported using dynamic Dynamic primitive carries out on the basis of teaching track study, is carried out by the object detection method based on deep learning to external environment Visual perception makes mechanical arm grasping body trajectory planning no longer single, the interaction of robot and external environment is better achieved.With Artificial Potential Field Method in mode two is compared, and the method for planning track of this programme is weighed by carrying out feature extraction to teaching track The problems such as weight, weight will not change in the case where environment changes, and movement tendency is identical always, and local optimum is not present, thus With higher robustness.
In conclusion mechanical arm grasping body method for planning track provided by the present invention can not only effectively realize machine Tool arm to the reproduction of teaching track and extensive, more can view-based access control model perception preferably interacted with external environment.By non-thread Property regulation of the function to dynamical system is stablized so that the no longer unification of the track of learning from instruction, but joined according to different environment Number independently realizes different trajectory planning strategies, has reached better generalization ability;Pass through the target detection based on deep learning Method Mask R-CNN, the target in image effectively identified and positioned, so that mechanical arm has more the crawl of object Intelligence, to better adapt to service robot market.
Generating means provided in an embodiment of the present invention are introduced below, generating means and above description described below Generation method can be cross-referenced.
Referring to Fig. 3, a kind of generating means of mechanical arm crawl track provided in an embodiment of the present invention, comprising:
Parameter determination module 100, for determining the basis function weights of motion model according to teaching trace information;The movement Model is the model based on dynamic motion primitive algorithm frame;
Module 200 is obtained, for obtaining target image by depth camera;
Position determination module 300 determines object to be grabbed for the target image to be inputted Mask R-CNN network Position and the object to be grabbed position to be placed;
Location conversion module 400, for the position of the object to be grabbed to be converted to mechanical arm 7 degree of freedom joint angles The object to be grabbed position to be placed is converted to the aiming spot of mechanical arm 7 degree of freedom joint angles by initial point position;
Track generation module 500, for the initial point position, the aiming spot to be inputted the moving model life At crawl track.
Wherein, the parameter determination module includes:
Information acquisition unit, for obtaining the movement locus, start point information, endpoint information of teaching track;
Speed determination module, for determining velocity information according to the movement locus;
Acceleration determining module, for determining acceleration information according to the movement locus;
Function determination unit, for according to the start point information, the endpoint information, the velocity information and acceleration Information determines the target forcing functions of the motion model;
Weight determining unit, for determining the basis function weights of the motion model using the target forcing functions.
Wherein, the location conversion module includes:
First three-dimensional coordinate converting unit, for being converted to the position of the object to be grabbed under robot basis coordinates First three-dimensional coordinate;
First joint angles converting unit, for first three-dimensional coordinate to be converted to corresponding mechanical arm 7 degree of freedom joint Angle, and the initial point position as crawl track;
Second three-dimensional coordinate converting unit, for the object to be grabbed position to be placed to be converted to robot base The second three-dimensional coordinate under mark;
Second joint angle conversion unit, for second three-dimensional coordinate to be converted to corresponding mechanical arm 7 degree of freedom joint Angle, and the aiming spot as crawl track.
Wherein, the track generation module is specifically used for:
The initial point position, the aiming spot are inputted into the moving model and generate crawl track;The movement Model includes:
Wherein, τ is the time-scaling factor, and y is system displacement,For speed,For acceleration, αyFor the first system parameter, βyFor second system parameter, g is the aiming spot, and f is the target forcing functions;X is time parameter, and ψ (x) is base letter Number, N are the number of basic function, and ω is basis function weights, y0For the initial point position.
The embodiment of the invention also discloses a kind of generating devices of mechanical arm crawl track, comprising:
Memory, for storing computer program;
Processor realizes that the mechanical arm as described in above method embodiment grabs rail when for executing the computer program The step of generation method of mark.
Referring to fig. 4, which may include memory 11, processor 12 and bus 13.
Wherein, memory 11 include at least a type of readable storage medium storing program for executing, the readable storage medium storing program for executing include flash memory, Hard disk, multimedia card, card-type memory (for example, SD or DX memory etc.), magnetic storage, disk, CD etc..Memory 11 It can be the internal storage unit of equipment 1, such as the hard disk of the equipment 1 in some embodiments.Memory 11 is in other realities Apply the plug-in type hard disk being equipped on the External memory equipment for being also possible to equipment 1 in example, such as equipment 1, intelligent memory card (Smart Media Card, SMC), secure digital (Secure Digital, SD) card, flash card (Flash Card) etc..Into One step, memory 11 can also both internal storage units including equipment 1 or including External memory equipment.Memory 11 is not only It can be used for storing the application software and Various types of data for being installed on equipment 1, such as execute the code of the generation method of crawl track Deng can be also used for temporarily storing the data that has exported or will export.
Processor 12 can be in some embodiments a central processing unit (Central Processing Unit, CPU), controller, microcontroller, microprocessor or other data processing chips, the program for being stored in run memory 11 Code or processing data, such as execute the code etc. of the generation method of crawl track.
The bus 13 can be Peripheral Component Interconnect standard (peripheral component interconnect, abbreviation PCI) bus or expanding the industrial standard structure (extended industry standard architecture, abbreviation EISA) Bus etc..The bus can be divided into address bus, data/address bus, control bus etc..For convenient for indicating, in Fig. 4 only with one slightly Line indicates, it is not intended that an only bus or a type of bus.
Further, equipment can also include network interface 14, network interface 14 optionally may include wireline interface and/ Or wireless interface (such as WI-FI interface, blue tooth interface), it is logical commonly used in being established between the equipment 1 and other electronic equipments Letter connection.
Optionally, which can also include user interface, and user interface may include display (Display), input Unit such as keyboard (Keyboard), optional user interface can also include standard wireline interface and wireless interface.It is optional Ground, in some embodiments, display can be light-emitting diode display, liquid crystal display, touch-control liquid crystal display and OLED (Organic Light-Emitting Diode, Organic Light Emitting Diode) touches device etc..Wherein, display can also be appropriate Referred to as display screen or display unit, for showing the information handled in the device 1 and for showing visual user interface.
Fig. 4 illustrates only the equipment 1 with component 11-14, it will be appreciated by persons skilled in the art that shown in Fig. 4 Structure does not constitute the restriction to equipment 1, may include than illustrating less perhaps more components or the certain components of combination, Or different component layout.
The embodiment of the invention also discloses a kind of computer readable storage medium, deposited on the computer readable storage medium Computer program is contained, realizes that the mechanical arm as described in above method embodiment is grabbed when the computer program is executed by processor The step of taking the generation method of track.
Wherein, the storage medium may include: USB flash disk, mobile hard disk, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic or disk etc. are various can store program The medium of code.
Each embodiment in this specification is described in a progressive manner, the highlights of each of the examples are with other The difference of embodiment, the same or similar parts in each embodiment may refer to each other.
The foregoing description of the disclosed embodiments enables those skilled in the art to implement or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, as defined herein General Principle can be realized in other embodiments without departing from the spirit or scope of the present invention.Therefore, of the invention It is not intended to be limited to the embodiments shown herein, and is to fit to and the principles and novel features disclosed herein phase one The widest scope of cause.

Claims (10)

1. a kind of generation method of mechanical arm crawl track characterized by comprising
The basis function weights of motion model are determined according to teaching trace information;The motion model is to be calculated based on dynamic motion primitive The model of method frame;
Target image is obtained by depth camera;
The target image is inputted into Mask R-CNN network, determines that the position of object to be grabbed and the object to be grabbed wait putting The position set;
The position of the object to be grabbed is converted into corresponding mechanical arm 7 degree of freedom joint angles, and the starting as crawl track Point position;The object to be grabbed position to be placed is converted into corresponding mechanical arm 7 degree of freedom joint angles, and as crawl The aiming spot of track;
The initial point position, the aiming spot are inputted into the moving model and generate crawl track.
2. generation method according to claim 1, which is characterized in that described to determine motion model according to teaching trace information Basis function weights, comprising:
Obtain movement locus, the start point information, endpoint information of teaching track;
Velocity information and acceleration information are determined according to the movement locus;
According to the start point information, the endpoint information, the velocity information and acceleration information, the motion model is determined Target forcing functions;
The basis function weights of the motion model are determined using the target forcing functions.
3. generation method according to claim 2, which is characterized in that the position by the object to be grabbed is converted to Corresponding mechanical arm 7 degree of freedom joint angles, and the initial point position as crawl track;The object to be grabbed is to be placed Position is converted to corresponding mechanical arm 7 degree of freedom joint angles, and the aiming spot as crawl track, comprising:
The position of the object to be grabbed is converted to the first three-dimensional coordinate under robot basis coordinates;
First three-dimensional coordinate is converted into corresponding mechanical arm 7 degree of freedom joint angles, and the starting point as crawl track It sets;
The object to be grabbed position to be placed is converted into the second three-dimensional coordinate under robot basis coordinates;
Second three-dimensional coordinate is converted into corresponding mechanical arm 7 degree of freedom joint angles, and the target point as crawl track It sets.
4. generation method as claimed in any of claims 1 to 3, which is characterized in that by the initial point position, institute It states aiming spot and inputs the moving model generation crawl track, comprising:
The initial point position, the aiming spot are inputted into the moving model and generate crawl track;
The motion model includes:
Wherein, τ is the time-scaling factor, and y is system displacement,For speed,For acceleration, αyFor the first system parameter, βyFor Second system parameter, g are the aiming spot, and f is the target forcing functions;X is time parameter, and ψ (x) is basic function, N For the number of basic function, ω is basis function weights, y0For the initial point position.
5. a kind of generating means of mechanical arm crawl track characterized by comprising
Parameter determination module, for determining the basis function weights of motion model according to teaching trace information;The motion model is Model based on dynamic motion primitive algorithm frame;
Module is obtained, for obtaining target image by depth camera;
Position determination module, for the target image to be inputted Mask R-CNN network, determine object to be grabbed position and The object to be grabbed position to be placed;
Location conversion module, for the position of the object to be grabbed to be converted to the starting point of mechanical arm 7 degree of freedom joint angles It sets, the object to be grabbed position to be placed is converted to the aiming spot of mechanical arm 7 degree of freedom joint angles;
Track generation module generates crawl for the initial point position, the aiming spot to be inputted the moving model Track.
6. generating means according to claim 5, which is characterized in that the parameter determination module includes:
Information acquisition unit, for obtaining the movement locus, start point information, endpoint information of teaching track;
Speed determination module, for determining velocity information according to the movement locus;
Acceleration determining module, for determining acceleration information according to the movement locus;
Function determination unit, for being believed according to the start point information, the endpoint information, the velocity information and acceleration Breath, determines the target forcing functions of the motion model;
Weight determining unit, for determining the basis function weights of the motion model using the target forcing functions.
7. generation method according to claim 6, which is characterized in that the location conversion module includes:
First three-dimensional coordinate converting unit, for the position of the object to be grabbed to be converted to first under robot basis coordinates Three-dimensional coordinate;
First joint angles converting unit, for first three-dimensional coordinate to be converted to corresponding mechanical arm 7 degree of freedom joint angle Degree, and the initial point position as crawl track;
Second three-dimensional coordinate converting unit, for being converted to the object to be grabbed position to be placed under robot basis coordinates The second three-dimensional coordinate;
Second joint angle conversion unit, for second three-dimensional coordinate to be converted to corresponding mechanical arm 7 degree of freedom joint angle Degree, and the aiming spot as crawl track.
8. the generating means according to any one of claim 5 to 7, which is characterized in that the track generation module tool Body is used for:
The initial point position, the aiming spot are inputted into the moving model and generate crawl track;The motion model Include:
Wherein, τ is the time-scaling factor, and y is system displacement,For speed,For acceleration, α is the first system parameter, βyIt is Two system parameter, g are the aiming spot, and f is the target forcing functions;X is time parameter, and ψ (x) is basic function, and N is The number of basic function, ω are basis function weights, y0For the initial point position.
9. a kind of generating device of mechanical arm crawl track characterized by comprising
Memory, for storing computer program;
Processor realizes that the described in any item mechanical arms of Claims 1-4 such as grab rail when for executing the computer program The step of generation method of mark.
10. a kind of computer readable storage medium, which is characterized in that be stored with computer on the computer readable storage medium Program realizes that the described in any item mechanical arms of Claims 1-4 such as grab track when the computer program is executed by processor Generation method the step of.
CN201910451779.1A 2019-05-28 2019-05-28 Method, device and equipment for generating grabbing track of mechanical arm and storage medium Active CN110026987B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910451779.1A CN110026987B (en) 2019-05-28 2019-05-28 Method, device and equipment for generating grabbing track of mechanical arm and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910451779.1A CN110026987B (en) 2019-05-28 2019-05-28 Method, device and equipment for generating grabbing track of mechanical arm and storage medium

Publications (2)

Publication Number Publication Date
CN110026987A true CN110026987A (en) 2019-07-19
CN110026987B CN110026987B (en) 2022-04-19

Family

ID=67243671

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910451779.1A Active CN110026987B (en) 2019-05-28 2019-05-28 Method, device and equipment for generating grabbing track of mechanical arm and storage medium

Country Status (1)

Country Link
CN (1) CN110026987B (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110480637A (en) * 2019-08-12 2019-11-22 浙江大学 A kind of mechanical arm part image identification grasping means based on Kinect sensor
CN110640736A (en) * 2019-10-23 2020-01-03 南京工业大学 Mechanical arm motion planning method for intelligent manufacturing
CN110666804A (en) * 2019-10-31 2020-01-10 福州大学 Motion planning method and system for cooperation of double robots
CN110861083A (en) * 2019-10-25 2020-03-06 广东省智能制造研究所 Robot teaching method and device, storage medium and robot
CN110895810A (en) * 2019-10-24 2020-03-20 中科院广州电子技术有限公司 Chromosome image example segmentation method and device based on improved Mask RCNN
CN110926852A (en) * 2019-11-18 2020-03-27 迪普派斯医疗科技(山东)有限公司 Automatic film changing system and method for digital pathological section
CN111002302A (en) * 2019-09-09 2020-04-14 浙江瀚镪自动化设备股份有限公司 Mechanical arm grabbing track planning method combining Gaussian mixture model and dynamic system
CN111015655A (en) * 2019-12-18 2020-04-17 深圳市优必选科技股份有限公司 Mechanical arm grabbing method and device, computer readable storage medium and robot
CN111890353A (en) * 2020-06-24 2020-11-06 深圳市越疆科技有限公司 Robot teaching track reproduction method and device and computer readable storage medium
CN112183188A (en) * 2020-08-18 2021-01-05 北京航空航天大学 Mechanical arm simulation learning method based on task embedded network
CN112207835A (en) * 2020-09-18 2021-01-12 浙江大学 Method for realizing double-arm cooperative work task based on teaching learning
CN112287728A (en) * 2019-07-24 2021-01-29 鲁班嫡系机器人(深圳)有限公司 Intelligent agent trajectory planning method, device, system, storage medium and equipment
CN112605974A (en) * 2020-11-27 2021-04-06 广东省科学院智能制造研究所 Robot complex operation skill characterization method and system
CN113043251A (en) * 2021-04-23 2021-06-29 江苏理工学院 Robot teaching reproduction track learning method
CN113125463A (en) * 2021-04-25 2021-07-16 济南大学 Teaching method and device for detecting weld defects of automobile hub
CN113344967A (en) * 2021-06-07 2021-09-03 哈尔滨理工大学 Dynamic target identification tracking method under complex background
CN113386123A (en) * 2020-03-13 2021-09-14 欧姆龙株式会社 Control device, robot, learning device, robot system, and learning method
CN113479405A (en) * 2021-07-19 2021-10-08 合肥哈工龙延智能装备有限公司 Control method for stably opening paper box by high-speed box packing machine
CN113552871A (en) * 2021-01-08 2021-10-26 腾讯科技(深圳)有限公司 Robot control method and device based on artificial intelligence and electronic equipment
CN113547521A (en) * 2021-07-29 2021-10-26 中国科学技术大学 Method and system for autonomous grabbing and accurate moving of mobile robot guided by vision
CN113657551A (en) * 2021-09-01 2021-11-16 陕西工业职业技术学院 Robot grabbing posture task planning method for sorting and stacking multiple targets
CN114227187A (en) * 2021-11-30 2022-03-25 浪潮(山东)计算机科技有限公司 Plug-in component mounting method and system and related assembly
CN114310918A (en) * 2022-03-14 2022-04-12 珞石(北京)科技有限公司 Mechanical arm track generation and correction method under man-machine cooperation
CN116061187A (en) * 2023-03-07 2023-05-05 睿尔曼智能科技(江苏)有限公司 Method for identifying, positioning and grabbing goods on goods shelves by composite robot
CN118710786A (en) * 2024-08-23 2024-09-27 山东云小华数字科技有限公司 Digital human action generation method and generation system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2957397A2 (en) * 2014-06-20 2015-12-23 Ricoh Company, Ltd. Measurement system, object pickup system, measurement method, and carrier means
US9486918B1 (en) * 2013-03-13 2016-11-08 Hrl Laboratories, Llc System and method for quick scripting of tasks for autonomous robotic manipulation
CN109108942A (en) * 2018-09-11 2019-01-01 武汉科技大学 The mechanical arm motion control method and system of the real-time teaching of view-based access control model and adaptive DMPS
CN109108978A (en) * 2018-09-11 2019-01-01 武汉科技大学 Three-dimensional space manipulator motion planning method based on study Generalization Mechanism
CN109397285A (en) * 2018-09-17 2019-03-01 鲁班嫡系机器人(深圳)有限公司 A kind of assembly method, assembly device and assembly equipment
CN109711325A (en) * 2018-12-25 2019-05-03 华南农业大学 A kind of mango picking point recognition methods

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9486918B1 (en) * 2013-03-13 2016-11-08 Hrl Laboratories, Llc System and method for quick scripting of tasks for autonomous robotic manipulation
EP2957397A2 (en) * 2014-06-20 2015-12-23 Ricoh Company, Ltd. Measurement system, object pickup system, measurement method, and carrier means
CN109108942A (en) * 2018-09-11 2019-01-01 武汉科技大学 The mechanical arm motion control method and system of the real-time teaching of view-based access control model and adaptive DMPS
CN109108978A (en) * 2018-09-11 2019-01-01 武汉科技大学 Three-dimensional space manipulator motion planning method based on study Generalization Mechanism
CN109397285A (en) * 2018-09-17 2019-03-01 鲁班嫡系机器人(深圳)有限公司 A kind of assembly method, assembly device and assembly equipment
CN109711325A (en) * 2018-12-25 2019-05-03 华南农业大学 A kind of mango picking point recognition methods

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘崇德: "基于动态运动基元的机械臂示教学习方法研究", 《中国优秀博硕士学位论文全文数据库(硕士) 信息科技辑》 *

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112287728A (en) * 2019-07-24 2021-01-29 鲁班嫡系机器人(深圳)有限公司 Intelligent agent trajectory planning method, device, system, storage medium and equipment
CN110480637A (en) * 2019-08-12 2019-11-22 浙江大学 A kind of mechanical arm part image identification grasping means based on Kinect sensor
CN111002302A (en) * 2019-09-09 2020-04-14 浙江瀚镪自动化设备股份有限公司 Mechanical arm grabbing track planning method combining Gaussian mixture model and dynamic system
CN110640736A (en) * 2019-10-23 2020-01-03 南京工业大学 Mechanical arm motion planning method for intelligent manufacturing
CN110895810B (en) * 2019-10-24 2022-07-05 中科院广州电子技术有限公司 Chromosome image example segmentation method and device based on improved Mask RCNN
CN110895810A (en) * 2019-10-24 2020-03-20 中科院广州电子技术有限公司 Chromosome image example segmentation method and device based on improved Mask RCNN
CN110861083A (en) * 2019-10-25 2020-03-06 广东省智能制造研究所 Robot teaching method and device, storage medium and robot
CN110861083B (en) * 2019-10-25 2020-11-24 广东省智能制造研究所 Robot teaching method and device, storage medium and robot
CN110666804B (en) * 2019-10-31 2021-07-13 福州大学 Motion planning method and system for cooperation of double robots
CN110666804A (en) * 2019-10-31 2020-01-10 福州大学 Motion planning method and system for cooperation of double robots
CN110926852A (en) * 2019-11-18 2020-03-27 迪普派斯医疗科技(山东)有限公司 Automatic film changing system and method for digital pathological section
CN110926852B (en) * 2019-11-18 2021-10-22 迪普派斯医疗科技(山东)有限公司 Automatic film changing system and method for digital pathological section
CN111015655A (en) * 2019-12-18 2020-04-17 深圳市优必选科技股份有限公司 Mechanical arm grabbing method and device, computer readable storage medium and robot
CN111015655B (en) * 2019-12-18 2022-02-22 深圳市优必选科技股份有限公司 Mechanical arm grabbing method and device, computer readable storage medium and robot
CN113386123A (en) * 2020-03-13 2021-09-14 欧姆龙株式会社 Control device, robot, learning device, robot system, and learning method
CN111890353A (en) * 2020-06-24 2020-11-06 深圳市越疆科技有限公司 Robot teaching track reproduction method and device and computer readable storage medium
CN112183188A (en) * 2020-08-18 2021-01-05 北京航空航天大学 Mechanical arm simulation learning method based on task embedded network
CN112183188B (en) * 2020-08-18 2022-10-04 北京航空航天大学 Method for simulating learning of mechanical arm based on task embedded network
CN112207835A (en) * 2020-09-18 2021-01-12 浙江大学 Method for realizing double-arm cooperative work task based on teaching learning
CN112605974A (en) * 2020-11-27 2021-04-06 广东省科学院智能制造研究所 Robot complex operation skill characterization method and system
CN113552871A (en) * 2021-01-08 2021-10-26 腾讯科技(深圳)有限公司 Robot control method and device based on artificial intelligence and electronic equipment
CN113552871B (en) * 2021-01-08 2022-11-29 腾讯科技(深圳)有限公司 Robot control method and device based on artificial intelligence and electronic equipment
CN113043251A (en) * 2021-04-23 2021-06-29 江苏理工学院 Robot teaching reproduction track learning method
CN113125463A (en) * 2021-04-25 2021-07-16 济南大学 Teaching method and device for detecting weld defects of automobile hub
CN113125463B (en) * 2021-04-25 2023-03-10 济南大学 Teaching method and device for detecting weld defects of automobile hub
CN113344967A (en) * 2021-06-07 2021-09-03 哈尔滨理工大学 Dynamic target identification tracking method under complex background
CN113479405A (en) * 2021-07-19 2021-10-08 合肥哈工龙延智能装备有限公司 Control method for stably opening paper box by high-speed box packing machine
CN113547521A (en) * 2021-07-29 2021-10-26 中国科学技术大学 Method and system for autonomous grabbing and accurate moving of mobile robot guided by vision
CN113657551A (en) * 2021-09-01 2021-11-16 陕西工业职业技术学院 Robot grabbing posture task planning method for sorting and stacking multiple targets
CN113657551B (en) * 2021-09-01 2023-10-20 陕西工业职业技术学院 Robot grabbing gesture task planning method for sorting and stacking multiple targets
CN114227187A (en) * 2021-11-30 2022-03-25 浪潮(山东)计算机科技有限公司 Plug-in component mounting method and system and related assembly
CN114310918A (en) * 2022-03-14 2022-04-12 珞石(北京)科技有限公司 Mechanical arm track generation and correction method under man-machine cooperation
CN116061187A (en) * 2023-03-07 2023-05-05 睿尔曼智能科技(江苏)有限公司 Method for identifying, positioning and grabbing goods on goods shelves by composite robot
CN116061187B (en) * 2023-03-07 2023-06-16 睿尔曼智能科技(江苏)有限公司 Method for identifying, positioning and grabbing goods on goods shelves by composite robot
CN118710786A (en) * 2024-08-23 2024-09-27 山东云小华数字科技有限公司 Digital human action generation method and generation system

Also Published As

Publication number Publication date
CN110026987B (en) 2022-04-19

Similar Documents

Publication Publication Date Title
CN110026987A (en) Generation method, device, equipment and the storage medium of a kind of mechanical arm crawl track
Andrychowicz et al. Learning dexterous in-hand manipulation
CN110000785B (en) Agricultural scene calibration-free robot motion vision cooperative servo control method and equipment
CN110405730B (en) Human-computer interaction mechanical arm teaching system based on RGB-D image
CN114080583B (en) Visual teaching and repetitive movement manipulation system
Billard et al. Discovering optimal imitation strategies
Mühlig et al. Interactive imitation learning of object movement skills
CN109483534B (en) Object grabbing method, device and system
CN111028317B (en) Animation generation method, device and equipment for virtual object and storage medium
Kaipa et al. Self discovery enables robot social cognition: Are you my teacher?
WO2020058669A1 (en) Task embedding for device control
CN109108942A (en) The mechanical arm motion control method and system of the real-time teaching of view-based access control model and adaptive DMPS
CN110730970A (en) Policy controller using image embedding to optimize robotic agents
Papadopoulos et al. Towards open and expandable cognitive AI architectures for large-scale multi-agent human-robot collaborative learning
CN110385694A (en) Action teaching device, robot system and the robot controller of robot
Raessa et al. Teaching a robot to use electric tools with regrasp planning
Teng et al. Multidimensional deformable object manipulation based on DN-transporter networks
Deng et al. A learning framework for semantic reach-to-grasp tasks integrating machine learning and optimization
CN113927593B (en) Mechanical arm operation skill learning method based on task decomposition
Li et al. Robot brush-writing system of Chinese calligraphy characters
CN112276947B (en) Robot motion simulation method, device, equipment and storage medium
Pattar et al. Automatic data collection for object detection and grasp-position estimation with mobile robots and invisible markers
Gonçalves et al. Defining behaviors for autonomous agents based on local perception and smart objects
Österberg Skill Imitation Learning on Dual-arm Robotic Systems
Zhu Robot Learning Assembly Tasks from Human Demonstrations

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20240818

Address after: 230000 B-1015, wo Yuan Garden, 81 Ganquan Road, Shushan District, Hefei, Anhui.

Patentee after: HEFEI MINGLONG ELECTRONIC TECHNOLOGY Co.,Ltd.

Country or region after: China

Address before: No.729, Dongfeng East Road, Yuexiu District, Guangzhou City, Guangdong Province 510060

Patentee before: GUANGDONG University OF TECHNOLOGY

Country or region before: China