CN113911728A - Electric toothbrush brush head dynamic feeding system and feeding method based on vision - Google Patents

Electric toothbrush brush head dynamic feeding system and feeding method based on vision Download PDF

Info

Publication number
CN113911728A
CN113911728A CN202111161544.2A CN202111161544A CN113911728A CN 113911728 A CN113911728 A CN 113911728A CN 202111161544 A CN202111161544 A CN 202111161544A CN 113911728 A CN113911728 A CN 113911728A
Authority
CN
China
Prior art keywords
image
robot
brush head
point
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111161544.2A
Other languages
Chinese (zh)
Other versions
CN113911728B (en
Inventor
潘全科
朱雄伟
黄延军
王德明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Baiwang Oral Care Products Co ltd
Original Assignee
Jiangsu Baiwang Oral Care Products Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Baiwang Oral Care Products Co ltd filed Critical Jiangsu Baiwang Oral Care Products Co ltd
Priority to CN202111161544.2A priority Critical patent/CN113911728B/en
Publication of CN113911728A publication Critical patent/CN113911728A/en
Application granted granted Critical
Publication of CN113911728B publication Critical patent/CN113911728B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G47/00Article or material-handling devices associated with conveyors; Methods employing such devices
    • B65G47/74Feeding, transfer, or discharging devices of particular kinds or types
    • B65G47/90Devices for picking-up and depositing articles or materials
    • B65G47/91Devices for picking-up and depositing articles or materials incorporating pneumatic, e.g. suction, grippers
    • B65G47/914Devices for picking-up and depositing articles or materials incorporating pneumatic, e.g. suction, grippers provided with drive systems incorporating rotary and rectilinear movements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G43/00Control devices, e.g. for safety, warning or fault-correcting
    • B65G43/08Control devices operated by article or material being fed, conveyed or discharged
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G47/00Article or material-handling devices associated with conveyors; Methods employing such devices
    • B65G47/22Devices influencing the relative position or the attitude of articles during transit by conveyors
    • B65G47/24Devices influencing the relative position or the attitude of articles during transit by conveyors orientating the articles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G2203/00Indexing code relating to control or detection of the articles or the load carriers during conveying
    • B65G2203/04Detection means
    • B65G2203/041Camera

Abstract

The invention relates to a dynamic feeding system for a brush head of an electric toothbrush based on vision. The system comprises a conveying device, a visual detection device and a control module, and is used for respectively carrying out system calibration, brush head identification and positioning, grabbing sequence optimization and robot track planning, wherein: the brush head is conveyed to a working space of the robot from the hopper through the conveying device, images are collected through the visual detection device, the brush head is identified and the pose is calculated by adopting a support vector machine and an ellipse fitting method, and meanwhile, the pose is updated in real time. And after the pose information of the brush head is obtained, the grabbing sequence is planned by adopting an improved tabu search algorithm, and the robot carries out trajectory planning after receiving the instruction, so that the brush head is conveyed to the target position from the grabbing position. The invention can adapt to the feeding of various brush heads, has higher automation and flexibility compared with the existing feeding mode, can shorten the moving distance of the robot while completing the feeding, and can be used for the feeding of the procedures of brush head bristle planting, packaging and the like.

Description

Electric toothbrush brush head dynamic feeding system and feeding method based on vision
Technical Field
The invention relates to the field of electric toothbrush manufacturing, in particular to a dynamic feeding system and method for an electric toothbrush head based on vision.
Background
In recent years, with the continuous improvement of the industrial automation level of China, the machine vision is more and more mature to be applied to the material sorting aspect. Traditional toothbrush material loading mainly relies on the manual work, and is with high costs and efficient, often because the toothbrush position puts improper and influences normal takt. The existing automatic toothbrush feeding equipment is large in occupied area, complex in structure and poor in flexibility, and cannot meet the feeding requirements of different toothbrushes.
Disclosure of Invention
The purpose of the invention is as follows: aiming at the problems, the invention provides a dynamic feeding system for an electric toothbrush head based on vision, and achieves the technical effects of realizing the rapid feeding of the toothbrush head with high efficiency and low cost.
The invention also provides a feeding method using the feeding system.
The invention discloses a technical scheme of a dynamic feeding system for an electric toothbrush brush head based on vision, which comprises the following steps: the robot comprises a robot, a conveying device, an image acquisition device, an image processing module, a grabbing sequence planning module, a robot control module and a support vector machine;
the image acquisition device acquires a brush head image and transmits the brush head image to the image processing module, and the image processing module identifies and positions the brush head through image processing to obtain position information and posture information of the brush head;
the grabbing sequence planning module updates the position of the brush head in real time, plans the optimal grabbing sequence and then sends an instruction to the robot control module;
the robot control module receives the pose information of the brush head, performs track planning according to a given motion path, and the control section robot grabs the brush head according to the track planning to perform dynamic feeding.
Further, when the image processing module obtains the position information and the posture information of the brush head, the image preprocessing is used for carrying out Gaussian filtering on the image collected by the vision collecting device, removing the noise in the image and carrying out optical compensation enhancement on the image; cutting the image, removing partial image outside the conveyor belt in the image, and performing threshold segmentation by adopting an Otsu method; detecting the outline in the segmented image, screening and filtering the outline and determining the outline of the toothbrush head; refilling the outline of the toothbrush head, and calculating the moment of the filled image to obtain the gravity center and the direction angle of the toothbrush head; determining the direction of the toothbrush head through template matching, and obtaining the coordinates of the grabbing points of the toothbrush head through ellipse fitting; and performing AND operation on the filled image and the original image to obtain an image of the single toothbrush brush head, extracting a direction gradient histogram of the image, and inputting the histogram into a support vector machine for classification judgment to determine the direction of the brush head.
Further, the grabbing sequence planning module generates initial solutions of all the brush head targets based on a neighborhood method, when the initial solutions are generated, a plurality of initial solutions are randomly generated by the neighborhood method, then iterative search is carried out on the initial solutions respectively, and finally the grabbing sequence with the optimal evaluation function is selected as a global optimal solution; each initial solution shares a tabu table during searching, and information generated in the previous searching process can guide subsequent searching; setting a neighborhood operator and an evaluation function, adopting an exchange operator as the neighborhood operator, randomly selecting two positions in a solution sequence and exchanging corresponding serial numbers of the positions to generate a candidate solution; the overall feeding path of the robot is planned by taking the shortest moving path as a target and taking the total path length as an evaluation function; setting a privilege criterion and a taboo length, if the solution in the taboo table is superior to any one of the candidate solutions which are not taboo, the privilege ensures that the algorithm can search for the optimal solution; the length of the taboo is set as
Figure BDA0003290100330000026
N is the number of candidate solutions; introducing a disturbance mechanism, and if the number of times that the optimal solution is not updated in the process of one-time searching reaches a specified value, adopting a variable neighborhood searching algorithm to carry out disturbance; performing alternate search by taking the current solution as the initial solution of the VNS, updating a tabu table when the solution generated by disturbance is superior to the optimal solution searched at this time, and quitting the disturbance; and setting a stopping criterion, giving the maximum iteration times, finishing the search when the iteration times of the algorithm reach a set value, and outputting a global optimal solution.
Further, the robot control module adopts a 3-4-5 degree polynomial motion rule with smooth acceleration and deceleration, and combines the requirements that the speed and the acceleration of the robot execution end at the starting and ending positions are zero, so as to obtain the change rule of the acceleration a along with the time as follows:
Figure BDA0003290100330000021
in the formula: s is the total length of the motion path, T is the time for the robot to complete one motion, and T is the current motion moment; the change rule of the speed v along with the time obtained by the acceleration formula is as follows:
Figure BDA0003290100330000022
the change rule of the displacement s along with the time is as follows:
Figure BDA0003290100330000023
when in use
Figure BDA0003290100330000024
When the acceleration is at the maximum value, the maximum acceleration is
Figure BDA0003290100330000025
Planning a toothbrush grabbing track, namely firstly moving a section of curve from a grabbing point to a fixed point, and then moving a section of straight line from the fixed point to a placing point; by adopting a sectional planning mode, the robot adopts a 3-4-5 degree polynomial motion rule for the motion in the X-axis direction and the motion in the Y-axis direction, and the curve track is realized by synthesizing the motions in the two directions; the robot execution end realizes the grabbing and putting down of the brush head according to the curve track; setting the coordinate of the grabbing point under the robot base coordinate system as A (x)a,ya) The coordinate of the fixed point under the robot base coordinate system is C (x)c,yc),The coordinate of the placing point under the robot base coordinate system is B (x)b,yb) Then, under the robot base coordinate system, the moving distance h in the Y-axis direction is | Ya-ycI, the movement distance in the X-axis direction is b ═ Xa-xbI, X-direction distance b between AC1=|xa-xcL, |; the one-time movement time T of the robot is determined according to the working beats of the hair-planting machine and the packaging machine, and the movement time T in the X-axis directionxT; let the movement time in the Y-axis direction be tyDue to b and txIt is known to determine the maximum acceleration a in the X-axis directionmax(ii) a According to amaxAnd b1Calculating the time t from the point A to the point CACAnd t isy=tAC(ii) a And (3) establishing a parameter equation of the interpolation point according to the motion rules in the X-axis direction and the Y-axis direction, integrating the obtained parameter equation expression to obtain the length of a curve from the point A to the point C, and completely determining the motion track of the robot execution end at the moment.
Further, the robot is a horizontal multi-joint robot with 4 degrees of freedom and comprises a robot body, a tail end grabbing device and a motion controller; the robot body comprises a first rotary joint, a big arm, a second rotary joint, a small arm, a third rotary joint and a fourth linear joint; the first rotary joint comprises a base, a servo motor and a hollow rotary platform, the servo motor is connected with the input end of the hollow rotary platform, the hollow rotary platform is installed on the base, and the output disc surface is fixedly connected with the large arm; the second rotary joint comprises a servo motor, a speed reducer, a synchronous pulley set, a transmission shaft, a locking nut, a bearing and a bearing seat, the servo motor is fixedly connected with the input end of the speed reducer and is arranged on the large arm and close to the first rotary joint, the output shaft of the speed reducer and the transmission shaft are in transmission through the synchronous pulley set, the transmission shaft is supported and arranged in the bearing seat through the bearing and is locked through the locking nut, the bearing seat is arranged at the front end of the large arm, and the small arm is fixedly connected with the transmission shaft; the third rotating joint comprises a servo motor, a synchronous pulley group, an output shaft, a sliding bearing and a bearing seat, the servo motor is arranged on the small arm and close to the second rotating joint, the motor and the output shaft are driven by the synchronous pulley group, the output shaft is arranged in the bearing seat by the sliding bearing, and the bearing seat is arranged at the front end of the small arm; the fourth linear joint is fixedly connected with the output shaft by adopting a double-rod cylinder; the tail end grabbing device comprises a support vacuum chuck, a pneumatic finger and a profiling clamping jaw, and the profiling clamping jaw is fixedly connected with the pneumatic finger; in the feeding process, the sucker is adsorbed on the plane of the head part of the brush head, and the profiling clamping jaw clamps the tail part of the brush head; the motion controller is communicated with the computer, receives a control signal of the computer, plans the track of the robot and feeds back robot data to the computer.
Further, the conveying device comprises a hopper, a lifting machine and a conveying belt, wherein the hopper is arranged at the starting point of the lifting machine; the image acquisition device comprises an industrial camera, a light source and a camera bracket, wherein the industrial camera is fixedly connected with the bracket and is arranged on one side of the conveying belt; the light source is installed above the conveyer belt.
Further, the system comprises a system calibration module; the system calibration module comprises camera calibration, robot kinematics calibration, robot eye calibration, tool coordinate system calibration and conveyer belt calibration.
The technical scheme of the feeding method of the dynamic feeding system of the electric toothbrush brush head based on vision provided by the invention comprises the following steps:
s1: putting the electric toothbrush head into a hopper, and opening a conveying belt;
s2: the image acquisition device acquires a brush head image and transmits the image to the visual detection software, and the brush head is identified and positioned through image processing to obtain position information and posture information of the brush head;
s3: the grabbing sequence planning module updates the position of the brush head in real time, plans the optimal grabbing sequence and then sends a command to the motion controller;
s4: and the motion controller receives the pose information of the brush head, plans a track according to a given motion path and controls the horizontal multi-joint robot to grab the brush head for dynamic feeding.
Further, the S2 identifies and positions the brush head through image processing to obtain the position information and posture information of the brush head, including:
s2.1: image preprocessing, namely performing Gaussian filtering on the image acquired by the vision acquisition device, removing noise in the image and performing optical compensation enhancement on the image;
s2.2: cutting the image, removing partial images outside the conveyor belt in the image by cutting, and performing threshold segmentation by adopting an Otsu method;
s2.3: detecting the outline in the segmented image, screening and filtering the outline, and determining the outline of the toothbrush head;
s2.4: filling the outline of the toothbrush head, and calculating the moment of the filled image to obtain the gravity center and the direction angle of the toothbrush head;
s2.5: determining the direction of the toothbrush head through template matching, and obtaining the coordinates of a gripping point of the toothbrush head through ellipse fitting;
s2.6: and performing AND operation on the filled image and the original image to obtain an image of the single toothbrush brush head, extracting a direction gradient histogram of the image, and inputting the histogram into a support vector machine for classification judgment to determine the direction of the brush head.
Further, the S2.6 includes:
s2.6.1: acquiring an image of a single toothbrush head:
Dst=Src&Fill
wherein Dst is an image of a single toothbrush brush head, Src is an original image, and Fill is an image after filling;
s2.6.2: performing Gamma normalization processing on the image color;
s2.6.3: calculating the gradient of the image in the horizontal and vertical directions to obtain the gradient amplitude of each pixel point (x, y)
Figure BDA0003290100330000051
And direction of
Figure BDA0003290100330000052
fx(x,y)=f(x,y)-f(x+1,y)
fy(x,y)=f(x,y)-f(x,y+1)
Figure BDA0003290100330000053
Figure BDA0003290100330000054
S2.6.4: dividing pixel blocks, and calculating a gradient histogram by taking the pixel blocks as units;
s2.6.5: combining the pixel blocks into a block, solving a gradient direction histogram vector of the block by taking the block as a unit, and carrying out normalization processing on the characteristic quantity;
s2.6.5: and (4) collecting HOG features, collecting all overlapped blocks in the detection window, combining the collected HOG features into a feature vector, and inputting the feature vector into a support vector machine for classification.
Further, the S3 adopts an improved tabu search algorithm to perform grabbing sequence planning on the brush head obtained by image processing, including:
s3.1: generating an initial solution based on a neighborhood method, selecting a placement point B as a reference, and setting the distance of a point closest to the point B as r; forming an inner circle by taking the B as the center of a circle and the radius r as the radius, and forming an outer circle (beta is more than 1) by taking the beta r as the radius to form a circular ring as a neighborhood of the B; a plurality of brush heads are used as targets in the neighborhood of B, and the next target is randomly selected from the targets and added into the initial solution; until all identified brush heads are added to the initial solution;
s3.2: adopting a multi-target search strategy, randomly generating a plurality of initial solutions by adopting a neighborhood method when generating the initial solutions, then respectively carrying out iterative search on the initial solutions, and finally selecting a grabbing sequence with an optimal evaluation function as a global optimal solution; each initial solution shares a tabu table during searching, and information generated in the previous searching process can guide subsequent searching;
s3.3: setting a neighborhood operator and an evaluation function, adopting an exchange operator as the neighborhood operator, randomly selecting two positions in a solution sequence and exchanging corresponding serial numbers of the positions to generate a candidate solution; the overall feeding path of the robot is planned by taking the shortest moving path as a target and taking the total path length as an evaluation function;
s3.4: setting privilege criteria and taboo length if forbiddenThe solution in the taboo table is privileged when being superior to any one candidate solution which is not taboo, so that the algorithm can be ensured to search the optimal solution; the length of the taboo is set as
Figure BDA0003290100330000055
n is the number of candidate solutions;
s3.5: introducing a disturbance mechanism, and if the number of times that the optimal solution is not updated in the process of one-time searching reaches a specified value, adopting a variable neighborhood searching algorithm to carry out disturbance; performing alternate search by taking the current solution as the initial solution of the VNS, updating a tabu table when the solution generated by disturbance is superior to the optimal solution searched at this time, and quitting the disturbance;
s3.6: and setting a stopping criterion, giving the maximum iteration times, finishing the search when the iteration times of the algorithm reach a set value, and outputting a global optimal solution.
Further, the S4 performs trajectory planning according to the given path, including:
s4.1: the change rule of the acceleration a along with time is obtained by adopting a flexible 3-4-5 degree polynomial motion rule of acceleration and deceleration and combining the requirements that the speed and the acceleration of the robot execution end are zero at the initial position and the final position:
Figure BDA0003290100330000061
in the formula: s is the total length of the motion path, T is the time for the robot to complete one motion, and T is the current motion moment; the change rule of the speed v along with the time obtained by the acceleration formula is as follows:
Figure BDA0003290100330000062
the change rule of the displacement s along with the time is as follows:
Figure BDA0003290100330000063
when in use
Figure BDA0003290100330000064
When the acceleration is at the maximum value, the maximum acceleration is
Figure BDA0003290100330000065
S4.2: planning a toothbrush grabbing track, namely firstly moving a section of curve from a grabbing point to a fixed point, and then moving a section of straight line from the fixed point to a placing point;
s4.3: by adopting a sectional planning mode, the robot adopts a 3-4-5 degree polynomial motion rule for the motion in the X-axis direction and the motion in the Y-axis direction, and the curve track is realized by synthesizing the motions in the two directions; the robot execution end realizes the grabbing and putting down of the brush head according to the curve track;
s4.4: let the grasping point A (x)a,ya) Fixed point C (x)c,yc) Place point B (x)b,yb) Then the distance h ═ Y moves in the Y-axis directiona-ycI, the movement distance in the X-axis direction is b ═ Xa-xbI, X-direction distance b between AC1=|xa-xcL, |; the one-time movement time T of the robot is determined according to the working beats of the hair-planting machine and the packaging machine, and the movement time T in the X-axis directionx=T;
S4.5: let the movement time in the Y-axis direction be tyDue to b and txIt is known to determine the maximum acceleration a in the X-axis directionmax(ii) a According to amaxAnd b1Calculating the time t from the point A to the point CACAnd t isy=tAC
S4.6: and (3) establishing a parameter equation of the interpolation point according to the motion rules in the x-axis direction and the Y-axis direction, integrating the obtained parameter equation expression to obtain the length of a curve from the point A to the point C, and completely determining the motion track of the robot execution end at the moment.
The invention has the beneficial effects that: first, compared with manual feeding, the feeding system and the feeding method provided by the invention have the advantages that the robot has high feeding speed and reliable effect, the production cost can be saved in practical application, and the production efficiency can be improved. Secondly, compared with the existing automatic feeding equipment, the automatic feeding robot has a simple structure and strong adaptability, and can adapt to feeding of different toothbrush heads only by replacing the tail end profiling fixture. Thirdly, the invention plans the grabbing sequence of the identified brush heads, can reduce the movement distance of the robot and improve the grabbing efficiency of the robot on the premise of ensuring the feeding efficiency of the toothbrush brush heads.
Drawings
FIG. 1 is a block diagram of a vision-based dynamic loading system for a brushhead of an electric toothbrush in accordance with an embodiment of the present invention;
FIG. 2 is a schematic diagram of a horizontal articulated robot body according to an embodiment of the present invention;
FIG. 3 is a flow chart of the image processing software of the present invention;
FIG. 4 is a diagram of the trajectory of the robot of the present invention;
fig. 5 is a flow chart of the grabbing order planning based on the improved tabu search algorithm of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention.
As shown in figure 1, the invention provides a dynamic feeding system of an electric toothbrush brush head based on vision, which comprises a conveying device, an image acquisition module, a vision processing module and a control module. The conveying device comprises a hopper 1, a lifting machine 2 and a conveying belt 4, wherein the hopper is arranged at the front end of the lifting machine 2. The image acquisition comprises an industrial camera 3, a light source, a camera support. The vision processing module comprises toothbrush head identification and positioning software. The control module includes a robot 5, a tip clamp, and a motion controller.
The robot 5 is a horizontal multi-joint robot, has 4 degrees of freedom, and comprises a robot body, a tail end grabbing device and a motion controller.
As shown in fig. 2, the robot body includes a first rotary joint, a large arm, a second rotary joint, a small arm, a third rotary joint, and a fourth linear joint. The first rotary joint comprises a base 5-1, a servo motor 5-2 and a hollow rotary platform 5-3, wherein the servo motor is connected with the input end of the hollow rotary platform, the hollow rotary platform is arranged on the base, and the output disc surface is fixedly connected with a large arm 5-4; the second rotary joint comprises a servo motor 5-5, a speed reducer 5-6, a synchronous pulley set 5-21, a transmission shaft 5-9, a lock nut 5-22, a bearing 5-8 and a bearing seat 5-7, wherein the servo motor 5-5 is fixedly connected with the input end of the speed reducer 5-6 and is installed at the large arm 5-4 and close to the first rotary joint, the output shaft of the speed reducer 5-6 is transmitted with the transmission shaft 5-9 through the synchronous pulley set 5-21, the transmission shaft 5-9 is supported and installed in the bearing seat 5-7 through the bearing 5-8 and is locked through the lock nut 5-22, the bearing seat 5-7 is installed at the front end of the large arm 5-4, and the small arm 5-12 is fixedly connected with the transmission shaft 5-9; the third rotary joint comprises a servo motor 5-10, a synchronous pulley group 5-11, an output shaft 5-13, a sliding bearing 5-14 and a bearing seat 5-15, the servo motor 5-10 is installed at the position of the small arm 5-12 and close to the second rotary joint, the motor 5-10 and the output shaft 5-13 are driven through the synchronous pulley group 5-11, the output shaft 5-13 is installed in the bearing seat 5-15 through the sliding bearing 5-14, and the bearing seat 5-15 is installed at the front end of the small arm 5-12; the fourth linear joint is fixedly connected with the output shaft 5-13 by adopting a double-rod cylinder 5-17.
The tail end gripping device comprises a support 5-16, a vacuum chuck 5-20, a pneumatic finger 5-18 and a profiling clamping jaw 5-19, wherein the profiling clamping jaw 5-19 is fixedly connected with the pneumatic finger 5-18; in the feeding process, the suction cups 5 to 20 are adsorbed on the plane of the head part of the brush head, and the profiling clamping jaws 5 to 19 clamp the tail part of the brush head.
The motion controller is communicated with the computer, receives a control signal of the computer, plans the track of the robot and feeds back robot data to the computer.
The system calibration module comprises camera calibration, robot kinematics calibration, robot hand-eye calibration, tool coordinate system calibration and conveyer belt calibration.
In practical use, the dynamic feeding system for the electric toothbrush head based on vision in fig. 1 and 2 dynamically feeds the electric toothbrush head according to the following feeding method, as shown in fig. 3, including:
s1: opening the conveying belt, and putting the electric toothbrush head into a hopper;
s2: the image acquisition device acquires a brush head image and transmits the image to the visual detection software, and the brush head is identified and positioned through image processing to obtain position information and posture information of the brush head;
s3: the grabbing sequence planning module updates the toothbrush position in real time and plans an optimal grabbing sequence, and then sends an instruction to the motion controller;
s4: and the motion controller receives the information of the brush head, plans a track according to a given motion path and controls the horizontal multi-joint robot to grab the toothbrush for feeding.
According to an embodiment of the present invention, the S2 includes the following steps:
s2.1: image preprocessing, namely performing Gaussian filtering on the image acquired by the vision acquisition device, removing noise in the image and performing optical compensation enhancement on the image;
s2.2: cutting the image, removing partial images outside the conveyor belt in the image by cutting, and performing threshold segmentation by adopting an Otsu method;
s2.3: detecting the outline in the segmented image, screening and filtering the outline, and determining the outline of the toothbrush head;
s2.4: filling the outline of the toothbrush head, and calculating the moment of the filled image to obtain the gravity center and the direction angle of the toothbrush head;
s2.5: determining the direction of the toothbrush head through template matching, and obtaining the coordinates of a gripping point of the toothbrush head through ellipse fitting;
s2.6: and performing AND operation on the filled image and the original image to obtain an image of the single toothbrush brush head, extracting a direction gradient histogram of the image, and inputting the histogram into a support vector machine for classification judgment to determine the direction of the brush head.
According to one embodiment of the invention, S2.6 comprises the steps of:
s2.6.1: acquiring an image of a single toothbrush head:
Dst=Src&Fill
wherein Dst is the image of a single toothbrush brush head, Src is the original image, and Fill is the image after filling.
S2.6.2: performing Gamma normalization processing on the image color;
s2.6.3: calculating the gradient of the image in the horizontal and vertical directions to obtain the gradient amplitude of each pixel point (x, y)
Figure BDA0003290100330000091
And direction of
Figure BDA0003290100330000092
fx(x,y)=f(x,y)-f(x+1,y)
fy(x,y)=f(x,y)-f(x,y+1)
Figure BDA0003290100330000093
Figure BDA0003290100330000094
S2.6.4: dividing pixel blocks, and calculating a gradient histogram by taking the pixel blocks as units;
s2.6.5: combining the pixel blocks into a block, solving a gradient direction histogram vector of the block by taking the block as a unit, and carrying out normalization processing on the characteristic quantity;
s2.6.5: and (4) collecting HOG features, collecting all overlapped blocks in the detection window, combining the collected HOG features into a feature vector, and inputting the feature vector into a support vector machine for classification.
According to an embodiment of the present invention, S3 includes the steps of:
s3.1: and generating an initial solution based on a neighborhood method, selecting a placement point B as a reference, and setting the distance of a point closest to the point B as r. And forming an inner circle by taking the B as a circle center and the radius r as a radius, and forming an outer circle (beta is more than 1) by taking the beta r as a radius to form a circular ring, wherein the area is the neighborhood of the B. And in order to improve the diversity of search and avoid trapping in a local optimal solution, a next target is randomly selected from the targets and added into the initial solution. And so on thereafter until all targets are added to the initial solution;
s3.2: and (3) adopting a multi-target search strategy, randomly generating a plurality of initial solutions by adopting a neighborhood method when generating the initial solutions, then respectively carrying out iterative search on the initial solutions, and finally selecting the optimal grabbing sequence of the evaluation function as the global optimal solution. In order to reduce the repeated searching times, each initial solution shares a tabu table during searching, and information generated in the previous searching process can guide subsequent searching, so that the repeated searching process can be effectively avoided;
s3.3: setting a neighborhood operator and an evaluation function, adopting an exchange operator as the neighborhood operator, randomly selecting two positions in a solution sequence and exchanging corresponding serial numbers of the positions to generate a candidate solution. The overall feeding path of the robot is planned by taking the shortest moving path as a target and taking the total path length as an evaluation function;
s3.4: the special criterion and the taboo length are set, and if the solution in the taboo table is superior to any one candidate solution which is not taboo, the special criterion and the taboo length ensure that the algorithm can search the optimal solution. The length of the taboo is set as
Figure BDA0003290100330000102
N is the number of candidate solutions;
s3.5: and (3) introducing a disturbance mechanism, and if the number of times that the optimal solution is not updated in the process of one-time searching reaches a specified value, adopting a variable neighborhood searching algorithm (VNS) to carry out disturbance. Performing alternate search by taking the current solution as the initial solution of the VNS, in order to avoid influence on algorithm performance caused by overlong disturbance time, updating a tabu table when the solution generated by disturbance is superior to the optimal solution searched at this time, and quitting disturbance;
s3.6: and setting a stopping criterion, giving the maximum iteration times, finishing the search when the iteration times of the algorithm reach a set value, and outputting a global optimal solution.
According to an embodiment of the present invention, S4 includes the steps of:
s4.1: by adopting a flexible 3-4-5 degree polynomial motion rule of acceleration and deceleration and combining the requirements that the speed and the acceleration of the robot at the initial position and the final position are zero, the change rule of the acceleration a along with time is as follows:
Figure BDA0003290100330000101
in the formula: s is the total length of the motion path, T is the time for the robot to complete one motion, and T is the current motion moment. The change rule of the speed v along with the time obtained by the acceleration formula is as follows:
Figure BDA0003290100330000111
the change rule of the displacement s along with the time is as follows:
Figure BDA0003290100330000112
when in use
Figure BDA0003290100330000113
When the acceleration is at the maximum value, the maximum acceleration is
Figure BDA0003290100330000114
S4.2: considering the obstacles on two sides of the hair planting machine and the packaging machine in the actual feeding process, in order to prevent collision during movement, planning a toothbrush grabbing track, firstly, moving a section of curve from a grabbing point to a fixed point, and then, moving a section of straight line from the fixed point to a placing point;
s4.3: by adopting a sectional planning mode, the robot adopts a 3-4-5 degree polynomial motion rule for the motion in the X-axis direction and the motion in the Y-axis direction, and the curve track is realized by synthesizing the motions in the two directions;
s4.4: setting the coordinate of the grabbing point under the robot base coordinate system as A (x)a,ya) The coordinate of the fixed point under the robot base coordinate system is C (x)c,yc) The coordinate of the placement point under the robot base coordinate system is B (x)b,yb) Then, under the robot base coordinate system, Y-axisThe moving distance h in the direction is equal to ya-ycI, the movement distance in the X-axis direction is b ═ Xa-xbI, X-direction distance b between AC1=|xa-xcThe one-time movement time T of the robot is determined according to the working beats of the hair planting machine and the packaging machine, and the movement time T in the X-axis directionx=T;
S4.5: let the movement time in the Y-axis direction be tyDue to b and txIt is known that the maximum acceleration a in the X-axis direction can be foundmaxAccording to amaxAnd b1The time t from the point A to the point C of the robot can be obtainedACAnd t isy=tAC
S4.6: and (3) establishing a parameter equation of the interpolation point according to the motion rules in the X-axis direction and the Y-axis direction, and integrating the obtained parameter equation expression to obtain the length of a curve from the point A to the point C, wherein the motion track of the robot is completely determined at the moment.
It should be understood that the above-described embodiments are merely exemplary for illustrating the application of the present method and are not limiting, and that various other modifications and changes may be made by those skilled in the art based on the above description for studying the related problems. Therefore, the protection scope of the present invention should be defined by the appended claims.

Claims (12)

1. The utility model provides an electronic toothbrush brush head dynamic feeding system based on vision which characterized in that includes: the robot comprises a robot, a conveying device, an image acquisition device, an image processing module, a grabbing sequence planning module, a robot control module and a support vector machine;
the image acquisition device acquires a brush head image and transmits the brush head image to the image processing module, and the image processing module identifies and positions the brush head through image processing to obtain position information and posture information of the brush head;
the grabbing sequence planning module updates the position of the brush head in real time, plans the optimal grabbing sequence and then sends an instruction to the robot control module;
the robot control module receives the pose information of the brush head, performs track planning according to a given motion path, and the control section robot grabs the brush head according to the track planning to perform dynamic feeding.
2. The vision-based dynamic loading system for a brushhead of an electric toothbrush of claim 1, wherein: when the image processing module obtains the position information and the posture information of the brush head, the image preprocessing is used for carrying out Gaussian filtering on the image collected by the vision collecting device, removing the noise in the image and carrying out optical compensation enhancement on the image; cutting the image, removing partial image outside the conveyor belt in the image, and performing threshold segmentation by adopting an Otsu method; detecting the outline in the segmented image, screening and filtering the outline and determining the outline of the toothbrush head; refilling the outline of the toothbrush head, and calculating the moment of the filled image to obtain the gravity center and the direction angle of the toothbrush head; determining the direction of the toothbrush head through template matching, and obtaining the coordinates of the grabbing points of the toothbrush head through ellipse fitting; and performing AND operation on the filled image and the original image to obtain an image of the single toothbrush brush head, extracting a direction gradient histogram of the image, and inputting the histogram into a support vector machine for classification judgment to determine the direction of the brush head.
3. The vision-based dynamic loading system for a brushhead of an electric toothbrush of claim 1, wherein: the grabbing sequence planning module generates initial solutions of all brush head targets based on a neighborhood method, when the initial solutions are generated, a plurality of initial solutions are generated randomly by the neighborhood method, then iterative search is carried out on the initial solutions respectively, and finally the grabbing sequence with the optimal evaluation function is selected as a global optimal solution; each initial solution shares a tabu table during searching, and information generated in the previous searching process can guide subsequent searching; setting a neighborhood operator and an evaluation function, adopting an exchange operator as the neighborhood operator, randomly selecting two positions in a solution sequence and exchanging corresponding serial numbers of the positions to generate a candidate solution; the overall feeding path of the robot is planned by taking the shortest moving path as a target and taking the total path length as an evaluation function; setting privilege criteria and taboo length, if the solution in the taboo list is better than any one of the candidate solutions that are not tabooPrivileged, ensuring that the algorithm can search for the optimal solution; the length of the taboo is set as
Figure FDA0003290100320000011
n is the number of candidate solutions; introducing a disturbance mechanism, and if the number of times that the optimal solution is not updated in the process of one-time searching reaches a specified value, adopting a variable neighborhood searching algorithm to carry out disturbance; performing alternate search by taking the current solution as the initial solution of the VNS, updating a tabu table when the solution generated by disturbance is superior to the optimal solution searched at this time, and quitting the disturbance; and setting a stopping criterion, giving the maximum iteration times, finishing the search when the iteration times of the algorithm reach a set value, and outputting a global optimal solution.
4. The vision-based dynamic loading system for a brushhead of an electric toothbrush of claim 1, wherein: the robot control module adopts a 3-4-5 th-order polynomial motion rule with flexible acceleration and deceleration, and combines the requirements that the speed of the robot execution end at the starting position and the end position and the acceleration are zero to obtain the change rule of the acceleration a along with the time as follows:
Figure FDA0003290100320000021
in the formula: s is the total length of the motion path, T is the time for the robot to complete one motion, and T is the current motion moment; the change rule of the speed v along with the time obtained by the acceleration formula is as follows:
Figure FDA0003290100320000022
the change rule of the displacement s along with the time is as follows:
Figure FDA0003290100320000023
when in use
Figure FDA0003290100320000024
When the acceleration is at the maximum value, the maximum acceleration is
Figure FDA0003290100320000025
Planning a toothbrush grabbing track, namely firstly moving a section of curve from a grabbing point to a fixed point, and then moving a section of straight line from the fixed point to a placing point; by adopting a sectional planning mode, the robot adopts a 3-4-5 degree polynomial motion rule for the motion in the X-axis direction and the motion in the Y-axis direction, and the curve track is realized by synthesizing the motions in the two directions; the robot execution end realizes the grabbing and putting down of the brush head according to the curve track; setting the coordinate of the grabbing point under the robot base coordinate system as A (x)a,ya) The coordinate of the fixed point under the robot base coordinate system is C (x)c,yc) The coordinate of the placement point under the robot base coordinate system is B (x)b,yb) Then, under the robot base coordinate system, the moving distance h in the Y-axis direction is | Ya-ycI, the movement distance in the X-axis direction is b ═ Xa-xbI, X-direction distance b between AC1=|xa-xcL, |; the one-time movement time T of the robot is determined according to the working beats of the hair-planting machine and the packaging machine, and the movement time T in the X-axis directionxT; let the movement time in the Y-axis direction be tyDue to b and txIt is known to determine the maximum acceleration a in the X-axis directionmax(ii) a According to amaxAnd b1Calculating the time t from the point A to the point CACAnd t isy=tAC(ii) a And (3) establishing a parameter equation of the interpolation point according to the motion rules in the X-axis direction and the Y-axis direction, integrating the obtained parameter equation expression to obtain the length of a curve from the point A to the point C, and completely determining the motion track of the robot execution end at the moment.
5. The vision-based dynamic loading system for an electric toothbrush brushhead of any one of claims 1 to 4, wherein: the robot is a horizontal multi-joint robot and has 4 degrees of freedom, and comprises a robot body, a tail end grabbing device and a motion controller;
the robot body comprises a first rotary joint, a big arm, a second rotary joint, a small arm, a third rotary joint and a fourth linear joint; the first rotary joint comprises a base (5-1), a servo motor (5-2) and a hollow rotary platform (5-3), the servo motor is connected with the input end of the hollow rotary platform, the hollow rotary platform is installed on the base, and the output disc surface is fixedly connected with a large arm (5-4); the second rotary joint comprises a servo motor (5-5), a speed reducer (5-6), a synchronous belt wheel set (5-21), a transmission shaft (5-9), a locking nut (5-22), a bearing (5-8) and a bearing seat (5-7), wherein the servo motor (5-5) is fixedly connected with the input end of the speed reducer (5-6) and is arranged at the position of the large arm (5-4) and close to the first rotary joint, the output shaft of the speed reducer (5-6) and the transmission shaft (5-9) are driven by the synchronous belt wheel set (5-21), the transmission shaft (5-9) is supported and arranged in the bearing seat (5-7) through the bearing (5-8) and is locked by the locking nut (5-22), and the bearing seat (5-7) is arranged at the front end of the large arm (5-4), the small arm (5-12) is fixedly connected with the transmission shaft (5-9); the third rotary joint comprises a servo motor (5-10), a synchronous pulley set (5-11), an output shaft (5-13), a sliding bearing (5-14) and a bearing seat (5-15), the servo motor (5-10) is installed on the small arm (5-12) and close to the second rotary joint, the motor (5-10) and the output shaft (5-13) are driven through the synchronous pulley set (5-11), the output shaft (5-13) is installed in the bearing seat (5-15) through the sliding bearing (5-14), and the bearing seat (5-15) is installed at the front end of the small arm (5-12); the fourth linear joint is fixedly connected with the output shaft (5-13) by adopting a double-rod cylinder (5-17);
the tail end grabbing device comprises a support (5-16), a vacuum chuck (5-20), a pneumatic finger (5-18) and a profiling clamping jaw (5-19), wherein the profiling clamping jaw (5-19) is fixedly connected with the pneumatic finger (5-18); in the feeding process, the sucker (5-20) is adsorbed on the plane of the head part of the brush head, and the tail part of the brush head is clamped by the profiling clamping jaw (5-19);
the motion controller is communicated with the computer, receives a control signal of the computer, plans the track of the robot and feeds back robot data to the computer.
6. The vision-based dynamic loading system for a brushhead of an electric toothbrush of claim 5, wherein: the conveying device comprises a hopper (1), a lifter (2) and a conveying belt (4), wherein the hopper is arranged at the starting point of the lifter; the image acquisition device comprises an industrial camera (3), a light source and a camera bracket, wherein the industrial camera (3) is fixedly connected with the bracket and is arranged on one side of the conveying belt; the light source is installed above the conveyer belt.
7. The vision-based dynamic loading system for a brushhead of an electric toothbrush of claim 1, wherein: the system calibration module is also included; the system calibration module comprises camera calibration, robot kinematics calibration, robot eye calibration, tool coordinate system calibration and conveyer belt calibration.
8. A method of loading using the vision based dynamic loading system for a power toothbrush brushhead of any one of claims 1 to 7, comprising:
s1: putting the electric toothbrush head into a hopper, and opening a conveying belt;
s2: the image acquisition device acquires a brush head image and transmits the image to the visual detection software, and the brush head is identified and positioned through image processing to obtain position information and posture information of the brush head;
s3: the grabbing sequence planning module updates the position of the brush head in real time, plans the optimal grabbing sequence and then sends a command to the motion controller;
s4: and the motion controller receives the pose information of the brush head, plans a track according to a given motion path and controls the horizontal multi-joint robot to grab the brush head for dynamic feeding.
9. The dynamic feeding method as claimed in claim 8, wherein the step S2 is implemented by recognizing and positioning the brushhead through image processing, and obtaining the position information and the posture information of the brushhead, and comprises:
s2.1: image preprocessing, namely performing Gaussian filtering on the image acquired by the vision acquisition device, removing noise in the image and performing optical compensation enhancement on the image;
s2.2: cutting the image, removing partial images outside the conveyor belt in the image by cutting, and performing threshold segmentation by adopting an Otsu method;
s2.3: detecting the outline in the segmented image, screening and filtering the outline, and determining the outline of the toothbrush head;
s2.4: filling the outline of the toothbrush head, and calculating the moment of the filled image to obtain the gravity center and the direction angle of the toothbrush head;
s2.5: determining the direction of the toothbrush head through template matching, and obtaining the coordinates of a gripping point of the toothbrush head through ellipse fitting;
s2.6: and performing AND operation on the filled image and the original image to obtain an image of the single toothbrush brush head, extracting a direction gradient histogram of the image, and inputting the histogram into a support vector machine for classification judgment to determine the direction of the brush head.
10. The dynamic loading method according to claim 9, wherein the S2.6 comprises:
s2.6.1: acquiring an image of a single toothbrush head:
Dst=Src&Fill
wherein Dst is an image of a single toothbrush brush head, Src is an original image, and Fill is an image after filling;
s2.6.2: performing Gamma normalization processing on the image color;
s2.6.3: calculating the gradient of the image in the horizontal and vertical directions to obtain the gradient amplitude of each pixel point (x, y)
Figure FDA0003290100320000051
And direction of
Figure FDA0003290100320000052
fx(x,y)=f(x,y)-f(x+1,y)
fy(x,y)=f(x,y)-f(x,y+1)
Figure FDA0003290100320000053
Figure FDA0003290100320000054
S2.6.4: dividing pixel blocks, and calculating a gradient histogram by taking the pixel blocks as units;
s2.6.5: combining the pixel blocks into a block, solving a gradient direction histogram vector of the block by taking the block as a unit, and carrying out normalization processing on the characteristic quantity;
s2.6.5: and (4) collecting HOG features, collecting all overlapped blocks in the detection window, combining the collected HOG features into a feature vector, and inputting the feature vector into a support vector machine for classification.
11. The dynamic loading method according to claim 8, wherein the S3 adopts an improved tabu search algorithm to perform grabbing sequence planning on the image-processed brush head, including:
s3.1: generating an initial solution based on a neighborhood method, selecting a placement point B as a reference, and setting the distance of a point closest to the point B as r; forming an inner circle by taking the B as the center of a circle and the radius r as the radius, and forming an outer circle (beta >1) by taking the beta r as the radius to form a circular ring as a neighborhood of the B; a plurality of brush heads are used as targets in the neighborhood of B, and the next target is randomly selected from the targets and added into the initial solution; until all identified brush heads are added to the initial solution;
s3.2: adopting a multi-target search strategy, randomly generating a plurality of initial solutions by adopting a neighborhood method when generating the initial solutions, then respectively carrying out iterative search on the initial solutions, and finally selecting a grabbing sequence with an optimal evaluation function as a global optimal solution; each initial solution shares a tabu table during searching, and information generated in the previous searching process can guide subsequent searching;
s3.3: setting a neighborhood operator and an evaluation function, adopting an exchange operator as the neighborhood operator, randomly selecting two positions in a solution sequence and exchanging corresponding serial numbers of the positions to generate a candidate solution; the overall feeding path of the robot is planned by taking the shortest moving path as a target and taking the total path length as an evaluation function;
S3.4:setting a privilege criterion and a taboo length, if the solution in the taboo table is superior to any one of the candidate solutions which are not taboo, the privilege ensures that the algorithm can search for the optimal solution; the length of the taboo is set as
Figure FDA0003290100320000055
n is the number of candidate solutions;
s3.5: introducing a disturbance mechanism, and if the number of times that the optimal solution is not updated in the process of one-time searching reaches a specified value, adopting a variable neighborhood searching algorithm to carry out disturbance; performing alternate search by taking the current solution as the initial solution of the VNS, updating a tabu table when the solution generated by disturbance is superior to the optimal solution searched at this time, and quitting the disturbance;
s3.6: and setting a stopping criterion, giving the maximum iteration times, finishing the search when the iteration times of the algorithm reach a set value, and outputting a global optimal solution.
12. The dynamic loading method according to claim 9, wherein the S4 performs trajectory planning according to a given path, including:
s4.1: the change rule of the acceleration a along with time is obtained by adopting a flexible 3-4-5 degree polynomial motion rule of acceleration and deceleration and combining the requirements that the speed and the acceleration of the robot execution end are zero at the initial position and the final position:
Figure FDA0003290100320000061
in the formula: s is the total length of the motion path, T is the time for the robot to complete one motion, and T is the current motion moment; the change rule of the speed v along with the time obtained by the acceleration formula is as follows:
Figure FDA0003290100320000062
the change rule of the displacement s along with the time is as follows:
Figure FDA0003290100320000063
when in use
Figure FDA0003290100320000064
When the acceleration is at the maximum value, the maximum acceleration is
Figure FDA0003290100320000065
S4.2: planning a toothbrush grabbing track, namely firstly moving a section of curve from a grabbing point to a fixed point, and then moving a section of straight line from the fixed point to a placing point;
s4.3: by adopting a sectional planning mode, the robot adopts a 3-4-5 degree polynomial motion rule for the motion in the X-axis direction and the motion in the Y-axis direction, and the curve track is realized by synthesizing the motions in the two directions; the robot execution end realizes the grabbing and putting down of the brush head according to the curve track;
s4.4: let the grasping point A (x)a,ya) Fixed point C (x)c,yc) Place point B (x)b,yb) Then the distance h ═ Y moves in the Y-axis directiona-ycI, the movement distance in the X-axis direction is b ═ Xa-xbI, X-direction distance b between AC1=|xa-xcL, |; the one-time movement time T of the robot is determined according to the working beats of the hair-planting machine and the packaging machine, and the movement time T in the X-axis directionx=T;
S4.5: let the movement time in the Y-axis direction be tyDue to b and txIt is known to determine the maximum acceleration a in the X-axis directionmax(ii) a According to amaxAnd b1Calculating the time t from the point A to the point CACAnd t isy=tAC
S4.6: and (3) establishing a parameter equation of the interpolation point according to the motion rules in the X-axis direction and the Y-axis direction, integrating the obtained parameter equation expression to obtain the length of a curve from the point A to the point C, and completely determining the motion track of the robot execution end at the moment.
CN202111161544.2A 2021-09-30 2021-09-30 Dynamic feeding system and method for electric toothbrush brush head based on vision Active CN113911728B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111161544.2A CN113911728B (en) 2021-09-30 2021-09-30 Dynamic feeding system and method for electric toothbrush brush head based on vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111161544.2A CN113911728B (en) 2021-09-30 2021-09-30 Dynamic feeding system and method for electric toothbrush brush head based on vision

Publications (2)

Publication Number Publication Date
CN113911728A true CN113911728A (en) 2022-01-11
CN113911728B CN113911728B (en) 2023-02-28

Family

ID=79237460

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111161544.2A Active CN113911728B (en) 2021-09-30 2021-09-30 Dynamic feeding system and method for electric toothbrush brush head based on vision

Country Status (1)

Country Link
CN (1) CN113911728B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114800544A (en) * 2022-03-09 2022-07-29 始途科技(杭州)有限公司 Robot control method, robot control device and robot
CN115469555A (en) * 2022-11-14 2022-12-13 中国科学院光电技术研究所 Space image prediction and image quality optimization method for sensor chip projection lithography machine
CN115676383A (en) * 2022-12-29 2023-02-03 长沙湘丰智能装备股份有限公司 Zongzi leaf feeding control system and method based on symmetrical distribution
CN116494248A (en) * 2023-06-26 2023-07-28 深圳市长荣科机电设备有限公司 Visual positioning method of industrial robot
CN117125469A (en) * 2023-09-12 2023-11-28 天津锐新昌科技股份有限公司 Automatic loading and unloading control method, system, device, equipment and medium for radiating fins

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090065330A1 (en) * 2007-09-07 2009-03-12 Dematic Corp. Conveyor systems
CN106557844A (en) * 2016-11-23 2017-04-05 华东理工大学 A kind of welding robot paths planning method
CN110153987A (en) * 2019-06-26 2019-08-23 东北大学秦皇岛分校 A kind of intelligent recognition transfer robot and its control method
CN111573125A (en) * 2020-05-11 2020-08-25 盐城工学院 Modular intelligent logistics system planning and designing method based on omnidirectional wheel

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090065330A1 (en) * 2007-09-07 2009-03-12 Dematic Corp. Conveyor systems
CN106557844A (en) * 2016-11-23 2017-04-05 华东理工大学 A kind of welding robot paths planning method
CN110153987A (en) * 2019-06-26 2019-08-23 东北大学秦皇岛分校 A kind of intelligent recognition transfer robot and its control method
CN111573125A (en) * 2020-05-11 2020-08-25 盐城工学院 Modular intelligent logistics system planning and designing method based on omnidirectional wheel

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114800544A (en) * 2022-03-09 2022-07-29 始途科技(杭州)有限公司 Robot control method, robot control device and robot
CN115469555A (en) * 2022-11-14 2022-12-13 中国科学院光电技术研究所 Space image prediction and image quality optimization method for sensor chip projection lithography machine
CN115469555B (en) * 2022-11-14 2023-03-31 中国科学院光电技术研究所 Space image prediction and image quality optimization method for sensor chip projection lithography machine
CN115676383A (en) * 2022-12-29 2023-02-03 长沙湘丰智能装备股份有限公司 Zongzi leaf feeding control system and method based on symmetrical distribution
CN115676383B (en) * 2022-12-29 2023-03-07 长沙湘丰智能装备股份有限公司 Zongzi leaf feeding control system and method based on symmetrical distribution
CN116494248A (en) * 2023-06-26 2023-07-28 深圳市长荣科机电设备有限公司 Visual positioning method of industrial robot
CN116494248B (en) * 2023-06-26 2023-08-29 深圳市长荣科机电设备有限公司 Visual positioning method of industrial robot
CN117125469A (en) * 2023-09-12 2023-11-28 天津锐新昌科技股份有限公司 Automatic loading and unloading control method, system, device, equipment and medium for radiating fins
CN117125469B (en) * 2023-09-12 2024-03-15 天津锐新昌科技股份有限公司 Automatic loading and unloading control method, system, device, equipment and medium for radiating fins

Also Published As

Publication number Publication date
CN113911728B (en) 2023-02-28

Similar Documents

Publication Publication Date Title
CN113911728B (en) Dynamic feeding system and method for electric toothbrush brush head based on vision
CN110509281A (en) The apparatus and method of pose identification and crawl based on binocular vision
CN108182689B (en) Three-dimensional identification and positioning method for plate-shaped workpiece applied to robot carrying and polishing field
CN107618030B (en) Robot dynamic tracking grabbing method and system based on vision
CN112132889A (en) Soft magnet posture recognition and automatic grabbing method based on binocular vision
CN111347411B (en) Two-arm cooperative robot three-dimensional visual recognition grabbing method based on deep learning
CN107627299B (en) A kind of kinematic parameter errors scaling method of rope driving parallel robot
CN106546173B (en) Device for detecting components and detection method thereof
CN107053173A (en) The method of robot grasping system and grabbing workpiece
CN111462154B (en) Target positioning method and device based on depth vision sensor and automatic grabbing robot
CN111645074A (en) Robot grabbing and positioning method
CN103529855A (en) Rotary adjustable binocular vision target recognition and positioning device and application thereof in agricultural fruit harvesting machinery
CN109454638A (en) A kind of robot grasping system of view-based access control model guidance
CN111067197A (en) Robot sole dynamic gluing system and method based on 3D scanning
CN108501009A (en) A kind of Jian Dan robots
CN106925530A (en) Camera lens automatic sorting device
CN107009357A (en) A kind of method that object is captured based on NAO robots
CN106144524A (en) With CCD vision positioning method and device in a kind of high-speed motion
CN113689509A (en) Binocular vision-based disordered grabbing method and system and storage medium
US20220241982A1 (en) Work robot and work system
CN114758236A (en) Non-specific shape object identification, positioning and manipulator grabbing system and method
CN113519272A (en) Vision recognition-based small fruit picking robot with bionic centipede claw structure
CN108393676B (en) Model setting method for automatic makeup assembly
CN115861780A (en) Mechanical arm detection and grabbing method based on YOLO-GGCNN
CN206748400U (en) A kind of industrial carrying machine people based on autonomous learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant