CN116922400B - Robot hand system and method based on image recognition - Google Patents

Robot hand system and method based on image recognition Download PDF

Info

Publication number
CN116922400B
CN116922400B CN202311193116.7A CN202311193116A CN116922400B CN 116922400 B CN116922400 B CN 116922400B CN 202311193116 A CN202311193116 A CN 202311193116A CN 116922400 B CN116922400 B CN 116922400B
Authority
CN
China
Prior art keywords
execution
connecting rod
grabbing
grabbed
executing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311193116.7A
Other languages
Chinese (zh)
Other versions
CN116922400A (en
Inventor
何深宝
林嘉基
唐香君
张颍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Panfeng Precision Technology Co Ltd
Original Assignee
Shenzhen Panfeng Precision Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Panfeng Precision Technology Co Ltd filed Critical Shenzhen Panfeng Precision Technology Co Ltd
Priority to CN202311193116.7A priority Critical patent/CN116922400B/en
Publication of CN116922400A publication Critical patent/CN116922400A/en
Application granted granted Critical
Publication of CN116922400B publication Critical patent/CN116922400B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J15/00Gripping heads and other end effectors
    • B25J15/08Gripping heads and other end effectors having finger members
    • B25J15/10Gripping heads and other end effectors having finger members with three or more finger members
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Abstract

The invention relates to the technical field of robots, and particularly discloses a robot system based on image recognition, which is used for recognizing a contour region of an object to be grasped in a picture based on the image recognition, extracting the contour of the contour region according to a color recognition interval, acquiring a grasping position region, and acquiring an included angle between the normal line of the bottom of the object to be grasped and the central axis of a conveying beltThen drive first articulated arm through first execution motor operation and aim at the location area of snatching respectively, drive first articulated arm through second execution motor operation and rotate, make second articulated arm and snatch the location area and be parallel, then drive three second articulated arm through the execution cylinder and reach the location area of snatching in step, according to the size and the gesture of waiting to snatch the article, adjust the position of second articulated arm, obtain waiting to snatch article weight m, then drive the rack through drive gear rotation and remove, the rack removes and drives second articulated arm clamp waiting to snatch the article, and adjust the grabbing force according to article weight.

Description

Robot hand system and method based on image recognition
Technical Field
The invention relates to the technical field of robots, and particularly discloses a robot system and a robot method based on image recognition.
Background
A robot is an automatic operating device that mimics certain motion functions of a human hand and arm for grasping, handling objects or operating tools in a fixed program. The method is characterized in that various expected operations can be completed through programming, and the method has the advantages of both human and manipulator machines in terms of construction and performance.
The existing robot is mainly used for grabbing objects on the conveyor belt, but when the existing robot grabs objects, the grabbing position is fixed, the grabbing range is fixed, and the grabbing force is fixed, so that the types of workpieces grabbed by the robot are limited definitely, the placing position of the workpieces is required, and the situations of auxiliary processing still need to be carried out manually.
Disclosure of Invention
The present invention is directed to a robot system and a method based on image recognition, which are used for solving the problems faced in the background art.
The aim of the invention can be achieved by the following technical scheme:
the robot system based on image recognition comprises a control component and an execution component, wherein the control component is used for recognizing the gesture and the size of goods to be grabbed based on the image, generating a grabbing strategy by combining the weight of the goods to be grabbed, and controlling the execution component to grab the goods according to the grabbing strategy;
the execution component is used for executing the grabbing action according to the grabbing strategy generated by the control component.
As a further description of the scheme of the invention, the executing assembly comprises a first executing motor, wherein the output shaft of the first executing motor is fixedly connected with a first driving gear, an executing mechanism is arranged on the side of the first driving gear, and three executing robots distributed in an array are arranged below the executing mechanism.
As a further description of the solution of the present invention, the actuator comprises an actuator gear, the actuator gear is meshed with the first driving gear, and a grabbing mechanism is arranged on the actuator gear.
As a further description of the scheme of the invention, the executing gear comprises an executing gear body, three mounting blocks distributed in an array are arranged at the top of the executing gear body, a sliding groove is formed in each mounting block, a mounting groove penetrating through the executing gear body is formed in the bottom of each sliding groove, and a fixing rod is further arranged at the top of the executing gear body.
As a further description of the scheme of the invention, the grabbing mechanism comprises an execution cylinder, one end of the execution cylinder is movably connected with a fixed rod, the other end of the execution cylinder is movably connected with a first connecting rod, the middle of the first connecting rod is movably connected with an execution gear body, the other end of the first connecting rod is movably connected with a second connecting rod, one end of the second connecting rod is movably connected with a third connecting rod, the middle of the third connecting rod is movably connected with the execution gear body, the other end of the third connecting rod is movably connected with a fourth connecting rod, one end of the fourth connecting rod is movably connected with a fifth connecting rod, the fifth connecting rod is movably connected with the execution gear body, and three moving mechanisms distributed in a circumferential array are arranged above the execution gear body.
As a further description of the scheme of the invention, the moving mechanism comprises a sliding block, the sliding block is arranged in a sliding groove in a sliding way, the bottom of the sliding block is provided with a connecting block, the connecting block penetrates through the mounting groove and is in sliding connection with the side wall of the mounting groove, one end of the sliding block is rotationally connected with a pushing block, two rotating rods which are symmetrically distributed are arranged on the upper side and the lower side of the pushing block, and three pushing blocks are respectively and movably connected with a first connecting rod, a third connecting rod and a fifth connecting rod through the rotating rods.
According to the technical scheme, the execution robot comprises an installation box, three installation boxes are respectively and fixedly connected with a connecting block, a second execution motor is fixedly installed in the installation box, an output shaft of the second execution motor is fixedly connected with a worm, one end of the worm is rotatably connected with the inner wall of the installation box, a worm wheel is arranged in the installation box and meshed with the worm, the worm wheel is rotatably connected with the inner wall of the installation box, one side of the worm wheel is fixedly connected with a first joint arm, the first joint arm is rotatably connected with the side wall of the installation box, one end of the first joint arm is rotatably connected with a second driving gear, the second driving gear is rotatably controlled to rotate through a motor, a rack is meshed with the second driving gear, the rack is slidably arranged in the first joint arm, one end of the rack is fixedly connected with a spring, the other end of the spring is fixedly connected with the inner arm of the first joint arm, and the other end of the rack extends out of the first joint arm and is fixedly connected with a second joint arm.
As a further description of the solution of the present invention, the control component includes an image acquisition module, an analysis module and a control module;
the image acquisition module is used for acquiring an image of an object to be grabbed;
the analysis module is used for carrying out feature recognition on the acquired image, judging the pre-grabbing state of the object to be grabbed according to the result of the feature recognition, and generating a grabbing strategy according to the judging result;
the control module is used for controlling the execution assembly to grasp the article to be grasped according to the grasping strategy.
A method of using a robotic system, such as based on image recognition, the method comprising the steps of:
the using method of the robot system based on image recognition comprises the following steps:
step S1, the image acquisition module is used for emitting light with a preset wavelength to an object to be grabbed by utilizing a light source generating device;
s2, identifying a contour area of the object to be grabbed in the picture based on the image;
s3, determining a corresponding color recognition interval according to a preset wavelength, and extracting the outline of the outline area according to the color recognition interval to obtain a grabbing position area;
s4, acquiring an included angle between the normal line at the bottom of the object to be grabbed and the central axis of the conveying belt
S5, acquiring the weight m of the object to be grabbed;
s6, generating a grabbing strategy according to the data acquired in the steps S3-S5;
and S7, the control component controls the execution component to execute the grabbing action according to the grabbing strategy.
As a further description of the scheme of the present invention, the process of generating the capture strategy is as follows:
acquiring an included angle between the normal line of the bottom of the object to be grabbed and the central axis of the conveyer beltThen the first driving gear is driven to rotate n times by the operation of the first executing motor, wherein ∈10>
The first articulated arm is driven to rotate through the operation of the second executing motor, the second articulated arm is parallel to the grabbing position area, the linear distance x between the most edge of one side of the second articulated arm, which is close to the grabbing position area, and the grabbing position area is obtained, then the three second articulated arms are driven to synchronously reach the grabbing position area through the executing air cylinder, the stroke of the executing air cylinder is L,
acquiring the weight m of the object to be grabbed, driving a rack to move through rotation of a second driving gear, and driving a second joint arm to clamp the object to be grabbed through movement of the rack, wherein the clamping force of the second joint arm is F, and the deformation amount of a spring isWherein, the method comprises the steps of, wherein,,/>
in the method, in the process of the invention,to execute the tooth number of the gear body->For the number of teeth of the first drive gear, +.>As the stroke-transforming coefficient(s),k is the spring rate and g is the gravitational acceleration.
The invention has the beneficial effects that: the invention identifies the outline area of the object to be grabbed in the picture based on the image, determines the corresponding color identification interval according to the preset wavelength size, and determines the color identification interval according to the colorThe color recognition interval extracts the outline of the outline area, acquires the grabbing position area, and acquires the included angle between the normal line of the bottom of the object to be grabbed and the central axis of the conveyer beltThen drive the drive gear through first execution motor operation and rotate n circles, make first joint arm aim at respectively and snatch the position area, drive first joint arm through second execution motor operation and rotate, make second joint arm and snatch the position area parallel, then drive three second joint arms through the execution cylinder and reach and snatch the position area in step, according to the size and the gesture of waiting to snatch the article, adjust the position of second joint arm, obtain waiting to snatch article weight m, then drive the rack through second drive gear rotation, the rack removes and drives second joint arm clamp and wait to snatch the article, and adjust the grabbing force according to article weight.
Of course, it is not necessary for any one product to practice the invention to achieve all of the advantages set forth above at the same time.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of the structure of the image recognition-based robotic system execution assembly of the present invention;
FIG. 2 is a schematic diagram of the structure of the manipulator system actuator based on image recognition according to the present invention;
FIG. 3 is a schematic diagram of a gear structure of the image recognition-based robotic system of the present invention;
FIG. 4 is a schematic diagram of a grabbing mechanism of the robot system based on image recognition;
FIG. 5 is a schematic diagram of a motion mechanism of the robot system based on image recognition;
FIG. 6 is a schematic diagram of a robot system execution robot based on image recognition according to the present invention;
fig. 7 is a schematic structural diagram of a control component of the robot system based on image recognition.
In the figure, 1, a first executing motor; 2. a first drive gear; 3. an actuator; 4. executing a robot; 31. an execution gear; 32. a grabbing mechanism; 41. installing a box; 42. a second execution motor; 43. a worm; 44. a worm wheel; 45. a first articulated arm; 46. a second drive gear; 47. a rack; 48. a spring; 49. a second articulated arm; 311. executing a gear body; 312. a mounting block; 313. a sliding groove; 314. a mounting groove; 315. a fixed rod; 321. an execution cylinder; 322. a first connecting rod; 323. a third connecting rod; 324. a second connecting rod; 325. a fourth connecting rod; 326. a fifth connecting rod; 327. a movement mechanism; 3271. a sliding block; 3272. a connecting block; 3273. a pushing block; 3274. and rotating the rod.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The robot system based on image recognition comprises a control component and an execution component, wherein the control component is used for recognizing the gesture and the size of goods to be grabbed based on the image, generating a grabbing strategy by combining the weight of the goods to be grabbed, and controlling the execution component to grab the goods according to the grabbing strategy;
the execution component is used for executing the grabbing action according to the grabbing strategy generated by the control component.
Referring to fig. 1, the executing assembly includes a first executing motor 1, an output shaft of the first executing motor 1 is fixedly connected with a first driving gear 2, an executing mechanism 3 is disposed on a side of the first driving gear 2, and three executing robots 4 distributed in an array are disposed below the executing mechanism 3.
Referring to fig. 2, the actuator 3 includes an actuator gear 31, the actuator gear 31 is meshed with the first driving gear 2, and a grabbing mechanism 32 is disposed on the actuator gear 31.
Referring to fig. 3, the executing gear 31 includes an executing gear body 311, three mounting blocks 312 distributed in an array are disposed at the top of the executing gear body 311, each mounting block 312 is provided with a sliding groove 313, a mounting groove 314 penetrating the executing gear body 311 is disposed at the bottom of each sliding groove 313, and a fixing rod 315 is further disposed at the top of the executing gear body 311.
Referring to fig. 4, the grabbing mechanism 32 includes an actuating cylinder 321, one end of the actuating cylinder 321 is movably connected with a fixed rod 315, the other end of the actuating cylinder 321 is movably connected with a first connecting rod 322, the middle of the first connecting rod 322 is movably connected with an actuating gear body 311, the other end of the first connecting rod 322 is movably connected with a second connecting rod 324, one end of the second connecting rod 324 is movably connected with a third connecting rod 323, the middle of the third connecting rod 323 is movably connected with the actuating gear body 311, the other end of the third connecting rod 323 is movably connected with a fourth connecting rod 325, one end of the fourth connecting rod 325 is movably connected with a fifth connecting rod 326, the fifth connecting rod 326 is movably connected with the actuating gear body 311, and three moving mechanisms 327 distributed in a circumferential array are arranged above the actuating gear body 311.
Referring to fig. 5, the moving mechanism 327 includes a sliding block 3271, the sliding block 3271 is slidably disposed in the sliding groove 313, a connecting block 3272 is disposed at the bottom of the sliding block 3271, the connecting block 3272 penetrates through the mounting groove 314, the connecting block 3272 is slidably connected with a side wall of the mounting groove 314, one end of the sliding block 3271 is rotatably connected with a pushing block 3273, two symmetrically distributed rotating rods 3274 are disposed on the upper and lower sides of the pushing block 3273, and three pushing blocks 3273 are movably connected with the first connecting rod 322, the third connecting rod 323 and the fifth connecting rod 326 through the rotating rods 3274 respectively.
When in use, the execution cylinder 321 operates to drive the first connecting rod 322 to rotate, the first connecting rod 322 rotates to drive the second connecting rod 324 to rotate, the second connecting rod 324 rotates to drive the third connecting rod 323 to rotate, the third connecting rod 323 rotates to drive the fourth connecting rod 325 to rotate, the fourth connecting rod 325 rotates to drive the fifth connecting rod 326 to rotate, the first connecting rod 322, the third connecting rod 323 and the fifth connecting rod 326 simultaneously rotate to drive the pushing block 3273 to move, the pushing block 3273 moves to drive the sliding block 3271 to slide along the sliding groove 313, and therefore three execution robots 4 are driven to move relatively.
Referring to fig. 6, the actuating robot 4 includes a mounting box 41, three mounting boxes 41 are fixedly connected with a connecting block 3272, a second actuating motor 42 is fixedly mounted in the mounting box 41, an output shaft of the second actuating motor 42 is fixedly connected with a worm 43, one end of the worm 43 is rotatably connected with an inner wall of the mounting box 41, a worm wheel 44 is disposed in the mounting box 41, the worm wheel 44 is meshed with the worm 43, the worm wheel 44 is rotatably connected with the inner wall of the mounting box 41, one side of the worm wheel 44 is fixedly connected with a first joint arm 45, the first joint arm 45 is rotatably connected with a side wall of the mounting box 41, a second driving gear 46 is rotatably connected with one end of the first joint arm 45, the second driving gear 46 is rotatably controlled to rotate by a motor, the second driving gear 46 is meshed with a rack 47, the rack 47 is slidably disposed in the first joint arm 45, one end of the rack 47 is fixedly connected with a spring 48, the other end of the spring 48 is fixedly connected with the inner arm of the first joint arm 45, and the other end of the rack 47 extends out of the first joint arm 45 to fixedly connect with a second joint arm 49.
When the clamping device is used, the second execution motor 42 is operated to drive the worm 43 to rotate, the worm 43 rotates to drive the worm wheel 44 to rotate, the worm wheel 44 rotates to drive the first joint arm 45 to rotate, so that the position of the second joint arm 49 is adjusted, the second joint arm 49 is kept parallel to the clamping position of an article to be clamped, the second driving gear 46 rotates to drive the rack 47 to move, and the second joint arm 49 is driven to clamp the article to be clamped.
Referring to fig. 7, the control assembly includes an image acquisition module, an analysis module and a control module;
the image acquisition module is used for acquiring an image of an object to be grabbed;
the analysis module is used for carrying out feature recognition on the acquired image, judging the pre-grabbing state of the object to be grabbed according to the result of the feature recognition, and generating a grabbing strategy according to the judging result;
the control module is used for controlling the execution assembly to grasp the article to be grasped according to the grasping strategy.
A method of using a robotic system based on image recognition, the method comprising the steps of:
the using method of the robot system based on image recognition comprises the following steps:
step S1, the image acquisition module is used for emitting light with a preset wavelength to an object to be grabbed by utilizing a light source generating device;
s2, identifying a contour area of the object to be grabbed in the picture based on the image;
s3, determining a corresponding color recognition interval according to a preset wavelength, and extracting the outline of the outline area according to the color recognition interval to obtain a grabbing position area;
s4, acquiring an included angle between the normal line at the bottom of the object to be grabbed and the central axis of the conveying belt
S5, acquiring the weight m of the object to be grabbed;
s6, generating a grabbing strategy according to the data acquired in the steps S3-S5;
the process of the grabbing strategy generation is as follows:
acquiring an included angle between the normal line of the bottom of the object to be grabbed and the central axis of the conveyer beltThen, the first driving gear 2 is rotated n times by the operation of the first actuator motor 1, wherein,/-therein>
Running the belt by a second actuator motor 42The first joint arm 45 is moved to rotate, the second joint arm 49 is parallel to the grabbing position area, the linear distance x between the most edge of one side of the second joint arm 49, which is close to the grabbing position area, and the grabbing position area is obtained, then the three second joint arms 49 are driven by the execution cylinder 321 to synchronously reach the grabbing position area, the stroke of the execution cylinder 321 is L, wherein,
acquiring the weight m of the object to be grabbed, driving the rack 47 to move through the rotation of the second driving gear 46, driving the second joint arm 49 to clamp the object to be grabbed through the movement of the rack 47, wherein the clamping force of the second joint arm 49 is F, and the deformation amount of the spring 48 isWherein->,/>
In the method, in the process of the invention,to execute the number of teeth of the gear body 311 +.>For the number of teeth of the first drive gear 2, +.>For the travel conversion factor, +.>K is the spring rate and g is the gravitational acceleration.
And S7, the control component controls the execution component to execute the grabbing action according to the grabbing strategy.
The control assembly sets the working parameters of all the execution components in the execution assembly to complete grabbing, and the execution assembly returns to the initial position after grabbing is completed.
Working principle: in this embodiment, the executing component is located right above the conveyor belt, the object to be grabbed can fall into the conveyor belt in any posture, when the object is conveyed to the working range of the executing component by the conveyor belt, the outline area of the object to be grabbed in the picture is identified based on the image, the corresponding color identification interval is determined according to the preset wavelength, the outline area is extracted according to the color identification interval, the grabbing position area is obtained, and the included angle between the normal line of the bottom of the object to be grabbed and the central axis of the conveyor belt is obtainedThen, the first actuating motor 1 is operated to drive the first driving gear 2 to rotate n circles, so that the first joint arms 45 are aligned to the grabbing position areas respectively, the second actuating motor 42 is operated to drive the first joint arms 45 to rotate, so that the second joint arms 49 are parallel to the grabbing position areas, then the actuating cylinders 321 drive the three second joint arms 49 to synchronously reach the grabbing position areas, the positions of the second joint arms 49 are adjusted according to the size and the gesture of an object to be grabbed, the weight m of the object to be grabbed is obtained, then the second driving gear 46 is operated to drive the racks 47 to move, the racks 47 move to drive the second joint arms 49 to clamp the object to be grabbed, and grabbing force is adjusted according to the weight of the object to be grabbed.
The foregoing describes one embodiment of the present invention in detail, but the description is only a preferred embodiment of the present invention and should not be construed as limiting the scope of the invention. All equivalent changes and modifications within the scope of the present invention are intended to be covered by the present invention.

Claims (4)

1. The robot system based on the image recognition comprises a control component and an execution component, and is characterized in that the control component is used for recognizing the gesture and the size of the goods to be grabbed based on the image, generating a grabbing strategy by combining the weight of the goods to be grabbed, and controlling the execution component to grab the goods according to the grabbing strategy;
the execution component is used for executing the grabbing action according to the grabbing strategy generated by the control component;
the execution assembly comprises a first execution motor (1), wherein an output shaft of the first execution motor (1) is fixedly connected with a first driving gear (2), an execution mechanism (3) is arranged on the side of the first driving gear (2), and three execution robots (4) distributed in an array are arranged below the execution mechanism (3);
the actuating mechanism (3) comprises an actuating gear (31), the actuating gear (31) is meshed with the first driving gear (2), and a grabbing mechanism (32) is arranged on the actuating gear (31);
the executing gear (31) comprises an executing gear body (311), three installing blocks (312) distributed in an array are arranged at the top of the executing gear body (311), sliding grooves (313) are formed in each installing block (312), installing grooves (314) penetrating through the executing gear body (311) are formed in the bottom of each sliding groove (313), and fixing rods (315) are further arranged at the top of the executing gear body (311);
the grabbing mechanism (32) comprises an execution cylinder (321), one end of the execution cylinder (321) is movably connected with a fixed rod (315), the other end of the execution cylinder (321) is movably connected with a first connecting rod (322), the middle of the first connecting rod (322) is movably connected with an execution gear body (311), the other end of the first connecting rod (322) is movably connected with a second connecting rod (324), one end of the second connecting rod (324) is movably connected with a third connecting rod (323), the middle of the third connecting rod (323) is movably connected with the execution gear body (311), the other end of the third connecting rod (323) is movably connected with a fourth connecting rod (325), one end of the fourth connecting rod (325) is movably connected with a fifth connecting rod (326), and three moving mechanisms (327) distributed in a circumferential array are arranged above the execution gear body (311);
the motion mechanism (327) comprises a sliding block (3271), the sliding block (3271) is arranged in a sliding groove (313) in a sliding way, a connecting block (3272) is arranged at the bottom of the sliding block (3271), the connecting block (3272) penetrates through the mounting groove (314), the connecting block (3272) is in sliding connection with the side wall of the mounting groove (314), one end of the sliding block (3271) is rotationally connected with a pushing block (3273), two rotating rods (3274) which are symmetrically distributed are arranged on the upper side and the lower side of the pushing block (3273), and the three pushing blocks (3273) are respectively movably connected with a first connecting rod (322), a third connecting rod (323) and a fifth connecting rod (326) through the rotating rods (3274);
the execution robot (4) comprises a mounting box (41), three mounting boxes (41) are fixedly connected with a connecting block (3272) respectively, a second execution motor (42) is fixedly mounted in the mounting boxes (41), an output shaft of the second execution motor (42) is fixedly connected with a worm (43), one end of the worm (43) is rotationally connected with the inner wall of the mounting boxes (41), a worm wheel (44) is arranged in the mounting boxes (41), the worm wheel (44) is meshed with the worm (43), the worm wheel (44) is rotationally connected with the inner wall of the mounting boxes (41), one side of the worm wheel (44) is fixedly connected with a first joint arm (45), the first joint arm (45) is rotationally connected with the side wall of the mounting boxes (41), one end of the first joint arm (45) is rotationally connected with a second driving gear (46), the second driving gear (46) is rotationally controlled by the motor, the second driving gear (46) is meshed with a rack (47), the rack (47) is slidingly arranged in the first joint arm (45), one end of the rack (45) is fixedly connected with a spring (48), the other end of the rack (47) extends out of the first joint arm (45) and is fixedly connected with a second joint arm (49).
2. The image recognition-based robotic system of claim 1, wherein the control assembly comprises an image acquisition module, an analysis module, and a control module;
the image acquisition module is used for acquiring an image of an object to be grabbed;
the analysis module is used for carrying out feature recognition on the acquired image, judging the pre-grabbing state of the object to be grabbed according to the result of the feature recognition, and generating a grabbing strategy according to the judging result;
the control module is used for controlling the execution assembly to grasp the article to be grasped according to the grasping strategy.
3. A method of using the image recognition-based robotic system of claim 2, the method comprising the steps of:
the using method of the robot system based on image recognition comprises the following steps:
step S1, the image acquisition module is used for emitting light with a preset wavelength to an object to be grabbed by utilizing a light source generating device;
s2, identifying a contour area of the object to be grabbed in the picture based on the image;
s3, determining a corresponding color recognition interval according to a preset wavelength, and extracting the outline of the outline area according to the color recognition interval to obtain a grabbing position area;
s4, acquiring an included angle between the normal line at the bottom of the object to be grabbed and the central axis of the conveying belt
S5, acquiring the weight m of the object to be grabbed;
s6, generating a grabbing strategy according to the data acquired in the steps S3-S5;
and S7, the control component controls the execution component to execute the grabbing action according to the grabbing strategy.
4. The method for using the robot system based on image recognition according to claim 3, wherein the process of generating the capture strategy is:
acquiring an included angle between the normal line of the bottom of the object to be grabbed and the central axis of the conveyer beltThen, the first driving gear (2) is driven to rotate n times by the operation of the first executing motor (1), wherein +.>
The first articulated arm (45) is driven to rotate by the operation of the second executing motor (42), the second articulated arm (49) is parallel to the grabbing position area, the linear distance x from the most edge of one side of the second articulated arm (49) close to the grabbing position area is obtained, then the three second articulated arms (49) are driven by the executing air cylinder (321) to synchronously reach the grabbing position area, the stroke of the executing air cylinder (321) is L,
acquiring the weight m of the object to be grabbed, then driving a rack (47) to move through rotation of a second driving gear (46), driving a second joint arm (49) to clamp the object to be grabbed through movement of the rack (47), wherein the clamping force of the second joint arm (49) is F, and the deformation of a spring (48) isWherein->,/>
In the method, in the process of the invention,for executing the number of teeth of the gear body (311), +.>For the number of teeth of the first drive gear (2), ->For the travel conversion factor, +.>K is the spring rate and g is the gravitational acceleration.
CN202311193116.7A 2023-09-15 2023-09-15 Robot hand system and method based on image recognition Active CN116922400B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311193116.7A CN116922400B (en) 2023-09-15 2023-09-15 Robot hand system and method based on image recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311193116.7A CN116922400B (en) 2023-09-15 2023-09-15 Robot hand system and method based on image recognition

Publications (2)

Publication Number Publication Date
CN116922400A CN116922400A (en) 2023-10-24
CN116922400B true CN116922400B (en) 2023-12-08

Family

ID=88386491

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311193116.7A Active CN116922400B (en) 2023-09-15 2023-09-15 Robot hand system and method based on image recognition

Country Status (1)

Country Link
CN (1) CN116922400B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR200379813Y1 (en) * 2005-01-04 2005-03-28 김천경 rotary prop that power supply is available
CN213054885U (en) * 2020-07-31 2021-04-27 无锡朗琦金属科技有限公司 Intelligent manipulator with adjustable gripping strength
CN114998578A (en) * 2022-04-14 2022-09-02 西安建筑科技大学 Dynamic multi-article positioning, grabbing and packaging method and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6640792B2 (en) * 2017-06-30 2020-02-05 ファナック株式会社 Hand control device, hand control method, and hand simulation device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR200379813Y1 (en) * 2005-01-04 2005-03-28 김천경 rotary prop that power supply is available
CN213054885U (en) * 2020-07-31 2021-04-27 无锡朗琦金属科技有限公司 Intelligent manipulator with adjustable gripping strength
CN114998578A (en) * 2022-04-14 2022-09-02 西安建筑科技大学 Dynamic multi-article positioning, grabbing and packaging method and system

Also Published As

Publication number Publication date
CN116922400A (en) 2023-10-24

Similar Documents

Publication Publication Date Title
EP2660011B1 (en) Gripping device for manipulating flexible elements
CN1319702C (en) Movable manipulator system
CN110561396B (en) High-precision quick-reaction manipulator
CN204431245U (en) A kind of circular cylindrical coordinate machinery hand being applicable to workpiece handling
CN110497378A (en) A kind of automatic loading and unloading robot for part processing
CN105328697A (en) Modularized six-degree-freedom mechanical hand and control method thereof
CN1846951A (en) Control device and method for intelligent mobile robot capable of picking up article automatically
CN109845491B (en) Picking and grading method for picking circular grading and collecting robot by utilizing double-arm apples
CN105598476A (en) Robot-assisted numerical control full-automatic production line
CN111618838A (en) Snatch manipulator
CN113618709A (en) Multi-mode force-control lossless grabbing device for intelligent production line
CN116922400B (en) Robot hand system and method based on image recognition
CN111590622A (en) Master-slave cooperative flexible palm surface self-adaptive robot hand device
CN107953339B (en) Glass bottle sorting parallel robot with compliant structure
CN116872186B (en) Industrial robot for intelligent manufacturing workshop
CN205386836U (en) Automatic feed bin system that snatchs of aided industrial robot
CN2810918Y (en) Intelligent mobile robot controller capable of collecting articles automatically
CN109772710B (en) Round fruit transmission grading collection system of double-arm apple picking grading collection robot
CN203680293U (en) Automatic mechanism of robot
CN115229770A (en) Robot vision device and control method
CN213796545U (en) Clamping jaw with vision and force feedback control function
CN206242047U (en) A kind of Wearable drives hydraulic efficiency manipulator
CN205414425U (en) Full automatic production line of supplementary numerical control of robot
EP1885523A1 (en) Robot comprising a lever system with movable fulcra
EP3786743B1 (en) Robot gripper for a transfer system for setting a piece in a required orientation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant