CN116899832B - Dispensing manipulator control system and dispensing device - Google Patents

Dispensing manipulator control system and dispensing device Download PDF

Info

Publication number
CN116899832B
CN116899832B CN202311174454.6A CN202311174454A CN116899832B CN 116899832 B CN116899832 B CN 116899832B CN 202311174454 A CN202311174454 A CN 202311174454A CN 116899832 B CN116899832 B CN 116899832B
Authority
CN
China
Prior art keywords
product
dispensing
gesture
standard
dispensed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311174454.6A
Other languages
Chinese (zh)
Other versions
CN116899832A (en
Inventor
蔡蔓婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Quanfeng Intelligent Equipment Co ltd
Original Assignee
Guangdong Quanfeng Intelligent Equipment Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Quanfeng Intelligent Equipment Co ltd filed Critical Guangdong Quanfeng Intelligent Equipment Co ltd
Priority to CN202311174454.6A priority Critical patent/CN116899832B/en
Publication of CN116899832A publication Critical patent/CN116899832A/en
Application granted granted Critical
Publication of CN116899832B publication Critical patent/CN116899832B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05CAPPARATUS FOR APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05C11/00Component parts, details or accessories not specifically provided for in groups B05C1/00 - B05C9/00
    • B05C11/10Storage, supply or control of liquid or other fluent material; Recovery of excess liquid or other fluent material
    • B05C11/1002Means for controlling supply, i.e. flow or pressure, of liquid or other fluent material to the applying apparatus, e.g. valves
    • B05C11/1015Means for controlling supply, i.e. flow or pressure, of liquid or other fluent material to the applying apparatus, e.g. valves responsive to a conditions of ambient medium or target, e.g. humidity, temperature ; responsive to position or movement of the coating head relative to the target
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05CAPPARATUS FOR APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05C5/00Apparatus in which liquid or other fluent material is projected, poured or allowed to flow on to the surface of the work
    • B05C5/02Apparatus in which liquid or other fluent material is projected, poured or allowed to flow on to the surface of the work the liquid or other fluent material being discharged through an outlet orifice by pressure, e.g. from an outlet device in contact or almost in contact, with the work
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/0075Manipulators for painting or coating
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/04Viewing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a dispensing manipulator control system and a dispensing device, wherein the system comprises: the pick-up unit is arranged on the dispensing manipulator, moves transversely relative to the product along with the dispensing manipulator, and is used for collecting an image of the next product subjected to dispensing; the gesture capturing unit is arranged on the dispensing manipulator, is positioned right below the image capturing unit and is used for collecting gesture detection information of a product passing through the position below the image capturing unit; the controller is electrically connected with the camera unit and the gesture capturing unit and is used for being connected with the dispensing mechanical flashlight; the controller is configured to: analyzing and obtaining the gesture of the next dispensing product based on the image and gesture detection information; analyzing and obtaining the latest space coordinates of the position to be dispensed according to the gesture of the product, the preset standard gesture of the product and the standard space coordinates of the position to be dispensed; and correcting the next dispensing control amount of the dispensing manipulator according to the latest space coordinates of the position to be dispensed. The application has the effect of improving the accuracy of dispensing.

Description

Dispensing manipulator control system and dispensing device
Technical Field
The application relates to the technical field of industrial mechanical arms, in particular to a dispensing mechanical arm control system and a dispensing device.
Background
Along with the popularization of automation technology, traditional insole and electronic component dispensing have been changed from manual gluing to automatic mechanical arm dispensing so as to improve the working efficiency.
At present, a common dispensing manipulator has a dispensing position and a dispensing action which depend on a program written in advance, and the dispensing manipulator runs in a fixed quantity, so that once a workpiece to be dispensed is placed in an inaccurate position, the dispensing is failed. In order to solve the problem, it is common practice to configure a tooling, fix the workpiece to be dispensed on the tooling, and then dispense the glue.
Although the mode can improve the spot gluing lattice rate, a certain time is required for loading and unloading the workpiece to the tool, and when automatic spot gluing is applied to assembly line processing, the transmission deviation of the section from loading to spot gluing of the assembly line will amplify the spot gluing difference caused by the nonstandard position, so that the product qualification rate is affected.
Disclosure of Invention
In order to improve the accuracy of dispensing, the application provides a dispensing manipulator control system and dispensing device.
In a first aspect, the present application provides a dispensing manipulator control system, which adopts the following technical scheme:
a dispensing robot control system, comprising:
the pick-up unit is arranged on the dispensing manipulator, moves transversely relative to the product along with the dispensing manipulator, and is used for collecting an image of the next product subjected to dispensing;
the gesture capturing unit is arranged on the dispensing manipulator, is positioned right below the image capturing unit and is used for collecting gesture detection information of a product passing through the position below the image capturing unit; the controller is electrically connected with the camera unit and the gesture capturing unit and is used for being connected with the dispensing mechanical flashlight;
wherein the controller is configured to:
analyzing and obtaining the gesture of the next dispensing product based on the image and gesture detection information;
analyzing and obtaining the latest space coordinates of the position to be dispensed according to the gesture of the product, the preset standard gesture of the product and the standard space coordinates of the position to be dispensed;
and correcting the next dispensing control amount of the dispensing manipulator according to the latest space coordinates of the position to be dispensed.
In a second aspect, the present application provides a dispensing device, which adopts the following technical scheme:
the dispensing device comprises a dispensing manipulator, and further comprises the dispensing manipulator control system, and the controller is electrically connected to the dispensing manipulator.
Optionally, the dispensing manipulator includes a horizontal movable arm, a longitudinal movable arm and a dispensing head, the longitudinal movable arm is connected to a moving part of the horizontal movable arm, and the dispensing head is mounted on the moving part of the longitudinal movable arm;
the camera shooting unit is connected to the moving part of the horizontal movable arm, and the lens faces to the moving path of the product; the gesture capturing unit is connected to the moving part of the horizontal movable arm and is positioned below the camera.
Optionally, the gesture capturing unit includes a plurality of ranging sensors, and the plurality of ranging sensors are distributed around a lens of the image capturing unit.
Optionally, the analyzing based on the image and the gesture detection information to obtain the gesture of the next dispensing product includes: when the detection value of each ranging sensor is matched with the detection value distribution rule of the standard gesture of the product at a certain moment, determining the gesture of the next dispensing product as the standard gesture.
Optionally, the analyzing based on the image and the gesture detection information to obtain the gesture of the next dispensing product includes:
when the detection value of the ranging sensor is larger than the lower limit threshold of the height of the product, determining that the product passes below the gesture capturing unit, defining the starting time of the product passing as t1 and the leaving time as t2;
if the posture of the product is not determined to be the standard posture within the time length T, carrying out characteristic recognition to be glued on the image within the time from T1 to T2; wherein, (T2-T1)/2 is less than or equal to T < (T2-T1);
searching a preset gesture sample set according to the characteristic identification result to be glued and the change rule of the detection value in the T time length to obtain the matched gesture of the next dispensing product.
Optionally, the analyzing to obtain the latest space coordinate of the position to be dispensed according to the gesture of the product, the preset standard gesture of the product and the standard space coordinate of the position to be dispensed includes:
if the product gesture is a standard gesture, calling a standard space coordinate;
assuming that the detection values of all the ranging sensors at the moment t4 are matched with the detection value distribution rule when the product is in the standard posture, and acquiring the X-axis coordinate value and the Y-axis coordinate value of the dispensing head at the moment t 4;
calculating coordinate differences between X-axis and Y-axis coordinate values of the dispensing head at the moment t4 and an initial position preset by the dispensing head;
and compensating the standard space coordinate by using the coordinate difference to obtain the latest space coordinate of the position to be dispensed.
Optionally, the analyzing to obtain the latest space coordinate of the position to be dispensed according to the gesture of the product, the preset standard gesture of the product and the standard space coordinate of the position to be dispensed includes:
calculating the transverse deflection angle of the product according to the characteristic identification result to be glued and the overlooking outline of the product in the standard posture;
calculating a transverse offset according to the center line of the characteristic identification result to be glued and the center line of the standard product gesture;
calculating the interval deviation between the feeding length L and the standard product interval L;
adjusting a pre-established product three-dimensional model in a matched three-dimensional coordinate system according to a transverse offset, a transverse deflection angle and an interval offset to obtain a new model;
and obtaining the space coordinates of the position to be dispensed of the new model according to the new model.
In summary, the present application includes at least one of the following beneficial technical effects: even if the product is placed in an inaccurate position on the tool or the product is directly conveyed by using a production line without using the tool, the space coordinates of the to-be-dispensed positions of each product can be automatically identified and calculated, and the action of the dispensing manipulator is corrected, so that the dispensing errors caused by fixed deviation and conveying deviation can be reduced, and the dispensing accuracy of the product is improved.
Drawings
FIG. 1 is a schematic diagram of the system architecture of the present application;
FIG. 2 is a schematic structural view of the device of the present application;
fig. 3 is a schematic view of the configuration of fig. 2.
Reference numerals illustrate: 1. an image pickup unit; 2. a ranging sensor; 3. a controller; 41. a horizontal movable arm; 411. a linear motor; 412. a connecting frame; 413. a cross plate; 414. a mounting ring; 42. a longitudinally movable arm; 43. and (5) dispensing the glue head.
Detailed Description
The present application is described in further detail below in conjunction with fig. 1-3.
The embodiment of the application discloses a dispensing manipulator control system.
Referring to fig. 1, the dispensing robot control system includes: an imaging unit 1, an attitude capturing unit, and a controller 3.
The camera unit 1 is arranged on the dispensing manipulator, transversely moves relative to the product along with the dispensing manipulator, and is used for collecting images of the next product to be dispensed.
For traditional camera position fixed mode, above-mentioned setting can cooperate the gesture to catch the better product information that obtains of unit, the reason:
when the product is conveyed by the assembly line or the conveyor belt, the product feeding process is generally only controlled roughly, and two adjacent products cannot be greatly different on the conveyor belt; then the current product dispensing is completed by the dispensing manipulator, the next product is stopped, and in the process of waiting for the next product to be in place, the image capturing unit 1 moving along with the next product can perform image capturing on the next product, and the position of the product in the image is relatively close to the center. The effect is particularly remarkable during automatic feeding, and because the fixed action is used for feeding, the position difference of the same product is relatively not excessive, and as long as the former product can be subjected to dispensing, the image of the latter product can be necessarily acquired.
If the fixed-position camera is adopted, in order to ensure that the image of a product is acquired, the position of the camera is pulled up, the shooting range is enlarged, or the camera is shot at a lower position by a wide-angle or low-multiple lens, and the shooting range is enlarged. The two modes lead to more unnecessary features in the image, and the probability of the former being blocked by mechanical equipment is high.
The gesture capturing unit comprises a plurality of distance measuring sensors 2, is arranged on the dispensing manipulator and is located right below the image capturing unit 1 and used for collecting gesture detection information of products passing through the lower side of the image capturing unit. The position setting of the attitude capturing unit is also as described above in order to be able to detect the next product relatively centered.
The specific mounting arrangement of the camera unit 1 and the attitude capturing unit is specifically set forth in the embodiments of the apparatus of the present application.
The controller 3 may be a control circuit board of another integrated processing chip independent of the dispensing robot, which is electrically connected to the image capturing unit 1 and the attitude capturing unit, and the controller 3 is configured to:
analyzing and obtaining the gesture of the next dispensing product based on the image and gesture detection information;
analyzing and obtaining the latest space coordinates of the position to be dispensed according to the gesture of the product, the preset standard gesture of the product and the standard space coordinates of the position to be dispensed;
and correcting the next dispensing control amount of the dispensing manipulator according to the latest space coordinates of the position to be dispensed.
According to the arrangement, even if the placing position of the product on the tool is inaccurate, or the product is directly conveyed by adopting the assembly line without using the tool, the system can automatically identify and calculate the space coordinates of the to-be-dispensed positions of all the products and correct the action of the dispensing manipulator, so that the dispensing error caused by the fixed deviation and the conveying deviation can be reduced, and the dispensing accuracy of the product is improved.
The embodiment of the application also discloses a dispensing device.
Referring to fig. 2 and 3, the dispensing device includes: the dispensing manipulator and the dispensing manipulator control system are characterized in that a controller 3 of the dispensing manipulator control system is electrically connected to a control module of the dispensing manipulator.
The dispensing robot includes a horizontal movable arm 41, a longitudinal movable arm 42, and a dispensing head 43. The horizontal movable arm 41 comprises a linear motor 411 and a connecting frame 412, and the linear motor 411 is arranged at the side of the assembly line through a bracket at the bottom; a transverse plate 413 is fixed on the slide block of the linear motor 411, one end of the transverse plate 413 extends out of the side edge of the linear motor, and a connecting frame 412 is fixed at the outer end of the transverse plate 413. The connection frame 412 is shaped like a Chinese character 'kou' in plan view and is positioned on one side of the transverse plate 413.
The longitudinal movable arm 42 comprises an electric cylinder, the cylinder body of the electric cylinder is fixed on one side of the connecting frame 412, which is away from the transverse plate 413, the piston rod end of the electric cylinder passes through the frame body to fix the glue head 43, and the glue dispensing head 43 faces downwards. The linear motor and the electric cylinder are respectively and electrically connected with a manipulator control host through a servo drive controller, and the control host is electrically connected with the controller 3.
The camera unit 1 comprises a camera, the camera is fixed on the connecting frame 412 through a bracket, the camera is positioned at the center of the connecting frame 412, and the lens faces downwards. A mounting ring 414 is fixed below the camera, the mounting ring 414 is fixed on the connecting frame 412 through a bracket, a plurality of mounting holes are uniformly formed in the mounting ring 414 around the center, the probes of each ranging sensor 2 are respectively inserted and fixed in each mounting hole, and the probes face downwards.
In use, the product passes under the lens of the camera and then moves under the dispensing head 43.
The distribution mode of the distance measuring sensors 2 can be ensured as much as possible, no matter how large the specifications of the products are, the products can be detected by the distance measuring sensors 2, and the annular distribution characteristics of the distance measuring sensors 2 are symmetrical in pairs, so that the characteristics can be used for rapidly distinguishing the postures (orientations) of the asymmetric products by utilizing the difference of the heights of the symmetrical points.
The dispensing process after the intervention of the controller 3 is specifically explained below.
The analyzing the gesture of the product based on the image and the gesture detection information to obtain the next dispensing comprises the following steps:
when the detection value of the ranging sensor 2 is greater than the lower limit threshold of the height of the product, the product is determined to pass below the gesture capturing unit, the starting time of the product passing is defined as t1, and the leaving time is defined as t2.
1. When the detection value of each ranging sensor 2 matches the detection value distribution rule when the product is in the standard posture at a certain moment, the posture of the next dispensing product is determined to be the standard posture.
Among them, the standard posture, that is, the standard posture defined by the worker, includes directions and the like, for example: the small head of the sole is defined to face forward, and the large head is defined to face backward as a standard posture.
According to the above arrangement, the gesture of the product can be identified by using the distribution rule of the distance detection values in the use process, instead of performing more complex analysis such as image recognition every time, so as to reduce the operation pressure of the controller 3 and improve the response rate of the controller 3.
2. If the posture of the product is not determined to be the standard posture within the time length T, namely the posture of the product cannot be identified by directly utilizing the distance detection value, carrying out characteristic identification to be glued on the image within the time from T1 to T2; wherein, (T2-T1)/2 is less than or equal to T < (T2-T1).
It will be appreciated that if the product is in a standard pose, then by (t 2-t 1)/2, the product should have been identified based on the distribution law of the ranging detection values, and if not, the product is most likely not in a standard pose, at which point image recognition intervention is initiated.
The above arrangement can greatly reduce the processing amount of image recognition and reduce unnecessary data processing amount.
And searching a preset gesture sample set according to the characteristic identification result to be glued and the change rule of the detection value in the T time length to obtain the matched gesture of the next dispensing product.
Because the camera shoots in overlooking mode, the identification result of the gluing features is a top view, according to the top view, the gluing features which are different in front-back and left-right overlooking mode can be directly identified, and the features which are just height-changing mode like in front-back and left-right overlooking mode are required to be matched and identified according to the change rule in the duration of T.
It can be understood that the preset gesture sample set, namely, the change rule of the detection values in various time periods of T, is verified in advance, so as to be matched with the image characteristics to identify the real gesture of the product.
The above-mentioned according to the gesture of product, the standard gesture that the product presets and wait for the standard space coordinate of some glue position, analyze and obtain the space coordinate of the most recent position of waiting for some glue, it includes:
and acquiring the moving speed v of the product and calculating the feeding length L of the product by the time difference t3 of two adjacent times t 1. The moving speed v is obtained through a pipeline controller, and is calculated through the product size and the time period passing under the gesture capturing unit. The feed length L, i.e. the interval length of two adjacent products, l=v×interval duration.
Subsequently, in particular:
1) If the product pose is a standard pose, then:
calling a pre-stored standard space coordinate; the standard space coordinates refer to product posture and position standards, and match the coordinates of the position to be dispensed of the control quantity of the initially set dispensing manipulator, or coordinate point data set.
And (3) assuming that the detection values of the ranging sensors at the moment t4 are matched with the detection value distribution rule of the standard product gesture, acquiring the X-axis coordinate value and the Y-axis coordinate value (X1 and Y1) of the dispensing head at the moment t 4.
And calculating the coordinate difference between the (X1, Y1) and the preset initial position of the dispensing head, and compensating the standard space coordinate by using the coordinate difference to obtain the space coordinate of the position to be dispensed.
2) If the product pose is not a standard pose, then:
calculating the transverse deflection angle of the product according to the characteristic identification result (namely the characteristic outline at the moment) to be glued and the overlooking outline of the standard posture of the product;
calculating a transverse offset according to the center line of the characteristic identification result to be glued and the center line of the standard product gesture;
calculating the interval deviation between the feeding length L and the standard product interval L;
adjusting a pre-established product three-dimensional model in a matched three-dimensional coordinate system according to a transverse offset, a transverse deflection angle and an interval offset to obtain a new model;
and acquiring (reading) the space coordinates of the position to be dispensed of the new model according to the new model.
According to the arrangement, the device can automatically update the space coordinates of the position to be dispensed according to the actual gesture and the position of the product, and provides a basis for the action adjustment of the dispensing manipulator.
The foregoing are all preferred embodiments of the present application, and are not intended to limit the scope of the present application in any way, therefore: all equivalent changes in structure, shape and principle of this application should be covered in the protection scope of this application.

Claims (4)

1. A dispensing robot control system, comprising:
the pick-up unit is arranged on the dispensing manipulator, moves transversely relative to the product along with the dispensing manipulator, and is used for collecting an image of the next product subjected to dispensing;
the gesture capturing unit is arranged on the dispensing manipulator, is positioned right below the image capturing unit and is used for collecting gesture detection information of a product passing through the position below the image capturing unit; the gesture capturing unit comprises a plurality of ranging sensors, and the ranging sensors are distributed around the lens of the camera unit; the controller is electrically connected with the camera unit and the gesture capturing unit and is used for being connected with the dispensing mechanical flashlight;
wherein the controller is configured to:
analyzing and obtaining the gesture of the next dispensing product based on the image and gesture detection information;
analyzing and obtaining the latest space coordinates of the position to be dispensed according to the gesture of the product, the preset standard gesture of the product and the standard space coordinates of the position to be dispensed;
correcting the next dispensing control amount of the dispensing manipulator according to the space coordinates of the latest position to be dispensed;
the analysis of the image and the gesture detection information based on the next dispensing product gesture comprises the following steps: when the detection value of each ranging sensor is matched with the detection value distribution rule of the standard gesture of the product at a certain moment, determining the gesture of the next dispensing product as the standard gesture;
the analysis of the image and the gesture detection information based on the next dispensing product gesture comprises the following steps:
when the detection value of the ranging sensor is larger than the lower limit threshold of the height of the product, determining that the product passes below the gesture capturing unit, defining the starting time of the product passing as t1 and the leaving time as t2;
if the posture of the product is not determined to be the standard posture within the time length T, carrying out characteristic recognition to be glued on the image within the time from T1 to T2; wherein, (T2-T1)/2 is less than or equal to T < (T2-T1);
searching a preset gesture sample set according to the characteristic identification result to be glued and the change rule of the detection value in the T time length to obtain the matched gesture of the next dispensing product;
according to the product gesture, the product preset standard gesture and the standard space coordinate of the position to be dispensed, analyzing to obtain the latest space coordinate of the position to be dispensed, which comprises the following steps:
calculating the transverse deflection angle of the product according to the characteristic identification result to be glued and the overlooking outline of the product in the standard posture;
calculating a transverse offset according to the center line of the characteristic identification result to be glued and the center line of the standard product gesture;
calculating the interval deviation between the feeding length L and the standard product interval L; the feeding length L is the interval length of two adjacent products;
adjusting the pre-established three-dimensional model in a matched three-dimensional coordinate system according to the transverse offset, the transverse deflection angle and the interval offset to obtain a new model;
and obtaining the space coordinates of the position to be dispensed of the new model according to the new model.
2. The utility model provides a glue dispensing device, includes glue dispensing manipulator, its characterized in that: the dispensing robot control system of claim 1, wherein the controller is electrically connected to the dispensing robot.
3. The dispensing device of claim 2, wherein: the dispensing manipulator comprises a horizontal movable arm, a longitudinal movable arm and a dispensing head, wherein the longitudinal movable arm is connected to the moving part of the horizontal movable arm, and the dispensing head is arranged on the moving part of the longitudinal movable arm;
the camera shooting unit is connected to the moving part of the horizontal movable arm, and the lens faces to the moving path of the product; the gesture capturing unit is connected to the moving part of the horizontal movable arm and is positioned below the camera.
4. A dispensing device as claimed in claim 3, wherein: according to the product gesture, the product preset standard gesture and the standard space coordinate of the position to be dispensed, analyzing to obtain the latest space coordinate of the position to be dispensed, which comprises the following steps:
if the product gesture is a standard gesture, calling a standard space coordinate;
assuming that the detection values of all the ranging sensors at the moment t4 are matched with the detection value distribution rule when the product is in the standard posture, and acquiring the X-axis coordinate value and the Y-axis coordinate value of the dispensing head at the moment t 4;
calculating coordinate differences between X-axis and Y-axis coordinate values of the dispensing head at the moment t4 and an initial position preset by the dispensing head;
and compensating the standard space coordinate by using the coordinate difference to obtain the latest space coordinate of the position to be dispensed.
CN202311174454.6A 2023-09-13 2023-09-13 Dispensing manipulator control system and dispensing device Active CN116899832B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311174454.6A CN116899832B (en) 2023-09-13 2023-09-13 Dispensing manipulator control system and dispensing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311174454.6A CN116899832B (en) 2023-09-13 2023-09-13 Dispensing manipulator control system and dispensing device

Publications (2)

Publication Number Publication Date
CN116899832A CN116899832A (en) 2023-10-20
CN116899832B true CN116899832B (en) 2023-12-29

Family

ID=88356992

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311174454.6A Active CN116899832B (en) 2023-09-13 2023-09-13 Dispensing manipulator control system and dispensing device

Country Status (1)

Country Link
CN (1) CN116899832B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006130383A (en) * 2004-11-02 2006-05-25 Seiko Epson Corp Method and device for detection of dot shift
JP2011238048A (en) * 2010-05-11 2011-11-24 Nippon Telegr & Teleph Corp <Ntt> Position attitude measurement device and position attitude measurement program
CN106493042A (en) * 2016-10-18 2017-03-15 凌云光技术集团有限责任公司 Dispensing method and dispenser system
CN106853430A (en) * 2016-12-30 2017-06-16 杭州力视科技有限公司 A kind of automatically dropping glue tracking and device based on streamline
CN110538766A (en) * 2019-08-12 2019-12-06 苏州富强科技有限公司 Height-based dispensing head closed-loop control method and system
CN111299078A (en) * 2020-03-17 2020-06-19 欣辰卓锐(苏州)智能装备有限公司 Automatic tracking dispensing method based on assembly line
CN111921788A (en) * 2020-08-07 2020-11-13 欣辰卓锐(苏州)智能装备有限公司 High-precision dynamic tracking dispensing method and device
CN115518838A (en) * 2022-11-23 2022-12-27 苏州佳祺仕科技股份有限公司 Dispensing control method, device, equipment and storage medium
CN115846131A (en) * 2022-11-16 2023-03-28 杭州长川科技股份有限公司 Multi-station adhesive dispensing mechanism displacement conversion method and multi-station adhesive dispensing device
CN116393322A (en) * 2023-04-21 2023-07-07 大连理工大学 Device and method for realizing dispensing of micro parts
WO2023159611A1 (en) * 2022-02-28 2023-08-31 深圳市大疆创新科技有限公司 Image photographing method and device, and movable platform

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006130383A (en) * 2004-11-02 2006-05-25 Seiko Epson Corp Method and device for detection of dot shift
JP2011238048A (en) * 2010-05-11 2011-11-24 Nippon Telegr & Teleph Corp <Ntt> Position attitude measurement device and position attitude measurement program
CN106493042A (en) * 2016-10-18 2017-03-15 凌云光技术集团有限责任公司 Dispensing method and dispenser system
CN106853430A (en) * 2016-12-30 2017-06-16 杭州力视科技有限公司 A kind of automatically dropping glue tracking and device based on streamline
CN110538766A (en) * 2019-08-12 2019-12-06 苏州富强科技有限公司 Height-based dispensing head closed-loop control method and system
CN111299078A (en) * 2020-03-17 2020-06-19 欣辰卓锐(苏州)智能装备有限公司 Automatic tracking dispensing method based on assembly line
CN111921788A (en) * 2020-08-07 2020-11-13 欣辰卓锐(苏州)智能装备有限公司 High-precision dynamic tracking dispensing method and device
WO2023159611A1 (en) * 2022-02-28 2023-08-31 深圳市大疆创新科技有限公司 Image photographing method and device, and movable platform
CN115846131A (en) * 2022-11-16 2023-03-28 杭州长川科技股份有限公司 Multi-station adhesive dispensing mechanism displacement conversion method and multi-station adhesive dispensing device
CN115518838A (en) * 2022-11-23 2022-12-27 苏州佳祺仕科技股份有限公司 Dispensing control method, device, equipment and storage medium
CN116393322A (en) * 2023-04-21 2023-07-07 大连理工大学 Device and method for realizing dispensing of micro parts

Also Published As

Publication number Publication date
CN116899832A (en) 2023-10-20

Similar Documents

Publication Publication Date Title
US5249356A (en) Method and apparatus for mounting electronic component
JP6411028B2 (en) Management device
JP5893695B1 (en) Article transport system
US10966361B2 (en) Machine for performing specified work to a printed circuit board
US11134599B2 (en) Component mounter and component mounting system for mounting stacked components
JP4481201B2 (en) Interference detection method and apparatus
US7355386B2 (en) Method of automatically carrying IC-chips, on a planar array of vacuum nozzles, to a variable target in a chip tester
JPH09307297A (en) Correction device for electronic part installing apparatus and method thereof
CN116899832B (en) Dispensing manipulator control system and dispensing device
US10932401B2 (en) Component mounting machine
JP4331054B2 (en) Adsorption state inspection device, surface mounter, and component testing device
US11272651B2 (en) Component mounting device, method of capturing image, and method of determining mounting sequence
JPH09181487A (en) Method and apparatus for mounting electronic part
US20190037742A1 (en) Component mounting machine
US11185001B2 (en) Component placing device
US20230106149A1 (en) Component mounter
JP3543044B2 (en) Electronic component mounting apparatus and electronic component mounting method
JP4569419B2 (en) Electronic component mounting apparatus, electronic component mounting method, and nozzle height detection method
KR20230093451A (en) Soldering device and soldering system, and processing device
EP3771310B1 (en) Component-mounting device
CN113496177A (en) Visual inspection system and method of inspecting parts
JP7425693B2 (en) component mounting machine
JP7142169B2 (en) Image data management device and image data management method
JP6884494B2 (en) Parts transfer device, parts transfer method and component mounting device
JP6884924B2 (en) Component mounting equipment and inspection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant