CN110514664B - Cheese yarn rod positioning and detecting robot and method - Google Patents

Cheese yarn rod positioning and detecting robot and method Download PDF

Info

Publication number
CN110514664B
CN110514664B CN201910767593.7A CN201910767593A CN110514664B CN 110514664 B CN110514664 B CN 110514664B CN 201910767593 A CN201910767593 A CN 201910767593A CN 110514664 B CN110514664 B CN 110514664B
Authority
CN
China
Prior art keywords
yarn
coordinate
image
axis support
axis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910767593.7A
Other languages
Chinese (zh)
Other versions
CN110514664A (en
Inventor
王文胜
李天剑
卢影
冉宇辰
黄民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Information Science and Technology University
Original Assignee
Beijing Information Science and Technology University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Information Science and Technology University filed Critical Beijing Information Science and Technology University
Priority to CN201910767593.7A priority Critical patent/CN110514664B/en
Publication of CN110514664A publication Critical patent/CN110514664A/en
Application granted granted Critical
Publication of CN110514664B publication Critical patent/CN110514664B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • DTEXTILES; PAPER
    • D06TREATMENT OF TEXTILES OR THE LIKE; LAUNDERING; FLEXIBLE MATERIALS NOT OTHERWISE PROVIDED FOR
    • D06BTREATING TEXTILE MATERIALS USING LIQUIDS, GASES OR VAPOURS
    • D06B23/00Component parts, details, or accessories of apparatus or machines, specially adapted for the treating of textile materials, not restricted to a particular kind of apparatus, provided for in groups D06B1/00 - D06B21/00
    • D06B23/04Carriers or supports for textile materials to be treated
    • D06B23/047Replacing or removing the core of the package
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C9/00Measuring inclination, e.g. by clinometers, by levels
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The invention relates to a cheese yarn rod positioning and detecting robot and a method thereof, wherein the cheese yarn rod positioning and detecting robot comprises a motion system, a vision system and a control system, wherein the motion system and the vision system are both arranged on a portal structure and are connected with the control system; the X-axis support, the Y-axis support, the Z-axis support and corresponding servo motors forming the three-coordinate system are in a linear guide rail form, and a visual system is arranged on the side surface of the bottom of the Z-axis support and comprises an industrial camera and an annular light source; a servo motor on the Z-axis support drives the Z-axis support to move up and down, and simultaneously drives the industrial camera and the annular light source to move up and down together; the X-axis support and the Y-axis support control the visual system to move horizontally left and right through respective servo motors to drive the visual system to reach the coordinate position of the designated yarn rod; the yarn cage rotating station is arranged on the ground, the A-shaft support is arranged at the yarn cage rotating station, a rotating support structure is adopted for bearing a standard yarn cage, and the servo motor of the A shaft is used for controlling and driving the borne yarn cage to move.

Description

Cheese yarn rod positioning and detecting robot and method
Technical Field
The invention relates to the field of textile equipment, in particular to a cheese yarn rod positioning and detecting robot and a cheese yarn rod positioning and detecting method.
Background
The intelligent cheese dyeing factory is based on the technology of 'cheese digital automatic dyeing complete set technology and equipment' which is a first-class prize of national science and technology advancement, and realizes the digital and intelligent production of the whole process from the original yarns to the dyed yarn finished products by carrying out digital, information and intelligent comprehensive promotion; the novel full-process yarn dyeing intelligent manufacturing mode with controllable production quality, remote operation and maintenance and the like is created, wherein the yarn dyeing auxiliary spooling system, the production scheduling system and the logistics system are automatically matched, the production task is automatically executed, a special robot replaces manual work, the dyeing full-process parameter is detected and fed back on line.
One process in the dyeing process is yarn loading, namely, loading a spindle on a yarn rod in a yarn cage. Along with the continuous use of the yarn cage, the yarn rod can be deflected normally, and when the deflection degree is too large, the automatic loading of the spindle can be influenced. The larger the deviation distance of the yarn rod is, the longer the time for loading and unloading the yarn is, the lower the accuracy is, the single yarn cage has problems, the subsequent workshop work is influenced, and safety accidents can be caused if the single yarn cage is serious. When the deflection degree exceeds a certain amount, the yarn rod needs to be corrected. At present, the yarn rod correction work is still manually completed, workers need to measure and record the data of each rod, and then the number of the yarn rod with more deviation is obtained according to the comparison of the measured data, and corresponding adjustment is carried out.
Disclosure of Invention
In view of the above problems, an object of the present invention is to provide a cheese yarn rod positioning and detecting robot and method, which can automatically measure the current position of a yarn rod, determine whether the current yarn rod needs to be corrected, and have the advantages of fast measurement and high efficiency.
In order to achieve the purpose, the invention adopts the following technical scheme: a cheese yarn rod positioning and detecting robot comprises a gantry structure, a motion system, a vision system and a control system; the motion system and the vision system are both arranged on the portal structure and are connected with the control system; the vision system transmits the acquired image information to the control system, and the control system controls the motion system to act according to the received image information; the motion system comprises an X-axis support, a Y-axis support, a Z-axis support, an A-axis support and servo motors respectively arranged on the axis supports, each servo motor is connected with the control system, and the control system controls each servo motor to act so as to drive the corresponding axis support to move; the X-axis support, the Y-axis support, the Z-axis support and corresponding servo motors forming a three-coordinate system are in the form of linear guide rails, the visual system is arranged on the side surface of the bottom of the Z-axis support and comprises an industrial camera and an annular light source, the industrial camera is arranged on the side edge of the lower end of the Z-axis support, and the annular light source is sleeved at the lower part of the industrial camera; a servo motor on the Z-axis support drives the Z-axis support to move up and down, and simultaneously drives the industrial camera and the annular light source to move up and down together; the X-axis support and the Y-axis support control the visual system to horizontally move left and right through respective servo motors, and the visual system is driven to reach the coordinate position of a designated yarn rod; the yarn cage rotating station is arranged on the ground, the A-shaft support is arranged at the yarn cage rotating station, a rotating support structure is adopted for bearing a standard yarn cage, and the borne yarn cage is driven to move under the control of a servo motor of the A shaft.
Furthermore, the end part of the lower end of the Z-axis support is provided with a cheese automatic loading and unloading paw mechanism.
Further, the industrial camera adopts a large constant image Mercury series GigE digital camera; the distance between the yarn rod head and the lens of the industrial camera is 200-300 mm; the industrial camera is installed in parallel with the Z-axis support, and the angle error is not more than 1 degree.
Further, the control system comprises a controller, an X-axis driver, a Y-axis driver, a Z-axis driver, an A-axis driver and a power supply, wherein the controller is powered by the power supply; the controller receives the image information transmitted by the vision system through the industrial switch, processes the received image information and converts the processed image information into a control instruction, the control instruction is transmitted to the X-axis driver, the Y-axis driver, the Z-axis driver and the A-axis driver through the data bus respectively, and each driver drives the servo motor of the corresponding axis to act.
Further, the controller adopts a Siemens S7-1217 controller, and the data bus adopts a Profibet bus.
A cheese yarn rod positioning detection method based on the robot comprises the following steps: 1) after each yarn rod is subjected to image acquisition by an industrial camera, transmitting image information to a controller, and carrying out image processing by the controller; 2) the controller carries out image processing according to the received image information, carries out visual positioning calibration on each yarn rod, and carries out calibration and recording on centering reference coordinates of each yarn rod of the industrial camera; 3) visual inspection of the yarn rod which has been calibrated to record the original centering coordinate: sending a command of detecting a new yarn bar by the vision system, and writing the currently requested yarn bar number into the vision system; judging whether the numbers of the yarn rods at the current positions are the same in real time, sending out a command of continuing the request when the positions are different, moving the yarn rods above the yarn rods requested by the vision system when the positions are the same, and returning the numbers of the current yarn rods to show the positions; and the visual system reads the current yarn bar number equal to the request number, detects the yarn bar number and records the detection coordinate result to complete the yarn bar positioning visual detection.
Further, in the step 1), the image processing includes the following steps: 1.1) calibrating an industrial camera, aligning the center of the industrial camera through a standard circle, and calculating the corresponding relation between a pixel value and the diameter of the standard circle; 1.2) acquiring an original acquisition image containing an image of a head of a yarn rod to be detected; 1.3) carrying out automatic threshold segmentation on the original acquired image, and extracting a connected domain to obtain a connected domain image of the head of the yarn rod to be detected; 1.4) carrying out centroid extraction on a communicating region of the yarn rod head to be detected to obtain a centroid, namely a central position coordinate under an image coordinate system of the yarn rod head; 1.5) converting a coordinate system, converting an image coordinate system into coordinates under an actual coordinate system, and comparing the coordinates with original position coordinates to obtain an actual offset size; 1.6) carrying out defect estimation, carrying out characteristic analysis on the obtained connected domain image, calculating to obtain area and shape characteristics, comparing the area and shape characteristics with those of a standard yarn rod head, further judging whether the yarn rod head has defects or not, evaluating the defect degree through the area ratio, and giving the credibility of the yarn rod coordinate.
Further, in the step 1.5), the coordinate system conversion includes the following steps: 1.5.1) adopting a standard circle with known size to be arranged at the top end of the yarn rod to ensure that the standard circle and the top end of the yarn rod are in the same horizontal plane; 1.5.2) acquiring and obtaining a standard circle image by using a camera calibration method, and performing Hough transformation to obtain the size of a standard circle radius pixel; 1.5.3) calculating the ratio of the obtained standard circle radius pixel size to the actual size to obtain a calibration scale; 1.5.4) the coordinate of the center position in the coordinate system of the yarn bar head image is multiplied by the scale to obtain the coordinate of the final actual position.
Further, in the step 1.6), the defect evaluation includes the following steps: 1.6.1) carrying out area screening on the obtained connected domain image, and extracting the connected domain within the preset value range of the yarn rod; 1.6.2) analyzing the distance between the centroid position of the connected domain and the central position of the image, and judging whether the area of the connected domain within 20 pixels away from the center of the image is in a range; 1.6.3) if the difference between the preset value and the preset value is within the range of 50 pixels, judging the roundness and the rectangularity to analyze the defect degree; 1.6.4) if the difference is not within 50 pixels, judging whether the difference between the center positions of the second and third connected domains is within 10 pixels, if so, performing expansion processing to connect the two connected domains, and then performing the processing of step 1.6.2).
Further, in the step 2), the positioning method includes the following steps: 2.1) sending a positioning request signal to a vision system, updating an identification serial number and carrying out vision identification; 2.2) the industrial camera moves to a preset original standard alignment position after receiving the positioning request signal, performs photographing identification, feeds back an identified coordinate result to the controller, refreshes the identification serial number and the positioning data together, and the controller compares the fed back identification result with the requested serial number; 2.3) correcting the centering coordinate according to the comparison result, judging whether the deviation value between the position coordinate of the current industrial camera and the identified yarn rod coordinate is within a preset allowable range, if so, moving to the next yarn rod coordinate to be positioned to continue positioning, otherwise, moving to the correction position by the industrial camera, continuing calculating the deviation, and circularly changing the reference coordinate until the deviation is within the set allowable range; when the feedback identification result is consistent with the requested serial number, the vision system is considered to have finished positioning work and data processing, and the X/Y axis deviation in the data area is true and effective.
Due to the adoption of the technical scheme, the invention has the following advantages: the invention can automatically carry out positioning detection on the yarn bar head on the cheese yarn cage, and compare the yarn bar head with the standard position, thereby realizing automatic measurement of the current yarn bar position and judgment on whether the current yarn bar needs to be corrected, and solving the problems of manpower resource waste, low accuracy and the like caused by measurement and judgment by manpower; meanwhile, the device has high automation degree, and the intelligent modernization step of the whole cheese dyeing industry is greatly accelerated.
Drawings
FIG. 1 is a schematic view of the overall structure of the present invention;
FIG. 2 is a schematic diagram of the hardware architecture of the control system of the present invention;
FIG. 3 is a flow chart of a positioning method of the present invention;
FIG. 4 is a flow chart of the detection method of the present invention.
Detailed Description
The invention is described in detail below with reference to the figures and examples.
As shown in fig. 1, the present invention provides a cheese rod positioning and detecting robot, which comprises a gantry structure, a motion system, a vision system and a control system. The motion system and the vision system are both arranged on the portal structure and are connected with the control system; the vision system transmits the acquired image information to the control system, and the control system controls the motion system to act according to the received image information. The portal structure comprises four upright posts 1 and four cross beams 2, each cross beam 2 is arranged between the tops of the adjacent two upright posts 1, and supporting beams 3 for reinforcement are arranged at the joints of the cross beams 2 and the upright posts 1.
The motion system comprises an X-axis support 4, a Y-axis support 5, a Z-axis support 6, an A-axis support 7 and servo motors respectively arranged on the axis supports, wherein each servo motor is connected with the control system, and the control system controls each servo motor to act so as to drive the corresponding axis support to move. The X-axis and the Y-axis adopt the traditional prior art, the linear guide rail is driven by the synchronous transmission of the motor, and the Z-axis adopts the lead screw structure transmission of the prior art. The a-axis performs rotational motion through gear engagement of a motor of the prior art.
The X-axis support 4, the Y-axis support 5, the Z-axis support 6 and the corresponding servo motors forming the three-coordinate system are linear guide rails in the prior art, and the respective servo motors are used for driving belt pulleys to synchronously move, so detailed structures and movement principles thereof are not described herein. The side face of the bottom of the Z-axis support 6 is provided with a visual system which can effectively measure image information of the top end face of the yarn cage yarn bar, the visual system comprises an industrial camera and an annular light source, the industrial camera is arranged on the side edge of the lower end of the Z-axis support 6 and is fixed by angle iron screws, the annular light source is sleeved on the lower portion of the industrial camera, and the annular light source is fixed on the industrial camera through screws; a servo motor on the Z-axis support 6 drives the Z-axis support 6 to move up and down, and simultaneously drives the industrial camera and the annular light source to move up and down together so as to adjust the distance between the industrial camera and the yarn rod head. The X-axis support 4 and the Y-axis support 5 control the vision system to move horizontally left and right through respective servo motors, and can drive the vision system to reach the coordinate position of the designated yarn rod, so that imaging positioning detection operation is performed. The yarn cage rotating station is arranged on the ground, the A-shaft support 7 is arranged at the yarn cage rotating station, a rotating support structure is adopted for bearing a standard yarn cage, and the borne yarn cage is driven to move under the control of a servo motor of the A shaft.
In the above embodiment, the end of the lower end of the Z-axis support 6 is provided with an automatic bobbin loading and unloading gripper mechanism.
In the above embodiment, the industrial camera is a large constant image mars series GigE digital camera, which is an industrial camera with a focal length of 8 mm. The distance between the yarn rod head and the lens of the industrial camera is 200-300 mm; the industrial camera is arranged in parallel with the Z-axis support 6 to ensure that a view field coordinate system of the industrial camera is coincided with a Z-axis motion coordinate system, and the angle error is not more than 1 degree.
In the above embodiment, the annular light source is an annular LED light source with adjustable brightness, which is used to illuminate the head of the yarn rod and blur the background.
As shown in fig. 2, the control system includes a controller, an X-axis driver, a Y-axis driver, a Z-axis driver, an a-axis driver, and a power supply, the controller being powered by the power supply. The controller receives the image information transmitted by the vision system through the industrial switch, processes the received image information and converts the processed image information into a control instruction, the control instruction is transmitted to the X-axis driver, the Y-axis driver, the Z-axis driver and the A-axis driver through the data bus respectively, and each driver drives the servo motor of the corresponding axis to act.
In the above embodiment, the control system further includes an industrial touch screen, and the industrial touch screen performs information interaction with the controller through the industrial switch; and inputting external instruction information into the controller through the industrial touch screen.
In the above embodiment, the controller is a Siemens S7-1217 controller, and the Siemens S7-1200 series PLC controller can complete tasks such as simple logic control, high-level logic control, HMI and network communication. And the data bus adopts a Profibet bus to realize real-time control on the servo motor.
In the above embodiments, the controller, the vision system and the industrial touch screen are connected via an industrial ethernet, and a S7 communication protocol is used to implement human-machine information interaction and communication.
Based on the robot, the invention also provides a cheese yarn rod positioning detection method, which comprises the following steps:
1) the industrial camera collects images of each yarn rod, transmits the image information to the controller, and the controller processes the images;
the image processing comprises the following steps:
1.1) calibrating an industrial camera, aligning the center of the industrial camera through a standard circle, and calculating the corresponding relation between a pixel value and the diameter of the standard circle;
1.2) acquiring an original acquisition image containing an image of a head of a yarn rod to be detected;
1.3) carrying out automatic threshold segmentation on the original collected image, and extracting a connected domain to obtain a connected domain image of the head of the yarn rod to be detected;
the method comprises the following steps of screening characteristic conditions such as area, roundness and rectangle degree to obtain a connected domain which meets preset conditions as a connected domain of a yarn rod head to be detected; the roundness is the ratio of the area of the connected domain to the minimum circumscribed circle area of the connected domain, and the rectangularity is the ratio of the area of the connected domain to the minimum circumscribed rectangle area.
1.4) carrying out centroid extraction on a communicating region of the yarn rod head to be detected to obtain a centroid, namely a central position coordinate under an image coordinate system of the yarn rod head;
extracting the centroid of the connected domain of the yarn rod head to be detected, directly analyzing the connected domain and extracting the centroid, or extracting the circular contour from the original collected image in the step 1.2) or the connected domain image in the step 1.3) through Hough transform to obtain the center of the circular contour;
and 1.5) converting a coordinate system, converting the image coordinate system into coordinates under an actual coordinate system, and comparing the coordinates with the original position coordinates to obtain the actual offset size.
1.6) carrying out defect estimation, carrying out characteristic analysis on the obtained connected domain image, calculating to obtain characteristics such as area and shape, comparing the characteristics with the characteristics such as the area and the shape of a standard yarn rod head, further judging whether the yarn rod head has defects or not, evaluating the defect degree through an area ratio, and giving the credibility of the yarn rod coordinate.
In the step 1.3), the connected domain image of the yarn rod head to be detected is obtained by adopting an image segmentation algorithm, and the method comprises the following steps:
1.3.1) filtering the original collected image (not limited to mean filtering), and then carrying out contrast stretching;
1.3.2) adopting a self-adaptive threshold segmentation method to the preprocessed image to obtain an image with a candidate area of the club head; wherein, the adaptive threshold value division adopts but not limited to OTSU Otsu method, and can also adopt other methods such as bimodal method;
1.3.3) extracting connected domains of the obtained candidate region images, and calculating the size and position information of each candidate region in the candidate region set;
in the step 1.5), the coordinate system conversion includes the following steps:
1.5.1) adopting a standard circle with known size to be arranged at the top end of the yarn rod to ensure that the standard circle and the top end of the yarn rod are in the same horizontal plane;
1.5.2) acquiring and obtaining a standard circle image by using a camera calibration method, and performing Hough transformation to obtain the size of a standard circle radius pixel;
1.5.3) calculating the ratio of the obtained standard circle radius pixel size to the actual size to obtain a calibration scale;
1.5.4) multiplying the central position coordinate in the step 1.4) by a scale to obtain the final actual position coordinate.
In the step 1.6), the defect evaluation includes the following steps:
1.6.1) carrying out area screening on the obtained connected domain image, and extracting the connected domain within the preset value range of the yarn rod;
1.6.2) analyzing the distance between the centroid position of the connected domain and the central position of the image, and judging whether the area of the connected domain within 20 pixels away from the center of the image is in a range;
1.6.3) if the difference between the preset value and the preset value is within the range of 50 pixels, judging the roundness and the rectangularity to analyze the defect degree, wherein the roundness is the ratio of the area of the connected domain to the minimum circumscribed circle area of the connected domain, and the rectangularity is the ratio of the area of the connected domain to the minimum circumscribed rectangle area.
1.6.4) if the difference is not within 50 pixels, judging whether the difference between the center positions of the second and third connected domains is within 10 pixels, if so, performing expansion processing to connect the two connected domains, and then performing the processing of step 1.6.2).
2) The controller performs image processing according to the received image information, performs visual positioning calibration on each yarn rod, and calibrates and records centering reference coordinates of each yarn rod shot by the industrial camera;
as shown in fig. 3, the positioning method includes the following steps:
2.1) sending a positioning request signal to the industrial camera, updating the identification serial number and carrying out visual identification;
2.2) the industrial camera moves to a preset original standard alignment position after receiving the positioning request signal, performs photographing identification, feeds back an identified coordinate result to the controller, refreshes the identification serial number and the positioning data together, and the controller compares the fed back identification result with the requested serial number;
2.3) correcting the centering coordinate according to the comparison result (the centering coordinate is the position coordinate aligned with the central point of the picture of the industrial camera, namely the coordinate can be used as the reference coordinate for comparing the error of the final yarn rod), judging whether the deviation value of the position coordinate where the current industrial camera is located and the identified yarn rod coordinate is within the preset allowable range (0.1 mm in the embodiment), if so, moving to the next yarn rod coordinate to be positioned to continue positioning, otherwise, moving the industrial camera to the correction position, continuing to calculate the deviation, and circularly changing the reference coordinate until the deviation is within the set allowable range;
when the feedback identification result is consistent with the requested serial number, the vision system is considered to have finished positioning work and data processing, and the X/Y axis deviation in the data area is true and effective.
3) Visual inspection of the calibrated, originally recorded, centered yarn rod: as shown in fig. 4, the vision system sends out a command for detecting a new yarn bar, and writes the currently requested yarn bar number into the vision system; and judging whether the number of the yarn bar at the current position is the same as that of the yarn bar required to be detected in real time, sending a command for continuing the request when the positions are different, moving the yarn bar above the yarn bar required by the vision system when the positions are the same, and returning the number of the current yarn bar to show the position. And the visual system reads the current yarn bar number equal to the request number, detects the yarn bar number and records the detection coordinate result to complete the yarn bar positioning visual detection.
In each step, yarn cages and yarn rods in a dyeing and finishing workshop are produced in a standardized mode, 120 yarn rods are arranged on each yarn cage and marked in sequence, the diameter of a plane circle at the top end of each yarn rod is 7mm, a threaded hole with the diameter of 5mm is formed, and the yarn rods are made of stainless steel light reflecting surfaces. The wavy ring is arranged on the disc at the bottom end of the yarn rod for assisting in fixing the cheese, light irradiation can generate diffuse reflection, background interference is caused, and light reflection is serious. By adjusting the brightness of the annular light source, the problem of background reflection interference is solved, the background is blurred, and the brightness reflection of the top yarn rod is enhanced.
In each of the above steps, the vision system is calibrated before being used as a measuring tool. The vision camera can not be guaranteed to be completely consistent with the previous time when being reinstalled for some reasons, the top height of each yarn cage yarn rod has certain deviation, and a pixel/distance scale needs to be calibrated before each use.
In conclusion, the invention utilizes the annular light source to control the ambient brightness, takes the image acquired by the high-resolution industrial camera in real time as the main information source to position and analyze the yarn rod, and realizes the on-line automatic detection of the positioning of the finished yarn rod. The result of recognition in the image is a pixel deviation value on an X/Y axis, the pixel deviation value is converted into an actual deviation value in the calibration process, the actual deviation values represented by the pixel deviation values with different installation heights and the same installation height are different, and the height difference of each camera, each yarn cage and each camera and each yarn cage during installation cannot be completely guaranteed to be unchanged. A fixed scale is therefore required to scale the image. A standard circle with the diameter of 50mm is used as a scale, and is sleeved on the top surface of the top of the yarn rod to be flush with the yarn rod. The size of the pixel/distance scale is calibrated by detecting the diameter size (pixel size) of the standard circle in the camera, and the data measured in the visual positioning calibration and detection procedures are converted by using the scale.
Since the industrial camera and the robot each have their own coordinate systems, there are two sets of coordinate systems and there is a deviation between the two coordinate systems. Before the visual positioning calibration, the two coordinate systems need to be calibrated. The method for calibrating the origin of the coordinate system is complex, and the position of the origin is not required to be known in the experimental process. Therefore, during positioning calibration, the deviation value of the current yarn bar is given and stored in the controller, and is compared with the robot coordinate system (namely the coordinate system of the xyz axis) to calculate, so that the current coordinate is (0, 0), and the current coordinate is taken as a reference value, namely a new origin. During visual detection, the reference value coordinate is called, and the detection value of the output result is the deviation value of the coordinate of the center of the circle at the top end of the yarn rod.
The above embodiments are only for illustrating the present invention, and the structure, size, arrangement position and steps of each component can be changed, and on the basis of the technical scheme of the present invention, the improvement and equivalent transformation of the individual components and steps according to the principle of the present invention should not be excluded from the protection scope of the present invention.

Claims (9)

1. The utility model provides a cheese yarn pole location inspection robot which characterized in that: the device comprises a portal structure, a motion system, a vision system and a control system; the motion system and the vision system are both arranged on the portal structure and are connected with the control system; the vision system transmits the acquired image information to the control system, and the control system controls the motion system to act according to the received image information;
The motion system comprises an X-axis support, a Y-axis support, a Z-axis support, an A-axis support and servo motors respectively arranged on the axis supports, each servo motor is connected with the control system, and the control system controls each servo motor to act so as to drive the corresponding axis support to move;
the X-axis support, the Y-axis support, the Z-axis support and corresponding servo motors forming a three-coordinate system are in the form of linear guide rails, the visual system is arranged on the side surface of the bottom of the Z-axis support and comprises an industrial camera and an annular light source, the industrial camera is arranged on the side edge of the lower end of the Z-axis support, and the annular light source is sleeved at the lower part of the industrial camera; a servo motor on the Z-axis support drives the Z-axis support to move up and down, and simultaneously drives the industrial camera and the annular light source to move up and down together; the X-axis support and the Y-axis support control the visual system to horizontally move left and right through respective servo motors, and the visual system is driven to reach the coordinate position of a designated yarn rod; the yarn cage rotating station is arranged on the ground, the A-shaft bracket is arranged at the yarn cage rotating station, a rotating support structure is adopted for bearing a standard yarn cage, and the borne yarn cage is driven to move under the control of a servo motor of the A shaft;
The control system comprises a controller, an X-axis driver, a Y-axis driver, a Z-axis driver, an A-axis driver and a power supply, wherein the controller is powered by the power supply; the controller receives the image information transmitted by the vision system through the industrial switch, processes the received image information and converts the processed image information into a control instruction, the control instruction is transmitted to the X-axis driver, the Y-axis driver, the Z-axis driver and the A-axis driver through the data bus respectively, and each driver drives the servo motor of the corresponding axis to act.
2. The robot of claim 1, wherein: and the end part of the lower end of the Z-axis support is provided with a cheese automatic loading and unloading paw mechanism.
3. The robot of claim 1, wherein: the industrial camera adopts a large constant image Mercury series GigE digital camera; the distance between the yarn rod head and the lens of the industrial camera is 200-300 mm; the industrial camera is installed in parallel with the Z-axis support, and the angle error is not more than 1 degree.
4. The robot of claim 1, wherein: the controller adopts a Siemens S7-1217 controller, and the data bus adopts a Profibet bus.
5. A cheese rod positioning and detecting method based on the robot as claimed in any one of claims 1 to 4, characterized by comprising the following steps:
1) After each yarn rod is subjected to image acquisition by an industrial camera, transmitting image information to a controller, and carrying out image processing by the controller;
2) the controller carries out image processing according to the received image information, carries out visual positioning calibration on each yarn rod, and carries out calibration and recording on centering reference coordinates of each yarn rod of the industrial camera;
3) visual inspection of the yarn rod which has been calibrated to record the original centering coordinate: sending a command of detecting a new yarn bar by the vision system, and writing the currently requested yarn bar number into the vision system; judging whether the numbers of the yarn rods at the current positions are the same in real time, sending out a command for continuing the request when the positions are different, moving the yarn rods above the yarn rods requested by the vision system when the positions are the same, and returning the numbers of the current yarn rods to show the positions; and the visual system reads the current yarn bar number equal to the request number, detects the yarn bar number and records the detection coordinate result to complete the yarn bar positioning visual detection.
6. The method of claim 5, wherein: in the step 1), the image processing includes the following steps:
1.1) calibrating an industrial camera, aligning the center of the industrial camera through a standard circle, and calculating the corresponding relation between a pixel value and the diameter of the standard circle;
1.2) acquiring an original acquisition image containing an image of a head of a yarn rod to be detected;
1.3) carrying out automatic threshold segmentation on the original collected image, and extracting a connected domain to obtain a connected domain image of the head of the yarn rod to be detected;
1.4) extracting the mass center of the communicated area of the yarn rod head to be detected to obtain the mass center which is the central position coordinate of the yarn rod head under an image coordinate system;
1.5) converting a coordinate system, converting an image coordinate system into coordinates under an actual coordinate system, and comparing the coordinates with original position coordinates to obtain an actual offset size;
1.6) carrying out defect estimation, carrying out characteristic analysis on the obtained connected domain image, calculating to obtain area and shape characteristics, comparing the area and shape characteristics with the area and shape characteristics of a standard yarn rod head, further judging whether the yarn rod head has defects or not, evaluating the defect degree through the area ratio, and giving the reliability of the yarn rod coordinate.
7. The method of claim 6, wherein: in the step 1.5), the coordinate system conversion comprises the following steps:
1.5.1) adopting a standard circle with known size to be arranged at the top end of the yarn rod to ensure that the standard circle and the top end of the yarn rod are in the same horizontal plane;
1.5.2) acquiring and obtaining a standard circle image by using a camera calibration method, and performing Hough transformation to obtain the size of a standard circle radius pixel;
1.5.3) calculating the ratio of the obtained standard circle radius pixel size to the actual size to obtain a calibration scale;
1.5.4) the coordinate of the center position in the coordinate system of the yarn bar head image is multiplied by the scale to obtain the coordinate of the final actual position.
8. The method of claim 6, wherein: in step 1.6), the defect evaluation includes the following steps:
1.6.1) carrying out area screening on the obtained connected domain image, and extracting the connected domain within the preset value range of the yarn rod;
1.6.2) analyzing the distance between the centroid position of the connected domain and the central position of the image, and judging whether the area of the connected domain within 20 pixels away from the center of the image is in a range;
1.6.3) if the difference between the preset value and the preset value is within the range of 50 pixels, judging the roundness and the rectangularity to analyze the defect degree;
1.6.4) if the difference is not within 50 pixels, judging whether the difference between the center positions of the second and third connected domains is within 10 pixels, if so, performing expansion processing to connect the two connected domains, and then performing the processing of step 1.6.2).
9. The method of any of claims 5 to 8, wherein: in the step 2), the positioning method comprises the following steps:
2.1) sending a positioning request signal to a vision system, updating an identification serial number and carrying out vision identification;
2.2) the industrial camera moves to a preset original standard alignment position after receiving the positioning request signal, performs photographing identification, feeds back an identified coordinate result to the controller, refreshes the identification serial number and the positioning data together, and the controller compares the fed back identification result with the requested serial number;
2.3) correcting the centering coordinate according to the comparison result, judging whether the deviation value between the position coordinate of the current industrial camera and the identified yarn rod coordinate is within a preset allowable range, if so, moving to the next yarn rod coordinate to be positioned to continue positioning, otherwise, moving to the correction position by the industrial camera, continuing calculating the deviation, and circularly changing the reference coordinate until the deviation is within the set allowable range;
when the feedback identification result is consistent with the requested serial number, the vision system is considered to have finished positioning work and data processing, and the X/Y axis deviation in the data area is true and effective.
CN201910767593.7A 2019-08-20 2019-08-20 Cheese yarn rod positioning and detecting robot and method Active CN110514664B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910767593.7A CN110514664B (en) 2019-08-20 2019-08-20 Cheese yarn rod positioning and detecting robot and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910767593.7A CN110514664B (en) 2019-08-20 2019-08-20 Cheese yarn rod positioning and detecting robot and method

Publications (2)

Publication Number Publication Date
CN110514664A CN110514664A (en) 2019-11-29
CN110514664B true CN110514664B (en) 2022-08-12

Family

ID=68626657

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910767593.7A Active CN110514664B (en) 2019-08-20 2019-08-20 Cheese yarn rod positioning and detecting robot and method

Country Status (1)

Country Link
CN (1) CN110514664B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110961289B (en) * 2019-12-09 2021-06-29 国网智能科技股份有限公司 Transformer substation insulator anti-pollution flashover coating spraying tool and spraying method
CN110992358B (en) * 2019-12-18 2023-10-20 北京机科国创轻量化科学研究院有限公司 Method and device for positioning yarn rod of yarn cage, storage medium and processor

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103869186A (en) * 2014-03-05 2014-06-18 随州泰华电子科技有限公司 Tester for quartz crystal wafer
WO2015120734A1 (en) * 2014-02-17 2015-08-20 华南理工大学 Special testing device and method for correcting welding track based on machine vision
CN206187342U (en) * 2016-11-24 2017-05-24 西南科技大学 Novel four rotor motor fixing structure
CN207123496U (en) * 2017-06-22 2018-03-20 惠州市德赛电池有限公司 A kind of battery plus-negative plate identification equipment
CN108476282A (en) * 2016-01-22 2018-08-31 富士胶片株式会社 Radiography assisting device and photography householder method
CN108802040A (en) * 2017-05-04 2018-11-13 南京市特种设备安全监督检验研究院 A kind of unmanned plane device and detection method for crane surface defects detection
US10207421B1 (en) * 2016-09-26 2019-02-19 Wein Holding LLC Automated multi-headed saw and method for lumber

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6753965B2 (en) * 2001-01-09 2004-06-22 The University Of Hong Kong Defect detection system for quality assurance using automated visual inspection
ITFI20060211A1 (en) * 2006-08-24 2008-02-25 Tecnorama Srl DEVICE AND PROCEDURE TO PERFORM OPTICAL READINGS ON TEXTILE MATERIALS UNDER DYEING.
CN102706274B (en) * 2012-04-25 2014-08-06 复旦大学 System for accurately positioning mechanical part by machine vision in industrially-structured scene
US9285296B2 (en) * 2013-01-02 2016-03-15 The Boeing Company Systems and methods for stand-off inspection of aircraft structures
CN103197599A (en) * 2013-03-25 2013-07-10 东华大学 System and method for numerical control (NC) workbench error self correction based on machine vision
CN103406905B (en) * 2013-08-20 2015-07-08 西北工业大学 Robot system with visual servo and detection functions
CN104609175A (en) * 2014-12-05 2015-05-13 机械科学研究总院先进制造技术研究中心 Gripping device and method for high-level sarong to load and unload yarn rolls
CN105180905B (en) * 2015-07-23 2018-03-02 陕西科技大学 A kind of double camera vision positioning system and method
CN107014818B (en) * 2017-02-28 2020-08-04 深圳市维图视技术有限公司 Laser etching defect visual detection system and method
CN107102009A (en) * 2017-05-04 2017-08-29 武汉理工大学 A kind of method of the cylinder spool quality testing based on machine vision
CN107742289A (en) * 2017-10-15 2018-02-27 哈尔滨理工大学 One kind is based on machine vision revolving body workpieces detection method
CN207806014U (en) * 2017-11-17 2018-09-04 青岛杰瑞工控技术有限公司 Spool sorting system based on machine vision
CN209009859U (en) * 2018-05-23 2019-06-21 广州赫伽力智能科技有限公司 A kind of outer diameter crawl band u-turn yarn grabbing fixture
CN109059810B (en) * 2018-07-24 2020-05-26 天津大学 Method and device for detecting surface landform of fixed abrasive grinding tool
CN109085179A (en) * 2018-09-17 2018-12-25 周口师范学院 A kind of board surface flaw detection device and detection method
CN109108607A (en) * 2018-10-24 2019-01-01 泰安康平纳机械有限公司 Yarn bar positioning device
CN109507192B (en) * 2018-11-02 2021-07-02 江苏理工学院 Magnetic core surface defect detection method based on machine vision
CN109520420B (en) * 2018-12-21 2020-10-09 中国航空工业集团公司北京航空精密机械研究所 Method for determining space coordinates of rotary center of rotary table

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015120734A1 (en) * 2014-02-17 2015-08-20 华南理工大学 Special testing device and method for correcting welding track based on machine vision
CN103869186A (en) * 2014-03-05 2014-06-18 随州泰华电子科技有限公司 Tester for quartz crystal wafer
CN108476282A (en) * 2016-01-22 2018-08-31 富士胶片株式会社 Radiography assisting device and photography householder method
US10207421B1 (en) * 2016-09-26 2019-02-19 Wein Holding LLC Automated multi-headed saw and method for lumber
CN206187342U (en) * 2016-11-24 2017-05-24 西南科技大学 Novel four rotor motor fixing structure
CN108802040A (en) * 2017-05-04 2018-11-13 南京市特种设备安全监督检验研究院 A kind of unmanned plane device and detection method for crane surface defects detection
CN207123496U (en) * 2017-06-22 2018-03-20 惠州市德赛电池有限公司 A kind of battery plus-negative plate identification equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于机器视觉的汽车油泵支撑杆尺寸实时检测系统;陈军;《中国优秀硕士学位论文全文数据库工程科技Ⅱ辑》;20170815(第8期);第C035-49页 *

Also Published As

Publication number Publication date
CN110514664A (en) 2019-11-29

Similar Documents

Publication Publication Date Title
CN110501342B (en) Cheese yarn rod positioning visual detection method
CN210155545U (en) Be used for automatic on-line measuring equipment of switch manufacturing process
CN107764839B (en) Machine vision-based steel wire rope surface defect online detection method and device
CN102141374A (en) Image type spinneret plate automatic detector
CN110514664B (en) Cheese yarn rod positioning and detecting robot and method
CN106584273B (en) A kind of online vision detection system for robot polishing
CN111733496B (en) Yarn empty bobbin detection device and method
CN108489394A (en) A kind of large-scale sheet metal works almost T-stable automatic detection device and method
CN109840900A (en) A kind of line detection system for failure and detection method applied to intelligence manufacture workshop
CN109693140B (en) Intelligent flexible production line and working method thereof
CN113847881A (en) Free-form surface profile tolerance detection method based on machine vision
CN208042989U (en) A kind of large-scale sheet metal works almost T-stable automatic detection device
CN107056089A (en) A kind of on-line control system of optical fiber coating die holder
CN206405908U (en) A kind of online vision detection system polished for robot
CN113983965A (en) Flat cable quality detection device and detection method
CN213543477U (en) Power battery pole piece coating uniformity online metering test system
CN113334380A (en) Robot vision calibration method, control system and device based on binocular vision
CN114392940B (en) Pin detection method and device for special-shaped component
CN208012837U (en) A kind of homogeneity test device of heavy caliber uniform source of light
CN212459435U (en) Textile fabric surface flaw detection device
TWI693374B (en) Non-contact measurement system for measuring object contour
CN109825944B (en) Online fabric weaving defect detection method based on line laser
CN114113116A (en) Accurate detection process method for micro-defects on surface of large-diameter element
CN113715935A (en) Automatic assembling system and automatic assembling method for automobile windshield
CN217008240U (en) Full-automatic cylinder work piece character recognition device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant