CN111005163A - Automatic leather sewing method, device, equipment and computer readable storage medium - Google Patents

Automatic leather sewing method, device, equipment and computer readable storage medium Download PDF

Info

Publication number
CN111005163A
CN111005163A CN201911405470.5A CN201911405470A CN111005163A CN 111005163 A CN111005163 A CN 111005163A CN 201911405470 A CN201911405470 A CN 201911405470A CN 111005163 A CN111005163 A CN 111005163A
Authority
CN
China
Prior art keywords
leather
sewing
image
robot
edge
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911405470.5A
Other languages
Chinese (zh)
Other versions
CN111005163B (en
Inventor
李文智
刘培超
郎需林
刘主福
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yuejiang Technology Co Ltd
Original Assignee
Shenzhen Yuejiang Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Yuejiang Technology Co Ltd filed Critical Shenzhen Yuejiang Technology Co Ltd
Priority to CN201911405470.5A priority Critical patent/CN111005163B/en
Publication of CN111005163A publication Critical patent/CN111005163A/en
Application granted granted Critical
Publication of CN111005163B publication Critical patent/CN111005163B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05BSEWING
    • D05B15/00Machines for sewing leather goods
    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05BSEWING
    • D05B19/00Programme-controlled sewing machines
    • D05B19/02Sewing machines having electronic memory or microprocessor control unit
    • D05B19/04Sewing machines having electronic memory or microprocessor control unit characterised by memory aspects
    • D05B19/08Arrangements for inputting stitch or pattern data to memory ; Editing stitch or pattern data
    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05BSEWING
    • D05B19/00Programme-controlled sewing machines
    • D05B19/02Sewing machines having electronic memory or microprocessor control unit
    • D05B19/04Sewing machines having electronic memory or microprocessor control unit characterised by memory aspects
    • D05B19/10Arrangements for selecting combinations of stitch or pattern data from memory ; Handling data in order to control stitch format, e.g. size, direction, mirror image
    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05BSEWING
    • D05B19/00Programme-controlled sewing machines
    • D05B19/02Sewing machines having electronic memory or microprocessor control unit
    • D05B19/12Sewing machines having electronic memory or microprocessor control unit characterised by control of operation of machine
    • D05B19/16Control of workpiece movement, e.g. modulation of travel of feed dog

Landscapes

  • Engineering & Computer Science (AREA)
  • Textile Engineering (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Sewing Machines And Sewing (AREA)

Abstract

The invention discloses an automatic leather sewing method, which comprises the following steps: acquiring a leather image to be sewn; calculating the grabbing position coordinate of the robot according to the leather image; performing image processing on the leather image to obtain sewing track data of the leather; controlling the robot to transfer the leather to the position of a sewing machine according to the grabbing position coordinate of the robot; and controlling the sewing machine to automatically sew the leather according to the sewing track data. The automatic sewing machine can realize automatic sewing of leather, has higher sewing efficiency and better sewing effect compared with manual sewing, and can greatly save labor cost. In addition, the invention also discloses an automatic leather sewing device, equipment and a computer readable storage medium.

Description

Automatic leather sewing method, device, equipment and computer readable storage medium
Technical Field
The invention relates to the field of industrial sewing machines, in particular to a method, a device and equipment for automatically sewing leather and a computer readable storage medium.
Background
With the continuous improvement of living standard, leather products are more and more in daily life of people, such as automobile foot mats, automobile seat cushions, sofa cushions and the like.
Currently, for sewing leather materials, most sewing factories adopt manual sewing. The manual sewing requires a worker to be trained in sewing, and then the worker performs sewing along a sewing track previously marked on the leather. However, there is a drawback that sewing efficiency is low due to the manual sewing. Meanwhile, for different workers, the sewing techniques are different, so that the sewing effect is poor in consistency.
Nowadays, with the continuous development of industrial automation, an automatic sewing technology which replaces manual work by automatic equipment appears in the sewing industry. However, the existing automatic sewing technology mainly aims at the clothing products with simple manufacturing process and single products, and is rarely seen in the sewing of leather materials with the same large market demand.
Disclosure of Invention
The invention mainly aims to provide an automatic leather sewing method, and aims to solve the technical problems of low sewing efficiency and inconsistent sewing effect of the conventional manual leather sewing mode.
In order to solve the technical problem, the invention provides an automatic leather sewing method, which comprises the following steps: acquiring a leather image to be sewn; calculating the current coordinate position of the leather according to the leather image; performing image processing on the leather image to obtain sewing track data of the leather; controlling a robot to transfer the leather to the position of a sewing machine according to the current coordinate position of the leather; and controlling the sewing machine to automatically sew the leather according to the sewing track data.
Preferably, the calculating the current coordinate position of the leather according to the leather image comprises: acquiring a pixel coordinate of the leather in the leather image; according to the pixel coordinates and a preset coordinate conversion formula:
Figure BDA0002348506070000021
calculating the grabbing position coordinates of the robot; wherein, the (x)i,yi) As a grasping position coordinate of the robot, dx、dyAnd m is a scale factor, and theta is an included angle between a camera coordinate system and a robot coordinate system.
Preferably, the image processing the leather material image to obtain the sewing track data of the leather material comprises: extracting the contour edge of the leather, and converting the contour edge into an edge image; performing expansion processing on the edge image, and performing retraction processing on the expanded edge image to obtain an inner edge of the expanded edge image; and acquiring sewing track data of the leather according to the inner edge of the edge image.
Preferably, the acquiring the sewing track data of the leather material according to the inner edge of the edge image comprises: sequentially and equidistantly extracting points on the inner edge, and connecting adjacent two points to form a line segment; calculating an included angle between the two adjacent line segments, and deleting points between the two adjacent line segments with the included angle smaller than a preset angle; acquiring corner point coordinates of the inner edge, and calculating the distance between the remaining points on the inner edge and the corner points; inquiring the serial number of a point closest to the corner point, and extracting a point before the point according to the serial number; and judging the position of the corner point according to a vector included angle between a first vector formed by the previous point and the corner point and a second vector formed by the corner point and the point closest to the corner point.
Preferably, the image processing of the leather material image to obtain the sewing track data of the leather material further includes: and respectively calculating the included angle between two adjacent line segments to obtain the sewing posture of the sewing track.
The invention further provides an automatic leather sewing device, which comprises: the image acquisition module is used for acquiring a leather image to be sewn; the coordinate acquisition module is used for calculating the current coordinate position of the leather according to the leather image; the data acquisition module is used for carrying out image processing on the leather image so as to acquire sewing track data of the leather; the first control module is used for controlling the robot to transfer the leather to the position of the sewing machine according to the current coordinate position of the leather; and the second control module is used for controlling the sewing machine to automatically sew the leather according to the sewing track data.
Preferably, the coordinate acquiring module includes: the pixel coordinate acquisition unit is used for acquiring the pixel coordinate of the leather in the leather image; and the robot coordinate acquisition unit is used for acquiring the pixel coordinates according to the pixel coordinates and a preset coordinate conversion formula:
Figure BDA0002348506070000031
calculating the grabbing position coordinates of the robot; wherein, the (x)i,yi) As a grasping position coordinate of the robot, dx、dyAnd m is a scale factor, and theta is an included angle between a camera coordinate system and a robot coordinate system.
Preferably, the data acquisition module includes: the edge image acquisition unit is used for extracting the contour edge of the leather and converting the contour edge into an edge image; an inner edge obtaining unit, configured to perform expansion processing on the edge image, and then perform retraction processing on the expanded edge image to obtain an inner edge of the expanded edge image; and the sewing track acquisition unit is used for acquiring the sewing track data of the leather according to the inner edge of the edge image.
The invention also provides leather automatic sewing equipment which comprises a memory, a sewing machine and a sewing machine, wherein the memory is used for storing the computer program; and the processor is used for realizing the steps of the automatic leather sewing method when the computer program is executed.
The invention also provides a computer readable storage medium, wherein a computer program is stored on the computer readable storage medium, and when the computer program is executed by a processor, the steps of the automatic leather sewing method are realized.
The embodiment of the invention has the beneficial effects that: the automatic leather sewing method based on machine vision replaces the existing manual sewing method, has the characteristics of high sewing efficiency and excellent sewing effect, and can greatly reduce the labor cost so as to reduce the production cost of leather products. In addition, the leather is automatically sewn through automatic equipment, so that the sewn leather has high consistency.
Drawings
FIG. 1 is a flow chart of a first embodiment of an automatic sewing method for leather according to the present invention;
FIG. 2 is a flow chart of a second embodiment of the automatic sewing method for leather in accordance with the present invention;
FIG. 3 is a flow chart of a third embodiment of the automatic sewing method for leather according to the present invention;
FIG. 4 is an image of the leather processed by the automatic leather sewing method of the present invention;
FIG. 5 is a flow chart of a fourth embodiment of the method for automatically sewing a cover according to the present invention;
FIG. 6 is a schematic diagram of the corner point position determination of the automatic leather sewing method of the present invention;
FIG. 7 is an overall flow chart of the automatic sewing method for leather according to the present invention;
FIG. 8 is a schematic structural view of the automatic sewing machine for leather materials of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary and intended to be illustrative of the present invention and should not be construed as limiting the present invention, and all other embodiments that can be obtained by one skilled in the art based on the embodiments of the present invention without inventive efforts shall fall within the scope of protection of the present invention.
The invention provides an automatic leather sewing method, which comprises the following steps of:
step S10, obtaining a leather material image to be sewn;
in this embodiment, the industrial camera is used to photograph the leather to be sewn directly below the industrial camera, the obtained leather image is sent to the industrial computer, and the industrial computer performs image processing and data acquisition on the leather image according to the built-in automatic leather sewing program. In addition, in order to obtain a clearer leather material image, an LED light source which vertically irradiates downwards is arranged below the industrial camera, so that the leather material to be sewn below the industrial camera is supplemented with light.
Step S20, calculating the grabbing position coordinates of the robot according to the leather images;
in this embodiment, before the leather material to be sewn is transferred to the position of the sewing machine, the position of the leather material needs to be acquired first, so that the robot can move to the position of the leather material and grab the leather material to the position of the sewing machine, and the sewing machine can perform automatic sewing on the leather material. Before the leather is automatically sewn, the industrial camera and the robot are calibrated to obtain calibration parameters, and the parameters and the pixel coordinates are called to calculate in each detection, so that the coordinates of the grabbing position of the robot can be obtained.
Step S30, processing the leather image to obtain the sewing track data of the leather;
in the embodiment, firstly, mean filtering, binarization and other processing are carried out on the leather image to obtain object characteristics in the leather image; secondly, removing interference parts in the image and only keeping leather characteristics by threshold analysis and Blob processing (analyzing connected domains of the same pixels in the image, wherein the connected domains are called blobs); thirdly, performing edge processing on the reserved leather characteristics, and eliminating burrs and edges by using an edge smoothing method to obtain a smooth edge; then, sewing track data of the leather is extracted according to the contour edge.
In addition, since the real sewing track is not performed along the real edge of the leather material, but is performed along the sewing edge formed by retracting the real edge to the inside by a certain dimension, which requires the retraction processing of the processed edge image, the retraction method proposed in this embodiment is different from the conventional retraction method, specifically, the retraction method is performed after the expansion so as to obtain the inner edge of the expanded contour edge.
Step S40, controlling the robot to transfer the leather to the position of the sewing machine according to the grabbing position coordinates of the robot;
in this embodiment, after the grabbing position coordinates of the robot are obtained, according to the grabbing position coordinates, the robot is controlled to transfer the leather to be sewn to the position of the sewing machine, so that the sewing machine can automatically sew the leather. The robot comprises a mechanical arm capable of moving along multiple axes and a grabbing clamp arranged at the tail end of the mechanical arm, the mechanical arm drives the grabbing clamp to move to the position of the leather material to be sewn, and the leather material is grabbed to a station where the sewing machine is located by controlling an air valve on the grabbing clamp.
And step S50, controlling the sewing machine to automatically sew the leather according to the sewing track data.
In the embodiment, after the leather to be sewn is transferred to the station where the sewing machine is located, the sewing machine is controlled to sew the leather to be sewn according to the calculated sewing track data. In addition, sewing parameters of the leather to be sewn, such as needle pitch, sewing speed and the like, can be input through the industrial control computer, including but not limited to.
In a preferred embodiment, referring to fig. 2, step S20 includes:
step S21, acquiring the pixel coordinates of the leather;
step S22, according to the pixel coordinates and the preset coordinate transformation formula:
Figure BDA0002348506070000051
calculating the coordinates of the grabbing position of the robot;
wherein (x)i,yi) As coordinates of the gripping position of the robot, dx、dyAnd m is a scale factor, and theta is an included angle between a camera coordinate system and a robot coordinate system.
In this embodiment, after the pixel coordinates of the leather are obtained, the pixel coordinates are substituted into a preset coordinate conversion formula, so that the grabbing position coordinates of the robot can be calculated, and the robot transfers the leather to be sewn to the station where the sewing machine is located according to the grabbing position coordinates.
In another preferred embodiment, referring to fig. 3, step S30 includes:
step S31, extracting the contour edge of the leather, and converting the contour edge into an edge image;
step S32, performing expansion processing on the edge image, and performing retraction processing on the expanded edge image to obtain the inner edge of the expanded edge image;
and step S33, acquiring sewing track data of the leather according to the inner edge of the edge image.
In the embodiment, the obtained leather image is subjected to edge processing to extract the contour edge of the leather; and then, performing expansion processing on the edge image, specifically, firstly performing binarization on the edge image to obtain a binary image, then performing or operation on each pixel of the scanned image by using a 3x3 structural element and the binary image covered by the structural element, and if one element in the extracted skeleton is 0, calculating that the pixel of the image is 0. Otherwise it is 1. And the result obtained after calculation is that the binary image skeleton is increased by one circle, and if the increase of the circle is not enough, the expansion width is increased twice until the requirement of the expansion width is met. And performing binarization processing on the expanded edge image, and extracting an inner edge of the expanded edge contour by using an edge detection method, wherein the edge is a searched result as shown in fig. 4.
In another preferred embodiment, referring to fig. 5 and 6, the step S33 includes:
step S331, sequentially extracting points on the inner edge at equal intervals, and connecting adjacent two points to form a line segment;
in this embodiment, points at equal intervals are extracted from the inner edge in a certain order, a serial number is assigned to each point, and then two adjacent points are connected to form a line segment.
Step S332, calculating an included angle between two adjacent line segments, and deleting points between the two adjacent line segments with the included angle smaller than a preset angle;
in this embodiment, an included angle between two adjacent line segments is calculated, and if the included angle is smaller than a preset angle, for example, 1 ° or 2 °, it indicates that the two line segments form a straight line, and all points between the two line segments are deleted, and only two end points are reserved.
Step S333, acquiring corner point coordinates of the inner edge, and calculating the distance between the remaining points on the inner edge and the corner points;
in this embodiment, corner points of the inner edge are obtained by a corner point detection method, and a corerreharris function is used to run a Harris corner point detection operator in an OpenCV to process an image. After the coordinates of the corner points are acquired, the distances between the remaining points and the corner points are calculated.
Step S334, inquiring the serial number of the point nearest to the corner point, and extracting a point before the point according to the serial number;
in this embodiment, after the distance between the remaining point and the corner point is calculated, the serial number of the point closest to the corner point is queried, and then a point before the point is extracted according to the serial number. Let the corner point be H, the point closest to the corner point be B, and the point before this point be a.
Step S335, the position of the corner point is judged according to the vector included angle between the first vector formed by the previous point and the corner point and the second vector formed by the corner point and the point nearest to the corner point.
In this embodiment, referring to fig. 6, a first vector formed by a previous point and a corner point is a vector AH, a second vector formed by the corner point and a point closest to the corner point is a vector HB, and a vector included angle between the vector AH and the vector HB is calculated, and if the vector included angle between the vector AH and the vector HB is close to 180 °, it indicates that the corner point H is behind the point B; otherwise, it represents the corner point H between the point a and the point B.
In another preferred embodiment, the step S33 further includes:
step S336, calculating an included angle between two adjacent line segments respectively to obtain a sewing posture of the sewing track.
In this embodiment, since the robot performs sewing according to an external control point (ECP point) of the robot during the sewing process, it is necessary to add a sewing posture to the sewing trajectory, and the robot sewing trajectory data is formed from the sewing trajectory and the sewing posture. Specifically, each line segment on the sewing track is represented as a vector with a direction, and the included angle between two adjacent vector segments is calculated, namely the sewing posture of the connection point of the two vector segments. Assuming that the two vectors are vector a and vector b, the solution formula of the included angle is: cos θ ═ a × b/| a | | | b |.
The automatic leather sewing system corresponding to the automatic leather sewing method comprises: the system comprises a color industrial camera, two white bar-shaped LED light sources, a PC computer, an industrial robot, an industrial sewing machine and a visual image analysis processing system. Industry camera dress on camera fixed mounting support, LED white bar light source place the camera below and shine perpendicularly downwards, the cladding is placed under with the camera, the automatic sewing system software of cladding is installed at the PC computer, entire system's work flow is as figure 7:
1. starting the software of the automatic leather sewing system, automatically establishing communication connection between the system and the robot, and when the connection is successfully established, the vision system receives a success signal sent by the robot and the robot starts to initialize.
2. After initialization is completed, the robot sends a photographing signal to the automatic leather sewing system, and the system controls the camera to photograph and collect pictures.
3. The automatic leather sewing system firstly carries out template matching according to the collected pictures, if the pictures are not matched, the situation that the current leather is not the sewn leather is indicated, an alarm signal is sent, the robot clamps the leather to an idle placing area, after a new leather is fed, the system is informed to carry out photographing detection, if the situation is that the leather is sewn, the system firstly obtains the coordinate position of a target, and image processing is carried out
4. The image processing first performs processing such as mean filtering and binarization to obtain the detected object features captured in the image.
5. After Blob processing, interfering parts in the image are removed by threshold analysis, and only the cortical material characteristics are retained.
6. And performing edge processing on the reserved features, and eliminating edge burrs by using an edge smoothing method to obtain a smooth edge. Converting the obtained edge into an image, and processing the edge:
(1) binarization is carried out on the edge image to obtain a binary image
(2) Each pixel of the image is scanned by using a 3x3 structural element, and an or operation is performed by using the structural element and the binary image covered by the structural element, if one element in the extracted skeleton is 0, the pixel of the image is calculated to be 0. Otherwise it is 1. And the result obtained after calculation is that the binary image skeleton is increased by one circle, and if the increase of the circle is not enough, the expansion width is increased twice until the requirement of the expansion width is met.
(3) And performing binarization processing on the expanded edge image, and extracting to obtain an inner edge of the expanded edge contour by using an edge detection method.
7. Extracting sewing tracks of sewing edges:
(1) extracting points on edges at equal intervals
(2) And extracting coordinates of corner points of the contour by a method of searching the corner points.
(3) And calculating the angles of two adjacent line segments, and deleting the intermediate points of the two adjacent line segments with angular points smaller than a value, so that the straight line part of the contour only takes two points.
(4) And calculating the distances between the corner points and all the rest points, and calculating to obtain the position of the point closest to the corner point.
(5) And inquiring the serial number of the point closest to the corner point, assuming the point position B, extracting the point before the point, assuming the point as A and the corner point as H, calculating the included angle between a vector AH and a vector HB, and judging whether the interpolation point is between the AB point and the B point through the included angle of the vectors.
8. Because the industrial robot sews according to the ECP point in the sewing process, and does not sew according to the terminal coordinate system of the industrial robot, the sewing gesture needs to be added to the sewing track and added to the sewing track. The calculation method of the track posture comprises the following steps: expressing each line segment on the track as a directional vector, solving the included angle between two adjacent vector segments to obtain the posture of the connection point of the two vector segments, assuming that the two vector segments are a and b, and solving the included angle by the following formula: cos θ ═ a × b/| a | | | b |.
9. And the automatic leather sewing system converts the coordinates of the target position and the sewing track obtained by calculation and sends the target position and the sewing track to the industrial robot in a TCP/IP communication mode.
10. And after the robot obtains sewing data, the robot moves to a grabbing point, controls an air valve on the grabbing clamp, grabs the leather to a station where the sewing machine is located, and performs automatic sewing according to the sewing track of the leather. After sewing one piece of material, the sewn leather material is placed in a loading area. The sewing of one piece of leather is completed.
The automatic leather sewing method provided by the invention has the following advantages:
1. in the sewing process, if other cloth materials are mixed, the automatic leather sewing system can automatically distinguish and distinguish the cloth materials and automatically select the mixed cloth materials.
2. Can adapt to the sewing of the leather materials with various shapes, and can complete the switching through system selection for the sewing switching of different leather materials. If increase peripheral bar code scanning, but automatic switch-over, it is specific, has the two-dimensional code on the cladding, through the two-dimensional code on the bar code scanner scanning cladding, acquires the information that corresponds with the cladding, for example the model of cladding, so can realize the sewing switching to different cladding.
3. The new material sewing is simple in arrangement, and the setting operation can be completed through a plurality of buttons on a system setting interface.
4. The sewing leather has good effect consistency and higher consistency.
5. The system can work continuously for 24 hours, has high sewing efficiency and can effectively reduce the production cost.
Based on the aforementioned proposed automatic leather sewing method, the present invention further provides an automatic leather sewing device, referring to fig. 8, the automatic leather sewing device comprising:
the image acquisition module 1 is used for acquiring a leather image to be sewn;
the coordinate acquisition module 2 is used for calculating the coordinates of the grabbing position of the robot according to the leather images;
the data acquisition module 3 is used for carrying out image processing on the leather image so as to acquire sewing track data of the leather;
the first control module 4 is used for controlling the robot to transfer the leather to the position of the sewing machine according to the grabbing position coordinate of the robot;
and the second control module 5 is used for controlling the sewing machine to automatically sew the leather according to the sewing track data.
In a preferred embodiment, the coordinate acquisition module 2 includes:
the pixel coordinate acquisition unit 21 is used for acquiring the pixel coordinate of the leather;
a robot coordinate obtaining unit 22, configured to, according to the pixel coordinates and according to a preset coordinate conversion formula:
Figure BDA0002348506070000091
calculating the coordinates of the grabbing position of the robot;
wherein (x)i,yi) As coordinates of the gripping position of the robot, dx、dyAnd m is a scale factor, and theta is an included angle between a camera coordinate system and a robot coordinate system.
In another preferred embodiment, the data acquisition module 3 comprises:
an edge image obtaining unit 31, configured to extract a contour edge of the leather material, and convert the contour edge into an edge image;
an inner edge obtaining unit 32, configured to perform expansion processing on the edge image, and then perform retraction processing on the expanded edge image to obtain an inner edge of the expanded edge image;
and a sewing track obtaining unit 33 for obtaining sewing track data of the leather material according to the inner edge of the edge image.
Based on the leather automatic sewing method, the invention also provides leather automatic sewing equipment, which comprises the following steps:
a memory for storing a computer program;
a processor, configured to implement the steps of the automatic leather sewing method in the foregoing embodiments when executing a computer program, where the steps of the automatic leather sewing method at least include:
step S10, obtaining a leather material image to be sewn;
step S20, calculating the grabbing position coordinates of the robot according to the leather images;
step S30, processing the leather image to obtain the sewing track data of the leather;
step S40, controlling the robot to transfer the leather to the position of the sewing machine according to the grabbing position coordinates of the robot;
and step S50, controlling the sewing machine to automatically sew the leather according to the sewing track data.
Based on the automatic leather sewing method, the invention further provides a computer-readable storage medium, wherein a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the automatic leather sewing method in the embodiment are realized, and the steps of the automatic leather sewing method at least comprise:
step S10, obtaining a leather material image to be sewn;
step S20, calculating the grabbing position coordinates of the robot according to the leather images;
step S30, processing the leather image to obtain the sewing track data of the leather;
step S40, controlling the robot to transfer the leather to the position of the sewing machine according to the grabbing position coordinates of the robot;
and step S50, controlling the sewing machine to automatically sew the leather according to the sewing track data.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is only one logical division, and other divisions may be realized in practice, for example, a plurality of modules or components may be combined or integrated into another apparatus, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or modules, and may be in an electrical, mechanical or other form.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present invention may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only a part of or preferred embodiments of the present invention, and neither the text nor the drawings should be construed as limiting the scope of the present invention, and all equivalent structural changes, which are made by using the contents of the present specification and the drawings, or any other related technical fields, are included in the scope of the present invention.

Claims (10)

1. An automatic sewing method for leather is characterized by comprising the following steps:
acquiring a leather image to be sewn;
calculating the grabbing position coordinate of the robot according to the leather image;
performing image processing on the leather image to obtain sewing track data of the leather;
controlling the robot to transfer the leather to the position of a sewing machine according to the grabbing position coordinate of the robot;
and controlling the sewing machine to automatically sew the leather according to the sewing track data.
2. The automatic leather material sewing method according to claim 1, wherein the calculating of the gripping position coordinates of the robot from the leather material image comprises:
acquiring a pixel coordinate of the leather;
according to the pixel coordinates and a preset coordinate conversion formula:
Figure FDA0002348506060000011
Figure FDA0002348506060000012
calculating the grabbing position coordinates of the robot;
wherein, the (x)i,yi) As a grasping position coordinate of the robot, dx、dyAnd m is a scale factor, and theta is an included angle between a camera coordinate system and a robot coordinate system.
3. The automatic leather material sewing method according to claim 1, wherein the image processing of the leather material image to obtain sewing track data of the leather material comprises:
extracting the contour edge of the leather, and converting the contour edge into an edge image;
performing expansion processing on the edge image, and performing retraction processing on the expanded edge image to obtain an inner edge of the expanded edge image;
and acquiring sewing track data of the leather according to the inner edge of the edge image.
4. The automatic leather material sewing method according to claim 3, wherein the obtaining of the sewing trajectory data of the leather material according to the inner edge of the edge image comprises:
sequentially and equidistantly extracting points on the inner edge, and connecting adjacent two points to form a line segment;
calculating an included angle between the two adjacent line segments, and deleting points between the two adjacent line segments with the included angle smaller than a preset angle;
acquiring corner point coordinates of the inner edge, and calculating the distance between the remaining points on the inner edge and the corner points;
inquiring the serial number of a point closest to the corner point, and extracting a point before the point according to the serial number;
and judging the position of the corner point according to a vector included angle between a first vector formed by the previous point and the corner point and a second vector formed by the corner point and the point closest to the corner point.
5. The automatic leather material sewing method according to claim 4, wherein the image processing of the leather material image to obtain sewing track data of the leather material further comprises:
and respectively calculating the included angle between two adjacent line segments to obtain the sewing posture of the sewing track.
6. An automatic sewing device of cladding, characterized by includes:
the image acquisition module is used for acquiring a leather image to be sewn;
the coordinate acquisition module is used for calculating the grabbing position coordinate of the robot according to the leather image;
the data acquisition module is used for carrying out image processing on the leather image so as to acquire sewing track data of the leather;
the first control module is used for controlling the robot to transfer the leather to the position of the sewing machine according to the grabbing position coordinate of the robot;
and the second control module is used for controlling the sewing machine to automatically sew the leather according to the sewing track data.
7. The automatic leather sewing device of claim 6, wherein the coordinate acquisition module comprises:
the pixel coordinate acquisition unit is used for acquiring the pixel coordinate of the leather;
and the robot coordinate acquisition unit is used for acquiring the pixel coordinates according to the pixel coordinates and a preset coordinate conversion formula:
Figure FDA0002348506060000021
calculating the grabbing position coordinates of the robot;
wherein, the (x)i,yi) As a grasping position coordinate of the robot, dx、dyIs flatAnd (4) the shift amount, wherein m is a scale factor, and theta is an included angle between the camera coordinate system and the robot coordinate system.
8. The automatic leather material sewing device according to claim 6, wherein the data acquisition module comprises:
the edge image acquisition unit is used for extracting the contour edge of the leather and converting the contour edge into an edge image;
an inner edge obtaining unit, configured to perform expansion processing on the edge image, and then perform retraction processing on the expanded edge image to obtain an inner edge of the expanded edge image;
and the sewing track acquisition unit is used for acquiring the sewing track data of the leather according to the inner edge of the edge image.
9. An automatic sewing machine for leather materials, comprising:
a memory for storing a computer program;
a processor for implementing the steps of the automatic leather sewing method according to any one of claims 1 to 5 when executing the computer program.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when being executed by a processor, carries out the steps of the automatic leather sewing method according to any one of claims 1 to 5.
CN201911405470.5A 2019-12-30 2019-12-30 Automatic leather sewing method, device, equipment and computer readable storage medium Active CN111005163B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911405470.5A CN111005163B (en) 2019-12-30 2019-12-30 Automatic leather sewing method, device, equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911405470.5A CN111005163B (en) 2019-12-30 2019-12-30 Automatic leather sewing method, device, equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN111005163A true CN111005163A (en) 2020-04-14
CN111005163B CN111005163B (en) 2022-04-26

Family

ID=70119694

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911405470.5A Active CN111005163B (en) 2019-12-30 2019-12-30 Automatic leather sewing method, device, equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN111005163B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111633647A (en) * 2020-05-26 2020-09-08 山东大学 Multi-mode fusion robot sewing method and system based on deep reinforcement learning
CN112848631A (en) * 2021-03-04 2021-05-28 临沂金钰纺织品有限公司 Production process of anti-skid automobile foot mat
KR20230107956A (en) * 2022-01-10 2023-07-18 (주) 엠엔비젼 Product overlapping system using computer vision that does not use a reference image when sewing clothing parts

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3552334B2 (en) * 1995-04-28 2004-08-11 ブラザー工業株式会社 Embroidery data processing device
US20120222602A1 (en) * 2011-03-01 2012-09-06 Brother Kogyo Kabushiki Kaisha Sewing machine, stitch data generating device and stitch data generating program
CN103437074A (en) * 2013-09-13 2013-12-11 吴江市菀坪宝得利缝制设备机械厂 Trace conversion mechanism of industrial zigzag seam sewing machine
CN103668793A (en) * 2013-12-02 2014-03-26 中国船舶重工集团公司第七〇五研究所 Method for controlling electronic pattern machine to perform multiple sewing
CN104695139A (en) * 2015-04-01 2015-06-10 华中科技大学 Industrial sewing machine system and cut part sewing processing method by same
CN106054874A (en) * 2016-05-19 2016-10-26 歌尔股份有限公司 Visual positioning calibrating method and device, and robot
CN109629122A (en) * 2018-12-25 2019-04-16 珞石(山东)智能科技有限公司 A kind of robot method of sewing based on machine vision
CN109623821A (en) * 2018-12-26 2019-04-16 深圳市越疆科技有限公司 The visual guide method of mechanical hand crawl article
CN110004600A (en) * 2018-09-20 2019-07-12 浙江大学台州研究院 Intelligent sewing device and method based on machine vision
CN110258031A (en) * 2019-06-20 2019-09-20 西北工业大学 A kind of two-needle sewing machine curve seam power matching method and device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3552334B2 (en) * 1995-04-28 2004-08-11 ブラザー工業株式会社 Embroidery data processing device
US20120222602A1 (en) * 2011-03-01 2012-09-06 Brother Kogyo Kabushiki Kaisha Sewing machine, stitch data generating device and stitch data generating program
CN103437074A (en) * 2013-09-13 2013-12-11 吴江市菀坪宝得利缝制设备机械厂 Trace conversion mechanism of industrial zigzag seam sewing machine
CN103668793A (en) * 2013-12-02 2014-03-26 中国船舶重工集团公司第七〇五研究所 Method for controlling electronic pattern machine to perform multiple sewing
CN104695139A (en) * 2015-04-01 2015-06-10 华中科技大学 Industrial sewing machine system and cut part sewing processing method by same
CN106054874A (en) * 2016-05-19 2016-10-26 歌尔股份有限公司 Visual positioning calibrating method and device, and robot
CN110004600A (en) * 2018-09-20 2019-07-12 浙江大学台州研究院 Intelligent sewing device and method based on machine vision
CN109629122A (en) * 2018-12-25 2019-04-16 珞石(山东)智能科技有限公司 A kind of robot method of sewing based on machine vision
CN109623821A (en) * 2018-12-26 2019-04-16 深圳市越疆科技有限公司 The visual guide method of mechanical hand crawl article
CN110258031A (en) * 2019-06-20 2019-09-20 西北工业大学 A kind of two-needle sewing machine curve seam power matching method and device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111633647A (en) * 2020-05-26 2020-09-08 山东大学 Multi-mode fusion robot sewing method and system based on deep reinforcement learning
CN111633647B (en) * 2020-05-26 2021-06-22 山东大学 Multi-mode fusion robot sewing method and system based on deep reinforcement learning
CN112848631A (en) * 2021-03-04 2021-05-28 临沂金钰纺织品有限公司 Production process of anti-skid automobile foot mat
KR20230107956A (en) * 2022-01-10 2023-07-18 (주) 엠엔비젼 Product overlapping system using computer vision that does not use a reference image when sewing clothing parts
KR102629757B1 (en) * 2022-01-10 2024-01-29 (주) 엠엔비젼 Product overlapping system using computer vision that does not use a reference image when sewing clothing parts

Also Published As

Publication number Publication date
CN111005163B (en) 2022-04-26

Similar Documents

Publication Publication Date Title
CN111005163B (en) Automatic leather sewing method, device, equipment and computer readable storage medium
CN207861446U (en) Control system for robot destacking apparatus
CN107992881B (en) Robot dynamic grabbing method and system
CN109785317B (en) Automatic pile up neatly truss robot's vision system
US7747080B2 (en) System and method for scanning edges of a workpiece
TWI398157B (en) System and method for boundary scan of an image
CN104695139A (en) Industrial sewing machine system and cut part sewing processing method by same
CN111923053A (en) Industrial robot object grabbing teaching system and method based on depth vision
CN110926330A (en) Image processing apparatus, image processing method, and program
US20220080581A1 (en) Dual arm robot teaching from dual hand human demonstration
CN109822568B (en) Robot control method, system and storage medium
CN105690393A (en) Four-axle parallel robot sorting system based on machine vision and sorting method thereof
CN110548698B (en) Sewing equipment and cut piece sorting method, sorting device and sorting system applied to sewing equipment
JPWO2020144784A1 (en) Image processing equipment, work robots, substrate inspection equipment and sample inspection equipment
CN111985420B (en) Unmanned inspection method for power distribution station based on machine vision
CN110640741A (en) Grabbing industrial robot with regular-shaped workpiece matching function
CN114463244A (en) Vision robot grabbing system and control method thereof
JP6424432B2 (en) Control device, robot system, robot and robot control method
CN109355812B (en) Visual positioning automatic sewing system and sewing method
CN117576094B (en) 3D point cloud intelligent sensing weld joint pose extraction method, system and equipment
CN117506931A (en) Groove cutting path planning and correcting equipment and method based on machine vision
CN108669600A (en) A kind of full-automatic mushroom separator
CN112347837A (en) Image processing system
CN116594351A (en) Numerical control machining unit system based on machine vision
KR102462591B1 (en) Sewing line automatic inspection method using vision system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant