WO2018044176A1 - Procédés, systèmes et produits-programmes informatiques pour la programmation basée sur la reconnaissance de forme de robots de couture - Google Patents

Procédés, systèmes et produits-programmes informatiques pour la programmation basée sur la reconnaissance de forme de robots de couture Download PDF

Info

Publication number
WO2018044176A1
WO2018044176A1 PCT/NO2017/050214 NO2017050214W WO2018044176A1 WO 2018044176 A1 WO2018044176 A1 WO 2018044176A1 NO 2017050214 W NO2017050214 W NO 2017050214W WO 2018044176 A1 WO2018044176 A1 WO 2018044176A1
Authority
WO
WIPO (PCT)
Prior art keywords
sewing
workpiece
shape
computer
automated
Prior art date
Application number
PCT/NO2017/050214
Other languages
English (en)
Inventor
Tor Ronny GJELSTENLI
Terje RIKSHEIM
Svein Even BLAKSTAD
Kenneth REVNE
Sebastian DRANSFELD
Lars Erik WETTERWALLD
Original Assignee
Amatec As
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Amatec As filed Critical Amatec As
Publication of WO2018044176A1 publication Critical patent/WO2018044176A1/fr

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/401Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by control arrangements for measuring, e.g. calibration and initialisation, measuring workpiece for machining purposes
    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05BSEWING
    • D05B19/00Programme-controlled sewing machines
    • D05B19/02Sewing machines having electronic memory or microprocessor control unit
    • AHUMAN NECESSITIES
    • A41WEARING APPAREL
    • A41HAPPLIANCES OR METHODS FOR MAKING CLOTHES, e.g. FOR DRESS-MAKING OR FOR TAILORING, NOT OTHERWISE PROVIDED FOR
    • A41H42/00Multi-step production lines for making clothes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05BSEWING
    • D05B19/00Programme-controlled sewing machines
    • D05B19/02Sewing machines having electronic memory or microprocessor control unit
    • D05B19/04Sewing machines having electronic memory or microprocessor control unit characterised by memory aspects
    • D05B19/08Arrangements for inputting stitch or pattern data to memory ; Editing stitch or pattern data
    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05BSEWING
    • D05B19/00Programme-controlled sewing machines
    • D05B19/02Sewing machines having electronic memory or microprocessor control unit
    • D05B19/12Sewing machines having electronic memory or microprocessor control unit characterised by control of operation of machine
    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05BSEWING
    • D05B19/00Programme-controlled sewing machines
    • D05B19/02Sewing machines having electronic memory or microprocessor control unit
    • D05B19/12Sewing machines having electronic memory or microprocessor control unit characterised by control of operation of machine
    • D05B19/14Control of needle movement, e.g. varying amplitude or period of needle movement
    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05BSEWING
    • D05B19/00Programme-controlled sewing machines
    • D05B19/02Sewing machines having electronic memory or microprocessor control unit
    • D05B19/12Sewing machines having electronic memory or microprocessor control unit characterised by control of operation of machine
    • D05B19/16Control of workpiece movement, e.g. modulation of travel of feed dog
    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05BSEWING
    • D05B21/00Sewing machines with devices for automatically controlling movement of work-carrier relative to stitch-forming mechanism in order to obtain particular configuration of seam, e.g. programme-controlled for sewing collars, for attaching pockets
    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05BSEWING
    • D05B33/00Devices incorporated in sewing machines for supplying or removing the work

Definitions

  • the present invention relates to flexible robotic sewing, and in particular a method and a system for flexible generation and implementation of sewing paths.
  • a method for creating a library of sewing instructions for an automated sewing station.
  • the sewing instructions are specific to respective workpieces, meaning that they are determined by the shape of the workpiece, and that workpieces of different shape will be subject to different sewing processes.
  • the method includes such steps as obtaining an image of a workpiece, processing the obtained image to create a shape representation, generating a sewing path with a defined relationship with the shape representation, and storing the shape representation together with the sewing path in a database. In that way sewing instructions are associated with shapes and they can be retrieved based on an identification of the shape with which they are associated.
  • the shape representation includes a set of points positioned along the edge of the shape. This set of points can be connected by line segments to create a polygon representation of the shape.
  • shape representations are known in the art and may be used in other embodiments of the invention.
  • the sewing path can be represented as a set of points positioned with a
  • This set of points may be used to generate a set of line segments or, for example, of curve splines.
  • the resulting curve can be used to control the sewing robots, as will be described in further detail below.
  • the sewing curve may be defined implicitly, as a predefined distance from the edge of the workpiece.
  • Some embodiments of the invention may allow manual input from an operator to define the sewing path.
  • the method may include displaying at least one of the obtained image and the shape representation generated from the image on a display of a computer and receiving a set of points representative of the sewing path as user input generated with a pointing device operated in conjunction with the display.
  • the method may further include generation and storage of additional sewing parameters in association with the shape representation.
  • additional sewing parameters may include stitch length, gathering rate etc., and they may be received as user input, or generated based on default values and rules.
  • a method for identifying a workpiece and providing sewing instructions to an automated sewing station may include obtaining an image of a workpiece and processing the obtained image to create a shape representation, much like the corresponding steps in the method according to the first aspect.
  • a search may then be performed in a database containing a library of sewing instructions associated with shape representations in order to find a closest matching shape representation.
  • the library of sewing instructions associated with shape representations may have been generated using a method corresponding to the method of the first aspect described above.
  • sewing instructions at least including a sewing path with a defined relationship with the closest matching shape representation can be retrieved and transferred to a controller computer configured to control a sewing operation in the automated sewing station.
  • a number of methods for performing shape based searches are known in the art, and the best method may depend on the method chosen for shape representation.
  • the closest matching shape representation is determined based on a metric that produces a numerical value representing a measurement of difference between two shape representations.
  • a method for performing a sewing operation in an automated sewing station.
  • Such a method may include identifying a workpiece that is delivered to the sewing station based on shape recognition, and transferring sewing instructions associated with the identified workpiece from a database of sewing instructions associated with shape representations to a controller computer configured to control a sewing operation in the automated sewing station.
  • a robot may be controlled to transfer the received workpiece from the area where it was delivered to the sewing station (which may be the area where it was positioned for identification) to a sewing machine, and a robot and the sewing machine may then be controlled to perform a sewing operation based on said sewing instructions associated with the identified workpiece.
  • the workpiece may be removed from the sewing machine.
  • the sewing machine will be ready to receive a new workpiece.
  • the method further includes utilization of a camera to continuously obtain images of the sewing operation including a needle of the sewing machine and an edge of the workpiece and use of the obtained images to measure the distance between the needle of the sewing machine and the edge of the workpiece. Upon determining that that the distance deviates from a desired value with more than a
  • the instructions to at least one of said robot and said sewing machine may be updated.
  • the distance may be used directly in order to bring the seam gradually closer to the desired distance to the edge, or it may be used in an intermediate calculation of an angle between the sewing direction and the direction of the edge of the workpiece.
  • the automated sewing station is one of a plurality of automated sewing stations in a production cell, wherein each automated sewing station includes at least one sewing machine and one sewing robot.
  • the method may then include selection of one of said plurality of sewing stations, controlling a service robot to perform the transfer of the workpiece to the sewing machine that is part of the selected one of the plurality of sewing stations, and controlling the sewing robot and sewing machine that are included in the selected sewing station to perform the sewing operation.
  • a workstation for creating a library of sewing instructions for an automated sewing station is provided. This workstation may be used to implement or perform a method corresponding to the method according the first aspect of the invention described above, and again the sewing instructions are specific to respective workpieces.
  • Such a workstation may include a camera configured to capture images of workpieces, a computer configured to receive images from the camera, generate shape representations and sewing paths from images of workpieces and store respective ones of said shape representations together with an associated sewing path in a database.
  • the shape representation may include a set of points positioned along the edge of the shape, and the sewing path may be represented as a set of points positioned with a predetermined distance from the edge of the shape.
  • a workstation may in some embodiments include a display and a pointing device, and wherein the computer can be further configured to display at least one of images received from said camera and the generated shape representations, and receive user input from the pointing device representing coordinates defining points that are to be included in the sewing paths.
  • the workstation may include a contrasting surface, and the camera may then be directed towards the contrasting surface. The workpieces may then be placed on the contrasting surface when the images are obtained by the camera.
  • an automated sewing station is provided. The automated sewing station may be used to perform methods corresponding to the methods of the second and/or the third aspect of the invention described above.
  • such an automated sewing station may comprise a first camera configured to capture images of workpieces, an image processing computer configured to receive an image from said camera, generate a shape representation from said received image of a workpiece, perform a search in a database containing a library of sewing instructions associated with shape representations to find a closest matching shape representation, and upon finding the shape representation in the library of sewing instructions that is a closest match to the shape representation created from the obtained image of the workpiece, retrieving sewing instructions at least including a sewing path with a defined relationship with the closest matching shape representation.
  • a controller computer is configured to receive the sewing instructions form the image processing computer, and the sewing machine also includes a sewing machine and a robot (104). The controller computer is configured to control a sewing operation in the automated sewing station by controlling the sewing machine and the robot based on the sewing instructions.
  • an automated sewing station may further include a second camera configured to continuously obtain images of the sewing operation including the needle of the sewing machine and the edge of the workpiece.
  • An edge tracking computer is configured to receive images from the second camera and process the received images to determine a distance between the needle of the sewing machine and the edge of the workpiece and to provide the result of this determination to the controller computer.
  • the controller computer is configured to update the sewing instructions based on a predetermined rule and the determined distance.
  • the image processing computer and the controller computer are implemented as respective software modules executed by one or more processors that are part of the same computer system.
  • the image processing computer and the controller computer may equally well be implemented as respective computer systems.
  • the automated sewing station may include a contrasting surface, and the camera may then be directed towards the contrasting surface.
  • the workpieces may then be placed on the contrasting surface when the images are obtained by the camera.
  • the contrasting surface which may, for example, be a light table, may also serve as the delivery area for workpieces that are delivered to the workstation.
  • a production cell substantially corresponds to an automated sewing cell according to the fifth aspect, but in the production cell several automated sewing cells share certain common components, primarily the components responsible for identifying workpieces, providing controlling computers with corresponding sewing instructions and distributing work between the several automated sewing stations.
  • a production cell may include a first camera connected to an image processing computer for identification of workpieces delivered to the production cell, a master controller computer configured to receive sewing instructions from a database based on an
  • Each sewing station may include a sewing machine, a sewing robot, and a sewing station controller computer.
  • the master controller computer is configured to select one of the plurality of automated sewing stations, control the service robot to transfer the workpiece to the selected automated sewing station, and also transfer the received sewing instructions to the sewing station controller computer of the selected automated sewing station.
  • the sewing station controller computer is configured to control the sewing operation at said selected automated sewing station.
  • the automated sewing stations further include a second camera connected to an edge tracking computer for determining a distance between the needle of the sewing machine and an edge of the workpiece.
  • the sewing station controller computer may then be further configured to receive the determined distance from the edge tracking computer and to update the sewing instructions based on a predetermined rule and said determined distance.
  • all computers may be implemented as separate systems. However, the image processing computer, service computer, sewing station controller computers and sewing station edge tracking computers may equally well be implemented as respective software modules installed on one or more computer systems.
  • FIG. 1 is an illustration of a sewing station according to the invention
  • FIG. 2 is a flow chart illustrating a process of generating shape representations and sewing instructions and storing them in a shape library or database;
  • FIG. 3 shows an exemplary user interface for generating a sewing path from a shape image or shape representation shown on a display;
  • FIG. 4 is a flow chart illustrating a process of identifying a workpiece by shape and programming an automated sewing station with corresponding sewing instructions
  • FIG. 5A and 5B are flow charts illustrating a process of performing a sewing operation on a workpiece.
  • FIG. 6 is an illustration of a production cell with a plurality of sewing stations.
  • FIG. 1 shows an overview of a sewing station including a light table 101, a registration camera 102, a sewing machine 103, a sewing robot 104, and a work surface table 105.
  • the sewing robot 104 has a manipulation arm 106 with a workpiece gripper 107.
  • the arm 106 is capable of being controllably moved up and down in the Z direction. In addition it can be rotated about the vertical axis, the Z-axis, by way of a motor 108 and also to be moved linearly in the horizontal X and Y directions by way of rails 109, 110.
  • the workpiece gripper 107 may simply be a piston with a lower surface that is pressed down on the workpiece 111 such that the workpiece 111 will follow the workpiece gripper's movements due to higher friction between the workpiece 111 and the workpiece gripper 107 than between the workpiece and the surface upon which the workpiece 107 is placed.
  • Other alternatives may introduce claws or pincers or vacuum, enabling additional manipulation including lifting of the workpiece 107 from the surface.
  • a new workpiece 111 with a particular geometric shape is positioned on the light table 101 it is viewed from above by the shape registration camera 102 and an image of the shape against the contrasting background is registered.
  • the light table 101 provides a bright background, ensuring that there will be a high contrast between the image of the workpiece 111 and the background.
  • the shape of the workpiece 111 is registered and stored in an image processing computer 112. This image can later be used to recognize workpieces 111 that have been placed on the light table, and also to provide the path the robot 104 should follow when sewing, as will be described in further detail below.
  • the sewing path generated from the image of the workpiece 111 is transferred to controller computer 113.
  • the controller computer 113 controls the movement of the robot 104 and it also controls the sewing machine 103.
  • an edge tracking computer 114 is connected to an edge detection camera 115.
  • the edge detection camera 115 continuously sends an image of the needle of the sewing machine 103 and the workpiece to the edge tracking computer 114.
  • the edge tracking computer 114 measures the angle and distance between the edge of the workpiece 111 and the needle of the sewing machine 103 and provides the results to the controller computer 113.
  • the controller computer 113 may then adjust the sewing path in order to compensate for any deviation from the predetermined distance from the edge of the workpiece 111 to the seam caused by deformation of the workpiece 111 or by any other reason.
  • Some embodiments of the invention may include a display 116 and a pointing device 117 enabling manual user input for generation or adjustment of the sewing path.
  • the directions of movement for the robot 104 will be described based on a system of coordinates where the X-axis and Y-axis define the horizontal plane, the Z-axis is the vertical direction, and rotation will be referred to as the A-axis.
  • the rotational axis may, but does not have to be exactly in the Z direction, and in some embodiments additional degrees of freedom may be introduced, as those with skill in the art will readily understand.
  • the robot 104 manipulates the workpiece 111 by moving the manipulator arm 106 in the X and Y direction along the rails 109, 110 until it is immediately above the workpiece 111.
  • the arm 106 is then extended downward until the workpiece 111 is held between the surface of the light table 101 and the workpiece gripper 107 at the end of the manipulator arm 106.
  • the surfaces of the light table 101 and the work surface table 105 have a sufficiently low friction to allow the workpiece to follow the movement of the workpiece gripper 107 when the manipulator arm is moved in its extended position. In this manner the workpiece can be moved to any position on the work surface that is reachable by the manipulator arm 106.
  • the A-axis motor 108 makes it possible to rotate the workpiece 111 around a vertical axis.
  • the first phase of operations, before a new workpiece 111 can be subjected to a sewing process, is to create a representation of its geometrical shape in a library of registered shapes in the image processing computer 112.
  • the images stored in the image processing computer 112 may serve two purposes. They are used to identify workpieces during production, and they can be used to generate the sewing path associated with that type of workpiece.
  • a number of features of the robot 104, the rails 109, 110, and the manipulator arm 106 will not be described in detail since they are well known and understood by those with skill in the art who may choose from a number of options, including electrical motors, servos, hydraulics and pneumatics. In the embodiment illustrated in FIG.
  • rail 109 may be moved in the Y-direction for example on an additional rail inside the robot 104 and pulled by belts or by rack and pinion.
  • rail 110 may be pulled in the X-direction by a belt or by rack and pinion inside rail 109.
  • the manipulator arm 106 may be moved up and down telescopically, for example using hydraulics or pneumatics, or by rack and pinion.
  • FIG. 2 is a flow chart showing an example of how a process of registering a new workpiece in the library of workpieces can be performed in some embodiments of the invention.
  • registration of a new workpiece is performed using the light table 101, camera 102 and image processing computer 112 of the production cell, this does not have to be the case.
  • Registration of components as well as generation of sewing paths may equally well be performed on a workstation particularly adapted for this purpose.
  • Such a special purpose workstation may, but does not have to, include all of the production equipment such as a sewing machine 103, a sewing robot 104, and a controller computer 113. Whether sewing paths and the library of geometric shapes are generated on special purpose equipment or on equipment that is part of a production cell, they may be transferred to other production cells to be used there.
  • a first step 201 the workpiece is placed on a high contrasting surface such as light table 101.
  • a camera 102 which is capable of obtaining undistorted images of the shape of the workpiece, for instance by being positioned directly above the contrasting surface, obtains one or more images of the workpiece. This image is then forwarded to an image processing computer 112 in step 203.
  • step 204 the image is processed in the image processing computer 112 and a representation of the shape of the workpiece is generated.
  • a number of methods for representation of shapes are known in the art. Generally they can be classified as contour- based and region-based, and as global or structural. In contour-based methods shape features are extracted only from the contour and not from the rest of the shape region. In structural methods the shape is represented not as a whole, but as segments or sections referred to as primitives. In some embodiments of the invention the contour based, structural method of representing shape segments as polygons are used, but other methods are consistent with the principles of the invention.
  • the shape is represented as the sum of edges connecting vertices (or points, or nodes).
  • the vertices are represented as the coordinates of respective points along the edge of a shape and the edges are represented as a straight line between two adjacent points.
  • An alternative method to the process described in steps 201 through 204 is to receive the same input data as that which is used to control a cutting machine used to cut the workpiece from a larger piece of e.g. cloth or hide.
  • the generated shape representation is stored in a library of workpiece shapes in step 204. This library is searchable and can be used to identify workpieces during production, as will be described in further detail below.
  • a sewing path is generated.
  • the path is generated manually from the image of the workpiece obtained in step 202.
  • the image, or a processed version of the image, for example one with enhanced contrast, or even a synthetic image based on the shape representation generated in step 204, is presented on a display of a monitor which is part of the image processing computer 112 and an operator uses a pointing device such as a computer mouse to mark control points along the sewing path.
  • the path can then be generated as line or curve segments between each control point. In most cases it will be sufficient to generate the line segments as straight lines, but in some embodiments they may be generated as curves, for example as polygons or B-splines.
  • the sewing path is generated automatically from the shape representation generated in step 204.
  • the sewing path may then be generated as a path parallel to and a predetermined distance from the edge of the shape. It is, of course, also possible to use one shape representation method or algorithm for generation of the shape representation and a different method or algorithm for generating the sewing path.
  • Some embodiments may define the sewing path implicitly, for example by specifying a starting point, an ending point and a distance from the edge of the workpiece 111.
  • the sewing path may also be defined both by an explicit path definition and a rule, for example distance to the edge of the workpiece, and the rule may then be used to handle deformation of the workpiece during sewing, something that will require deviation from the predefined sewing path.
  • a final step 206 the sewing path is stored in the shape library in a manner that associates it with the shape representation.
  • the exact structure and organization of the database that constitutes the shape library is not essential. The important point is that it should be possible to obtain a corresponding sewing path description based on an
  • FIG. 3 shows an exemplary user interface for generating the sewing path from a shape image or shape representation shown on a display.
  • the example is somewhat simplified in that it does not show tools, controls or other user interface elements that are unnecessary for this description.
  • the user interface is shown as a viewport 301 that may be anything from the entirety of a display screen to the inside of one of a plurality of windows.
  • a window will typically include a frame with borders and widgets giving access to various menus, tools and other functions, but the chrome part of the window is not included in the drawing.
  • the user interface includes a ruler display element 302 which helps the operator determine distances in the displayed image.
  • the ruler scales according to zoom level and is therefore capable of showing correct distances independently of whether the user zooms into or out of the image.
  • a mouse pointer element 303 may be displayed as an arrow, which is a well-known convention in the art.
  • the display further shows a representation of the shape of a workpiece 311. As already mentioned this representation may be the image captured by camera 102, it may be a processed version of that image, for example with enhanced contrast, or it may be based on the shape representation generated and stored in the workpiece library.
  • the mouse pointer 303 By moving the mouse pointer 303, positioning it a predetermined distance from the edge of the workpiece 311, and providing user input for example in the form of a mouse click, the operator can create a mark or point 304.
  • the mouse pointer changes appearance, for example to a cross cursor (also known as precision cursor or crosshair cursor).
  • a cross cursor also known as precision cursor or crosshair cursor
  • straight lines are generated to connect adjacent points 304, and collectively these straight lines represent the sewing path 305.
  • curves other than straight lines may also be used in some embodiments of the invention.
  • the points 304 can be positioned relatively far apart along sections of the workpiece edge that are relatively straight, while along sections with a sharper curvature the points 304 must be much closer to each other.
  • the resulting sewing path curve 305 may now be stored in association with its corresponding workpiece shape as described above.
  • the sewing path may be stored simply as a set of coordinates representing the points 304, since the rules for generating the lines or curve segments between the points are known and can be repeated. However, there may also be necessary to associate the sewing path with one or more points of reference on the workpiece.
  • the curve representing the sewing path alone may represent sufficient input to the controller computer 113.
  • the controller computer 113 may be programmed to map the sewing path curve 305 to the workpiece shape in accordance with general rules such as a predetermined distance from the edge of the workpiece and shape comparisons. However, it may be more efficient to include additional data with the
  • FIG. 4 is a flowchart illustration of a process of receiving a workpiece for sewing, identifying the workpiece based on the information in the shape library, loading the sewing path and associated parameters and performing the sewing operation.
  • a workpiece is placed on a high contrasting surface such as the light table 101.
  • This step is similar to the first step of the registration process, but now the purpose is not to register the shape of the workpiece, but to identify the workpiece based on its shape.
  • An image is obtained by a camera 102 mounted above the light table 101 in step 402, again in a manner similar to the corresponding step of the registration process, and the image is transferred to the image processing computer 112 for shape recognition in step 403.
  • step 404 the image processing computer 112 generates a shape representation using the same method as has been used to generate the shape representations stored in the shape library as described above.
  • the image processing computer 112 searches for the closest match to this shape in the shape library.
  • a metric can be defined to measure the difference between two polygons as a distance (e.g. the sum of the distances between the positions of corresponding points) or as cumulative measure of the angles through which a polygonal curve turns.
  • the sewing path and other parameters stored in association with the identified shape can be retrieved from the shape library in step 405, and these parameters can be transferred to the controller computer 113 in step 406.
  • step 407 the sewing path and the other parameters that have been received by the controller computer 113 are loaded and the appropriate sewing operation can be performed.
  • FIG. 5 is a flowchart presenting a number of steps that may be performed in exemplary embodiments of the invention.
  • the embodiment illustrated in FIG. 1 and the sewing process described below include only one sewing machine and one robot.
  • several sewing machines may be included in one production cell.
  • Such embodiments may include one sewing robot per sewing machine and one service robot providing configured to move workpieces from the light table to the sewing machine.
  • Some of the steps described as performed by the robot in the following description may then be performed by the service robot while others may be performed by the sewing robot.
  • a first step 501 of the sewing process the position of the workpiece on the light table 101 is determined by the image processing computer 112 based on the image received from the camera 102. This position is sent to the controller computer 113.
  • the controller computer 113 instructs the robot 104 to move the workpiece to the sewing machine 103. In production cells including several sewing machines this step may also include a determination of which sewing machine should receive the workpiece.
  • step 503 the position of the workpiece is adjusted to the correct starting position. This is the position where the needle of the sewing machine is directly above the point on the workpiece 111 from where the sewing path begins according to the data received from the shape library.
  • the workpiece shall also be correct oriented as determined by rotation of the workpiece gripper 107, i.e. the position on the A-axis.
  • the sewing operation is performed in accordance with the parameters from the shape library as received from the image processing computer 112. The sewing operation will be described in further detail below.
  • the controller computer 113 instructs the robot 104 to remove the finished workpiece.
  • the robot 104 may be instructed to move the workpiece 111 to a specific place for removal or temporary storage, but such details are not illustrated in the drawing.
  • FIG. 5B is a flowchart illustrating further details of an exemplary method for performing the sewing operation in step 504. It should be noted that while this flowchart illustrates the process as one cycle of consecutive steps, several of the steps may be performed in parallel or may be reinitiated without waiting for the previous cycle to complete.
  • a first step 5041 an image is obtained with the edge detection camera 115. This step is performed continuously and as soon as they are available, images are transferred to the edge tracking computer 114 in step 5042.
  • an image is received by the edge tracking computer 114 it is processed in step 5403 in order to provide information making it possible to determine whether the current sewing position and direction is consistent with the
  • step 5044 the results obtained by the edge tracking computer 114 are transferred to the controller computer 113 for further processing.
  • the controller computer 113 compares the received values with the programmed sewing path and determines, in step 5045, whether an adjustment is necessary. If not, the sewing process continues in accordance with the programmed sewing path (which may involve adjustment of the robot axis if the programmed sewing path is curved). The process is repeated for the next image obtained by the edge detection camera 115, starting again with step 5041.
  • step 5041 is not necessarily initiated by the completion of step 5045, and processing of the next image may already be in progress.
  • step 5045 If it is determined in step 5045 that adjustment of the sewing path is necessary, the necessary adjustments are calculated in step 5046, and in step 5047 the controller computer instructs the robot 104 accordingly. If the adjustments also require an update of the instructions to the sewing machine 103 such instructions are also generated and transferred in this step. The process is then repeated for the next image obtained by the edge detection camera 115 as described above.
  • FIG. 6 In order to avoid cluttering the drawing unnecessarily, communication connections between the various components are not included in the drawing, and certain other details are also left out, as will be explained below.
  • the reference numbers in FIG. 6 are the same as the reference numbers used for corresponding components in FIG. 1. The reference numbers do not distinguish between components of the same type, so for example all the sewing machines 103 have the same reference number.
  • the production cell has a contrasting surface such as a light table 101 which is used to receive workpieces 111 and which is observed from above by a registration camera 102.
  • a sewing machine 103 There are four sewing stations, each including a sewing machine 103, a sewing robot 104, a work surface table 105 and an edge detection camera 115.
  • the manipulation arm 106, workpiece gripper 107, motor 108 and rails 109, 110 shown in FIG. 1 are collectively represented by robot arm 120.
  • a service robot 104' with a motor 108' and rails 109', 110' is positioned such that it can reach the light table 101 as well as all work surface tables 105, 105'.
  • the service robot 104' also has a manipulation arm and a workpiece gripper corresponding to, respectively, manipulation arm 106 and workpiece gripper 107 in FIG. 1, but not shown in FIG. 6.
  • An image processing computer 112 receives images from the registration camera and performs the processing described above.
  • the image processing computer 112 also includes the library of workpieces already described.
  • the image processing computer 112 is connected to a master controller computer 113'.
  • the master controller computer 113' receives information from the image processing computer 112 and determines which of the sewing stations in the production cell should receive the next workpiece 111. The determination may be based on whether a sewing station is currently idle, on which sewing station will be the first to finish its current task and become idle, on historic workload such that workloads may be distributed evenly over time, or on a combination of these and other factors.
  • the controller computer 113' may then control the service robot 104' to move the workpiece 111 from the light table 101 over the common work surface table 105' and positions it on the work surface table 105 of the designated sewing station.
  • the controller computer 113 of the designated sewing station receives the sewing instructions from the master controller computer 113' and the process continues under control of the sewing station controller computer 113.
  • the sewing station controller computer 113 receives edge tracking information from the sewing station edge tracking computer 114 based on images from the sewing station's edge detection camera 115, and the sewing station's controller computer 113 also controls the sewing station's sewing robot 104.
  • the computers will include one or more processors configured to operate in accordance with instructions written in computer code and stored in persistent memory in the respective computers.
  • the computers will further comprise working memory, one or more system buses for internal communication between various components of the computer, and interfaces for
  • the database holding the shape library may be stored in the image processing computer 112 itself, or in a separate server (not shown).
  • the shape library may also be distributed, in whole or in part, over several computers.
  • the sewing machines 103 may be standard industrial type sewing machines capable of being controlled by the controller computer 113.

Landscapes

  • Engineering & Computer Science (AREA)
  • Textile Engineering (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Human Computer Interaction (AREA)
  • Manufacturing & Machinery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Sewing Machines And Sewing (AREA)

Abstract

L'invention concerne des procédés, des appareils et des produits-programmes informatiques permettant d'effectuer des opérations de couture automatisée sur la base d'une reconnaissance de forme d'une pièce. L'invention concerne ainsi la génération d'instructions de couture en fonction d'une forme, la programmation d'un poste de couture automatisée en fonction d'une forme, et la mise à jour d'instructions de couture au cours du processus de couture.
PCT/NO2017/050214 2016-08-31 2017-08-31 Procédés, systèmes et produits-programmes informatiques pour la programmation basée sur la reconnaissance de forme de robots de couture WO2018044176A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
NO20161384 2016-08-31
NO20161384A NO344844B1 (en) 2016-08-31 2016-08-31 Methods, systems and computer program products for shape recognition based programming of sewing robots

Publications (1)

Publication Number Publication Date
WO2018044176A1 true WO2018044176A1 (fr) 2018-03-08

Family

ID=60081246

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/NO2017/050214 WO2018044176A1 (fr) 2016-08-31 2017-08-31 Procédés, systèmes et produits-programmes informatiques pour la programmation basée sur la reconnaissance de forme de robots de couture

Country Status (2)

Country Link
NO (1) NO344844B1 (fr)
WO (1) WO2018044176A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020092855A (ja) * 2018-12-13 2020-06-18 三菱電機株式会社 縫製制御装置、縫製制御システム、縫製制御方法およびプログラム
WO2021096928A1 (fr) * 2019-11-12 2021-05-20 Softwear Automation, Inc. Procédés et systèmes pour fabriquer un produit cousu à l'aide d'un robot
WO2023225862A1 (fr) * 2022-05-24 2023-11-30 Abb Schweiz Ag Robot et procédé de réglage de fils cousus
WO2024055202A1 (fr) * 2022-09-14 2024-03-21 Centre For Garment Production Limited Systèmes et procédés pour coudre et défroisser des tissus

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10744647B1 (en) * 2019-11-12 2020-08-18 Softwear Automation, Inc. Sensor systems and methods for sewn product processing apparatus
CN113031538B (zh) * 2019-12-25 2022-03-18 北京宜通华瑞科技有限公司 一种缝纫机械设备的管理系统及方法

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4998489A (en) * 1988-04-28 1991-03-12 Janome Sewing Machine Industry Co., Ltd. Embroidering machines having graphic input means

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4876976A (en) * 1988-07-18 1989-10-31 Td Quilting Machinery Automatic quilting machine and method for specialized quilting of patterns which can be controlled by a remote joystick and monitored on a video screen including pattern duplication through a reprogrammable computer
IL99757A (en) * 1991-10-15 1995-06-29 Orisol Original Solutions Ltd Apparatus and method for automatic preparation of a sewing program
US5657710A (en) * 1995-07-21 1997-08-19 Sara Lee Corporation Automatic garment manufacture
US5664512A (en) * 1995-07-21 1997-09-09 Sara Lee Corporation Garment piece positioner and seamer
US6755141B2 (en) * 2001-04-03 2004-06-29 Otabo Llc Method for stitching a work piece using a computer controlled, vision-aided sewing machine
US7426302B2 (en) * 2003-11-28 2008-09-16 John Amico System and method for digitizing a pattern
US8573145B2 (en) * 2010-03-18 2013-11-05 Stephen Lang Dickerson Feed mechanism that advances fabric

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4998489A (en) * 1988-04-28 1991-03-12 Janome Sewing Machine Industry Co., Ltd. Embroidering machines having graphic input means

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
DENGSHENG ZHANG; GUOJUN LU: "Review of shape representation and description techniques", PATTERN RECOGNITION, vol. 37, 2004

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020092855A (ja) * 2018-12-13 2020-06-18 三菱電機株式会社 縫製制御装置、縫製制御システム、縫製制御方法およびプログラム
WO2021096928A1 (fr) * 2019-11-12 2021-05-20 Softwear Automation, Inc. Procédés et systèmes pour fabriquer un produit cousu à l'aide d'un robot
US11421363B2 (en) 2019-11-12 2022-08-23 Softwear Automation, Inc. Methods and systems for making a sewn product using a robot
WO2023225862A1 (fr) * 2022-05-24 2023-11-30 Abb Schweiz Ag Robot et procédé de réglage de fils cousus
WO2024055202A1 (fr) * 2022-09-14 2024-03-21 Centre For Garment Production Limited Systèmes et procédés pour coudre et défroisser des tissus

Also Published As

Publication number Publication date
NO344844B1 (en) 2020-05-25
NO20161384A1 (en) 2018-03-01

Similar Documents

Publication Publication Date Title
WO2018044176A1 (fr) Procédés, systèmes et produits-programmes informatiques pour la programmation basée sur la reconnaissance de forme de robots de couture
US11667030B2 (en) Machining station, workpiece holding system, and method of machining a workpiece
CN110977931B (zh) 使用了增强现实和混合现实的机器人控制装置及显示装置
JP6497953B2 (ja) オフライン教示装置、オフライン教示方法及びロボットシステム
CN109629122B (zh) 一种基于机器视觉的机器人缝制方法
JP5850004B2 (ja) ロボット制御装置及びロボット制御方法
JP5939213B2 (ja) ロボット制御装置及びロボット制御方法
US10228686B2 (en) Robot programming device for teaching robot program
US10534876B2 (en) Simulation device and simulation method that carry out simulation of operation of robot system, and recording medium that records computer program
US20150112482A1 (en) Teaching system and teaching method
Torgerson et al. Vision-guided robotic fabric manipulation for apparel manufacturing
KR101013749B1 (ko) 비젼시스템을 구비한 씨엔씨 공작기계
CN105487481B (zh) 离线示教机器人的机器人示教装置
JP2014083610A (ja) ロボットシステムおよび加工品の製造方法
US20200133231A1 (en) Program code generating method for tilted plane machining by multi-axis machine tool and device thereof
CN107984475B (zh) 模拟机器人动作的模拟装置以及模拟方法
CN107848117B (zh) 机器人系统以及控制方法
JP2019089201A (ja) 教示データ作成装置、教示データ作成装置の制御方法及びロボットシステム
JP2015222196A (ja) 三次元測定機、及びこれを用いた形状測定方法
JP2015200582A (ja) 画像測定機
JP2019018250A (ja) 動作プログラムを生成するプログラミング装置、及びプログラム生成方法
JP6799614B2 (ja) 情報処理装置及び情報作成方法
CN107921632B (zh) 处理轨迹编辑装置、机器人、物品处理系统以及物品制造方法
US11780080B2 (en) Robot teaching with scans and geometries
WO2024023934A1 (fr) Dispositif de retrait de pièce, procédé de retrait de pièce et dispositif de commande

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17783602

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17783602

Country of ref document: EP

Kind code of ref document: A1