CN112779679A - Image processing device, sewing machine, and image processing method - Google Patents

Image processing device, sewing machine, and image processing method Download PDF

Info

Publication number
CN112779679A
CN112779679A CN202011224095.7A CN202011224095A CN112779679A CN 112779679 A CN112779679 A CN 112779679A CN 202011224095 A CN202011224095 A CN 202011224095A CN 112779679 A CN112779679 A CN 112779679A
Authority
CN
China
Prior art keywords
sewing
area
pixel
image
stitch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011224095.7A
Other languages
Chinese (zh)
Other versions
CN112779679B (en
Inventor
塚田丰
山田和范
横濑仁彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Juki Corp
Original Assignee
Juki Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Juki Corp filed Critical Juki Corp
Publication of CN112779679A publication Critical patent/CN112779679A/en
Application granted granted Critical
Publication of CN112779679B publication Critical patent/CN112779679B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05BSEWING
    • D05B19/00Programme-controlled sewing machines
    • D05B19/02Sewing machines having electronic memory or microprocessor control unit
    • D05B19/04Sewing machines having electronic memory or microprocessor control unit characterised by memory aspects
    • D05B19/10Arrangements for selecting combinations of stitch or pattern data from memory ; Handling data in order to control stitch format, e.g. size, direction, mirror image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Textile Engineering (AREA)
  • Sewing Machines And Sewing (AREA)

Abstract

The invention provides an image processing apparatus, a sewing machine, and an image processing method, which can distinguish a region provided with a hole pattern from a region not provided with the hole pattern. The image processing apparatus includes: an object image acquiring unit that acquires an object image representing an image of a sewing object; a scanning unit that scans an object image with a predetermined search area and determines whether or not a pixel of interest is a predetermined color pixel; and an area dividing unit that divides the surface of the sewing object into a texture area and a stitch area based on the determination result, and calculates a boundary line between the texture area and the stitch area.

Description

Image processing device, sewing machine, and image processing method
Technical Field
The invention relates to an image processing apparatus, a sewing machine, and an image processing method.
Background
In order to improve the appearance of the sewing object, stitches may be formed on the sewing object. Patent document 1 discloses a technique for forming stitches on a skin material used for a vehicle seat.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open publication No. 2013-162957
Disclosure of Invention
Technical problem to be solved
A skin material for a vehicle seat is provided with a hole. By patterning the holes, the aesthetic appearance of the vehicle seat can be further improved. The stitches are formed in the areas of the pattern where the holes are not provided. In the case of forming stitches, it is necessary to identify a region where a pattern of holes is provided and a region where a pattern of holes is not provided.
An object of an aspect of the present invention is to recognize a region where a pattern of holes is provided and a region where a pattern of holes is not provided.
(II) technical scheme
According to an aspect of the present invention, there is provided an image processing apparatus including: an object image acquiring unit that acquires an object image representing an image of a sewing object; a scanning unit that scans the target image with a predetermined search area and determines whether or not a pixel of interest is a predetermined color pixel; and an area dividing unit that divides the surface of the sewing object into a texture area and a stitch area based on the determination result, and calculates a boundary line between the texture area and the stitch area.
(III) advantageous effects
According to the aspect of the present invention, a region provided with a pattern of holes and a region not provided with a pattern of holes can be discriminated.
Drawings
Fig. 1 is a perspective view showing a sewing machine according to the present embodiment.
Fig. 2 is a perspective view showing a part of the sewing machine of the present embodiment.
Fig. 3 is a cross-sectional view showing a part of the sewing object of the present embodiment.
Fig. 4 is a plan view showing a sewing object of the present embodiment.
Fig. 5 is a sectional view showing a part of the sewing object of the present embodiment.
Fig. 6 is a plan view showing a part of the sewing object of the present embodiment.
Fig. 7 is a plan view showing a part of the sewing object of the present embodiment.
Fig. 8 is a functional block diagram showing the sewing machine of the present embodiment.
Fig. 9 is a diagram for explaining the operation of the imaging apparatus according to the present embodiment.
Fig. 10 is a diagram for explaining a boundary line calculation method according to the present embodiment.
Fig. 11 is a diagram for explaining the boundary line of the present embodiment.
Fig. 12 is a diagram for explaining a correction point in the present embodiment.
Fig. 13 is a diagram for explaining an example of a method of calculating the feature point according to the present embodiment.
Fig. 14 is a diagram for explaining an example of a method of calculating the feature point according to the present embodiment.
Fig. 15 is a diagram for explaining a reference vector according to the present embodiment.
Fig. 16 is a diagram for explaining the reference vector of the present embodiment.
Fig. 17 is a diagram for explaining an example of a correction point calculation method according to the present embodiment.
Fig. 18 is a diagram for explaining an example of a correction point calculation method according to the present embodiment.
Fig. 19 is a diagram for explaining an example of a correction point calculation method according to the present embodiment.
Fig. 20 is a flowchart showing a sewing method according to the present embodiment.
Fig. 21 is a flowchart showing the region division processing according to the present embodiment.
Fig. 22 is a flowchart showing correction point calculation processing according to the present embodiment.
Fig. 23 is a block diagram showing a computer system according to the present embodiment.
Description of the reference numerals
1-a sewing machine; 2-a workbench; 3-sewing machine needles; 4-surface material; 5-a gasket material; 6-lining material; 7-well; 10-a sewing machine body; 11-a sewing machine frame; 11A-horizontal arm; 11B-a bed; 11C-vertical arm; 11D-a handpiece; 12-a needle bar; 13-a needle plate; 14-a support member; 15-a holding member; 15A-a foot press component; 15B-lower plate; 16-an actuator; 17-an actuator; a 17X-X axis motor; 17Y-Y axis motors; 18-an actuator; 19-a medium voltage foot member; 20-an operating device; 21-an operation panel; 22-operating pedal; 30-a camera device; 31-a drive amount sensor; 32-a drive amount sensor; a 32X-X axis sensor; a 32Y-Y axis sensor; 40-an image processing device; 41-object image acquisition unit; 42-a scanning section; 43-region dividing section; 44-correction point setting section; 45-a feature point extraction section; 46-a reference vector calculation section; 47-a reference vector storage; 48-correction amount calculating section; 49-region division image output section; 50-a control device; 60-a storage device; 61-sewing data storage part; 62-a design data store; 63-a program storage section; 70-an input device; 80-an output device; 1000-a computer system; 1001-processor; 1002-main memory; 1003-storage means; 1004-interface; AP-pixel of interest; APb-pixel of interest; APs-pixel of interest; APt-pixel of interest; AX-optical axis; BP-boundary points; CP-correction point; CH-stitch; DM-area segmentation image; an FA-image pickup area; FP-feature point; HA-search area; KP-candidate points; pf-camera position; ps-sewing position; RL-target needle trace; RS-specifying a pattern; s-sewing the object; SM-object image; SA-stitch area; TA-texture area; US-reference pattern.
Detailed Description
Embodiments of the present invention will be described below with reference to the drawings, but the present invention is not limited thereto. The constituent elements of the embodiments described below may be combined as appropriate. In addition, some of the components may not be used.
In the present embodiment, a local coordinate system is defined for the sewing machine 1. In the following description, the local coordinate system defined for the sewing machine 1 is referred to as a sewing machine coordinate system as appropriate. The coordinate system of the sewing machine is defined by an XYZ orthogonal coordinate system. In the present embodiment, the positional relationship of each part will be described based on the coordinate system of the sewing machine. The direction parallel to the X axis in the predetermined plane is set as the X axis direction. The direction parallel to the Y axis in a predetermined plane orthogonal to the X axis is set as the Y axis direction. The direction parallel to the Z axis orthogonal to the predetermined plane is set as the Z axis direction. In addition, the rotation direction or the tilt direction about the X axis is set as the θ X direction. The rotation direction or the tilt direction about the Y axis is set as the θ Y direction. The rotation direction or the tilt direction about the Z axis is set as the θ Z direction. In the present embodiment, a plane including the X axis and the Y axis is referred to as an XY plane as appropriate. A plane including the X axis and the Z axis is appropriately referred to as an XZ plane. A plane including the Y axis and the Z axis is appropriately referred to as a YZ plane. The XY plane is parallel to the prescribed plane. The XY plane, XZ plane, YZ plane are orthogonal. In the present embodiment, the XY plane is set to be parallel to the horizontal plane. The Z-axis direction is the up-down direction. The + Z direction is upward and the-Z direction is downward. Further, the XY plane may be inclined with respect to the horizontal plane.
< Sewing machine >
Fig. 1 is a perspective view showing a sewing machine 1 according to the present embodiment. Fig. 2 is a perspective view showing a part of the sewing machine 1 of the present embodiment. In the present embodiment, the sewing machine 1 is an electronic cycle sewing machine. The sewing machine 1 includes: the sewing machine includes a sewing machine body 10, an operation device 20 operated by an operator, and an imaging device 30 capable of imaging a sewing object S.
The sewing machine body 10 is mounted on the upper surface of the table 2. The sewing machine body 10 includes: the sewing machine includes a sewing machine frame 11, a needle bar 12 supported by the sewing machine frame 11, a needle plate 13 supported by the sewing machine frame 11, a holding member 15 supported by the sewing machine frame 11 via a support member 14, an actuator 16 generating power for moving the needle bar 12, an actuator 17 generating power for moving the holding member 15, and an actuator 18 generating power for moving at least a part of the holding member 15.
The sewing machine frame 11 has: a horizontal arm 11A extending in the Y-axis direction, a bed 11B disposed below the horizontal arm 11A, a vertical arm 11C connecting the + Y-side end of the horizontal arm 11A to the bed 11B, and a head 11D disposed on the-Y side of the horizontal arm 11A.
The needle bar 12 holds the sewing machine needle 3. The needle bar 12 holds the sewing needle 3 in such a manner that the sewing needle 3 is parallel to the Z axis. The needle bar 12 is supported by the head 11D so as to be movable in the Z-axis direction.
The needle plate 13 supports the sewing object S. The needle plate 13 supports the holding member 15. The needle plate 13 is supported on the bed 11B. The needle plate 13 is disposed below the holding member 15.
The holding member 15 holds the sewing object S. The holding member 15 can hold and move the sewing object S in an XY plane including a sewing position Ps directly below the sewing needle 3. The holding member 15 can hold and move the sewing object S in an XY plane including an imaging position Pf directly below the imaging device 30. The holding member 15 moves in the XY plane including the sewing position Ps based on the sewing data in a state of holding the sewing object S, thereby forming a stitch CH on the sewing object S. The holding member 15 is supported by the horizontal arm 11A via the support member 14.
The holding member 15 includes: a presser foot member 15A, and a lower plate 15B opposed to the presser foot member 15A. The presser foot member 15A is a frame-shaped member. The presser foot member 15A is movable in the Z-axis direction. The lower plate 15B is disposed below the presser foot member 15A. The holding member 15 holds the sewing object S by sandwiching the sewing object S between the presser foot member 15A and the lower plate 15B.
When the leg pressing member 15A moves in the + Z direction, the leg pressing member 15A is separated from the lower plate 15B. Thus, the operator can dispose the sewing object S between the presser foot member 15A and the lower plate 15B. When the presser foot member 15A moves in the-Z direction in a state where the sewing object S is disposed between the presser foot member 15A and the lower plate 15B, the sewing object S is sandwiched between the presser foot member 15A and the lower plate 15B. Thereby, the sewing object S is held by the holding member 15. Further, the holding member 15 releases the holding of the sewing object S by moving the leg pressing member 15A in the + Z direction. Thereby, the operator can take out the sewing object S from between the presser foot member 15A and the lower plate 15B.
The actuator 16 generates a motive force that moves the needle bar 12 in the Z-axis direction. The actuator 16 comprises a pulse motor. The actuator 16 is disposed on the horizontal arm 11A.
A horizontal arm shaft extending in the Y-axis direction is disposed inside the horizontal arm 11A. The actuator 16 is coupled to the + Y-side end of the horizontal arm shaft. the-Y-side end of the horizontal arm shaft is connected to the needle bar 12 via a transmission mechanism disposed inside the handpiece 11D. The horizontal arm shaft is rotated by the operation of the actuator 16. The power generated by the actuator 16 is transmitted to the needle bar 12 via the horizontal arm shaft and the transmission mechanism. Thereby, the sewing machine needle 3 held by the needle bar 12 reciprocates in the Z-axis direction.
A timing belt extending in the Z-axis direction is disposed inside the vertical arm 11C. Further, a bed shaft extending in the Y-axis direction is disposed inside the bed 11B. The horizontal arm shaft and the bed shaft are respectively provided with a belt wheel. The timing belt is respectively disposed on a belt wheel disposed on the horizontal arm shaft and a belt wheel disposed on the bed shaft. The horizontal arm shaft and the bed shaft are connected via a transmission mechanism including a timing belt.
A kettle is disposed inside the bed 11B. The kettle contains a spool in which a spool box is housed. The horizontal arm shaft and the bed shaft are rotated by the operation of the actuator 16. The power generated by the actuator 16 is transmitted to the tank via the horizontal arm shaft, the timing belt, and the table shaft. Thereby, the pot is rotated in synchronization with the reciprocating movement of the needle bar 12 in the Z-axis direction.
The actuator 17 generates a power to move the holding member 15 in the XY plane. The actuator 17 comprises a pulse motor. The actuator 17 includes: an X-axis motor 17X that generates power to move the holding member 15 in the X-axis direction, and a Y-axis motor 17Y that generates power to move the holding member 15 in the Y-axis direction. The actuator 17 is disposed inside the bed 11B.
The power generated by the actuator 17 is transmitted to the holding member 15 via the support member 14. Thereby, the holding member 15 can move in the X-axis direction and the Y-axis direction between the sewing machine needle 3 and the needle plate 13, respectively. The holding member 15 can hold and move the sewing object S in an XY plane including a sewing position Ps directly below the sewing needle 3 by the operation of the actuator 17.
The actuator 18 generates a power to move the leg pressing member 15A of the holding member 15 in the Z-axis direction. The actuator 18 comprises a pulse motor. The leg pressing member 15A is moved in the + Z direction, whereby the leg pressing member 15A is separated from the lower plate 15B. The stitch object S is held between the presser foot member 15A and the lower plate 15B by the movement of the presser foot member 15A in the-Z direction.
As shown in fig. 2, the sewing machine body 10 has a middle presser foot member 19 disposed around the sewing needle 3. The middle presser foot member 19 presses the sewing object S around the sewing needle 3. The middle presser member 19 is supported by the head 11D so as to be movable in the Z-axis direction. A center presser motor that generates power for moving the center presser member 19 in the Z-axis direction is disposed inside the head 11D. The middle presser foot member 19 is moved in the Z-axis direction in synchronization with the needle bar 12 by the operation of the middle presser foot motor. The floating of the sewing object S caused by the movement of the sewing machine needle 3 can be suppressed by the middle presser foot member 19.
The operation device 20 is operated by an operator. The sewing machine 1 is operated by operating the operating device 20. In the present embodiment, the operation device 20 includes an operation panel 21 and an operation pedal 22. An operation panel 21 is mounted on the upper surface of the table 2. The operating pedal 22 is disposed below the table 2. The operator operates the operating pedal 22 with his foot. The sewing machine 1 is operated by an operator operating at least one of the operation panel 21 and the operation pedal 22.
The imaging device 30 images the sewing object S held by the holding member 15. The imaging device 30 includes an optical system and an image sensor that receives light incident via the optical system. The image sensor includes a CCD (charge coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor.
The imaging device 30 is disposed above the needle plate 13 and the holding member 15. The imaging position Pf includes the position of the optical axis AX of the optical system of the imaging device 30. An imaging area FA is defined for the imaging device 30. The imaging area FA includes a field of view area of the optical system of the imaging device 30. The imaging area FA contains the imaging position Pf. The imaging device 30 acquires an image of at least a part of the sewing object S arranged in the imaging area FA. The imaging device 30 images at least a part of the sewing object S disposed inside the presser foot member 15A from above.
The position of the image pickup device 30 is fixed. The relative position of the camera 30 and the sewing machine frame 11 is fixed. The relative position of the optical axis AX of the optical system of the imaging device 30 and the sewing machine needle 3 in the XY plane is fixed. The relative position data indicating the relative position of the optical axis AX of the optical system of the imaging device 30 and the sewing machine needle 3 in the XY plane is known data that can be derived from design data of the sewing machine 1.
The position of the image acquired by the imaging device 30 is specified in the camera coordinate system. The position of the image defined in the camera coordinate system is converted to the position of the image defined in the sewing machine coordinate system by a predetermined conversion formula or matrix.
Further, in the case where a difference occurs between the actual position of the imaging device 30 and the position in the design data due to the mounting error of the imaging device 30, the position of the sewing needle 3 in the XY plane is measured after the imaging device 30 is mounted, the measured position of the sewing needle 3 is moved toward the imaging device 30 by an amount corresponding to known data, and the difference between the actual position of the imaging device 30 in the XY plane and the moved position of the sewing needle 3 is calculated, so that the accurate relative position of the optical axis AX of the optical system of the imaging device 30 and the sewing needle 3 can be calculated based on the difference.
[ Sewing objects ]
Fig. 3 is a sectional view showing a part of the sewing object S of the present embodiment. Fig. 4 is a plan view showing the sewing object S of the present embodiment. Fig. 3 and 4 show the sewing object S before the sewing process. In the present embodiment, the sewing object S is a skin material for a vehicle seat.
As shown in fig. 3, the sewing object S includes: surface material 4, cushioning material 5, lining material 6. Holes 7 are provided in the surface material 4.
The surface of the surface material 4 is a seating surface that is in contact with an occupant when the occupant is seated in the vehicle seat. The surface material 4 includes at least one of woven cloth, nonwoven cloth, and leather. The cushioning material 5 has elasticity. The cushioning material 5 contains, for example, a polyurethane resin. The lining material 6 comprises at least one of woven cloth, non-woven cloth, and leather.
As shown in fig. 4, the hole 7 is provided in plural on the surface material 4. The holes 7 are arranged in a predetermined pattern RS. The predetermined pattern RS includes a plurality of reference patterns US. The reference pattern US is formed by using a plurality of holes 7. In the present embodiment, the reference pattern US is constituted by 25 holes 7.
As shown in fig. 4, the reference pattern US is disposed on the surface material 4 at intervals. The reference patterns US are arranged at equal intervals in the X-axis direction and the Y-axis direction, respectively. The reference patterns US having different positions in the Y-axis direction are arranged between the reference patterns US adjacent to each other in the X-axis direction. No hole 7 is formed between the adjacent reference patterns US.
In the following description, an area where the reference pattern US is provided on the surface of the surface material 4 is referred to as a texture area TA, and an area where the reference pattern US is not provided between the reference patterns US on the surface of the surface material 4 is referred to as a stitch area SA.
The stitch area SA defines a target stitch line RL of a stitch CH formed on the sewing object S.
[ Displacement of surface of Sewing object ]
Fig. 5 is a sectional view showing a part of the sewing object S of the present embodiment. Fig. 5 shows the sewing object S after the sewing process. The sewing object S has a thickness and elasticity. By forming the stitches CH on the sewing object S having a thickness and elasticity, the sewing object S contracts as shown in fig. 5.
Fig. 6 and 7 are plan views each showing a part of the sewing object S of the present embodiment. Fig. 6 shows the sewing object S before the sewing process. Fig. 7 shows the sewing object S after the sewing process.
As shown in fig. 6, the target stitch line RL is specified in the stitch area SA. When the stitches CH are formed on the sewing object S, the sewing object S contracts. When the sewing object S contracts, the surface of the sewing object S is displaced. As shown in fig. 7, when the stitch CH is formed on the sewing object S, the surface of the sewing object S is displaced in the XY plane with respect to the target stitch line RL.
When the surface of the sewing object S is displaced in the XY plane with respect to the target needle trace RL, if the holding member 15 is moved in accordance with the target needle trace RL, it is difficult to form the needle trace CH at a desired position on the surface of the sewing object S.
In the present embodiment, when the sewing object S contracts due to the formation of the stitch CH and the surface of the sewing object S is displaced, the position of the target stitch line RL is corrected based on the displacement amount of the surface of the sewing object S. The holding member 15 moves based on the corrected target stitch RL.
[ image processing apparatus ]
Fig. 8 is a functional block diagram showing the sewing machine 1 of the present embodiment. The sewing machine 1 includes: image processing apparatus 40, control apparatus 50, and storage apparatus 60.
The image processing apparatus 40 includes a computer system. As shown in fig. 8, the image processing device 40 is connected to the imaging device 30, the control device 50, the storage device 60, the input device 70, and the output device 80, respectively. The image processing device 40 processes the image of the sewing object S.
The input device 70 is operated by an operator to generate input data. The input device 70 may be exemplified by a computer keyboard, a mouse, and a touch panel.
The output device 80 outputs the output data. The output device 80 may be exemplified by a display device and a printing device. The display device outputs the display data as output data. The printing apparatus outputs the print data as output data. Examples of the Display device include a flat panel Display such as a Liquid Crystal Display (LCD) or an Organic EL Display (OELD). An inkjet printer can be exemplified as the printing apparatus.
The control device 50 includes a computer system. As shown in fig. 8, the control device 50 is connected to an actuator 16 for moving the sewing needle 3 in the Z-axis direction, an actuator 17 for moving the holding member 15 in the XY plane, an actuator 18 for moving the presser foot member 15A of the holding member 15 in the Z-axis direction, the operation device 20, the image processing device 40, and the storage device 60, respectively. The control device 50 outputs a control command for controlling the actuator 17 that moves the holding member 15, based on the processing result of the image processing device 40.
The control device 50 is connected to a drive amount sensor 31 that detects the drive amount of the actuator 16 and a drive amount sensor 32 that detects the drive amount of the actuator 17.
The driving amount sensor 31 includes an encoder that detects the amount of rotation of a pulse motor as the actuator 16. The detection data of the driving amount sensor 31 is output to the control device 50.
The drive amount sensor 32 includes: an X-axis sensor 32X that detects the amount of rotation of the X-axis motor 17X, and a Y-axis sensor 32Y that detects the amount of rotation of the Y-axis motor 17Y. The X-axis sensor 32X includes an encoder that detects the amount of rotation of the X-axis motor 17X. The Y-axis sensor 32Y includes an encoder that detects the amount of rotation of the Y-axis motor 17Y. The detection data of the driving amount sensor 32 is output to the control device 50.
The driving amount sensor 32 functions as a position sensor that detects the position of the holding member 15 in the XY plane. The driving amount of the actuator 17 corresponds one-to-one to the moving amount of the holding member 15.
The X-axis sensor 32X can detect the amount of movement of the holding member 15 in the X-axis direction from the origin in the sewing machine coordinate system by detecting the amount of rotation of the X-axis motor 17X. The Y-axis sensor 32Y can detect the amount of movement of the holding member 15 in the Y-axis direction from the origin in the sewing machine coordinate system by detecting the amount of rotation of the Y-axis motor 17Y.
The control device 50 controls the actuator 16 based on the detection data of the driving amount sensor 31. The control device 50 determines, for example, the operation timing of the actuator 16 based on the detection data of the driving amount sensor 31.
The control device 50 controls the actuator 17 based on the detection data of the driving amount sensor 32. The control device 50 performs feedback control on the actuator 17 based on the detection data of the driving amount sensor 32 so that the holding member 15 moves to a desired position.
The control device 50 calculates the position of the holding member 15 in the XY plane based on the detection data of the driving amount sensor 32. The movement amount of the holding member 15 from the origin in the XY plane is detected based on the detection data of the driving amount sensor 32. The control device 50 calculates the position of the holding member 15 in the XY plane based on the detected amount of movement of the holding member 15.
The storage device 60 includes a nonvolatile Memory such as a ROM (Read Only Memory) or a storage device, and a volatile Memory such as a RAM (Random Access Memory). As shown in fig. 8, the storage device 60 is connected to the image processing device 40 and the control device 50, respectively.
The storage device 60 includes: a sewing data storage section 61, a design data storage section 62, and a program storage section 63.
The sewing data storage section 61 stores sewing data referred to in the sewing process.
The sewing process is a process of forming a stitch CH on the sewing object S. In the present embodiment, the sewing process includes: a first sewing process for forming the first stitch CH1, and a second sewing process for forming the second stitch CH 2. Similarly, the sewing process includes third to fourteenth sewing processes for forming third to fourteenth stitches CH3 to CH14, respectively.
The sewing data includes: a target stitch line RL of the stitch CH formed on the sewing object S, and a moving condition of the holding member 15.
The target stitch line RL defines a target shape of the stitch CH formed on the sewing object S and a target position of the stitch CH in the sewing machine coordinate system.
As shown in fig. 4, the target needle trace RL includes: a first target stitch line RL1 for forming a first stitch CH1, and a second target stitch line RL2 for forming a second stitch CH 2. Likewise, the target stitches RL include third to fourteenth target stitches RL3 to RL14 for forming the third to fourteenth stitches CH3 to CH14, respectively.
The moving conditions of the holding member 15 include: the movement locus of the holding member 15 is defined in the sewing machine coordinate system. The movement locus of the holding member 15 includes a movement locus of the holding member 15 in the XY plane. The moving condition of the holding member 15 is determined based on the target needle trace RL.
The first sewing process includes a process of forming a first stitch CH1 on the sewing object S based on the first target stitch line RL 1. The first sewing process is performed first after the sewing object S is held by the holding member 15.
The second sewing process includes a process of forming a second stitch CH2 on the sewing object S based on the second target stitch line RL 2. The second sewing process is performed in succession to the first sewing process.
Similarly, the third sewing process to the fourteenth sewing process include processes of forming the third stitch CH3 to the fourteenth stitch CH14 on the sewing object S based on the third target stitch RL3 to the fourteenth target stitch RL14, respectively. The third sewing process to the fourteenth sewing process are performed in sequence.
The design data storage 62 stores design data of the sewing object S. The design data of the sewing object S comprises: the position and the range of the texture area TA on the surface of the sewing object S, the position and the range of the stitch area SA, and the shape and the size of the reference pattern US. In the case of designing the sewing object S by CAD (Computer Aided Design), the Design data of the sewing object S includes CAD data.
The design data of the sewing object S is the design data of the sewing object S in the initial state. The initial state of the sewing object S is a state before the first sewing process. That is, the initial state of the sewing object S is a state where no stitch CH is formed on the sewing object S.
The program storage 63 stores a computer program for controlling the sewing machine 1. The computer program is read by the control device 50. The control device 50 controls the sewing machine 1 in accordance with the computer program stored in the program storage unit 63.
The image processing apparatus 40 includes: an object image acquisition unit 41, a scanning unit 42, a region dividing unit 43, a correction point setting unit 44, a feature point extraction unit 45, a reference vector calculation unit 46, a reference vector storage unit 47, a correction amount calculation unit 48, and a region-divided image output unit 49.
The object image acquiring unit 41 acquires an object image SM representing an image of the sewing object S. The imaging device 30 images the sewing object S and outputs an object image SM to the image processing device 40. The object image acquisition unit 41 acquires the object image SM from the imaging device 30.
Fig. 9 is a diagram for explaining the operation of the imaging device 30 according to the present embodiment. The imaging device 30 images the sewing object S held by the holding member 15. As shown in fig. 9, the imaging area FA of the imaging device 30 is smaller than the sewing object S. The plurality of object images SM can be acquired by the relative movement of the sewing object S and the imaging area FA of the imaging device 30. The holding member 15 holding the sewing object S moves in an XY plane including the imaging position Pf of the imaging device 30. The imaging device 30 images a part of the sewing object S disposed in the imaging area FA. The movement of the holding member 15 in the XY plane and the imaging process of a part of the sewing object S arranged in the imaging area FA are repeated, thereby sequentially acquiring a plurality of object images SM.
As shown in fig. 9, the sewing object S and the imaging device 30 are moved relative to each other while sequentially acquiring a plurality of object images SM so that the imaging area FA of the first imaging process of the imaging device 30 overlaps a part of the imaging area FA of the second imaging process. The object images SM acquired in the plurality of imaging processes are joined to each other, thereby generating the object image SM of the entire sewing object S.
The imaging device 30 images the sewing object S before the sewing process. That is, the imaging device 30 images the sewing object S before the first sewing process and before the second sewing process, respectively. Similarly, the imaging device 30 images the sewing object S before the third sewing process to the fourteenth sewing process, respectively. The object image acquiring unit 41 acquires the object images SM of the sewing object S photographed before the first sewing process to the fourteenth sewing process from the imaging device 30.
In addition, when the object image SM acquired by the imaging device 30 before the first sewing process is identical to the design data of the sewing object S in the initial state stored in the design data storage 62, the object image acquiring unit 41 may acquire the object image SM before the first sewing process from the design data storage 62 of the sewing object S in the initial state.
The scanning unit 42 scans the object image SM in a predetermined search area HA, and determines the state of each of a plurality of pixels constituting the object image SM.
The area dividing unit 43 divides the surface of the sewing object S into the texture area TA and the stitch area SA based on the object image SM, and calculates the boundary BL between the texture area TA and the stitch area SA.
Fig. 10 is a diagram for explaining a method of calculating the boundary line BL according to the present embodiment. As shown in fig. 10, the scanning unit 42 scans the object image SM in a predetermined search area HA, and acquires the states of a plurality of pixels constituting the object image SM.
The object image SM may be preprocessed before scanning the object image SM. As the preprocessing, smoothing filter processing and maximum value filter processing can be exemplified. By performing the preprocessing, the determination of the pixel state can be smoothly performed.
The scanning unit 42 determines whether or not each of the plurality of pixels is a predetermined color pixel. In the present embodiment, the following are set: the predetermined color is black, and the predetermined color pixel is a black pixel. The scanner 42 determines whether each of the plurality of pixels is a black pixel. In the following description, a pixel to be determined as to whether or not a black pixel is present is referred to as a target pixel AP as appropriate.
As shown in fig. 10, the scanner 42 determines the pixel of interest APb of the image constituting the hole 7 as a black pixel. The scanner 42 determines that the target pixel APt and the target pixels APs constituting the image of the surface material 4 are not black pixels.
The area dividing unit 43 divides the surface of the sewing object S into the texture area TA and the stitch area SA based on the determination result of the scanning unit 42.
The area dividing unit 43 classifies the attention pixel AP determined as a black pixel into the texture area TA. In the example shown in fig. 10, the area dividing unit 43 classifies the target pixel APb determined as a black pixel into the texture area TA.
The region dividing unit 43 determines: whether or not the relative position of the attention pixel AP determined not to be a black pixel and the black pixels around the attention pixel AP satisfies a predetermined condition. The predetermined condition includes a condition that the relative distance between the target pixel AP and the black pixel is equal to or less than a predetermined threshold value. The threshold value is determined based on the interval of adjacent holes 7 in the texture area TA and the size (width) of the stitch area SA.
In the example shown in fig. 10, the pixel of interest APt constitutes an image of the surface material 4 and also an image of the texture area TA. The attention pixels APs constitute an image of the stitch area SA. The distance between the pixel of interest APt constituting the image of the texture region TA and the black pixel constituting the image of the hole 7 around the pixel of interest APt is short. On the other hand, the distance between the target pixel APs constituting the image of the stitch area SA and the black pixel constituting the image of the hole 7 around the target pixel APs is large. The distance between the pixel of interest APt and the black pixels around the pixel of interest APt is equal to or less than a threshold value. The distance between the attention pixel APs and the black pixel around the attention pixel APs is larger than the threshold value.
When the region dividing unit 43 determines that the distance between the target pixel APt determined not to be a black pixel and the black pixel around the target pixel APt is equal to or less than the threshold value, the target pixel APt is classified into the texture region TA. When the region dividing unit 43 determines that the distance between the attention pixel APs determined not to be a black pixel and the black pixel around the attention pixel APs is greater than the threshold value, the attention pixel APs is classified into the stitch region SA.
The area dividing unit 43 classifies all the pixels of the object image SM into the texture area TA and the stitch area SA. The area dividing unit 43 can calculate the boundary line BL between the texture area TA and the stitch area SA by classifying a plurality of pixels of the object image SM into the texture area TA and the stitch area SA.
Further, the post-processing of the object image SM may be performed after the classification of the attention pixel AP is completed. The boundary line BL may be calculated after post-processing of the object image SM is performed. As the post-processing, an opening processing, a closing processing, a noise removal processing, and a hole burying processing can be exemplified. After the post-processing, the marking processing and the outline drawing processing are performed, thereby calculating a plurality of reference points representing the boundary between the texture area TA and the stitch area SA. The region dividing unit 43 calculates a least square curve from a plurality of reference points. The region dividing unit 43 may use the calculated least square curve as the boundary line BL.
Fig. 11 is a diagram for explaining the boundary lines BL according to the present embodiment. As shown in fig. 11, the boundary line BL is a line passing through the edge of the texture region TA. When the texture region TA includes the holes 7, the boundary line BL is formed so as to connect the outer edges of the holes 7 arranged outermost in the texture region TA.
The correction point setting unit 44 sets a correction point CP for correcting the target stitch line RL defined in the stitch area SA. Before the first sewing process, correction points CP for correcting the first to fourteenth target needle traces RL1 to RL14, respectively, are set.
Fig. 12 is a diagram for explaining the correction point CP according to the present embodiment. As shown in fig. 12, a correction point CP is set on the sewing object S. The correction point CP is used to correct the target needle trace RL. The correction point CP is set at an arbitrary position by the operator. The operator can set the correction point CP at an arbitrary position of the sewing object S by operating the input device 70. The correction point setting unit 44 sets the correction point CP based on input data generated by operating the input device 70. In the example shown in fig. 12, the correction point CP is set so as to overlap the target stitch line RL to be corrected in the stitch area SA. The correction point CP may be set in the vicinity of the target stitch line RL to be corrected, and may be set at a position deviated from the target stitch line RL or may be set in the texture region TA. The position of the correction point CP is specified in the sewing machine coordinate system.
When the stitch CH is formed, the surface of the sewing object S is displaced from the correction point CP and the target stitch line RL.
The feature point extraction unit 45 extracts the feature points FP of the texture region TA based on the boundary lines BL calculated by the region dividing unit 43. The feature point FP is a portion having a characteristic shape on the boundary line BL. Even if the surface of the sewing object S is displaced, the characteristic shape of the characteristic point FP is substantially maintained. As the feature point FP, an angular point, a maximum point, a minimum point, and an inflection point of the boundary line BL can be exemplified.
Fig. 13 is a diagram for explaining an example of the method of calculating the feature point FP according to the present embodiment. As shown in fig. 13, the feature point extraction unit 45 defines, for example, a reference line XL, and calculates the distances between the reference line XL and each of a plurality of boundary points BP of the boundary line BL. The distance is a distance in a direction orthogonal to the reference line XL. The feature point extraction unit 45 may determine the boundary point BP having the longest distance to the reference line XL as the feature point FP.
Further, there may be no clear characteristic point FP such as an angular point, a maximum point, a minimum point, and an inflection point on the boundary line BL. The feature point FP may be calculated without using the reference line XL.
Fig. 14 is a diagram for explaining an example of the method of calculating the feature point FP according to the present embodiment. As shown in fig. 14, when there are no clear feature points FP such as corner points, maximum points, minimum points, and inflection points on the boundary lines BL, the feature point extraction unit 45 extracts the feature points FP based on the relative distances of the plurality of boundary lines BL. As shown in fig. 14, when a pair of boundary lines BL are defined to face each other, the feature point extraction unit 45 calculates a distance between the boundary point BP of one boundary line BL and the boundary point BP of the other boundary line BL. In the example shown in fig. 14, the boundary point BP of one boundary line BL in the X-axis direction is located at the same position as the boundary point BP of the other boundary line BL. The feature point extraction unit 45 calculates a distance for each of the boundary points BP of the boundary line BL. The feature point extraction unit 45 may determine the boundary point BP of the boundary line BL having the longest distance as the feature point FP.
As described with reference to fig. 9, the plurality of object images SM are acquired while overlapping a part of the imaging area FA of each of the plurality of imaging processes. The object images SM of the entire sewing object S are generated by joining the plurality of object images SM. Therefore, even if there is no clear feature point FP, the feature point extraction unit 45 can search for a pair of opposing boundary lines BL as shown in fig. 14 from the joined object images SM.
The imaging device 30 may enlarge the imaging area FA to collectively acquire the object image SM of the entire sewing object S. When the object images SM of the entire sewing object S are acquired at once, even if there is no clear feature point FP, a pair of opposing boundary lines BL as shown in fig. 14 can be searched for.
The reference vector calculation unit 46 calculates a reference vector RV indicating the relative position between the correction point CP set by the correction point setting unit 44 and the boundary point BP set on the boundary line BL.
Fig. 15 and 16 are views for explaining the reference vector RV of the present embodiment. Fig. 16 is an enlarged view of a part of fig. 15. A correction point CP is set on the sewing object S before sewing processing. In fig. 15 and 16, the correction point CP is set at the sewing object S in the initial state before the first sewing process. The boundary point BP is set at the boundary line BL. The boundary points BP are set in the vicinity of the plurality of holes 7, respectively. The boundary point BP is set in the vicinity of the correction point CP. The reference vector RV represents the orientation of the correction point CP with respect to the boundary point BP and the distance between the boundary point BP and the correction point CP on the sewing object S before the first sewing process. A reference vector RV is defined in a sewing machine coordinate system.
A plurality of boundary points BP are set on the boundary line BL. The boundary point BP contains the feature point FP. In the examples shown in fig. 15 and 16, the feature point FP is a corner point (corner) of the boundary line BL. As shown in fig. 15, the reference vector calculation unit 46 calculates the reference vectors RV for the correction point CP and the boundary points BP including the feature point FP. A plurality of reference vectors RV are calculated. The number of boundary points BP is equal to the number of reference vectors RV.
The reference vector storage unit 47 stores relative position data of the boundary point BP and the correction point CP represented by the reference vector RV calculated by the reference vector calculation unit 46.
The correction amount calculation unit 48 calculates the correction point CP after the first sewing process based on the boundary point BP of the sewing object S after the first sewing process and the reference vector RV stored in the reference vector storage unit 47. Due to the sewing process, the surface of the sewing object S may be displaced with respect to the correction point CP. Therefore, the position of the correction point CP after the first sewing process is corrected based on the displacement amount of the surface of the sewing object S. The correction amount calculation unit 48 calculates the correction point CP after the first sewing process based on the boundary point BP of the sewing object S after the first sewing process and the reference vector RV.
Fig. 17 is a diagram for explaining an example of a method of calculating the correction point CP according to the present embodiment. The object image SM of the sewing object S after the first sewing process is acquired by the imaging device 30. The boundary line BL is calculated based on the object image SM after the first sewing process, and a plurality of boundary points BP are set on the boundary line BL. The boundary point BP is set in the vicinity of the hole 7. The plurality of boundary points BP after the first sewing process correspond to the plurality of boundary points BP before the first sewing process one by one.
The correction amount calculation unit 48 calculates a plurality of candidate points KP of the correction point CP based on the plurality of boundary points BP of the sewing object S after the first sewing process and the plurality of reference vectors RV calculated before the first sewing process. The correction amount calculation unit 48 calculates the correction point CP after the first sewing process based on the plurality of candidate points KP.
That is, the correction amount calculating section 48 adds the reference vector RV calculated before the first sewing process to the boundary point BP after the first sewing process. Adding the reference vector RV to the boundary point BP after sewing processing means calculating a candidate point KP indicating a point which is displaced from the boundary point BP by a distance indicated by the reference vector RV in the azimuth indicated by the reference vector RV.
The correction amount calculation unit 48 adds the reference vector RV calculated for the boundary point BP before the first sewing process to the boundary point BP after the first sewing process. For example, when the first reference vector RV1 is calculated for the first boundary point BP1 before the first sewing process, the correction amount calculation section 48 adds the first reference vector RV1 to the first boundary point BP1 after the first sewing process corresponding to the first boundary point BP1 before the first sewing process. When the second reference vector RV2 is calculated for the second boundary point BP2 before the first sewing process, the correction amount calculation section 48 adds the second reference vector RV2 to the second boundary point BP2 after the first sewing process corresponding to the second boundary point BP2 before the first sewing process. Similarly, when the predetermined reference vector RV is calculated for the predetermined boundary point BP before the first sewing process, the correction amount calculation unit 48 adds the predetermined reference vector RV to the boundary point BP after the first sewing process corresponding to the predetermined boundary point BP before the first sewing process.
The correction amount calculation unit 48 sets, as the candidate points KP, the intersection points of the tip portions of the reference vectors RV, which are added to the plurality of boundary points BP set on one boundary line BL. In the example shown in fig. 17, there are four texture regions TA and four boundary lines BL. Three boundary points BP are set on one boundary line BL. Four candidate points KP are calculated.
The correction amount calculation unit 48 sets the correction point CP in a partial region surrounded by the four candidate points KP in the stitch area SA. In the present embodiment, the correction amount calculation unit 48 calculates the center of gravity (center point) of the four candidate points KP in the XY plane, and sets the center of gravity as the correction point CP.
Further, a weight may be set to at least one of the plurality of candidate points KP. For example, a weight may be added to the candidate point KP calculated based on the reference vector RV of the feature point FP closest to the correction point CP.
Further, in the calculation of the candidate points KP, only the feature points FP may be considered without considering the boundary points BP that are not the feature points FP. That is, the candidate point KP may be calculated based only on the reference vector RV added to the feature point FP.
Here, the reference vector RV is calculated in an initial state before the first sewing process, the candidate point KP is calculated based on the boundary point BP and the reference vector RV after the first sewing process, and the correction point CP is calculated based on the candidate point KP. The reference vector RV may be calculated before the second sewing process, the candidate point KP may be calculated based on the boundary point BP and the reference vector RV after the second sewing process, and the correction point CP after the second sewing process may be calculated based on the candidate point KP. The same processing is performed during each of the third sewing process to the fourteenth sewing process.
The position of the correction point CP after the sewing process may be calculated without using the reference vector RV.
Fig. 18 is a diagram for explaining an example of a method of calculating the correction point CP according to the present embodiment. Fig. 18 (a) is a diagram showing a part of the sewing object S before the sewing process arranged in the imaging area FA. Fig. 18 (B) is a diagram showing a part of the sewing object S after the sewing process arranged in the imaging area FA. As shown in fig. 18 (a), the pair of boundary lines BL face each other. The boundary line BL extends in the Y-axis direction. There is no definite feature point FP on the boundary line BL. As shown in fig. 18 (B), when the boundary line BL moves in the-X direction due to the sewing process, it can be estimated that the correction point CP moves in the-X direction by the same movement amount as the movement amount of the boundary line BL. In this way, when there is no clear characteristic point FP on the boundary line BL, the position of the correction point CP after the sewing process can be calculated based on the movement amount of the boundary line BL. The same applies to the case where the boundary line BL moves in the + X direction. The same applies to the case where the pair of boundary lines BL extend in the Y-axis direction and move in the + Y direction or the-Y direction.
Fig. 19 is a diagram for explaining an example of a method of calculating the correction point CP according to the present embodiment. Fig. 19 (a) is a diagram showing a part of the sewing object S before the sewing process arranged in the imaging area FA. Fig. 19 (B) is a diagram showing a part of the sewing object S after the sewing process arranged in the imaging area FA. As shown in fig. 19 (a), the pair of boundary lines BL face each other. The boundary lines BL extend in directions inclined in the X-axis direction and the Y-axis direction, respectively. There is no definite feature point FP on the boundary line BL. As shown in fig. 19 (B), when the boundary line BL moves due to the sewing process, the boundary line BL extends in directions inclined in the X-axis direction and the Y-axis direction, respectively, and it is therefore difficult to determine whether the sewing object S moves in the X-axis direction or the Y-axis direction in the imaging area FA. In this case, the imaging device 30 enlarges the imaging area FA, and may search for a pair of boundary lines BL on which the feature points FP can be calculated, as described with reference to fig. 14, for example. In addition, the output device 80 may output a warning when it is difficult to determine whether the sewing object S moves in the X-axis direction or the Y-axis direction in the imaging area FA.
The area-divided image output unit 49 outputs the area-divided image DM to the output device 80. In the present embodiment, the area dividing unit 43 divides the surface of the sewing object S into the texture area TA and the stitch area SA, and then generates the area division image DM representing an image including the texture area TA and the stitch area SA. The area-divided-image output unit 49 outputs an area-divided image DM including the texture area TA and the stitch area SA divided by the area dividing unit 43 to the output device 80.
[ Sewing method ]
Fig. 20 is a flowchart showing a sewing method according to the present embodiment. In the present embodiment, the sewing method includes: calibration processing S0, object image acquisition processing S1, region segmentation processing S2, correction point calculation processing S3, target needle trace correction processing S4, sewing processing S5, and end determination processing S6.
The calibration process S0 is a process of associating the texture area TA and the stitch area SA of the sewing object S held by the holding member 15 with the sewing machine coordinate system. After the sewing object S before the sewing process is held by the holding member 15, the imaging device 30 images the sewing object S. The imaging device 30 images a plurality of feature points FP of the sewing object S, for example. In addition, when the calibration mark is provided on the sewing object S, the imaging device 30 may image the calibration mark. The position of the image of the sewing object S captured by the imaging device 30 is defined in the camera coordinate system. The position of the image defined in the camera coordinate system is converted to the position of the image defined in the sewing machine coordinate system by a predetermined conversion formula or matrix. Thus, the position of the texture area TA and the position of the stitch area SA of the sewing object S can be defined in the sewing machine coordinate system.
The object image acquisition process S1 is a process of acquiring the object image SM. Before the first sewing process, the object image acquiring unit 41 may acquire the object image SM in the initial state from the imaging device 30, or may acquire the object image SM in the initial state from the design data storage unit 62.
After the first sewing process, the image of the object SM is acquired by the imaging device 30. The imaging device 30 acquires the object image SM before the second sewing process after the first sewing process, and acquires the object image SM before the third sewing process after the second sewing process. Similarly, the imaging device 30 acquires the object image SM during each of the third sewing process to the fourteenth sewing process.
The area dividing process S2 is a process of dividing the surface of the sewing object S into the texture area TA and the stitch area SA based on the object image SM, and calculating the boundary line BL between the texture area TA and the stitch area SA. The area dividing process S2 is performed before the first sewing process, after the first sewing process and before the second sewing process. Similarly, the region dividing process S2 is performed during each of the second sewing process to the fourteenth sewing process.
Fig. 21 is a flowchart showing the region division processing according to the present embodiment. The scanning unit 42 scans the target image SM in the predetermined search area HA (step S21).
The scanner unit 42 determines whether or not the target pixel AP of the object image SM is a black pixel (step S22).
When the attention pixel AP is determined to be a black pixel in step S22 (yes in step S22), the area divider 43 classifies the attention pixel AP determined to be a black pixel into the texture area TA (step S23).
When it is determined in step S22 that the pixel of interest AP is not a black pixel (no in step S22), the area dividing unit 43 determines that: whether or not the relative position of the attention pixel AP determined not to be a black pixel and the black pixels around the attention pixel AP satisfies a predetermined condition (step S24).
When it is determined in step S24 that the attention pixel AP satisfies the predetermined condition (yes in step S24), the area divider 43 classifies the attention pixel AP determined to satisfy the predetermined condition into the texture area TA (step S23).
When it is determined in step S24 that the attention pixel AP does not satisfy the predetermined condition (no in step S24), the area divider 43 classifies the attention pixel AP determined to not satisfy the predetermined condition into the stitch area SA (step S25).
The area divider 43 calculates a boundary line BL between the texture area TA and the stitch area SA based on the pixels classified into the texture area TA and the pixels classified into the stitch area SA (step S26).
The correction point calculation process S3 is a process in which, in the correction point calculation process S3: a reference vector RV indicating the relative position of a correction point CP set on a sewing object S before sewing processing and a boundary point BP set on a boundary line BL is calculated, and the correction point CP after sewing processing is calculated based on the boundary point BP of the sewing object S after sewing processing and the reference vector RV. In an initial state before the first sewing process, the operator sets the correction point CP. The operator operates the input device 70 to set the correction point CP. The correction point setting unit 44 sets the correction point CP in the initial state based on the input data of the input device 70. The correction point calculation process S3 is not performed before the first sewing process. The correction point calculation process S3 is performed after the first sewing process, after the second sewing process and before the third sewing process, respectively.
Similarly, the correction point calculation process S3 is performed during each of the fourth sewing process to the fourteenth sewing process.
Fig. 22 is a flowchart showing correction point calculation processing S3 according to the present embodiment. The region dividing unit 43 sets a boundary point BP on the boundary line BL. The region dividing unit 43 sets a boundary point BP around the correction point CP. In other words, the region dividing unit 43 sets the boundary point BP in the vicinity of the correction point CP (step S31).
The feature point extraction unit 45 extracts the feature points FP based on the boundary lines BL (step S32).
The feature point FP exists on the boundary line BL. The boundary point BP contains the feature point FP.
The correction amount calculation unit 48 acquires a plurality of reference vectors RV from the reference vector storage unit 47 (step S33).
As described with reference to fig. 15 and 16, the reference vector RV is calculated by the reference vector calculation unit 46 before the correction point calculation process (before the sewing process), and stored in the reference vector storage unit 47. Therefore, the correction amount calculation unit 48 can acquire the plurality of reference vectors RV calculated before the sewing process from the reference vector storage unit 47.
The correction amount calculation unit 48 calculates a plurality of candidate points KP based on the plurality of boundary points BP including the feature point FP extracted in step S32 and the plurality of reference vectors RV acquired in step S33 (step S34).
The correction amount calculation unit 48 calculates the correction point CP after the sewing process based on the plurality of candidate points KP calculated in step S34 (step S35).
The correction amount calculation unit 48 sets the center of gravity point of the plurality of candidate points KP as the correction point CP after the sewing process.
The correction point CP after the sewing process is displaced from the correction point CP before the sewing process. The correction amount calculating section 48 calculates the displacement amount from the correction point CP before the sewing process to the correction point CP after the sewing process based on the position of the correction point CP calculated in step S35 (step S36).
There are a plurality of correction points CP. The correction amount calculation unit 48 calculates the displacement amount of each of the plurality of correction points CP.
The target stitch correction process S4 is a process of correcting the target stitch RL based on the correction point CP calculated by the correction point calculation process S3. The correction point CP set by the operator and the target stitch line RL specified by the sewing data are displaced by the surface displacement of the sewing object S due to the sewing process. The displacement amount of the correction point CP can be calculated by the correction point calculation process S3. The control device 50 corrects the target needle trace RL based on the displacement amount of the correction point CP calculated by the correction amount calculation section 48. The control device 50 displaces the target needle trace RL by the same displacement amount as that of the correction point CP, for example.
A plurality of correction points CP are set for the target needle trace RL. The target needle trace RL is corrected based on the displacement amounts of the plurality of correction points CP, respectively. The position of the target stitch RL in the coordinate system of the sewing machine is corrected.
The target stitch line correction process S4 is not performed before the first sewing process. The first sewing process is performed based on the target stitch line RL in the initial state defined by the sewing data. The target stitch correction processing S4 is performed after the first sewing processing, after the second sewing processing, and before the third sewing processing, respectively. Similarly, the target stitch correction processing S4 is performed during each of the fourth sewing processing to the fourteenth sewing processing.
The sewing process S5 is a process of forming a stitch CH based on the target stitch line RL. The sewing process includes first to fourteenth sewing processes. The first sewing process is performed based on the target stitch line RL in the initial state defined by the sewing data. The second sewing process to the fourteenth sewing process are performed based on the target stitch RL corrected by the target stitch correction process S4. The control device 50 outputs a control command to the actuator 17 so as to form a stitch CH in accordance with the target stitch line RL.
The end determination process S6 is a process for determining whether or not the sewing process of the sewing object S is ended. The control device 50 determines whether the sewing processing of the sewing object S is finished or not based on the sewing data. In a state where the first to thirteenth sewing processes are finished, the control device 50 determines that the sewing process is not finished in the finish determination process S6. When the fourteenth sewing process is completed, the control device 50 determines in the end determination process S6 that the sewing process is completed.
[ computer System ]
Fig. 23 is a block diagram showing an example of the computer system 1000. The image processing apparatus 40 and the control apparatus 50 each include a computer system 1000. The computer system 1000 includes a processor 1001 such as a CPU (Central Processing Unit), a main Memory 1002 including a nonvolatile Memory such as a ROM (Read Only Memory) and a volatile Memory such as a RAM (Random Access Memory), a storage device 1003, and an interface 1004 including an input/output circuit. The functions of the image processing apparatus 40 and the functions of the control apparatus 50 are stored in the storage device 1003 as computer programs, respectively. The processor 1001 reads a computer program from the storage device 1003, expands the computer program in the main memory 1002, and executes the above-described processing in accordance with the computer program. Further, the computer program may be transmitted to the computer system 1000 via a network.
The computer program can be executed according to the above-described embodiment: scanning an object image SM representing an image of a sewing object S with a predetermined search area HA, and determining whether a pixel of interest AP is a predetermined color pixel; and dividing the surface of the sewing object S into a texture area TA and a stitch area SA based on the determination result, and calculating a boundary line RL between the texture area TA and the stitch area SA.
[ Effect ]
As described above, according to the present embodiment, the object image SM is scanned, and whether each of the plurality of pixels of the object pixel SM is a black pixel is determined. From this, the textured area TA provided with the holes 7 and the stitch area SA not provided with the holes 7 are identified. Even if the surface of the sewing object S is displaced by the formation of the stitch CH, the texture area TA and the stitch area SA can be identified by determining whether or not the attention pixel AP is a black pixel.
The target pixel AP determined not to be a black pixel is classified into the texture region TA when the relative position to the surrounding black pixel satisfies a predetermined condition, and classified into the stitch region SA when the relative position to the surrounding black pixel does not satisfy the predetermined condition. Thus, even if the pixel of interest is not a black pixel, the pixel of interest AP is appropriately classified into one of the texture region TA and the stitch region SA.
[ other embodiments ]
In the above embodiment, the following are set: when the image of the object SM is acquired by the imaging device 30, the holding member 15 holding the sewing object S is moved in the XY plane in a state where the position of the imaging device 30 is fixed. In a state where the position of the sewing object S is fixed, the imaging area FA of the imaging device 30 may move in the XY plane, and both the imaging area FA and the sewing object S may move in the XY plane.
In the above embodiment, the following are set: the boundary line BL is formed so as to pass through the edge of the texture region TA located outside the outermost hole 7. The boundary line BL may be formed so as to pass through the outermost holes 7 in the texture region TA. The boundary point BP may also be set at the hole 7.

Claims (5)

1. An image processing apparatus includes:
an object image acquiring unit that acquires an object image representing an image of a sewing object;
a scanning unit that scans the target image with a predetermined search area and determines whether or not a pixel of interest is a predetermined color pixel; and
and an area dividing unit that divides the surface of the sewing object into a texture area and a stitch area based on the determination result, and calculates a boundary line between the texture area and the stitch area.
2. The image processing apparatus according to claim 1,
the region dividing unit classifies the attention pixel determined as the predetermined color pixel into the texture region.
3. The image processing apparatus according to claim 2,
the region dividing unit classifies the pixel of interest into the texture region when the relative position between the pixel of interest that is not the predetermined color pixel and the predetermined color pixel around the pixel of interest satisfies a predetermined condition, and classifies the pixel of interest into the stitch region when the relative position does not satisfy the predetermined condition.
4. A sewing machine is provided with:
a holding member capable of holding and moving a sewing object in a predetermined plane including a sewing position right below a sewing machine needle;
an actuator that generates a power to move the holding member;
the image processing apparatus of any one of claims 1 to 3; and
and a control device that outputs a control instruction to control the actuator based on a processing result of the image processing device.
5. An image processing method, comprising: scanning an object image representing an image of a sewing object with a predetermined search area, and determining whether or not a pixel of interest is a predetermined color pixel; and dividing the surface of the sewing object into a texture area and a stitch area based on the determination result, and calculating a boundary line between the texture area and the stitch area.
CN202011224095.7A 2019-11-06 2020-11-05 Image processing device, sewing machine, and image processing method Active CN112779679B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019201419A JP7405565B2 (en) 2019-11-06 2019-11-06 Image processing device, sewing machine, and image processing method
JP2019-201419 2019-11-06

Publications (2)

Publication Number Publication Date
CN112779679A true CN112779679A (en) 2021-05-11
CN112779679B CN112779679B (en) 2024-02-02

Family

ID=75750370

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011224095.7A Active CN112779679B (en) 2019-11-06 2020-11-05 Image processing device, sewing machine, and image processing method

Country Status (2)

Country Link
JP (1) JP7405565B2 (en)
CN (1) CN112779679B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0838756A (en) * 1994-07-27 1996-02-13 Brother Ind Ltd Embroidery data generating device
JPH08218266A (en) * 1995-02-15 1996-08-27 Janome Sewing Mach Co Ltd Embroidery pattern-making device for machine capable of embroidery sewing and its method
JPH105465A (en) * 1996-06-24 1998-01-13 Japan Small Corp Quilting method
JPH11123289A (en) * 1997-10-22 1999-05-11 Brother Ind Ltd Embroidery data processing device, embroidering machine, and recording medium
JPH11207061A (en) * 1998-01-19 1999-08-03 Dan:Kk Method and device for detecting overcasting stitch of circularly knitted cloth
JP2002197459A (en) * 2000-12-27 2002-07-12 Fuji Photo Film Co Ltd Image processor, image processing method and recording medium
US20110146553A1 (en) * 2007-12-27 2011-06-23 Anders Wilhelmsson Sewing machine having a camera for forming images of a sewing area
JP2013162957A (en) * 2012-02-13 2013-08-22 Toyota Boshoku Corp Structural member of vehicle
JP2015070382A (en) * 2013-09-27 2015-04-13 オリンパス株式会社 Image processing apparatus
CN104933681A (en) * 2014-03-20 2015-09-23 株式会社岛津制作所 Image processing apparatus and an image processing program
CN105447847A (en) * 2014-09-24 2016-03-30 Juki株式会社 Form detection means and sewing machine
JP2016135163A (en) * 2015-01-23 2016-07-28 蛇の目ミシン工業株式会社 Embroidery pattern arrangement system, embroidery pattern arrangement device, embroidery pattern arrangement method for embroidery pattern arrangement device, program for embroidery pattern arrangement device, and sewing machine
CN108729036A (en) * 2017-04-21 2018-11-02 Juki株式会社 Sewing machine and method of sewing
JP2018183576A (en) * 2017-04-21 2018-11-22 Juki株式会社 Sewing machine and sewing method

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0838756A (en) * 1994-07-27 1996-02-13 Brother Ind Ltd Embroidery data generating device
JPH08218266A (en) * 1995-02-15 1996-08-27 Janome Sewing Mach Co Ltd Embroidery pattern-making device for machine capable of embroidery sewing and its method
JPH105465A (en) * 1996-06-24 1998-01-13 Japan Small Corp Quilting method
JPH11123289A (en) * 1997-10-22 1999-05-11 Brother Ind Ltd Embroidery data processing device, embroidering machine, and recording medium
JPH11207061A (en) * 1998-01-19 1999-08-03 Dan:Kk Method and device for detecting overcasting stitch of circularly knitted cloth
JP2002197459A (en) * 2000-12-27 2002-07-12 Fuji Photo Film Co Ltd Image processor, image processing method and recording medium
US20110146553A1 (en) * 2007-12-27 2011-06-23 Anders Wilhelmsson Sewing machine having a camera for forming images of a sewing area
JP2013162957A (en) * 2012-02-13 2013-08-22 Toyota Boshoku Corp Structural member of vehicle
JP2015070382A (en) * 2013-09-27 2015-04-13 オリンパス株式会社 Image processing apparatus
CN104933681A (en) * 2014-03-20 2015-09-23 株式会社岛津制作所 Image processing apparatus and an image processing program
CN105447847A (en) * 2014-09-24 2016-03-30 Juki株式会社 Form detection means and sewing machine
JP2016063894A (en) * 2014-09-24 2016-04-28 Juki株式会社 Shape recognition device and sewing machine
JP2016135163A (en) * 2015-01-23 2016-07-28 蛇の目ミシン工業株式会社 Embroidery pattern arrangement system, embroidery pattern arrangement device, embroidery pattern arrangement method for embroidery pattern arrangement device, program for embroidery pattern arrangement device, and sewing machine
CN108729036A (en) * 2017-04-21 2018-11-02 Juki株式会社 Sewing machine and method of sewing
JP2018183576A (en) * 2017-04-21 2018-11-22 Juki株式会社 Sewing machine and sewing method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨磊: "数字媒体技术概论", vol. 1, 中国铁道出版社, pages: 24 - 25 *

Also Published As

Publication number Publication date
JP7405565B2 (en) 2023-12-26
JP2021074075A (en) 2021-05-20
CN112779679B (en) 2024-02-02

Similar Documents

Publication Publication Date Title
JP5049975B2 (en) 3D model data generation method and 3D model data generation apparatus
EP2366824B1 (en) Sewing machine and sewing machine control program
US8527083B2 (en) Sewing machine and non-transitory computer-readable medium storing sewing machine control program
JP5993283B2 (en) Sewing device
JP2009172119A (en) Sewing machine
US8700200B2 (en) Sewing machine and non-transitory computer-readable medium storing sewing machine control program
CN112941733B (en) Image processing device, sewing machine and image processing method
WO2017090294A1 (en) Sewing machine and storage medium storing program
CN112779680B (en) Image processing device, sewing machine, and image processing method
CN108729036B (en) Sewing machine and sewing method
US10450682B2 (en) Sewing machine and non-transitory computer-readable medium
JP7079132B2 (en) Sewing machine and sewing method
CN112779679B (en) Image processing device, sewing machine, and image processing method
US10619278B2 (en) Method of sewing a fabric piece onto another fabric piece based on image detection
JP6427332B2 (en) Image measuring machine
JP7075246B2 (en) Seam inspection device
US11286597B2 (en) Sewing machine and sewing method
JP7156832B2 (en) Sewing machine and sewing method
CN110616511B (en) Sewing machine and sewing method
JP2011005180A (en) Sewing machine
JP3853507B2 (en) Line width measuring method and apparatus
CN114599476A (en) Metal plate processing system, laser processing machine, metal plate processing method, and processing area setting program by laser processing
CN212603446U (en) 3D prints supplementary detection device
US11869215B2 (en) Computer-readable storage medium, image processing apparatus, and method for image processing
WO2023112337A1 (en) Teaching device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant