CN112779679B - Image processing device, sewing machine, and image processing method - Google Patents

Image processing device, sewing machine, and image processing method Download PDF

Info

Publication number
CN112779679B
CN112779679B CN202011224095.7A CN202011224095A CN112779679B CN 112779679 B CN112779679 B CN 112779679B CN 202011224095 A CN202011224095 A CN 202011224095A CN 112779679 B CN112779679 B CN 112779679B
Authority
CN
China
Prior art keywords
sewing
region
image
pixel
stitch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011224095.7A
Other languages
Chinese (zh)
Other versions
CN112779679A (en
Inventor
塚田丰
山田和范
横濑仁彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Juki Corp
Original Assignee
Juki Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Juki Corp filed Critical Juki Corp
Publication of CN112779679A publication Critical patent/CN112779679A/en
Application granted granted Critical
Publication of CN112779679B publication Critical patent/CN112779679B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05BSEWING
    • D05B19/00Programme-controlled sewing machines
    • D05B19/02Sewing machines having electronic memory or microprocessor control unit
    • D05B19/04Sewing machines having electronic memory or microprocessor control unit characterised by memory aspects
    • D05B19/10Arrangements for selecting combinations of stitch or pattern data from memory ; Handling data in order to control stitch format, e.g. size, direction, mirror image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Textile Engineering (AREA)
  • Sewing Machines And Sewing (AREA)

Abstract

The invention provides an image processing device, a sewing machine and an image processing method, which can identify a region provided with a pattern of holes and a region not provided with a pattern of holes. The image processing device is provided with: an object image acquisition unit that acquires an object image representing an image of a sewn object; a scanning unit that scans an object image with a predetermined search area and determines whether or not a pixel of interest is a pixel of a predetermined color; and a region dividing unit that divides the surface of the sewing object into a texture region and a stitch region based on the determination result, and calculates a boundary line between the texture region and the stitch region.

Description

Image processing device, sewing machine, and image processing method
Technical Field
The present invention relates to an image processing apparatus, a sewing machine, and an image processing method.
Background
In order to improve the aesthetic appearance of the object to be sewn, stitches may be formed on the object to be sewn. Patent document 1 discloses a technique for forming stitches on a skin material for a vehicle seat.
Prior art literature
Patent literature
Patent document 1: japanese patent laid-open publication No. 2013-162957
Disclosure of Invention
First, the technical problem to be solved
Holes are provided in a skin material for a vehicle seat. By patterning the holes, the aesthetic appearance of the vehicle seat can be further improved. The stitch is formed in the area where the pattern of holes is not provided. In the case of forming stitches, it is necessary to identify a region where a pattern of holes is provided and a region where a pattern of holes is not provided.
An object of an embodiment of the present invention is to identify a region where a pattern of holes is provided and a region where a pattern of holes is not provided.
(II) technical scheme
According to an aspect of the present invention, there is provided an image processing apparatus including: an object image acquisition unit that acquires an object image representing an image of a sewn object; a scanning unit that scans the object image with a predetermined search area and determines whether or not a pixel of interest is a pixel of a predetermined color; and a region dividing unit that divides the surface of the sewing object into a texture region and a stitch region based on the determination result, and calculates a boundary line between the texture region and the stitch region.
(III) beneficial effects
According to the aspect of the present invention, it is possible to identify a region where a pattern of holes is provided and a region where a pattern of holes is not provided.
Drawings
Fig. 1 is a perspective view showing a sewing machine according to the present embodiment.
Fig. 2 is a perspective view showing a part of the sewing machine according to the present embodiment.
Fig. 3 is a cross-sectional view showing a part of the sewing object according to the present embodiment.
Fig. 4 is a plan view showing the sewing object of the present embodiment.
Fig. 5 is a cross-sectional view showing a part of the sewing object according to the present embodiment.
Fig. 6 is a plan view showing a part of the sewing object according to the present embodiment.
Fig. 7 is a plan view showing a part of the sewing object according to the present embodiment.
Fig. 8 is a functional block diagram showing the sewing machine according to the present embodiment.
Fig. 9 is a diagram for explaining the operation of the imaging device according to the present embodiment.
Fig. 10 is a diagram for explaining a boundary line calculation method according to the present embodiment.
Fig. 11 is a diagram for explaining boundary lines in the present embodiment.
Fig. 12 is a diagram for explaining the correction points of the present embodiment.
Fig. 13 is a diagram for explaining an example of a method of calculating feature points according to the present embodiment.
Fig. 14 is a diagram for explaining an example of a method of calculating feature points according to the present embodiment.
Fig. 15 is a diagram for explaining the reference vector of the present embodiment.
Fig. 16 is a diagram for explaining the reference vector of the present embodiment.
Fig. 17 is a diagram for explaining an example of a method for calculating the correction point according to the present embodiment.
Fig. 18 is a diagram for explaining an example of a method for calculating the correction point according to the present embodiment.
Fig. 19 is a diagram for explaining an example of a method for calculating the correction point according to the present embodiment.
Fig. 20 is a flowchart showing the sewing method according to the present embodiment.
Fig. 21 is a flowchart showing the region division processing according to the present embodiment.
Fig. 22 is a flowchart showing correction point calculation processing according to the present embodiment.
Fig. 23 is a block diagram showing a computer system according to the present embodiment.
Description of the reference numerals
1-a sewing machine; 2-a workbench; 3-a sewing machine needle; 4-surface material; 5-a gasket material; 6-lining material; 7-holes; 10-a sewing machine body; 11-a sewing machine frame; 11A-horizontal arm; 11B-a bed; 11C-vertical arm; 11D-handpiece; 12-a needle bar; 13-needle plate; 14-a support member; 15-a holding member; 15A-a presser foot member; 15B-lower plate; a 16-actuator; 17-an actuator; 17X-X axis motor; 17Y-Y axis motor; 18-an actuator; 19-a middle presser foot part; 20-operating means; 21-an operation panel; 22-operating the pedal; 30-an image pickup device; 31-a drive quantity sensor; 32-a drive quantity sensor; a 32X-X axis sensor; a 32Y-Y axis sensor; 40-an image processing device; 41-an object image acquisition unit; 42-a scanning section; 43-region dividing section; 44-a correction point setting unit; 45-a feature point extraction unit; 46-a reference vector calculation unit; 47-a reference vector storage unit; 48-a correction amount calculation unit; 49-a region-divided image output section; 50-a control device; 60-storage means; 61-a sewing data storage part; 62-a design data store; 63-a program storage section; 70-an input device; 80-an output device; 1000-a computer system; 1001-a processor; 1002-main memory; 1003-storage device; 1004-interface; AP-pixel of interest; APb-pixel of interest; APs-pixels of interest; APt-pixels of interest; AX-optical axis; BP-boundary points; CP-correction point; CH-stitch; DM-region segmented image; FA-camera area; FP-feature points; HA-search area; KP-candidate points; pf-the image pick-up location; ps-sewing position; RL-target needle trace; RS-prescribing a pattern; s-sewing the object; SM-object image; SA-stitch area; a TA-textured region; US-reference pattern.
Detailed Description
Embodiments of the present invention will be described below with reference to the drawings, but the present invention is not limited thereto. The constituent elements of the embodiments described below may be appropriately combined. In addition, some of the components may not be used.
In the present embodiment, a local coordinate system is defined for the sewing machine 1. In the following description, a local coordinate system defined for the sewing machine 1 is appropriately referred to as a sewing machine coordinate system. The coordinate system of the sewing machine is defined by an XYZ orthogonal coordinate system. In the present embodiment, the positional relationship of each part will be described based on a sewing machine coordinate system. The direction parallel to the X-axis in the predetermined plane is set as the X-axis direction. A direction parallel to a Y axis in a predetermined plane orthogonal to the X axis is set as a Y axis direction. A direction parallel to the Z axis orthogonal to the predetermined plane is set as a Z axis direction. In addition, a rotation direction or an inclination direction about the X axis is set as θx direction. The rotation direction or the tilt direction about the Y axis is set as θy direction. The rotation direction or the tilt direction about the Z axis is set as θz direction. In the present embodiment, a plane including the X axis and the Y axis is appropriately referred to as an XY plane. The plane including the X axis and the Z axis is appropriately referred to as the XZ plane. The plane including the Y axis and the Z axis is appropriately referred to as YZ plane. The XY plane is parallel to the predetermined plane. The XY plane, XZ plane and YZ plane are orthogonal. In the present embodiment, the XY plane is set to be parallel to the horizontal plane. The Z-axis direction is the up-down direction. The +Z direction is above and the-Z direction is below. Further, the XY plane may be inclined with respect to the horizontal plane.
< Sewing machine >
Fig. 1 is a perspective view showing a sewing machine 1 according to the present embodiment. Fig. 2 is a perspective view showing a part of the sewing machine 1 according to the present embodiment. In the present embodiment, the sewing machine 1 is an electronic circulation sewing machine. The sewing machine 1 includes: a sewing machine body 10, an operation device 20 operated by an operator, and an imaging device 30 capable of imaging a sewing object S.
The sewing machine body 10 is mounted on the upper surface of the table 2. The sewing machine body 10 includes: the sewing machine includes a sewing machine frame 11, a needle bar 12 supported by the sewing machine frame 11, a needle plate 13 supported by the sewing machine frame 11, a holding member 15 supported by the sewing machine frame 11 via a supporting member 14, an actuator 16 for generating power for moving the needle bar 12, an actuator 17 for generating power for moving the holding member 15, and an actuator 18 for generating power for moving at least a part of the holding member 15.
The sewing machine frame 11 has: the horizontal arm 11A extending in the Y-axis direction, the bed 11B disposed below the horizontal arm 11A, the vertical arm 11C connecting the +Y side end of the horizontal arm 11A to the bed 11B, and the head 11D disposed on the-Y side of the horizontal arm 11A.
The needle bar 12 holds the sewing machine needle 3. The needle bar 12 holds the sewing needle 3 so that the sewing needle 3 is parallel to the Z axis. The needle bar 12 is supported on the head 11D so as to be movable in the Z-axis direction.
The needle plate 13 supports the sewing object S. The needle plate 13 supports the holding member 15. The needle plate 13 is supported by the bed 11B. The needle plate 13 is disposed below the holding member 15.
The holding member 15 holds the sewing object S. The holding member 15 can hold and move the object S to be sewn in an XY plane including the sewing position Ps directly below the sewing machine needle 3. The holding member 15 can hold and move the sewing object S in an XY plane including the imaging position Pf immediately below the imaging device 30. The holding member 15 moves in the XY plane including the sewing position Ps based on the sewing data while holding the object S, thereby forming the stitch CH on the object S. The holding member 15 is supported by the horizontal arm 11A via the supporting member 14.
The holding member 15 includes: a presser foot member 15A, and a lower plate 15B opposed to the presser foot member 15A. The presser foot member 15A is a frame-shaped member. The presser foot member 15A is movable in the Z-axis direction. The lower plate 15B is disposed below the presser foot member 15A. The holding member 15 holds the object S by sandwiching the object S between the presser foot member 15A and the lower plate 15B.
When the presser foot member 15A moves in the +z direction, the presser foot member 15A is separated from the lower plate 15B. Thereby, the operator can dispose the sewing object S between the presser foot member 15A and the lower plate 15B. When the presser foot member 15A moves in the-Z direction with the object S being sewn disposed between the presser foot member 15A and the lower plate 15B, the object S is sandwiched between the presser foot member 15A and the lower plate 15B. Thereby, the sewing object S is held by the holding member 15. Further, the holding member 15 releases the holding of the object S to be sewn by the movement of the presser foot member 15A in the +z direction. Thereby, the operator can take out the sewing object S from between the presser foot member 15A and the lower plate 15B.
The actuator 16 generates a power to move the needle bar 12 in the Z-axis direction. The actuator 16 comprises a pulse motor. The actuator 16 is disposed on the horizontal arm 11A.
A horizontal arm shaft extending in the Y-axis direction is disposed inside the horizontal arm 11A. The actuator 16 is connected to the +y side end of the horizontal arm shaft. The end of the horizontal arm shaft on the-Y side is connected to the needle bar 12 via a transmission mechanism disposed inside the head 11D. The horizontal arm shaft is rotated by the operation of the actuator 16. The power generated by the actuator 16 is transmitted to the needle stick 12 via a horizontal arm shaft and a transmission mechanism. Thereby, the sewing needle 3 held by the needle bar 12 reciprocates in the Z-axis direction.
A timing belt extending in the Z-axis direction is disposed inside the vertical arm 11C. In addition, a bed shaft extending in the Y-axis direction is disposed inside the bed 11B. Pulleys are disposed on the horizontal arm shaft and the bed shaft, respectively. The timing toothed belt is respectively erected on a belt pulley arranged on the horizontal arm shaft and a belt pulley arranged on the bed shaft. The horizontal arm shaft and the bed shaft are connected via a transmission mechanism including a timing belt.
A kettle is disposed inside the bed 11B. The bobbin accommodated in the bobbin case is accommodated in the pot. The horizontal arm shaft and the bed shaft are rotated by the operation of the actuator 16, respectively. The power generated by the actuator 16 is transferred to the tank via the horizontal arm shaft, timing belt, and bed shaft. Thus, the pot rotates in synchronization with the reciprocation of the needle bar 12 in the Z-axis direction.
The actuator 17 generates a motive force that moves the holding member 15 in the XY plane. The actuator 17 comprises a pulse motor. The actuator 17 includes: an X-axis motor 17X that generates power to move the holding member 15 in the X-axis direction, and a Y-axis motor 17Y that generates power to move the holding member 15 in the Y-axis direction. The actuator 17 is disposed inside the bed 11B.
The power generated by the actuator 17 is transmitted to the holding member 15 via the supporting member 14. Thereby, the holding member 15 can move between the sewing needle 3 and the needle plate 13 along the X-axis direction and the Y-axis direction, respectively. The holding member 15 can hold and move the object S to be sewn in an XY plane including the sewing position Ps directly below the sewing machine needle 3 by the operation of the actuator 17.
The actuator 18 generates a power to move the presser foot member 15A of the holding member 15 in the Z-axis direction. The actuator 18 comprises a pulse motor. The presser foot member 15A is moved in the +z direction, and the presser foot member 15A is separated from the lower plate 15B. The presser foot member 15A moves in the-Z direction, so that the sewing object S is held between the presser foot member 15A and the lower plate 15B.
As shown in fig. 2, the sewing machine body 10 has a middle presser foot member 19 disposed around the sewing machine needle 3. The middle presser foot member 19 presses the object S to be sewn around the sewing machine needle 3. The middle presser foot member 19 is supported on the head 11D so as to be movable in the Z-axis direction. A middle presser motor is disposed inside the hand piece 11D, and generates power to move the middle presser member 19 in the Z-axis direction. The middle presser foot member 19 moves in the Z-axis direction in synchronization with the needle bar 12 by the operation of the middle presser foot motor. The middle presser foot member 19 can suppress the floating of the sewing object S caused by the movement of the sewing machine needle 3.
The operation device 20 is operated by an operator. By operating the operating device 20, the sewing machine 1 is operated. In the present embodiment, the operation device 20 includes an operation panel 21 and an operation pedal 22. The operation panel 21 is mounted on the upper surface of the table 2. The operation pedal 22 is disposed below the table 2. The operator operates the operation pedal 22 with his foot. The operator operates at least one of the operation panel 21 and the operation pedal 22 to operate the sewing machine 1.
The imaging device 30 images the sewing object S held by the holding member 15. The image pickup device 30 has an optical system, and an image sensor that receives light incident via the optical system. The image sensor includes a CCD (Couple Charged Device: charge coupled device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor: complementary metal oxide semiconductor) image sensor.
The imaging device 30 is disposed above the needle plate 13 and the holding member 15. The imaging position Pf includes the position of the optical axis AX of the optical system of the imaging device 30. An imaging area FA is defined for the imaging device 30. The imaging area FA includes a field of view of the optical system of the imaging device 30. The imaging area FA includes an imaging position Pf. The image pickup device 30 acquires an image of at least a part of the sewing object S arranged in the image pickup area FA. The imaging device 30 images at least a part of the sewing object S disposed inside the presser foot member 15A from above.
The position of the image pickup device 30 is fixed. The relative position of the camera device 30 and the sewing machine frame 11 is fixed. The optical axis AX of the optical system of the imaging device 30 in the XY plane is fixed relative to the sewing needle 3. The relative position data indicating the relative position between the optical axis AX of the optical system of the imaging device 30 and the sewing machine needle 3 in the XY plane is known data which can be derived from the design data of the sewing machine 1.
The position of the image acquired by the imaging device 30 is specified in the camera coordinate system. The position of the image specified in the camera coordinate system is converted into the position of the image specified in the sewing machine coordinate system by using a specified conversion formula or matrix.
In addition, when a difference occurs between the actual position of the image pickup device 30 and the position in the design data due to the mounting error of the image pickup device 30, the position of the sewing needle 3 in the XY plane may be measured after the image pickup device 30 is mounted, the measured position of the sewing needle 3 may be moved toward the image pickup device 30 by an amount equivalent to the known data, and the difference between the actual position of the image pickup device 30 in the XY plane and the moved position of the sewing needle 3 may be calculated, so that the exact relative position between the optical axis AX of the optical system of the image pickup device 30 and the sewing needle 3 may be calculated based on the difference.
[ object to be sewn ]
Fig. 3 is a cross-sectional view showing a part of the sewing object S according to the present embodiment. Fig. 4 is a plan view showing the sewing object S according to the present embodiment. Fig. 3 and 4 show the object S to be sewn before the sewing process. In the present embodiment, the sewing object S is a skin material for a vehicle seat.
As shown in fig. 3, the sewing object S includes: surface material 4, lining material 5, lining material 6. Holes 7 are provided in the surface material 4.
The surface of the surface material 4 is a seating surface that contacts an occupant when the occupant is seated on the vehicle seat. The surface material 4 includes at least one of woven cloth, nonwoven cloth, and leather. The cushioning material 5 has elasticity. The cushion material 5 contains, for example, polyurethane resin. The lining material 6 includes at least one of woven cloth, nonwoven cloth, and leather.
As shown in fig. 4, a plurality of holes 7 are provided in the surface material 4. The holes 7 are arranged in a prescribed pattern RS. The predetermined pattern RS includes a plurality of reference patterns US. The reference pattern US is formed using a plurality of holes 7. In the present embodiment, the reference pattern US is composed of 25 holes 7.
As shown in fig. 4, the reference patterns US are arranged on the surface material 4 at intervals. The reference patterns US are arranged at equal intervals in the X-axis direction and the Y-axis direction. The reference patterns US having different positions in the Y-axis direction are arranged between the reference patterns US adjacent in the X-axis direction. No holes 7 are formed between adjacent reference patterns US.
In the following description, a region where the reference pattern US is provided on the surface of the surface material 4 is appropriately referred to as a textured region TA, and a region where the reference pattern US is not provided between the reference patterns US on the surface of the surface material 4 is appropriately referred to as a stitch region SA.
The target stitch RL of the stitch CH formed on the sewing object S is defined in the stitch area SA.
[ Displacement of surface of sewn object ]
Fig. 5 is a cross-sectional view showing a part of the sewing object S according to the present embodiment. Fig. 5 shows the object S after the sewing process. The sewing object S has a thickness and elasticity. By forming the stitch CH on the object S having a thickness and elasticity, the object S is contracted as shown in fig. 5.
Fig. 6 and 7 are plan views each showing a part of the object S to be sewn according to the present embodiment. Fig. 6 shows the object S to be sewn before the sewing process. Fig. 7 shows the object S after the sewing process.
As shown in fig. 6, a target stitch RL is defined in the stitch area SA. When the stitch CH is formed on the object S, the object S contracts. When the object S is contracted, the surface of the object S is displaced. As shown in fig. 7, when the stitch CH is formed on the sewing object S, the surface of the sewing object S is displaced in the XY plane with respect to the target stitch RL.
When the surface of the sewing object S is displaced in the XY plane with respect to the target stitch line RL, if the holding member 15 is moved in accordance with the target stitch line RL, it is difficult to form the stitch line CH at a desired position on the surface of the sewing object S.
In the present embodiment, when the object S is contracted by forming the stitch CH and the surface of the object S is displaced, the position of the target stitch RL is corrected based on the displacement amount of the surface of the object S. The holding member 15 moves based on the corrected target stitch line RL.
[ image processing apparatus ]
Fig. 8 is a functional block diagram showing the sewing machine 1 according to the present embodiment. The sewing machine 1 includes: an image processing device 40, a control device 50, and a storage device 60.
The image processing apparatus 40 includes a computer system. As shown in fig. 8, the image processing apparatus 40 is connected to the image pickup apparatus 30, the control apparatus 50, the storage apparatus 60, the input apparatus 70, and the output apparatus 80, respectively. The image processing device 40 processes an image of the sewing object S.
The input device 70 is operated by an operator to generate input data. Examples of the input device 70 include a keyboard, a mouse, and a touch panel for a computer.
The output device 80 outputs output data. As the output device 80, a display device and a printing device can be exemplified. The display device outputs the display data as output data. The printing device outputs the print data as output data. The display device may be a flat panel display such as a liquid crystal display (LCD: liquid Crystal Display) or an organic EL display (OELD: organic Electroluminescence Display: organic electroluminescent display). As the printing apparatus, an inkjet printer can be exemplified.
The control device 50 comprises a computer system. As shown in fig. 8, the control device 50 is connected to the actuator 16 for moving the sewing needle 3 in the Z-axis direction, the actuator 17 for moving the holding member 15 in the XY plane, the actuator 18 for moving the presser foot member 15A of the holding member 15 in the Z-axis direction, the operation device 20, the image processing device 40, and the storage device 60, respectively. The control device 50 outputs a control instruction for controlling the actuator 17 that moves the holding member 15 based on the processing result of the image processing device 40.
The control device 50 is connected to a drive amount sensor 31 that detects the drive amount of the actuator 16 and a drive amount sensor 32 that detects the drive amount of the actuator 17.
The driving amount sensor 31 includes an encoder that detects the rotation amount of the pulse motor as the actuator 16. The detection data of the drive amount sensor 31 is output to the control device 50.
The driving amount sensor 32 includes: an X-axis sensor 32X that detects the rotation amount of the X-axis motor 17X, and a Y-axis sensor 32Y that detects the rotation amount of the Y-axis motor 17Y. The X-axis sensor 32X includes an encoder that detects the rotation amount of the X-axis motor 17X. The Y-axis sensor 32Y includes an encoder that detects the rotation amount of the Y-axis motor 17Y. The detection data of the drive amount sensor 32 is output to the control device 50.
The driving amount sensor 32 functions as a position sensor that detects the position of the holding member 15 in the XY plane. The driving amount of the actuator 17 corresponds to the moving amount of the holding member 15 one by one.
The X-axis sensor 32X can detect the movement amount of the holding member 15 in the X-axis direction from the origin in the sewing machine coordinate system by detecting the rotation amount of the X-axis motor 17X. The Y-axis sensor 32Y can detect the movement amount of the holding member 15 in the Y-axis direction from the origin in the sewing machine coordinate system by detecting the rotation amount of the Y-axis motor 17Y.
The control device 50 controls the actuator 16 based on the detection data of the drive amount sensor 31. The control device 50 determines, for example, the operation timing of the actuator 16 based on the detection data of the drive amount sensor 31.
The control device 50 controls the actuator 17 based on the detection data of the drive amount sensor 32. The control device 50 feedback-controls the actuator 17 based on the detection data of the drive amount sensor 32 so that the holding member 15 moves to a desired position.
The control device 50 calculates the position of the holding member 15 in the XY plane based on the detection data of the drive amount sensor 32. The movement amount of the holding member 15 in the XY plane from the origin is detected based on the detection data of the drive amount sensor 32. The control device 50 calculates the position of the holding member 15 in the XY plane based on the detected movement amount of the holding member 15.
The storage device 60 includes a nonvolatile Memory such as a ROM (Read Only Memory) or a storage device, and a volatile Memory such as a RAM (Random Access Memory: random access Memory). As shown in fig. 8, the storage device 60 is connected to the image processing device 40 and the control device 50, respectively.
The storage device 60 includes: a sewing data storage 61, a design data storage 62, and a program storage 63.
The sewing data storage 61 stores sewing data referred to in the sewing process.
The sewing process is a process of forming a stitch CH on the object S to be sewn. In the present embodiment, the sewing process includes: a first sewing process for forming the first stitch CH1, and a second sewing process for forming the second stitch CH 2. Likewise, the sewing process includes third to fourteenth sewing processes for forming third to fourteenth stitches CH3 to CH14, respectively.
The sewing data comprises: the target stitch line RL of the stitch line CH formed on the sewing object S, and the moving condition of the holding member 15.
The target stitch line RL defines a target shape of the stitch CH formed on the sewing object S and a target position of the stitch CH in the sewing machine coordinate system.
As shown in fig. 4, the target stitch line RL includes: a first target stitch RL1 for forming a first stitch CH1, and a second target stitch RL2 for forming a second stitch CH 2. Likewise, the target stitch lines RL include third to fourteenth target stitch lines RL3 to RL14, which are used to form third to fourteenth stitch lines CH3 to CH14, respectively.
The movement conditions of the holding member 15 include: a movement locus of the holding member 15 defined in a sewing machine coordinate system. The movement locus of the holding member 15 includes the movement locus of the holding member 15 in the XY plane. The movement condition of the holding member 15 is determined based on the target stitch line RL.
The first sewing process includes a process of forming a first stitch CH1 on the sewing object S based on the first target stitch line RL 1. The first sewing process is performed first after the object S to be sewn is held by the holding member 15.
The second sewing process includes a process of forming a second stitch CH2 on the sewing object S based on the second target stitch line RL2. The second sewing process is performed subsequent to the first sewing process.
Similarly, the third to fourteenth sewing processes include processes of forming third to fourteenth stitches CH3 to CH14 on the object S to be sewn based on the third to fourteenth target stitch lines RL3 to RL14, respectively. The third sewing process to the fourteenth sewing process are sequentially performed.
The design data storage 62 stores design data of the sewing object S. The design data of the sewing object S includes: the position and the range of the texture area TA, the position and the range of the stitch area SA, and the shape and the size of the reference pattern US on the surface of the sewing object S. When the sewing object S is designed by CAD (Computer Aided Design: computer aided design), the design data of the sewing object S includes CAD data.
The design data of the sewing object S is the design data of the sewing object S in the initial state. The initial state of the sewing object S is a state before the first sewing process. That is, the initial state of the object S is a state in which the stitch CH is not formed on the object S.
The program storage section 63 stores a computer program for controlling the sewing machine 1. The computer program is read by the control device 50. The control device 50 controls the sewing machine 1 in accordance with a computer program stored in the program storage unit 63.
The image processing apparatus 40 includes: the object image acquiring unit 41, the scanning unit 42, the region dividing unit 43, the correction point setting unit 44, the feature point extracting unit 45, the reference vector calculating unit 46, the reference vector storing unit 47, the correction amount calculating unit 48, and the region divided image outputting unit 49.
The object image acquisition unit 41 acquires an object image SM representing an image of the sewing object S. The imaging device 30 captures an image of the object S to be sewn, and outputs an object image SM to the image processing device 40. The object image acquisition unit 41 acquires the object image SM from the imaging device 30.
Fig. 9 is a diagram for explaining the operation of the imaging device 30 according to the present embodiment. The imaging device 30 images the sewing object S held by the holding member 15. As shown in fig. 9, an imaging area FA of the imaging device 30 is smaller than the sewing object S. The plurality of object images SM can be obtained by the relative movement of the sewn object S and the imaging area FA of the imaging device 30. The holding member 15 holding the sewing object S moves in the XY plane including the imaging position Pf of the imaging device 30. The imaging device 30 images a part of the sewing object S disposed in the imaging area FA. By repeating the movement of the holding member 15 in the XY plane and the image capturing process of a part of the sewing object S arranged in the image capturing area FA, a plurality of object images SM are sequentially obtained.
As shown in fig. 9, a plurality of object images SM are sequentially acquired while the sewing object S and the image pickup device 30 are relatively moved so that the image pickup area FA of the first image pickup process of the image pickup device 30 overlaps with a part of the image pickup area FA of the second image pickup process. The object images SM obtained in the plurality of image capturing processes are joined together to generate an object image SM of the entire sewing object S.
The imaging device 30 captures an image of the object S before the sewing process. That is, the imaging device 30 captures images of the object S before the first sewing process and before the second sewing process, respectively. Similarly, the imaging device 30 images the object S before the third to fourteenth sewing processes, respectively. The object image acquisition unit 41 acquires, from the imaging device 30, object images SM of the sewn object S captured before the first to fourteenth sewing processes, respectively.
Further, when the object image SM acquired by the imaging device 30 before the first sewing process is identical to the design data of the sewing object S in the initial state stored in the design data storage 62, the object image acquisition unit 41 may acquire the object image SM before the first sewing process from the design data storage 62 of the sewing object S in the initial state.
The scanner 42 scans the object image SM in a predetermined search area HA, and determines the states of a plurality of pixels constituting the object image SM.
The region dividing unit 43 divides the surface of the sewing object S into a texture region TA and a stitch region SA based on the object image SM, and calculates a boundary line BL between the texture region TA and the stitch region SA.
Fig. 10 is a diagram for explaining a method of calculating the boundary line BL according to the present embodiment. As shown in fig. 10, the scanner 42 scans the object image SM in a predetermined search area HA, and acquires the states of a plurality of pixels constituting the object image SM.
Further, the preprocessing of the object image SM may be performed before the object image SM is scanned. As the preprocessing, smoothing filter processing and maximum value filter processing can be exemplified. By performing the preprocessing, the determination of the pixel state can be smoothly performed.
The scanner 42 determines whether or not each of the plurality of pixels is a predetermined color pixel. In the present embodiment, the following is set: the prescribed color is black, and the prescribed color pixel is a black pixel. The scanner 42 determines whether each of the plurality of pixels is a black pixel. In the following description, a pixel to be a determination target of whether or not a black pixel is appropriately referred to as a pixel of interest AP.
As shown in fig. 10, the scanning unit 42 determines the pixel of interest APb constituting the image of the hole 7 as a black pixel. The scanner unit 42 determines that the pixel of interest APt and the pixel of interest APs constituting the image of the surface material 4 are not black pixels.
The region dividing unit 43 divides the surface of the sewing object S into a texture region TA and a stitch region SA based on the determination result of the scanning unit 42.
The region dividing unit 43 classifies the pixel of interest AP determined to be a black pixel into the texture region TA. In the example shown in fig. 10, the region dividing unit 43 classifies the target pixel APb determined to be a black pixel into the texture region TA.
The area dividing unit 43 determines: whether or not the relative position of the target pixel AP, which is determined to be not a black pixel, and the black pixels around the target pixel AP satisfies a predetermined condition. The predetermined condition includes a condition that the relative distance between the pixel of interest AP and the black pixel is equal to or less than a predetermined threshold value. The threshold value is determined based on the interval of the adjacent holes 7 in the textured area TA, the size (width) of the stitch area SA.
In the example shown in fig. 10, the target pixel APt constitutes an image of the surface material 4 and also constitutes an image of the texture area TA. The pixels of interest APs constitute an image of the stitch area SA. The distance between the pixel of interest APt constituting the image of the texture area TA and the black pixel constituting the image of the hole 7 around the pixel of interest APt is short. On the other hand, the distance between the target pixel APs constituting the image of the stitch area SA and the black pixel constituting the image of the hole 7 around the target pixel APs is large. The distance between the target pixel APt and the black pixels around the target pixel APt is equal to or less than a threshold value. The distance between the target pixel APs and the black pixels around the target pixel APs is larger than the threshold value.
When the region dividing unit 43 determines that the distance between the target pixel APt, which is determined to be not a black pixel, and the black pixels around the target pixel APt is equal to or less than the threshold value, the target pixel APt is classified into the texture region TA. When the region dividing unit 43 determines that the distance between the target pixel APs determined to be not a black pixel and the black pixels around the target pixel APs is larger than the threshold value, the target pixel APs is classified into the stitch region SA.
The region dividing unit 43 classifies all pixels of the object image SM into the texture region TA and the stitch region SA. The region dividing unit 43 classifies the plurality of pixels of the object image SM into the texture region TA and the stitch region SA, thereby calculating the boundary line BL between the texture region TA and the stitch region SA.
Further, post-processing of the object image SM may be performed after the classification of the attention pixel AP is completed. The boundary line BL may be calculated after the post-processing of the object image SM is performed. Examples of the post-treatment include an opening treatment, a closing treatment, a noise removal treatment, and a buried hole treatment. After the post-processing, a marking process and a contour drawing process are performed, whereby a plurality of reference points representing the boundaries between the texture area TA and the stitch area SA are calculated. The region dividing unit 43 calculates a least squares curve from the plurality of reference points. The area dividing unit 43 may use the calculated least square curve as the boundary line BL.
Fig. 11 is a diagram for explaining the boundary line BL according to the present embodiment. As shown in fig. 11, the boundary line BL is a line passing through the edge of the texture area TA. When the texture area TA includes the holes 7, the boundary line BL is formed so as to connect the outer edges of the holes 7 disposed on the outermost side in the texture area TA.
The correction point setting unit 44 sets a correction point CP for correcting the target stitch line RL defined in the stitch region SA. Before the first sewing process, a correction point CP for correcting the first to fourteenth target stitch lines RL1 to RL14 is set.
Fig. 12 is a diagram for explaining the correction point CP of the present embodiment. As shown in fig. 12, a correction point CP is set on the sewing object S. The correction point CP is used to correct the target stitch line RL. The correction point CP is set at an arbitrary position by the operator. The operator can set the correction point CP at an arbitrary position of the sewing object S by operating the input device 70. The correction point setting unit 44 sets the correction point CP based on the input data generated by operating the input device 70. In the example shown in fig. 12, the correction point CP is set so as to overlap the target stitch RL to be corrected in the stitch area SA. The correction point CP may be set near the target stitch line RL to be corrected, at a position offset from the target stitch line RL, or in the texture area TA. The position of the correction point CP is specified in the sewing machine coordinate system.
When the stitch CH is formed, the surface of the sewing object S is displaced with respect to the correction point CP and the target stitch RL.
The feature point extraction unit 45 extracts feature points FP of the texture region TA based on the boundary line BL calculated by the region division unit 43. The feature point FP is a portion having a characteristic shape on the boundary line BL. The characteristic shape of the characteristic point FP is substantially maintained even if the surface of the sewing object S is displaced. As the feature point FP, corner points, maximum points, minimum points, and inflection points of the boundary line BL can be exemplified.
Fig. 13 is a diagram for explaining an example of a method of calculating the feature point FP according to the present embodiment. As shown in fig. 13, the feature point extracting unit 45 specifies, for example, a reference line XL, and calculates the distances between the reference line XL and the plurality of boundary points BP of the boundary line BL, respectively. The distance is a distance in a direction orthogonal to the reference line XL. The feature point extraction unit 45 may determine, as the feature point FP, the boundary point BP having the longest distance from the reference line XL.
Further, there may be no definite feature point FP such as a corner point, a maximum point, a minimum point, or an inflection point on the boundary line BL. The feature point FP may be calculated without using the reference line XL.
Fig. 14 is a diagram for explaining an example of a method of calculating the feature point FP according to the present embodiment. As shown in fig. 14, when no clear feature point FP such as a corner point, a maximum point, a minimum point, or an inflection point exists on the boundary line BL, the feature point extraction unit 45 extracts the feature point FP based on the relative distances of the plurality of boundary lines BL. As shown in fig. 14, when the boundary lines BL are defined so as to face each other, the feature point extracting unit 45 calculates the distance between the boundary point BP of one boundary line BL and the boundary point BP of the other boundary line BL. In the example shown in fig. 14, the position of the boundary point BP of one boundary line BL in the X-axis direction is the same as the position of the boundary point BP of the other boundary line BL. The feature point extraction unit 45 calculates distances from each of the plurality of boundary points BP of the boundary line BL. The feature point extraction unit 45 may determine the boundary point BP of the boundary line BL having the longest distance as the feature point FP.
As described with reference to fig. 9, a plurality of object images SM are acquired while overlapping a part of each of the imaging areas FA of the plurality of imaging processes. By joining the plurality of object images SM, an object image SM is generated in which the entire object S is sewn. Therefore, even if there is no clear feature point FP, the feature point extraction unit 45 can search for a pair of boundary lines BL facing each other as shown in fig. 14 from the plurality of object images SM to be joined.
The image pickup device 30 may also be configured to enlarge the image pickup area FA to collectively acquire the object image SM of the entire sewing object S. When the object image SM of the entire sewing object S is acquired together, even if the feature point FP is not clear, the pair of opposing boundary lines BL as shown in fig. 14 can be searched for.
The reference vector calculation unit 46 calculates a reference vector RV indicating the relative position between the correction point CP set by the correction point setting unit 44 and the boundary point BP set on the boundary line BL.
Fig. 15 and 16 are diagrams for explaining the reference vector RV according to the present embodiment. Fig. 16 is an enlarged view of a portion of fig. 15. A correction point CP is set on the sewing object S before the sewing process. In fig. 15 and 16, the correction point CP is set to the object S in the initial state before the first sewing process. The boundary point BP is set to the boundary line BL. The boundary points BP are respectively set in the vicinity of the plurality of holes 7. The boundary point BP is set in the vicinity of the correction point CP. The reference vector RV indicates the orientation of the correction point CP with respect to the boundary point BP and the distance between the boundary point BP and the correction point CP on the object S before the first sewing process. The reference vector RV is defined in the sewing machine coordinate system.
The boundary points BP are set in plural on the boundary line BL. The boundary point BP contains the feature point FP. In the example shown in fig. 15 and 16, the feature point FP is a corner point (corner) of the boundary line BL. As shown in fig. 15, the reference vector calculation unit 46 calculates the reference vector RV of each of the correction point CP and the plurality of boundary points BP including the feature point FP. A plurality of reference vectors RV are calculated. The number of boundary points BP is equal to the number of reference vectors RV.
The reference vector storage unit 47 stores relative position data of the boundary point BP and the correction point CP indicated by the reference vector RV calculated by the reference vector calculation unit 46.
The correction amount calculating unit 48 calculates the correction point CP after the first sewing process based on the boundary point BP of the object S after the first sewing process and the reference vector RV stored in the reference vector storing unit 47. Due to the sewing process, the surface of the sewing object S may be displaced with respect to the correction point CP. Therefore, the position of the correction point CP after the first sewing process is corrected based on the displacement amount of the surface of the sewing object S. The correction amount calculation unit 48 calculates the correction point CP after the first sewing process based on the boundary point BP of the object S after the first sewing process and the reference vector RV.
Fig. 17 is a diagram for explaining an example of a method of calculating the correction point CP according to the present embodiment. The image pickup device 30 acquires the object image SM of the sewn object S after the first sewing process. The boundary line BL is calculated based on the object image SM after the first sewing process, and a plurality of boundary points BP are set on the boundary line BL. The boundary point BP is set in the vicinity of the hole 7. The boundary points BP after the first sewing process are in one-to-one correspondence with the boundary points BP before the first sewing process.
The correction amount calculation unit 48 calculates a plurality of candidate points KP of the correction point CP based on a plurality of boundary points BP of the object S after the first sewing process and a plurality of reference vectors RV calculated before the first sewing process. The correction amount calculation unit 48 calculates the correction point CP after the first sewing process based on the plurality of candidate points KP.
That is, the correction amount calculation unit 48 adds the reference vector RV calculated before the first sewing process to the boundary point BP after the first sewing process. The addition of the reference vector RV to the boundary point BP after the sewing process means that a candidate point KP is calculated, which represents a point that is offset from the boundary point BP by a distance indicated by the reference vector RV in the azimuth indicated by the reference vector RV.
The correction amount calculation unit 48 adds the reference vector RV calculated for the boundary point BP before the first sewing process to the boundary point BP after the first sewing process. For example, when calculating the first reference vector RV1 for the first boundary point BP1 before the first sewing process, the correction amount calculation unit 48 adds the first reference vector RV1 to the first boundary point BP1 after the first sewing process corresponding to the first boundary point BP1 before the first sewing process. When calculating the second reference vector RV2 for the second boundary point BP2 before the first sewing process, the correction amount calculating unit 48 adds the second reference vector RV2 to the second boundary point BP2 after the first sewing process corresponding to the second boundary point BP2 before the first sewing process. Similarly, when calculating the predetermined reference vector RV for the predetermined boundary point BP before the first sewing process, the correction amount calculating unit 48 adds the predetermined reference vector RV to the boundary point BP after the first sewing process corresponding to the predetermined boundary point BP before the first sewing process.
The correction amount calculation unit 48 sets, as the candidate point KP, the intersection of the distal ends of the reference vectors RV, which are added to the plurality of boundary points BP set in one boundary line BL, respectively. In the example shown in fig. 17, there are four texture areas TA and four boundary lines BL. Three boundary points BP are set on one boundary line BL. Four candidate points KP are calculated.
The correction amount calculation unit 48 sets a correction point CP in a partial region surrounded by four candidate points KP in the stitch region SA. In the present embodiment, the correction amount calculation unit 48 calculates the center of gravity point (center point) of the four candidate points KP in the XY plane, and sets the center of gravity point as the correction point CP.
Further, a weight may be set to at least one of the plurality of candidate points KP. For example, a weight may be added to the candidate point KP calculated based on the reference vector RV of the feature point FP closest to the correction point CP.
In addition, in the calculation of the candidate points KP, only the feature points FP may be considered without considering the boundary points BP that are not the feature points FP. That is, the candidate points KP may be calculated based on only the reference vector RV added to the feature points FP.
Here, the reference vector RV is calculated in an initial state before the first sewing process, the candidate point KP is calculated based on the boundary point BP and the reference vector RV after the first sewing process, and the correction point CP is calculated based on the candidate point KP. The reference vector RV may be calculated before the second sewing process, the candidate point KP may be calculated based on the boundary point BP and the reference vector RV after the second sewing process, and the correction point CP after the second sewing process may be calculated based on the candidate point KP. The same process is also performed during each of the third to fourteenth sewing processes.
The position of the correction point CP after the sewing process may be calculated without using the reference vector RV.
Fig. 18 is a diagram for explaining an example of a method of calculating the correction point CP according to the present embodiment. Fig. 18 (a) is a diagram showing a part of the object S to be sewn before the sewing process in the image pickup area FA. Fig. 18 (B) is a diagram showing a part of the object S to be sewn after the sewing process disposed in the imaging area FA. As shown in fig. 18 (a), a pair of boundary lines BL are opposed. The boundary line BL extends in the Y-axis direction. No distinct feature point FP exists on the boundary line BL. As shown in fig. 18 (B), when the boundary line BL moves in the-X direction due to the sewing process, the correction point CP can be estimated to move in the-X direction by the same amount as the movement amount of the boundary line BL. In this way, when there is no clear feature point FP on the boundary line BL, the position of the correction point CP after the sewing process can be calculated based on the movement amount of the boundary line BL. The same applies to the case where the boundary line BL moves in the +x direction. The same applies to the case where the pair of boundary lines BL extend in the Y-axis direction and move in the +y direction or the-Y direction.
Fig. 19 is a diagram for explaining an example of a method of calculating the correction point CP according to the present embodiment. Fig. 19 (a) is a diagram showing a part of the object S to be sewn before the sewing process in the image pickup area FA. Fig. 19 (B) is a diagram showing a part of the object S to be sewn after the sewing process disposed in the imaging area FA. As shown in fig. 19 (a), a pair of boundary lines BL are opposed. The boundary line BL extends in directions inclined to the X-axis direction and the Y-axis direction, respectively. No distinct feature point FP exists on the boundary line BL. As shown in fig. 19 (B), when the boundary line BL moves due to the sewing process, the boundary line BL extends in directions inclined to the X-axis direction and the Y-axis direction, respectively, and therefore it is difficult to determine whether the object S to be sewn moves in the X-axis direction or the Y-axis direction in the image pickup area FA. In this case, the image pickup device 30 may enlarge the image pickup area FA, for example, as described with reference to fig. 14, and search for a pair of boundary lines BL on which the feature point FP can be calculated. In addition, when it is difficult to determine whether the object S to be sewn moves in the X-axis direction or the Y-axis direction in the image pickup area FA, the output device 80 may output a warning.
The region-divided image output unit 49 outputs the region-divided image DM to the output device 80. In the present embodiment, the region dividing unit 43 divides the surface of the sewing object S into the texture region TA and the stitch region SA, and then generates a region divided image DM representing an image including the texture region TA and the stitch region SA. The segmented image output unit 49 outputs the segmented image DM including the texture region TA and the stitch region SA segmented by the segmented unit 43 to the output device 80.
[ Sewing method ]
Fig. 20 is a flowchart showing the sewing method according to the present embodiment. In this embodiment, the sewing method includes: the calibration process S0, the object image acquisition process S1, the region division process S2, the correction point calculation process S3, the target stitch line correction process S4, the sewing process S5, and the end determination process S6.
The calibration process S0 is a process of associating the texture area TA and the stitch area SA of the sewing object S held by the holding member 15 with the sewing machine coordinate system. After the object S before the sewing process is held by the holding member 15, the imaging device 30 captures an image of the object S. The imaging device 30, for example, images a plurality of feature points FP of the object S. In the case where the calibration mark is provided on the sewing object S, the imaging device 30 may image the calibration mark. The position of the image of the sewing object S captured by the imaging device 30 is defined in the camera coordinate system. The position of the image specified in the camera coordinate system is converted into the position of the image specified in the sewing machine coordinate system by using a specified conversion formula or matrix. Thus, the position of the texture area TA and the position of the stitch area SA of the object S to be sewn can be defined in the sewing machine coordinate system.
The object image acquisition process S1 is a process of acquiring the object image SM. The object image acquiring unit 41 may acquire the object image SM in the initial state from the imaging device 30 or may acquire the object image SM in the initial state from the design data storage unit 62 before the first sewing process.
After the first sewing process, the image capturing device 30 acquires the object image SM. The imaging device 30 acquires the object image SM before the second sewing process after the first sewing process, and acquires the object image SM before the third sewing process after the second sewing process. Similarly, the imaging device 30 acquires the object image SM during each of the third to fourteenth sewing processes.
The region dividing process S2 is a process of dividing the surface of the sewing object S into a texture region TA and a stitch region SA based on the object image SM, and calculating a boundary line BL between the texture region TA and the stitch region SA. The region dividing process S2 is performed before the first sewing process, and after the first sewing process and before the second sewing process, respectively. Similarly, the region dividing process S2 is performed during each of the second to fourteenth sewing processes.
Fig. 21 is a flowchart showing the region division processing according to the present embodiment. The scanner 42 scans the object image SM with a predetermined search area HA (step S21).
The scanner unit 42 determines whether or not the pixel of interest AP of the object image SM is a black pixel (step S22).
When it is determined in step S22 that the pixel of interest AP is a black pixel (yes in step S22), the region dividing unit 43 classifies the pixel of interest AP determined to be a black pixel into the texture region TA (step S23).
When it is determined in step S22 that the target pixel AP is not a black pixel (step S22: no), the area dividing unit 43 determines that: whether or not the relative positions of the target pixel AP, which is determined to be not a black pixel, and black pixels around the target pixel AP satisfy a predetermined condition (step S24).
When it is determined in step S24 that the target pixel AP satisfies the predetermined condition (yes in step S24), the region dividing unit 43 classifies the target pixel AP determined to satisfy the predetermined condition into the texture region TA (step S23).
When it is determined in step S24 that the target pixel AP does not satisfy the predetermined condition (step S24: no), the area dividing unit 43 classifies the target pixel AP determined not to satisfy the predetermined condition into the stitch area SA (step S25).
The region dividing unit 43 calculates a boundary line BL between the texture region TA and the stitch region SA based on the pixels classified into the texture region TA and the pixels classified into the stitch region SA (step S26).
The correction point calculation process S3 is a process in which, in the correction point calculation process S3: the reference vector RV indicating the relative position between the correction point CP set on the object S before the sewing process and the boundary point BP set on the boundary line BL is calculated, and the correction point CP after the sewing process is calculated based on the boundary point BP and the reference vector RV of the object S after the sewing process. In an initial state before the first sewing process, the correction point CP is set by the operator. The operator operates the input device 70 to set the correction point CP. The correction point setting unit 44 sets the correction point CP in the initial state based on the input data from the input device 70. The correction point calculation process S3 is not performed before the first sewing process. The correction point calculation processing S3 is performed after the first sewing processing, after the second sewing processing, and before the third sewing processing, respectively.
Similarly, correction point calculation processing S3 is performed during each of the fourth to fourteenth sewing processing.
Fig. 22 is a flowchart showing the correction point calculation process S3 according to the present embodiment. The region dividing unit 43 sets a boundary point BP on the boundary line BL. The area dividing unit 43 sets a boundary point BP around the correction point CP. In other words, the area dividing unit 43 sets the boundary point BP in the vicinity of the correction point CP (step S31).
The feature point extraction unit 45 extracts the feature point FP based on the boundary line BL (step S32).
The feature point FP exists at the boundary line BL. The boundary point BP contains the feature point FP.
The correction amount calculation unit 48 acquires a plurality of reference vectors RV from the reference vector storage unit 47 (step S33).
As described with reference to fig. 15 and 16, the reference vector RV is calculated by the reference vector calculating unit 46 before the correction point calculating process (before the sewing process), and stored in the reference vector storing unit 47. Therefore, the correction amount calculation unit 48 can acquire the plurality of reference vectors RV calculated before the sewing process from the reference vector storage unit 47.
The correction amount calculation unit 48 calculates a plurality of candidate points KP based on the plurality of boundary points BP including the feature point FP extracted in step S32 and the plurality of reference vectors RV acquired in step S33 (step S34).
The correction amount calculating unit 48 calculates the corrected point CP after the sewing process based on the plurality of candidate points KP calculated in step S34 (step S35).
The correction amount calculation unit 48 sets the center of gravity point of the plurality of candidate points KP as the corrected point CP after the sewing process.
The correction point CP after the sewing process is displaced from the correction point CP before the sewing process. The correction amount calculating unit 48 calculates the displacement amount from the correction point CP before the sewing process to the correction point CP after the sewing process based on the position of the correction point CP calculated in step S35 (step S36).
There are a plurality of correction points CP. The correction amount calculation unit 48 calculates the displacement amounts of the plurality of correction points CP.
The target stitch line correction process S4 is a process of correcting the target stitch line RL based on the correction point CP calculated by the correction point calculation process S3. The surface displacement of the sewing object S caused by the sewing process causes displacement of the correction point CP set by the operator and the target stitch line RL defined by the sewing data. The displacement amount of the correction point CP can be calculated by the correction point calculation process S3. The control device 50 corrects the target stitch line RL based on the displacement amount of the correction point CP calculated by the correction amount calculating section 48. The control device 50 displaces the target needle track RL by the same displacement amount as the displacement amount of the correction point CP, for example.
A plurality of correction points CP are set for the target stitch line RL. The target stitch line RL is corrected based on the displacement amounts of the plurality of correction points CP. The position of the target stitch line RL in the sewing machine coordinate system is corrected.
The target stitch line correction process S4 is not performed before the first sewing process. The first sewing process is performed based on the target stitch line RL in the initial state specified by the sewing data. The target stitch line correction processing S4 is performed after the first sewing processing, after the second sewing processing, and before the third sewing processing, respectively. Similarly, the target stitch line correction processing S4 is performed during each of the fourth to fourteenth sewing processes.
The sewing process S5 is a process of forming the stitch CH based on the target stitch RL. The sewing process includes a first sewing process to a fourteenth sewing process. The first sewing process is performed based on the target stitch line RL in the initial state specified by the sewing data. The second to fourteenth sewing processes are performed based on the target stitch line RL corrected by the target stitch line correction process S4. The control device 50 outputs a control command to the actuator 17 so that the stitch CH is formed in accordance with the target stitch RL.
The end determination processing S6 is processing for determining whether or not the sewing processing of the object S to be sewn is ended. The control device 50 determines whether or not the sewing process of the object S is finished based on the sewing data. In a state where the first to thirteenth sewing processes are completed, the control device 50 determines that the sewing process is not completed in the completion determination process S6. In a state where the fourteenth sewing process is ended, the control device 50 determines that the sewing process is ended in the end determination process S6.
[ computer System ]
Fig. 23 is a block diagram showing an example of the computer system 1000. The image processing apparatus 40 and the control apparatus 50 each include a computer system 1000. The computer system 1000 includes a processor 1001 such as a CPU (Central Processing Unit: central processing unit), a main Memory 1002 including a nonvolatile Memory such as a ROM (Read Only Memory) and a volatile Memory such as a RAM (Random Access Memory: random access Memory), a storage device 1003, and an interface 1004 including an input/output circuit. The functions of the image processing apparatus 40 and the functions of the control apparatus 50 are stored in the storage apparatus 1003 as computer programs, respectively. The processor 1001 reads a computer program from the storage device 1003 and expands in the main memory 1002, and executes the above-described processing in accordance with the computer program. Furthermore, the computer program may also be transmitted to the computer system 1000 via a network.
The computer program can be executed according to the above embodiment: scanning an object image SM representing an image of the sewing object S with a predetermined search area HA, and determining whether the pixel of interest AP is a predetermined color pixel; and dividing the surface of the sewing object S into a texture area TA and a stitch area SA based on the judging result, and calculating a boundary line RL of the texture area TA and the stitch area SA.
[ Effect ]
As described above, according to the present embodiment, the object image SM is scanned, and it is determined whether each of the plurality of pixels of the object pixel SM is a black pixel. Thereby, the textured area TA where the holes 7 are provided and the stitch area SA where the holes 7 are not provided are identified. Further, even if the surface of the sewing object S is displaced by forming the stitch CH, the texture area TA and the stitch area SA can be identified by determining whether or not the pixel of interest AP is a black pixel.
The target pixel AP determined to be not a black pixel is classified into the texture area TA when the relative position to the surrounding black pixel satisfies a predetermined condition, and is classified into the stitch area SA when the relative position to the surrounding black pixel does not satisfy a predetermined condition. Thus, even if the pixel is not a black pixel, the pixel of interest AP is appropriately classified into one of the texture area TA and the stitch area SA.
Other embodiments
In the above embodiment, the following is set: when the image SM of the object is acquired by the image pickup device 30, the holding member 15 holding the sewing object S is moved in the XY plane in a state where the position of the image pickup device 30 is fixed. In a state where the position of the sewing object S is fixed, the imaging area FA of the imaging device 30 may be moved in the XY plane, or both the imaging area FA and the sewing object S may be moved in the XY plane.
In the above embodiment, the following is set: the boundary line BL is generated so as to pass through the outer edge of the outermost hole 7 in the texture region TA. The boundary line BL may be generated so as to pass through the outermost holes 7 in the texture region TA. The boundary point BP may be set in the hole 7.

Claims (3)

1. An image processing device is provided with:
an object image acquisition unit that acquires an object image representing an image of a sewn object;
a scanning unit that scans the object image with a predetermined search area and determines whether or not a pixel of interest is a pixel of a predetermined color; and
and a region dividing unit that divides a surface of the sewing object into a texture region and a stitch region based on a determination result, classifies the target pixel determined as the predetermined color pixel into the texture region, classifies the target pixel into the texture region when it is determined that a relative position between the target pixel and the predetermined color pixel around the target pixel is not the predetermined color pixel satisfies a predetermined condition, and classifies the target pixel into the stitch region when the relative position is not the predetermined condition, and calculates a boundary line between the texture region and the stitch region.
2. A sewing machine is provided with:
a holding member capable of holding and moving the object to be sewn in a predetermined plane including a sewing position immediately below the sewing machine needle;
an actuator that generates power to move the holding member;
the image processing apparatus of claim 1; and
and a control device that outputs a control instruction for controlling the actuator based on a processing result of the image processing device.
3. An image processing method, comprising:
scanning an object image representing an image of a sewing object with a predetermined search area, and determining whether a pixel of interest is a predetermined color pixel; and
the method includes dividing a surface of the sewing object into a texture region and a stitch region based on a determination result, classifying the target pixel determined as the predetermined color pixel into the texture region, classifying the target pixel into the texture region when it is determined that a relative position between the target pixel and the predetermined color pixel around the target pixel is not the predetermined color pixel, and calculating a boundary line between the texture region and the stitch region when the relative position is not the predetermined condition.
CN202011224095.7A 2019-11-06 2020-11-05 Image processing device, sewing machine, and image processing method Active CN112779679B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019201419A JP7405565B2 (en) 2019-11-06 2019-11-06 Image processing device, sewing machine, and image processing method
JP2019-201419 2019-11-06

Publications (2)

Publication Number Publication Date
CN112779679A CN112779679A (en) 2021-05-11
CN112779679B true CN112779679B (en) 2024-02-02

Family

ID=75750370

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011224095.7A Active CN112779679B (en) 2019-11-06 2020-11-05 Image processing device, sewing machine, and image processing method

Country Status (2)

Country Link
JP (1) JP7405565B2 (en)
CN (1) CN112779679B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0838756A (en) * 1994-07-27 1996-02-13 Brother Ind Ltd Embroidery data generating device
JPH08218266A (en) * 1995-02-15 1996-08-27 Janome Sewing Mach Co Ltd Embroidery pattern-making device for machine capable of embroidery sewing and its method
JPH105465A (en) * 1996-06-24 1998-01-13 Japan Small Corp Quilting method
JPH11123289A (en) * 1997-10-22 1999-05-11 Brother Ind Ltd Embroidery data processing device, embroidering machine, and recording medium
JPH11207061A (en) * 1998-01-19 1999-08-03 Dan:Kk Method and device for detecting overcasting stitch of circularly knitted cloth
JP2002197459A (en) * 2000-12-27 2002-07-12 Fuji Photo Film Co Ltd Image processor, image processing method and recording medium
JP2013162957A (en) * 2012-02-13 2013-08-22 Toyota Boshoku Corp Structural member of vehicle
JP2015070382A (en) * 2013-09-27 2015-04-13 オリンパス株式会社 Image processing apparatus
CN104933681A (en) * 2014-03-20 2015-09-23 株式会社岛津制作所 Image processing apparatus and an image processing program
CN105447847A (en) * 2014-09-24 2016-03-30 Juki株式会社 Form detection means and sewing machine
JP2016135163A (en) * 2015-01-23 2016-07-28 蛇の目ミシン工業株式会社 Embroidery pattern arrangement system, embroidery pattern arrangement device, embroidery pattern arrangement method for embroidery pattern arrangement device, program for embroidery pattern arrangement device, and sewing machine
CN108729036A (en) * 2017-04-21 2018-11-02 Juki株式会社 Sewing machine and method of sewing
JP2018183576A (en) * 2017-04-21 2018-11-22 Juki株式会社 Sewing machine and sewing method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8606390B2 (en) * 2007-12-27 2013-12-10 Vsm Group Ab Sewing machine having a camera for forming images of a sewing area

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0838756A (en) * 1994-07-27 1996-02-13 Brother Ind Ltd Embroidery data generating device
JPH08218266A (en) * 1995-02-15 1996-08-27 Janome Sewing Mach Co Ltd Embroidery pattern-making device for machine capable of embroidery sewing and its method
JPH105465A (en) * 1996-06-24 1998-01-13 Japan Small Corp Quilting method
JPH11123289A (en) * 1997-10-22 1999-05-11 Brother Ind Ltd Embroidery data processing device, embroidering machine, and recording medium
JPH11207061A (en) * 1998-01-19 1999-08-03 Dan:Kk Method and device for detecting overcasting stitch of circularly knitted cloth
JP2002197459A (en) * 2000-12-27 2002-07-12 Fuji Photo Film Co Ltd Image processor, image processing method and recording medium
JP2013162957A (en) * 2012-02-13 2013-08-22 Toyota Boshoku Corp Structural member of vehicle
JP2015070382A (en) * 2013-09-27 2015-04-13 オリンパス株式会社 Image processing apparatus
CN104933681A (en) * 2014-03-20 2015-09-23 株式会社岛津制作所 Image processing apparatus and an image processing program
CN105447847A (en) * 2014-09-24 2016-03-30 Juki株式会社 Form detection means and sewing machine
JP2016063894A (en) * 2014-09-24 2016-04-28 Juki株式会社 Shape recognition device and sewing machine
JP2016135163A (en) * 2015-01-23 2016-07-28 蛇の目ミシン工業株式会社 Embroidery pattern arrangement system, embroidery pattern arrangement device, embroidery pattern arrangement method for embroidery pattern arrangement device, program for embroidery pattern arrangement device, and sewing machine
CN108729036A (en) * 2017-04-21 2018-11-02 Juki株式会社 Sewing machine and method of sewing
JP2018183576A (en) * 2017-04-21 2018-11-22 Juki株式会社 Sewing machine and sewing method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨磊.数字媒体技术概论.中国铁道出版社,2017,(第1版),24-25. *

Also Published As

Publication number Publication date
CN112779679A (en) 2021-05-11
JP7405565B2 (en) 2023-12-26
JP2021074075A (en) 2021-05-20

Similar Documents

Publication Publication Date Title
US10358754B2 (en) Sewing system
JP5315705B2 (en) sewing machine
US8186289B2 (en) Sewing machine and computer-readable medium storing control program executable on sewing machine
US8606390B2 (en) Sewing machine having a camera for forming images of a sewing area
JP5049975B2 (en) 3D model data generation method and 3D model data generation apparatus
US20110226171A1 (en) Sewing machine and non-transitory computer-readable medium storing sewing machine control program
JP5993283B2 (en) Sewing device
CN112941733B (en) Image processing device, sewing machine and image processing method
CN112779680B (en) Image processing device, sewing machine, and image processing method
JP2014042706A (en) Sewing machine
CN108729036B (en) Sewing machine and sewing method
EP2386673A1 (en) Sewing machine and non-transitory computer-readable medium storing sewing machine control program
US10450682B2 (en) Sewing machine and non-transitory computer-readable medium
JP2020130912A (en) sewing machine
CN112779679B (en) Image processing device, sewing machine, and image processing method
JP7079132B2 (en) Sewing machine and sewing method
CN110273229B (en) Stitch inspection device
US10619278B2 (en) Method of sewing a fabric piece onto another fabric piece based on image detection
US11286597B2 (en) Sewing machine and sewing method
JP2015200582A (en) image measuring instrument
CN110616511B (en) Sewing machine and sewing method
JP2011005180A (en) Sewing machine
JP6904674B2 (en) Sewing machine, embroidery frame judgment method and program
US20220180558A1 (en) Computer-readable storage medium, image processing apparatus, and method for image processing
JP2019216985A (en) Sewing machine and sewing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant