CN111797824B - Video identification positioning system and method applied to garment thermoprint - Google Patents

Video identification positioning system and method applied to garment thermoprint Download PDF

Info

Publication number
CN111797824B
CN111797824B CN202010665441.9A CN202010665441A CN111797824B CN 111797824 B CN111797824 B CN 111797824B CN 202010665441 A CN202010665441 A CN 202010665441A CN 111797824 B CN111797824 B CN 111797824B
Authority
CN
China
Prior art keywords
image information
clothes
garment
points
calculating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010665441.9A
Other languages
Chinese (zh)
Other versions
CN111797824A (en
Inventor
赵田力
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beta Technology Suzhou Co ltd
Original Assignee
Beta Technology Suzhou Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beta Technology Suzhou Co ltd filed Critical Beta Technology Suzhou Co ltd
Priority to CN202010665441.9A priority Critical patent/CN111797824B/en
Publication of CN111797824A publication Critical patent/CN111797824A/en
Priority to PCT/CN2020/136299 priority patent/WO2022011952A1/en
Application granted granted Critical
Publication of CN111797824B publication Critical patent/CN111797824B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

A video identification positioning system and method applied to garment thermoprint is characterized by comprising the following steps: the placing table comprises a calibration plate arranged at the upper left corner and is used for flatly placing clothes; the camera is arranged above the placing table and is used for collecting image information comprising the clothes patterns and the calibration plate patterns; the computer device is connected with the camera and is used for processing and calculating the image information so as to determine the actual size, the actual hot stamping coordinate origin and the orientation of the clothes. The method for positioning the hot stamping coordinate origin can position the hot stamping coordinate origins on clothes with different sizes, and after the hot stamping coordinate origins are confirmed, the hot stamping process becomes datamation and visualization, and the produced clothes are more standardized. Besides meeting the production requirements of different clients, the production efficiency and the production yield are improved, and the labor cost is reduced.

Description

Video identification positioning system and method applied to garment thermoprint
Technical Field
The invention relates to the field of machine vision identification, in particular to a video identification positioning system and method applied to garment thermoprint.
Background
Garment stamping is a great challenge for inexperienced operators because the garment is devoid of fiducial lines. Workers need to place the material in the proper location. The manual removal of the placement is easy to deviate and skew. For some parameters that are not explicitly specified (e.g., printed on the back), the worker cannot guarantee that their placement is proper, and for many garments for one order, it is more difficult to maintain a standard (easily one order, highly irregular), let alone that many workers work together to process one order.
In addition, the printing position requirements of the customer on the garment thermoprinting are often ambiguous, the customer can only widely say which range is printed, but the specific printing place is not described, a piece of garment is provided with no datum line, and a worker needs a certain time to ensure that the worker places materials in the middle, properly up and down, and has no distortion and attractive overall appearance. This is certainly difficult for novice users. Meanwhile, workers need to check whether own materials, clothing sizes and the like are consistent with the requirement list. This makes the whole procedure time consuming. For skilled workers, a piece of five material (the most common printing case) takes about 2 minutes, while for novice, it often takes 5 minutes or even longer.
At present, a set of equipment capable of confirming the thermoprint coordinate origin on clothes is prepared to be designed, so that the thermoprint is accurately positioned, and the difficulty is in positioning the thermoprint coordinate origin of the clothes.
Therefore, in order to solve the problems of the prior art, it is necessary to design a video recognition positioning system and method for garment thermoprinting.
Disclosure of Invention
In order to overcome the defects in the prior art, the invention aims to provide a video identification positioning system and method applied to garment thermoprinting.
To achieve the above and other related objects, the present invention provides the following technical solutions: the garment thermoprint video identification positioning method is characterized by comprising the following steps of:
s1, laying clothes on a placing table with a calibration plate;
s2, acquiring image information comprising a clothes pattern and a calibration plate pattern;
s3, identifying a calibration plate pattern in the image information, and processing and calculating to obtain an actual size K represented by a single pixel;
s4, calculating to obtain the outline of the clothing pattern in the image information based on a gray value difference algorithm;
s5, defining left and right waist points and left and right neck points of the garment pattern in the image information based on the contour obtained in the step S4, and calculating to obtain a central line and a hot stamping coordinate origin of the garment pattern in the image information;
s6, calculating the number Q of pixels on the left waist point connecting line and the right waist point connecting line in the step S4 based on the actual size K represented by the single pixel obtained in the step S2 so as to calculate the width M of the hem of the garment, and calculating the actual position of the hot stamping coordinate origin on the garment according to the relative position of the hot stamping coordinate origin in the image information;
s7, calculating a midline angle based on the midline calculated in the step S5, and determining the orientation of the clothes.
The preferable technical scheme is as follows: in step S3, the calibration plate is known to be square with a side length L, and color recognition is performed on the calibration plate pattern in the image information, so as to obtain the number N of pixel points occupied by the calibration plate pattern in the image information, and the actual size k=l/sqrt (N) represented by a single pixel is calculated.
The preferable technical scheme is as follows: in step S4, n intersecting lines in the image information are taken, gray value differences of left and right pixel points of each pixel point on the n intersecting lines are calculated, the largest gray value difference in the n intersecting lines is found out, a certain proportion of the smallest value in the largest gray value differences is used as a mutation reference of the whole image information, all mutation points in the image information are found out according to the obtained mutation reference, and the mutation points are defined as outlines of clothing patterns in the image information.
The preferable technical scheme is as follows: in step S4, three transverse cross-sectional lines from top to bottom in the image information are taken, gray value differences of left and right pixel points of each pixel point on the three transverse cross-sectional lines are calculated, the largest gray value difference in the three transverse cross-sectional lines is found, 2/3 of the smallest value in the three largest gray value differences is used as a mutation reference of the whole image information, all mutation points in the image information are found according to the obtained mutation reference, and the mutation points are defined as outlines of clothing patterns in the image information.
The preferable technical scheme is as follows: in step S5, two points of the contour, namely the left waist point and the right waist point, are defined, two points of the contour, namely the left neck point and the right neck point, are defined, the midpoint of the connecting line of the left neck point and the right neck point is connected with the midpoint of the connecting line of the left waist point and the right waist point, a centerline of the contour is obtained, and the midpoint of the centerline is defined as the hot stamping coordinate origin of the contour.
The preferable technical scheme is as follows: in step S6, the formula of the hem width of the garment is m=k×q.
The preferable technical scheme is as follows: after step S6, the method further includes:
and identifying the bar code information of the clothes in the image information to obtain the calibrated size of the clothes, and comparing the calculated width of the hem with the width of the hem in the calibrated size to report errors.
A video identification positioning system for garment thermoprinting, comprising:
the placing table comprises a calibration plate arranged at the upper left corner and is used for flatly placing clothes;
the camera is arranged above the placing table and is used for collecting image information comprising the clothes patterns and the calibration plate patterns;
the calibration plate information identification unit is used for identifying the calibration plate pattern in the image information and obtaining the actual size K represented by a single pixel through calculation;
the outline judging unit is used for judging the outline of the clothing pattern in the image information;
the thermoprinting coordinate origin defining unit is used for defining left and right waist points and left and right neck points of the outline and calculating to obtain a central line of the garment pattern in the image information and the thermoprinting coordinate origin;
a size calculating unit for calculating an actual size of the garment and confirming a position of a hot stamping coordinate origin on the garment based on the obtained actual size K represented by the single pixel;
a direction recognition unit for recognizing a direction of the clothing;
the actual size and the orientation of the clothes and the actual thermoprint coordinate origin of the clothes are obtained through processing and calculating the image information, and the method is used for accurately positioning material placement thermoprint in the production process.
The preferable technical scheme is as follows: the clothes are provided with bar codes, the image information is provided with bar code information, the clothes further comprise a bar code identification unit and an error reporting unit, the bar code identification unit is used for reading the bar code information in the image information to obtain the calibrated size of the clothes, and the error reporting unit is used for comparing the calibrated size with the actual size to report errors.
Due to the application of the technical scheme, the invention has the following beneficial effects:
according to the invention, the calibration plate is sampled to determine the standard, then a plurality of calibration points on the clothes are found through a gray value difference value positioning method, and then the thermoprint coordinate origin of the clothes is found according to the calibration points. By adopting the method, the hot stamping coordinate origins on clothes with different sizes can be positioned, and after the hot stamping coordinate origins are confirmed, the hot stamping process becomes data-based and visual, and the produced clothes are more standardized. Besides meeting the production requirements of different clients, the production efficiency and the production yield are improved, and the labor cost is reduced.
Drawings
FIG. 1 is a flow chart of a method for identifying and positioning a garment hot stamping video according to the present invention.
Fig. 2 is a schematic diagram of image information collected by the present invention.
In the drawings, 1, a left collar point; 2. a right collar point; 3. a left waist point; 4. a right waist point; 5. a midline; 6. thermoprinting a coordinate origin; 7. a calibration plate; 8. and (5) bar codes.
Detailed Description
Further advantages and effects of the present invention will become apparent to those skilled in the art from the disclosure of the present invention, which is described by the following specific examples.
Please refer to fig. 1-2. It should be noted that, in the description of the present invention, it should be noted that, directions or positional relationships indicated by terms such as "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., are directions or positional relationships based on those shown in the drawings, or directions or positional relationships in which the inventive product is conventionally put in use, are merely for convenience of describing the present invention and for simplifying the description, and are not indicative or implying that the apparatus or element to be referred to must have a specific direction, be constructed and operated in a specific direction, and therefore should not be construed as limiting the present invention. Furthermore, the terms "first," "second," "third," and the like are used merely to distinguish between descriptions and should not be construed as indicating or implying relative importance. The terms "horizontal," "vertical," "overhang," and the like do not denote that the component is required to be absolutely horizontal or overhang, but may be slightly inclined. As "horizontal" merely means that its direction is more horizontal than "vertical", and does not mean that the structure must be perfectly horizontal, but may be slightly inclined.
In the description of the present invention, it should also be noted that, unless explicitly specified and limited otherwise, the terms "disposed," "mounted," "connected," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, integrally connected, mechanically connected, electrically connected, directly connected, indirectly connected through an intermediary, or communicating between two elements. The specific meaning of the above terms in the present invention will be understood in specific cases by those of ordinary skill in the art.
Examples:
referring to fig. 1, which is a flowchart of a method for identifying and positioning a video for garment thermoprint according to an embodiment of the present application, the present embodiment provides a method for identifying and positioning a video for garment thermoprint, including the following steps:
s1, laying clothes on a placing table with a calibration plate 7 (shown by referring to FIG. 2);
in this step, the clothing is laid flat on the placement table by a person, and bar code information (not shown) on the clothing is ensured to be directed upward.
S2, acquiring image information comprising clothes patterns and calibration plate patterns;
in the step, the camera arranged above the placing table is used for collecting image information, and the collected image information also comprises bar code information.
S3, calibrating a camera coordinate system, external parameters of the ground and internal parameters of the camera through a calibration plate vertically arranged on a placement plate right below the camera; processing and calculating an image shot by a camera and containing a calibration plate to obtain an actual size K represented by a single pixel; the specific processing calculation method comprises the following steps:
the calibration plate is known to be black square with side length L, color recognition is carried out on the calibration plate pattern in the image information, so that the number N of pixel points occupied by the calibration plate pattern in the image information is obtained, and the actual size K=L/sqrt (N) represented by a single pixel is calculated.
The step is realized by a calibration plate information identification unit, and is used for identifying the calibration plate pattern in the image information and obtaining the actual size represented by the single pixel through calculation.
S4, taking a plurality of transverse cross-sectional lines in the image information as calculation units, calculating gray value difference values of left and right pixel points of each pixel point on the transverse cross-sectional lines, finding out the maximum gray value difference value in each transverse cross-sectional line, taking a certain proportion of the minimum value in the maximum gray value difference values as a sudden change reference of the whole image information, taking each transverse cross-sectional line in the image information as a calculation unit, finding out sudden change points on each transverse cross-sectional line according to the obtained sudden change reference, and defining the sudden change points as outlines of clothing patterns in the image information;
the step is realized by a contour judging unit, and the contour of the clothing pattern in the image information is judged based on a gray value difference algorithm. The specific calculation method provided in this embodiment is as follows:
(1) The contour judging unit takes three transverse lines of the image information from top to bottom of 2/5, 1/2 and 3/5 as calculation units and calculates the gray value difference value of the left pixel point and the right pixel point of each pixel point on the three transverse lines;
(2) Finding out the maximum gray value difference value in the three transverse cross-sectional lines, and taking 2/3 of the minimum value in the three maximum gray value difference values as a sudden change reference of the whole graph;
(3) The contour judging unit takes each transverse line in the image information as a calculation unit, finds out the abrupt change points on each transverse line in the image information according to the abrupt change reference in the step (2), and defines the abrupt change points as the contour of the clothing pattern in the image information (as optimization, noise reduction treatment can be carried out on the contour).
S5, defining two points of the contour, namely a left waist point 3 and a right waist point 4, defining two points of the contour, namely a left neck point 1 and a right neck point 2, taking the midpoint of the connecting line of the left neck point 1 and the right neck point 2 and the midpoint of the connecting line of the left waist point 3 and the right waist point 4, connecting to obtain a central line 5 of the contour, and taking the midpoint of the central line 5 as a hot stamping coordinate origin 6 of the contour; (see FIG. 2)
The step is realized by a thermoprint coordinate origin defining unit, which is used for defining left and right waist points, left and right neck points, a central line and the thermoprint coordinate origin of the outline.
S6, calculating the number of pixels on a connecting line between the left waist point and the right waist point in the image information, combining the actual size represented by the single pixels obtained in the step S2 to obtain the actual size of the clothes, and calculating the actual position of the thermoprint coordinate origin on the clothes according to the relative position of the thermoprint coordinate origin in the outline;
this step is implemented by a size calculation unit that calculates the actual size of the garment and confirms the position of the origin of the hot stamping coordinates on the garment based on the actual size represented by the resulting single pixel.
S7, calculating the midline angle in the step S5 to determine the orientation of the clothes.
This step is implemented by an orientation recognition unit for recognizing the orientation of the garment.
In the above step, after step S6, the method further includes comparing the actual size with the calibrated size to report errors: and identifying the bar code information of the clothes in the image information to obtain the calibration size of the clothes, and comparing the actual size with the calibration size to report errors.
The method comprises the steps of realizing the steps by a bar code identification unit and an error reporting unit, wherein the bar code identification unit is used for reading bar code information in image information to obtain the calibrated size of the clothes, and the error reporting unit is used for comparing the calibrated size with the actual size to report errors.
Principle of:
according to the invention, the actual size represented by a single pixel in the image information is obtained by sampling the calibration plate 7, then a plurality of calibration points (waist points and neck points) on the clothes are found by a gray value difference positioning method, and then the actual size, the actual thermoprint coordinate origin and the orientation of the clothes are calculated according to the plurality of calibration points.
Therefore, the invention has the following advantages:
the method for positioning the hot stamping coordinate origin can position the hot stamping coordinate origins on clothes with different sizes, and after the hot stamping coordinate origins are confirmed, the hot stamping process becomes datamation and visualization, and the produced clothes are more standardized. Besides meeting the production requirements of different clients, the production efficiency and the production yield are improved, and the labor cost is reduced.
The above embodiments are merely illustrative of the principles of the present invention and its effectiveness, and are not intended to limit the invention. Modifications and variations may be made to the above-described embodiments by those skilled in the art without departing from the spirit and scope of the invention. Accordingly, it is intended that all equivalent modifications and variations which can be accomplished by persons skilled in the art without departing from the spirit and technical spirit of the present invention shall be covered by the appended claims.

Claims (7)

1. The garment thermoprint video identification positioning method is characterized by comprising the following steps of:
s1, laying clothes on a placing table with a calibration plate;
s2, acquiring image information comprising a clothes pattern and a calibration plate pattern;
s3, identifying a calibration plate pattern in the image information, and processing and calculating to obtain an actual size K represented by a single pixel;
s4, calculating to obtain the outline of the clothing pattern in the image information based on a gray value difference algorithm;
s5, defining left and right waist points and left and right neck points of the garment pattern in the image information based on the contour obtained in the step S4, and calculating to obtain a central line and a hot stamping coordinate origin of the garment pattern in the image information;
s6, calculating the number Q of pixels on the left waist point connecting line and the right waist point connecting line in the step S4 based on the actual size K represented by the single pixel obtained in the step S2 so as to calculate the width M of the hem of the garment, and calculating the actual position of the hot stamping coordinate origin on the garment according to the relative position of the hot stamping coordinate origin in the image information;
s7, calculating a midline angle based on the midline calculated in the step S5, and determining the orientation of the clothes;
step S4 is to take three transverse lines from top to bottom in the image information, calculate gray value differences of left and right pixel points of each pixel point on the three transverse lines, find out the maximum gray value difference in the three transverse lines, take 2/3 of the minimum value in the three maximum gray value differences as a mutation reference of the whole image information, find out all mutation points in the image information according to the obtained mutation reference, and define the mutation points as outlines of clothing patterns in the image information.
2. The garment thermoprinting video identification positioning method according to claim 1, wherein the method comprises the following steps: in step S3, the calibration plate is known to be square with a side length L, and color recognition is performed on the calibration plate pattern in the image information, so as to obtain the number N of pixel points occupied by the calibration plate pattern in the image information, and the actual size k=l/sqrt (N) represented by a single pixel is calculated.
3. The garment thermoprinting video identification positioning method according to claim 1, wherein the method comprises the following steps: in step S5, two points of the contour, namely the left waist point and the right waist point, are defined, two points of the contour, namely the left neck point and the right neck point, are defined, the midpoint of the connecting line of the left neck point and the right neck point is connected with the midpoint of the connecting line of the left waist point and the right waist point, a centerline of the contour is obtained, and the midpoint of the centerline is defined as the hot stamping coordinate origin of the contour.
4. The garment thermoprinting video identification positioning method according to claim 1, wherein the method comprises the following steps: in step S6, the formula of the hem width of the garment is m=k×q.
5. The method for identifying and positioning a garment hot stamping video according to claim 4, further comprising, after step S6:
and identifying the bar code information of the clothes in the image information to obtain the calibrated size of the clothes, and comparing the calculated width of the hem with the width of the hem in the calibrated size to report errors.
6. A video identification positioning system for garment thermoprinting, comprising:
the placing table comprises a calibration plate arranged at the upper left corner and is used for flatly placing clothes;
the camera is arranged above the placing table and is used for collecting image information comprising clothes patterns and calibration plate patterns;
the calibration plate information identification unit is used for identifying the calibration plate pattern in the image information and obtaining the actual size K represented by the single pixel through calculation;
the contour judging unit is used for taking three transverse cross lines from top to bottom in the image information, calculating gray value differences of left and right pixel points of each pixel point on the three transverse cross lines, finding out the maximum gray value difference in the three transverse cross lines, taking 2/3 of the minimum value in the three maximum gray value differences as a mutation reference of the whole image information, finding out all mutation points in the image information according to the obtained mutation reference, and defining the mutation points as the contour of a clothing pattern in the image information; the thermoprinting coordinate origin defining unit is used for defining left and right waist points and left and right neck points of the outline and calculating to obtain a central line of the garment pattern in the image information and the thermoprinting coordinate origin;
a size calculating unit for calculating an actual size of the garment and confirming a position of a hot stamping coordinate origin on the garment based on the obtained actual size K represented by the single pixel;
a direction recognition unit for recognizing a direction of the clothing;
the actual size and the orientation of the clothes and the actual thermoprint coordinate origin of the clothes are obtained through processing and calculating the image information, and the method is used for accurately positioning material placement thermoprint in the production process.
7. The video identification and location system for garment thermoprinting according to claim 6, wherein: the clothes are provided with bar codes, the image information is provided with bar code information, the clothes further comprise a bar code identification unit and an error reporting unit, the bar code identification unit is used for reading the bar code information in the image information to obtain the calibrated size of the clothes, and the error reporting unit is used for comparing the calibrated size with the actual size to report errors.
CN202010665441.9A 2020-07-11 2020-07-11 Video identification positioning system and method applied to garment thermoprint Active CN111797824B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010665441.9A CN111797824B (en) 2020-07-11 2020-07-11 Video identification positioning system and method applied to garment thermoprint
PCT/CN2020/136299 WO2022011952A1 (en) 2020-07-11 2020-12-15 Video recognition positioning system and method applied to garment hot stamping

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010665441.9A CN111797824B (en) 2020-07-11 2020-07-11 Video identification positioning system and method applied to garment thermoprint

Publications (2)

Publication Number Publication Date
CN111797824A CN111797824A (en) 2020-10-20
CN111797824B true CN111797824B (en) 2023-12-26

Family

ID=72808210

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010665441.9A Active CN111797824B (en) 2020-07-11 2020-07-11 Video identification positioning system and method applied to garment thermoprint

Country Status (2)

Country Link
CN (1) CN111797824B (en)
WO (1) WO2022011952A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111797824B (en) * 2020-07-11 2023-12-26 贝塔科技(苏州)有限公司 Video identification positioning system and method applied to garment thermoprint
CN114345753B (en) * 2021-12-13 2024-05-10 贝塔科技(苏州)有限公司 Production method for connecting printing garment production raw materials into Internet of things through thermal sublimation technology
CN115444355B (en) * 2022-10-28 2023-06-23 四川大学华西医院 Endoscope lesion size information determining method, electronic equipment and storage medium
CN115983308B (en) * 2023-03-20 2023-06-16 湖南半岛医疗科技有限公司 Information code generation method and reading method for intelligent medical treatment
CN116486116B (en) * 2023-06-16 2023-08-29 济宁大爱服装有限公司 Machine vision-based method for detecting abnormality of hanging machine for clothing processing

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110378959A (en) * 2019-07-15 2019-10-25 杭州恢弘科技有限公司 A kind of clothes auxiliary print is boiling hot to position setting method, localization method and auxiliary print ironing process

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10467454B2 (en) * 2017-04-26 2019-11-05 Mashgin Inc. Synchronization of image data from multiple three-dimensional cameras for image recognition
CN110111381A (en) * 2019-03-13 2019-08-09 中山易裁剪网络科技有限公司 A kind of long-range determining suit length system and its determining method
CN110595355A (en) * 2019-08-27 2019-12-20 东莞市精致自动化科技有限公司 Clothes size measuring equipment and measuring method
CN111797824B (en) * 2020-07-11 2023-12-26 贝塔科技(苏州)有限公司 Video identification positioning system and method applied to garment thermoprint

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110378959A (en) * 2019-07-15 2019-10-25 杭州恢弘科技有限公司 A kind of clothes auxiliary print is boiling hot to position setting method, localization method and auxiliary print ironing process

Also Published As

Publication number Publication date
CN111797824A (en) 2020-10-20
WO2022011952A1 (en) 2022-01-20

Similar Documents

Publication Publication Date Title
CN111797824B (en) Video identification positioning system and method applied to garment thermoprint
CN107504896B (en) A kind of location algorithm based on the matched spherical pin element of point
CN109558871B (en) Pointer instrument reading identification method and device
CN106651857B (en) A kind of printed circuit board patch defect inspection method
JP2008036918A (en) Screen printing equipment, and method for image recognition and alignment
CN114220757B (en) Wafer detection alignment method, device and system and computer medium
CN110260818B (en) Electronic connector robust detection method based on binocular vision
CN113030123B (en) AOI detection feedback system based on Internet of things
CN108709500B (en) Circuit board element positioning and matching method
CN111784674A (en) Component detection method, component detection device, computer equipment and storage medium
CN114187253A (en) Circuit board part installation detection method
CN105136818B (en) The image detection method of printed base plate
CN114612423A (en) Chip packaging defect detection method
CN109060799A (en) A kind of assembling line finished product detection determination method
CN112053333B (en) Square billet detection method, system, equipment and medium based on machine vision
CN111968104B (en) Machine vision-based steel coil abnormity identification method, system, equipment and medium
KR100837119B1 (en) A camera calibration method for measuring the image
JPH11175150A (en) Stop position deviation amount detecting device for moving body
CN111380474A (en) Aperture detection method based on binocular vision
CN114998571A (en) Image processing and color detection method based on fixed-size marker
US10853683B2 (en) Systems and methods to determine size and color of a fashion apparel
CN113220924A (en) Product model visual identification method and visual identification system
JP3520758B2 (en) Image recognition method
CN109345607B (en) Method for automatically marking EPC picture
JP2000207557A (en) Method for measuring quantity of positional deviation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant