CN110992425A - Image calibration method and device, electronic equipment and storage medium - Google Patents

Image calibration method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN110992425A
CN110992425A CN201911269742.3A CN201911269742A CN110992425A CN 110992425 A CN110992425 A CN 110992425A CN 201911269742 A CN201911269742 A CN 201911269742A CN 110992425 A CN110992425 A CN 110992425A
Authority
CN
China
Prior art keywords
image
pixel
difference
pixel point
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201911269742.3A
Other languages
Chinese (zh)
Inventor
陈可
许鹏远
郑艳伟
徐斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dahao Industrial Sewing Intelligent Control Technology Co Ltd
Beijing Dahao Technology Co Ltd
Original Assignee
Beijing Dahao Industrial Sewing Intelligent Control Technology Co Ltd
Beijing Dahao Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dahao Industrial Sewing Intelligent Control Technology Co Ltd, Beijing Dahao Technology Co Ltd filed Critical Beijing Dahao Industrial Sewing Intelligent Control Technology Co Ltd
Priority to CN201911269742.3A priority Critical patent/CN110992425A/en
Publication of CN110992425A publication Critical patent/CN110992425A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30124Fabrics; Textile; Paper
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Abstract

The image calibration method, the image calibration device, the electronic equipment and the storage medium acquire the ratio value of the pixel length and the actual length of the image of each calibration point in the calibration pattern; the ratio of the pixel length to the actual length of any pixel point in the image acquired by the image acquisition device is set by taking the ratio of the pixel length to the actual length of a pixel point corresponding to the image of the calibration point adjacent to the pixel point as a reference value. The image calibration method is highly automatic, simple in operation and not easy to make mistakes, the time complexity and the space complexity of the algorithm are low, the dependence on external conditions such as illumination is not strong, and the method is particularly suitable for sewing machines in actual working environments.

Description

Image calibration method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of sewing technology, and in particular, to an image calibration method, device, electronic device, and storage medium.
Background
Sewing/sewing generally refers to a production process capable of forming stitches on various materials (hereinafter, referred to as fabrics) such as cloth, and sewing machines, embroidery machines, and the like are all sewing machines. Modern sewing machines are controlled by electronic equipment, produce according to the pattern, can realize the automation of making up the process. The pattern is a data file containing production-related data such as image data for sewing, and a pattern that can be recognized by a sewing machine can be created by plate-making software or other software.
With the development of sewing technology, the form of sewing production is also changed, for example, a sewing machine may be equipped with an image acquisition device such as a camera or a camera, a pattern may be generated from an image acquired by the image acquisition device, and sewing production such as embroidery or sewing may be performed based on the pattern. Generally, an image acquired by image acquisition equipment is a digital image and is composed of pixel points, a pixel value refers to a numerical value corresponding to each pixel point, the numerical values represent information such as color, brightness and the like of the pixel points, and a storage image actually stores the pixel value of each pixel point of the image. According to the pattern generated by the image, the proportional value of the pixel length and the actual length needs to be obtained, namely, the actual length of the shot pattern corresponding to the pixel length calculated by the pixel coordinate value of the pixel point in the image relative to the pixel coordinate origin. Under the absolutely ideal condition, the image is reduced by a certain proportion of the actual pattern, two pixel points are arbitrarily selected on the image, and the proportion value of the pixel length and the actual length of the image under the same pixel coordinate system is completely the same. However, in practice, the image captured by the image capturing device is always distorted to some extent with respect to the pattern to be captured, and the distortion cannot be completely eliminated. Distortion can lead to arbitrarily taking two pixel points on the image, and even if the two pixel points are in the same pixel coordinate system, the pixel length and the actual length of the two pixel points are different in proportion. If the influence of distortion is not considered, when the pattern is generated according to the image acquired by the image acquisition equipment, each pixel point on the image uses the same proportional value of the pixel length and the actual length, so that the difference between the actually produced product and the expected effect is extremely large, the product quality requirement is not met, and even the product cannot be used after being scrapped. In order to avoid the above situation, before generating patterns according to an image acquired by an image acquiring device installed on a certain sewing machine, a ratio value between a pixel length corresponding to each pixel point on the image acquired by the image acquiring device and an actual length needs to be known, and this process may be referred to as image calibration.
There is a need for image calibration in many fields, and some image calibration methods have been disclosed, such as the more common "Zhangyou calibration method" named by inventor name. The main steps of the Zhangyingyou calibration method comprise: 1. making a checkerboard pattern (generally the size of A4 paper) with known black and white space, and sticking the checkerboard pattern on a flat plate; 2. the image acquisition equipment shoots a plurality of (generally 10-20) images with checkerboard patterns; 3. detecting feature points (Harris features) in an image; 4. 5 internal parameters and 6 external parameters are calculated; 5. and designing an optimization target by utilizing a maximum likelihood estimation strategy, and optimizing parameters.
The Zhangzhengyou calibration method is widely applied in many fields, but has great limitation in the technical field of sewing, mainly expressed as follows: 1. the Zhangzhengyou calibration method needs to stick the checkerboard pattern on a flat plate, the flat plate has certain thickness, and the thickness of the fabric in actual sewing production is usually ignored. The above thickness differences may result in less accurate or even completely unsatisfactory image calibration. 2. Although the electronic device for controlling the sewing machine also has a chip, a memory, and other components, the chip has a weaker arithmetic capability and a smaller memory space than a high-performance computer, and thus has insufficient support capability for an algorithm with a higher time complexity or space complexity. The calibration method for sewing machines to operate properly often fails to obtain results within a reasonable time, and may even interfere with the normal operation of the sewing machine. 3. The Zhangyingyou calibration method has more steps, and an unskilled person is easy to make mistakes, and once the mistakes are made, the whole method needs to be started from the beginning. An operator of the sewing machine is usually unfamiliar with the image calibration operation, so that the operator can easily repeat reworking, and time is seriously wasted. 4. In many cases, the working environment of sewing machines is not ideal, and various problems such as insufficient illumination exist, which all affect the accuracy of the Zhang Zhengyou calibration method.
Disclosure of Invention
The application provides an image calibration method, an image calibration device, electronic equipment and a storage medium, the image calibration method is highly automatic, simple to operate and not prone to error, the time complexity and the space complexity of the algorithm are low, the dependence on external conditions such as illumination is not strong, and the method is particularly suitable for sewing machines in actual working environments.
In a first aspect, an image calibration method includes:
moving the calibration pattern to enable the image of the calibration pattern to be located at a set position within the visual field range of the image acquisition equipment, and enabling the images of all the calibration points of the calibration pattern to be located within the visual field range of the image acquisition equipment;
acquiring a ratio value of the pixel length and the actual length of the image of each calibration point in the calibration pattern;
the ratio of the pixel length to the actual length of any pixel point in the image acquired by the image acquisition device is set by taking the ratio of the pixel length to the actual length of a pixel point corresponding to the image of the calibration point adjacent to the pixel point as a reference value.
Further, the method further comprises: and manufacturing a calibration pattern on the fabric, wherein the calibration pattern comprises a plurality of calibration points, and the calibration pattern comprises a plurality of regular geometric figures with the same shape.
Furthermore, the setting position comprises that the geometric center of the image of the calibration pattern is coincident with the geometric center of the visual field range of the image acquisition equipment, and the side of the regular geometric figure in the calibration pattern is parallel or vertical to the boundary of the visual field range of the image acquisition equipment.
Further, the setting of the reference value by using the ratio of the pixel length to the actual length of the pixel point corresponding to the image of the index point adjacent to the pixel point includes:
the ratio value of the pixel length to the actual length of the pixel point is the same as the ratio value of the pixel length to the actual length of the pixel point which is closest to the pixel point and corresponds to the image of the calibration point;
alternatively, the first and second electrodes may be,
the ratio value of the pixel length to the actual length of the pixel point is the weighted average of the ratio values of the pixel length to the actual length of two or more adjacent pixel points corresponding to the images of the calibration points.
Further, the method further comprises: and obtaining the actual distance from the initial operation point of the pattern to the projection of the sewing machine needle according to the ratio value of the pixel length and the actual length of the pixel point in the image and the visual field range of the image acquisition equipment.
Further, the method for recognizing the target image comprises the following steps:
acquiring a first image and a second image, wherein the first image only comprises background image information, the second image comprises a background image and target image information, and pixel points of the first image and the second image are in one-to-one correspondence;
combining the first image and the second image into a pixel difference image, and separating the pixel difference image to obtain a plurality of single-color pixel difference images;
acquiring an average value and a limit value of pixel values of all pixel points of each monochromatic pixel difference image, and judging whether the pixel points belong to a difference area or not by taking a set multiple of the ratio of the average value to the limit value as a threshold;
and acquiring the difference areas of the pixel difference images according to the difference areas of all the monochromatic pixel difference images, and taking the boundaries of the pixel difference image difference areas as the boundaries of the target image in the second image.
Further, acquiring the difference region of the pixel difference image according to the difference regions of all the monochromatic pixel difference images includes:
a certain pixel point belongs to a difference region on a certain monochromatic pixel difference image, and pixel points corresponding to the pixel point on all other monochromatic pixel difference images also belong to the difference region on the monochromatic pixel difference image where the pixel point is located, so that the pixel point corresponding to the pixel point on the pixel difference image belongs to the difference region; if a certain pixel point does not belong to the difference region on a certain monochromatic pixel difference image, the pixel point corresponding to the pixel point on the pixel difference image does not belong to the difference region;
or if a certain pixel point on a certain monochromatic pixel difference image belongs to a difference region, the pixel point corresponding to the pixel point on the pixel difference image belongs to the difference region; a certain pixel point does not belong to a difference region on a certain monochromatic pixel difference image, and pixel points corresponding to the pixel point on all other monochromatic pixel difference images also do not belong to the difference region on the monochromatic pixel difference image where the pixel point is located, so that the pixel point corresponding to the pixel point on the pixel difference image does not belong to the difference region;
or if a certain pixel point belongs to a difference region on a certain monochromatic pixel difference image, other monochromatic pixel difference images exist so that the pixel point corresponding to the pixel point also belongs to the difference region on the monochromatic pixel difference image where the pixel point is located, and the number of the pixel point and the monochromatic pixel difference images of the pixel point corresponding to the pixel point belonging to the difference region exceeds the set proportion of the total number of the monochromatic pixel difference images, the pixel point corresponding to the pixel point on the pixel difference image belongs to the difference region; otherwise, the pixel point corresponding to the pixel point on the pixel difference image does not belong to the difference area.
In a second aspect, the present application provides an image calibration apparatus, including: the device comprises a pattern moving device, a proportional value acquisition device and a proportional value setting device; the pattern moving device is used for moving the calibration pattern to enable the image of the calibration pattern to be located at a set position in the visual field range of the image acquisition equipment, and the images of all calibration points of the calibration pattern are located in the visual field range of the image acquisition equipment; the proportion value acquisition device is used for acquiring the proportion value of the pixel length and the actual length of the image of each calibration point in the calibration pattern; the proportional value setting device is used for setting the proportional value of the pixel length and the actual length of any pixel point in the image acquired by the image acquisition equipment by taking the proportional value of the pixel length and the actual length of the pixel point which is adjacent to the pixel point and corresponds to the image of the calibration point as a reference value.
Further, the image calibration device further comprises a calibration pattern making device and a distance obtaining device; the calibration pattern making device is used for making a calibration pattern on the fabric, and the calibration pattern comprises a plurality of calibration points; the distance acquisition device is used for acquiring the actual distance from the initial operation point of the pattern to the projection of the sewing machine needle according to the ratio value of the pixel length and the actual length of the pixel point in the image and the visual field range of the image acquisition equipment.
Further, the system also comprises a target image recognition device, wherein the target image recognition device is used for target image recognition; the target image recognition device comprises an image acquisition device, an image merging and separating device, a difference area judging device and a boundary determining device;
the image acquisition device is used for acquiring a first image and a second image, the first image only comprises background image information, the second image comprises a background image and target image information, and pixel points of the first image and the second image are in one-to-one correspondence;
the image merging and separating device is used for merging the first image and the second image into a pixel difference image and separating the pixel difference image to obtain a plurality of monochromatic pixel difference images;
the difference area judging device is used for acquiring the average value and the limiting value of all pixel point pixel values of each monochromatic pixel difference image, and judging whether the pixel points belong to the difference area or not by taking the set multiple of the ratio of the average value to the limiting value as a threshold;
the boundary determining device is used for acquiring the difference area of the pixel difference image according to the difference area of all the monochromatic pixel difference images, and taking the boundary of the pixel difference image difference area as the boundary of the target image in the second image.
In a third aspect, the present application provides an electronic device comprising a processor and a memory; the memory is used for storing computer instructions; the processor is configured to execute the computer instructions stored in the memory, so as to enable the electronic device to execute the image calibration method according to any one of the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, in which a computer program is stored, and the computer program, when executed, implements the image calibration method according to any one of the above first aspects.
The image calibration method, the image calibration device, the electronic equipment and the storage medium are particularly suitable for sewing machines in actual working environments. The method has the following advantages: 1. the method has the advantages that the image calibration can be carried out by making the calibration pattern on the fabric once, the image calibration process is highly automatic, the operation is simple, the error is not easy to occur, and the requirements on the professional performance, the proficiency and the like of operators are not high. 2. The time complexity and the space complexity of the algorithm are low, the algorithm can still normally run on electronic equipment with insufficient computing capacity and storage space, and a result can be obtained in reasonable time. 3. The dependence on external conditions such as illumination is not strong, and a good effect can be still kept in an environment with less ideal conditions.
Drawings
For a clearer explanation of the technical solutions in the present application or the prior art, the drawings used for describing the present application or the prior art are briefly introduced. For a person skilled in the art, other figures can also be derived from these figures without inventive exercise.
Fig. 1 is a flowchart of an image calibration method according to an embodiment of the present application.
Fig. 2 is a flowchart of a second target image recognition method according to an embodiment of the present application.
Fig. 3 is a schematic diagram of a conventional calibration pattern according to an embodiment of the present application.
Fig. 4 is a schematic diagram illustrating an architecture of an image calibration apparatus according to a third embodiment of the present application.
Fig. 5 is a schematic diagram illustrating an architecture of a target image recognition device according to a third embodiment of the present application.
Fig. 6 is a schematic diagram of a hardware structure of a fourth electronic device according to an embodiment of the present application.
Fig. 7 is a schematic overall structure diagram of an embroidery machine according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more clear, the technical solutions of the present application will be described in detail and completely with reference to the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the scope of protection granted by the present application.
The terms "first," "second," "third," and the like in the claims, the description, and the drawings of the specification, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order, but rather the terms "first," "second," "third," and the like may be used interchangeably without affecting the semantic accuracy. Moreover, the terms "comprises," "comprising," "includes," "including," "has," "having," and any variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The technical scheme is suitable for sewing machines such as embroidery machines and sewing machines or other equipment similar to the whole structure of the sewing machines. A sewing machine generally refers to a production apparatus capable of forming stitches on a fabric such as a cloth. The overall structure of the sewing machine is similar, and the embroidery machine is taken as a representative sewing machine in the embodiment of the application for explanation. Fig. 7 is a schematic view of the overall structure of an embroidery machine according to the embodiment of the present application. The machine head is a core component of the embroidery machine, and a machine needle is arranged on the machine head and used for embroidering fabrics. The bedplate is arranged below the machine head, a component called as an embroidery frame is arranged on the bedplate, and the fabric is fixed on the bedplate through the embroidery frame. In the embroidering process, the machine needle only moves in the vertical direction without horizontal movement, and the embroidery frame compresses the fabric to drive the fabric to horizontally move on the bedplate. An image acquisition apparatus, such as a camera, a video camera, or the like, capable of acquiring an image of the material or the pattern on the material on the platen is mounted at a suitable position above the platen. In order to ensure the image acquisition effect, the image acquisition equipment is fixedly installed in many cases and does not move in the working process of the image acquisition equipment.
The first embodiment is as follows:
fig. 1 is a flowchart of an image calibration method according to this embodiment, and the image calibration method includes the following steps.
S101, making a calibration pattern on the fabric, wherein the calibration pattern comprises a plurality of calibration points.
In this application, a pattern refers to a pattern formed by a real object, and an image refers to an image acquired by an image acquisition device, and is generally a digital image. In order to correctly calibrate the image, a calibration pattern needs to be made on the fabric. The embroidery machine generally adopts a flat embroidery mode to embroider a calibration pattern on the fabric. In view of the accuracy of image calibration and the convenience of manufacturing calibration patterns, a calibration pattern usually includes several regular geometric figures with the same shape, and specific points of the regular geometric figures, such as vertices, geometric centers, etc., are used as calibration points, and if two or more calibration points of the regular geometric figures coincide, the calibration point is regarded as one calibration point. Fig. 3 is a schematic diagram of a conventional calibration pattern. The calibration pattern shown in fig. 3 includes nine rectangles of the same shape, with the vertices of the rectangles being the calibration points. In the case of coincidence of different rectangular sides and vertexes in the calibration pattern, the coincident vertexes of two or more rectangles are regarded as one calibration point. The calibration patterns shown in fig. 3 have calibration points surrounded by open circles for a total of 16 calibration points in the calibration pattern.
S102, moving the calibration pattern to enable the image of the calibration pattern to be located at a set position within the visual field range of the image acquisition equipment, and enabling the images of all calibration points of the calibration pattern to be located within the visual field range of the image acquisition equipment.
Generally, the image acquisition device is fixedly arranged on the sewing machine, and the image acquisition device does not move in the working process, so that the visual field range of the image acquisition device is determined after the image acquisition device is arranged, and the visual field range of the image acquisition device is generally rectangular. For image calibration, the calibration pattern needs to be moved such that the image of the calibration pattern is within the field of view of the image acquisition device. The tabouret of the embroidery machine drives the fabric to horizontally move on the bedplate, and the calibration pattern correspondingly moves. The size of the calibration pattern is designed according to the visual field range of the image acquisition equipment, so that all calibration points of the calibration pattern are positioned in the visual field range of the image acquisition equipment when the image of the calibration pattern is positioned at the set position in the visual field range of the image acquisition equipment. The position of the image of the calibration pattern is usually set to a position where the geometric center of the image of the calibration pattern coincides with the geometric center of the field of view of the image capturing device and the side of the regular geometric figure in the calibration pattern is parallel or perpendicular to the boundary of the field of view of the image capturing device. In the calibration pattern shown in fig. 3, the intersection point of two diagonal lines of the rectangle located in the central region of the calibration pattern (covered by a black solid dot in fig. 3) is the geometric center of the calibration pattern, and the image of the point is the geometric center of the image of the calibration pattern. The set position is beneficial to improving the accuracy of image calibration.
S103, obtaining a ratio value of the pixel length and the actual length of the image of each calibration point in the calibration pattern.
After the image of the calibration pattern is located at the set position within the visual field range of the image acquisition equipment, the step acquires the ratio value of the pixel length and the actual length of the image of each calibration point in the calibration pattern.
The ratio of the pixel length to the actual length is the actual length of the pixel in the image corresponding to the shot pattern calculated by the pixel coordinate value relative to the pixel coordinate origin. The image acquired by the image acquisition equipment and the visual field range of the image acquisition equipment are in a mathematical isomorphic relationship, one pixel point on the image corresponds to an actual point on a bedplate of the embroidery machine, one pixel point on the image is taken as a pixel coordinate origin, the actual point on the bedplate of the embroidery machine corresponding to the pixel coordinate origin is correspondingly taken as the actual coordinate origin, and therefore each pixel point on the image can calculate the pixel length of the pixel point and the proportion value of the actual length. The image of each calibration point in the calibration pattern is essentially a pixel point in the image of the calibration pattern. The actual length corresponding to the pixel point refers to the distance from the actual point on the embroidery machine platen corresponding to the pixel point to the actual coordinate origin, and the distance can be obtained through actual measurement. After the pixel coordinate origin of the image of the calibration pattern is selected, the corresponding actual coordinate origin is determined, the actual length corresponding to the image of each calibration point can be obtained through actual measurement, the pixel length of the image of each calibration point relative to the pixel coordinate origin can be calculated through the pixel coordinate, and therefore the ratio value of the pixel length corresponding to the image of each calibration point and the actual length can be determined.
S104, setting the ratio value of the pixel length and the actual length of any pixel point in the image acquired by the image acquisition equipment by taking the ratio value of the pixel length and the actual length of the pixel point which is adjacent to the pixel point and corresponds to the image of the calibration point as a reference value.
Due to the existence of distortion, a pixel point is arbitrarily selected from the image acquired by the image acquisition equipment, and the ratio value of the pixel length to the actual length of the pixel point is often different from the ratio values of the pixel lengths to the actual lengths of other pixel points, which causes certain difficulty in performing subsequent operations, such as pattern generation and the like, on the image acquired by the image acquisition equipment.
If the field of view of the image acquisition device is not changed, any two images acquired by the image acquisition device are in a mathematical isomorphic relationship, the pixel points on the two images correspond to each other one by one, and the ratio value of the pixel length corresponding to each pixel point in the two pixel points with the corresponding relationship to the actual length is the same. Generally, the image acquisition device is fixedly arranged on the sewing machine, and the visual field range of the image acquisition device is not changed. Any image acquired by the image acquisition device has an isomorphic relationship with the image containing the calibration pattern in the step S102, the image of each calibration point has corresponding pixel points on the image acquired by the image acquisition device, and the ratio of the pixel length to the actual length of the corresponding pixel points is the same as that of the image of the calibration point. Therefore, after step S103, the ratio of the pixel length to the actual length of a plurality of pixel points in any image acquired by the image acquisition device is known.
For any pixel point in the image acquired by the image acquisition equipment, the ratio value of the pixel length to the actual length is set by taking the ratio value of the pixel length to the actual length of the pixel point which is adjacent to the pixel point and corresponds to the image of the calibration point as a reference value. And if the pixel point is the pixel point which has the corresponding relation with the image of the certain calibration point, the pixel point which has the corresponding relation with the image of the certain calibration point and is adjacent to the pixel point comprises the pixel point. The proximity range may be specifically set according to the actual situation of the image, for example, only the pixel point corresponding to the image of the index point closest to the pixel point may be considered, or all the pixel points corresponding to the image of the index point whose distance from the pixel point is smaller than the set threshold may be considered as being proximate.
Similarly, how the ratio of the pixel length to the actual length of any pixel point in the image acquired by the image acquisition device is specifically set is also flexibly processed according to the condition of the image. For example, in a relatively simple setting method, only a pixel corresponding to the image of the index point closest to the pixel is considered, and the ratio of the pixel length to the actual length of the pixel is set to be the same as the ratio of the pixel length to the actual length of the pixel corresponding to the image of the index point closest to the pixel. If some pixel points exist, the distances between the pixel points and two or more pixel points which have corresponding relations with the image of the calibration point are the same, and the ratio value of the pixel length and the actual length of the pixel points is set to be the same as the distance between the pixel points which are the closest to, the farthest from or the middle of the two or more pixel points which have corresponding relations with the image of the calibration point and the origin of the pixel coordinate. A relatively complicated setting mode is characterized in that at least two adjacent pixel points corresponding to the images of the calibration points are considered for each pixel point, different weights are given to two or more adjacent pixel points corresponding to the images of the calibration points according to the positions of the pixel points in the images, and the ratio value of the pixel length to the actual length of the pixel point is set as the weighted average of the ratio values of the pixel length to the actual length of the two or more adjacent pixel points corresponding to the images of the calibration points.
S105, obtaining the actual distance from the initial operation point of the pattern to the projection of the sewing machine needle according to the ratio value of the pixel length and the actual length of the pixel point in the image and the visual field range of the image acquisition equipment.
The image calibration performed in steps S101 to S104 obtains a ratio value of the pixel length of each pixel point in the image to the actual length for the image acquired by the image acquisition device, so that the image acquired by the image acquisition device can be restored to a real pattern in the field of view of the image acquisition device, and the image acquired by the image acquisition device has distortion and does not affect the restoration of the real pattern.
One important application of images acquired by image acquisition equipment mounted on sewing machines is the generation of patterns from images, on the basis of which the sewing machine performs production operations. The pattern needs to be provided with a starting operating point, and the starting operating point of the pattern of the embroidery machine refers to the position of a first needle which penetrates into the fabric when the embroidery is started. The starting working point plays an important role in the pattern, and usually the pattern is composed of needle points, each of which corresponds to a specific embroidery operation, the starting working point corresponds to the origin of coordinates of the pattern, and the positions of the other needle points are represented by coordinate values in a coordinate system with the starting working point as the origin. The starting working points of the patterns are arranged according to the specific situation of the patterns, and the starting working points of different patterns are different.
The image capture device of a sewing machine typically captures an image of the fabric or a pattern on the fabric, and if the thickness of the fabric is negligible, the field of view of the image capture device may be considered to be located on the platen of the sewing machine. If the table is regarded as a continuous plane, and the structure of the holes, grooves, etc. on the table is ignored, the needle of the sewing machine also has a projection on the table, and the projection of the needle can be regarded as a point on the table. According to the pattern generated by the image acquired by the image acquisition equipment, a certain pixel point on the image is set as an initial operation point, then according to the image calibration result, the actual distance between the initial operation point on the fabric and the geometric center of the visual field range of the image acquisition equipment can be calculated, the distance between the geometric center of the visual field range of the image acquisition equipment and the projection of the needle on the bedplate can be obtained through actual measurement, the included angle between the connecting line between the initial operation point and the geometric center of the visual field range of the image acquisition equipment and the connecting line between the geometric center of the visual field range of the image acquisition equipment and the projection of the needle on the bedplate relative to the boundary of the visual field range of the image acquisition equipment can also be calculated or obtained through actual measurement, so that the actual distance from the initial operation point on the fabric to the projection of the needle on the bedplate when the fabric is positioned in the visual field range of the image, and the actual distance projected from the starting working point to the needle is decomposed into two components which are perpendicular to each other and correspond to the two moving directions of the embroidery frame or the like. According to the decomposition result, before the specific pattern begins to be produced, the embroidery frame moves on the bedplate to drive the fabric to a proper position, so that the initial operation point is just overlapped with the projection of the machine needle.
The pattern in this step is not necessarily a pattern generated from the image acquired by the image acquisition device, and may be a pattern generated by another method. For patterns generated by other methods, the image corresponding to the pattern can be moved to the image acquisition equipment view field range displayed by the electronic equipment for controlling the sewing machine, the operator of the sewing machine can move the image corresponding to the pattern in the image acquisition equipment view field range displayed by the electronic equipment, after the operator confirms the position of the image corresponding to the pattern in the image acquisition equipment view field range, the starting operating point of the pattern is located at a certain pixel point of the image acquired by the image acquisition equipment, and the actual distance from the starting operating point on the fabric to the projection of the machine needle on the bedplate can be obtained by the same method.
The accurate positioning of the pattern initial operation point is always a difficult problem in sewing production, and the accuracy and consistency of positioning results cannot be ensured by traditional manual positioning. The method provided by the step can not only accurately obtain the position information of the initial operation point, but also is not limited by the pattern generation mode, can be used for reprocessing the existing patterns, and has a wide application range.
The image calibration method provided by the embodiment is particularly suitable for sewing machines in actual working environments, and has the following advantages: 1. the image calibration can be carried out by making the calibration pattern on the fabric once, the image calibration process is highly automatic, the operation is simple, the error is not easy to occur, the requirements on the professional and proficiency of the operator are not high, and the operator of the sewing machine can complete all the steps of the method only by clicking the key and other operations on the electronic equipment controlling the sewing machine. In addition, the steps of the method are not coupled, if a certain step is in error, the step is carried out again, and the whole method is not required to start from the beginning. 2. The time complexity and the space complexity of the algorithm are low, the algorithm can still normally run on electronic equipment with insufficient computing capacity and storage space, and a result can be obtained in reasonable time. The method is actually operated on the single-head embroidery machine, the time for manufacturing the calibration pattern is not considered, the whole method can be completed within 1 minute, the data calculation only needs several seconds, and the normal operation of other functions of the embroidery machine is not influenced. 3. The method has low dependence on external conditions such as illumination and can still maintain high accuracy in an environment with less ideal conditions.
Example two:
the embodiment provides a method for identifying a target image. The image acquired by the image acquisition device is a whole image, in which often only a part is an image of the target pattern and the other part is a background image which is not needed. And identifying the target image, namely extracting the target image from the whole image and marking the boundary of the target image and the background image.
More methods for identifying the target image have been reported, such as using the more extensive grabgot algorithm. The general steps of the GrabGut algorithm are as follows: 1. defining one or more rectangles containing the target image in the whole image, and automatically defining images outside the rectangles as background images; 2. modeling the background image and the target image by using a Gaussian Mixture Model (GMM), and marking undefined pixel points as possible background images or target images; 3. any pixel point in the rectangle is regarded as being connected with surrounding pixel points through a virtual edge, and each virtual edge defines a probability of belonging to a background image or a target image based on the similarity of the color of the pixel point and the surrounding pixel point; 4. after all the pixel points in the rectangle are connected, if one of two pixel points connected with a certain virtual edge belongs to the background image and the other pixel point belongs to the target image, the virtual edge is deleted. Through the steps, the background image and the target image in the whole image can be segmented.
The GrabGut algorithm has good effect in many application scenes, but has certain defects when being applied to the technical field of sewing. Firstly, for complex situations such as interleaving of a background image and a target image, the recognition effect of the GrabGut algorithm is not particularly ideal, and a certain error rate exists. Patterns generated by some images based on the results of the GrabGut algorithm do not meet the product quality requirements, cannot be directly used, need to be manually modified, and waste a lot of time. Secondly, the time complexity of the GrabGut algorithm is too high, and the GrabGut algorithm actually runs on a single-head embroidery machine, and the result can be obtained within 30-45 seconds, which is unacceptable in actual production.
The present embodiment provides a target image recognition method particularly suitable for use with a sewing machine in an actual working environment. Fig. 2 is a flowchart of a target image recognition method according to the present embodiment, and the target image recognition method includes the following steps.
S201, a first image and a second image are obtained, the first image only comprises background image information, the second image comprises a background image and target image information, and pixel points of the first image and the second image are in one-to-one correspondence.
In the technical field of sewing, a common target pattern is a certain pattern which is sewn or drawn or stuck on a fabric, and a corresponding background pattern refers to the fabric. An image of only the fabric, which is acquired by an image acquisition apparatus such as a camera or a camera mounted on the sewing machine, may be regarded as a first image. After the pattern is sewn or drawn and pasted on the fabric, the image acquired by the image acquisition device includes both the image of the target pattern and the image of the fabric, and can be regarded as a second image.
The first and second images are acquired generally requiring the position of the web relative to the image acquisition device to be constant. The embroidery frame of the embroidery machine compresses the fabric to drive the fabric to move, and the position of the fabric relative to the image acquisition equipment can be kept unchanged when the first image and the second image are acquired by controlling the embroidery frame. Therefore, the pixel points of the first image and the second image are in one-to-one correspondence and accord with the isomorphic relation in mathematics, and if the pixel value of each pixel point is not considered, the first image and the second image can be regarded as the same image, so that the subsequent steps can be greatly simplified.
S202, combining the first image and the second image into a pixel difference image, and separating the pixel difference image to obtain a plurality of single-color pixel difference images.
Because the pixel points of the first image and the second image are in one-to-one correspondence, the pixel values of the two corresponding pixel points are subtracted to obtain a pixel difference value. Two corresponding pixel points in the first image and the second image are regarded as a new pixel point, and a pixel difference value obtained by subtracting pixel values of the two pixel points is regarded as a pixel value of the new pixel point, so that the first image and the second image can be combined into a new image, and the combined new image is called a pixel difference image. The color of a digital image is usually formed by compounding several single colors, one digital image can be separated into several single-color images, any single-color image corresponds to the pixel points of the original digital image one by one, the pixel points of the single-color image only display one color, and the pixel points of the digital image display the color formed by compounding the multiple single colors, and the operation is often called color channel separation. By color-channel separating the pixel difference images, several single-color images can be obtained, which are referred to as single-color pixel difference images.
S203, obtaining the average value and the limiting value of all pixel values of each monochromatic pixel difference image, and judging whether the pixel points belong to the difference area or not by taking the set multiple of the ratio of the average value and the limiting value as a threshold.
For each monochromatic pixel difference image, the pixel values of all the pixels in the image can be obtained, and the average value and the limit value of the pixel values of all the pixels can be determined according to the pixel values of all the pixels. The average may be an arithmetic average, a geometric average, or other type of average, depending on the characteristics of the different images. The limit value may take the form of a maximum, a minimum or other type of limit value. And for all pixel points in the monochromatic pixel difference image, judging whether the pixel points belong to the difference region or not by taking the set multiple of the ratio of the average value to the limit value as a threshold value. Specifically, for each pixel point, dividing the average value of the pixel values of all the pixel points by the pixel value of the point to obtain a ratio, and if the ratio exceeds or is smaller than a threshold value, judging that the pixel point belongs to a difference region; correspondingly, if the ratio is not more than or not less than the threshold, the pixel point is judged not to belong to the difference area. The setting of the threshold value, i.e. the setting of the set multiple of the ratio of the average value to the limit value, is not always constant, and different values are generally required to be set according to the characteristics of the sewing machine or the target pattern, the fabric, etc. Based on experience or performing small-lot tests, a more reasonable value can be set. The multiples mentioned here are not necessarily integers, but may also be decimals or fractions.
The decision result may be marked with data of an appropriate format. For example, if a certain pixel belongs to the difference region, it is marked as 1; correspondingly, if the pixel does not belong to the difference region, the pixel is marked as 0. Therefore, the difference area can be conveniently identified subsequently, and the contour coordinates of the difference area can be solved.
And S204, acquiring the difference area of the pixel difference image according to the difference areas of all the monochromatic pixel difference images, and taking the boundary of the pixel difference image difference area as the boundary of the target image in the second image.
After obtaining the difference regions of all the monochromatic pixel difference images, the difference regions of the pixel difference images may be obtained from the difference regions of all the monochromatic pixel difference images. Because any single-color pixel difference image is in one-to-one correspondence with the pixel points of the pixel difference image, and any two single-color pixel difference images are also in one-to-one correspondence with the pixel points of the pixel difference image, there may be a case that a certain pixel point on a certain single-color pixel difference image belongs to a difference region on the single-color pixel difference image, but the pixel point corresponding to the certain pixel point on the certain single-color pixel difference image does not belong to the difference region on another single-color pixel difference image, and a judgment rule needs to be established for the case. According to a common discrimination rule, a certain pixel point on a single-color pixel difference image belongs to a difference region, and pixel points corresponding to the pixel point on all other single-color pixel difference images also belong to the difference region on the single-color pixel difference image where the pixel point is located, so that the pixel points corresponding to the pixel point on the pixel difference image belong to the difference region; if a certain pixel point does not belong to the difference region on a certain monochromatic pixel difference image, the pixel point corresponding to the pixel point on the pixel difference image does not belong to the difference region. Or, an opposite discrimination rule may be established, and if a certain pixel point on a certain monochromatic pixel difference image belongs to a difference region, a pixel point on the pixel difference image corresponding to the pixel point belongs to the difference region; and if a certain pixel point does not belong to the difference region on a certain monochromatic pixel difference image, and pixel points corresponding to the pixel point on all other monochromatic pixel difference images do not belong to the difference region on the monochromatic pixel difference image where the pixel point is located, the pixel point corresponding to the pixel point on the pixel difference image does not belong to the difference region. Similarly, a discrimination rule can be established, if a certain pixel point belongs to a difference region on a certain monochromatic pixel difference image, other monochromatic pixel difference images exist, so that the pixel point corresponding to the pixel point also belongs to the difference region on the monochromatic pixel difference image where the pixel point is located, and the number of the pixel point and the monochromatic pixel difference image where the pixel point corresponding to the pixel point belongs to the difference region exceeds the set proportion of the total number of the monochromatic pixel difference images, the pixel point corresponding to the pixel point on the pixel difference image belongs to the difference region; otherwise, the pixel point corresponding to the pixel point on the pixel difference image does not belong to the difference area.
Suppose that a certain pixel difference image is subjected to color channel separation to obtain 3 monochromatic pixel difference images, and the judgment rules are named as a first judgment rule, a second judgment rule and a third judgment rule respectively. For the first judgment rule, a certain pixel point on the 1 st monochromatic pixel difference image belongs to a difference region, and corresponding pixel points of the pixel point on the 2 nd monochromatic pixel difference image and the 3 rd monochromatic pixel difference image also belong to the difference region, so that the corresponding pixel point of the pixel point on the pixel difference image belongs to the difference region; and if the corresponding pixel point of the pixel point does not belong to the difference region on any one monochromatic pixel difference image in the 2 nd monochromatic pixel difference image and the 3 rd monochromatic pixel difference image, the corresponding pixel point of the pixel point on the monochromatic pixel difference image does not belong to the difference region. For the second judgment rule, if a certain pixel point and a pixel point corresponding to the certain pixel point belong to a difference region on any one monochromatic pixel difference image in the 3 monochromatic pixel difference images, the pixel point corresponding to the pixel point on the pixel difference image belongs to the difference region; and if the certain pixel point and the pixel point corresponding to the certain pixel point do not belong to the difference region in the 3 monochromatic pixel difference images, the pixel point corresponding to the pixel point on the pixel difference image does not belong to the difference region. For the third judgment rule, the set proportion is taken as one half, and if a certain pixel point and a pixel point corresponding to the certain pixel point belong to a difference region on at least 2 monochromatic pixel difference images in 3 monochromatic pixel difference images, the pixel point corresponding to the pixel point on the pixel difference image belongs to the difference region; otherwise, the pixel point corresponding to the pixel point on the pixel difference image does not belong to the difference area. Specifically, which rule is applied is not absolute, and it is necessary to determine the rule according to the actual situation of an image such as a pixel difference image, and necessary tests may be added.
The difference region of the pixel difference image is acquired with the purpose of taking the boundary of the difference region of the pixel difference image as the boundary of the target image in the second image. Specifically, the contour coordinates of the difference region of the pixel difference image, i.e., the contour of the target image in the second image, are obtained by obtaining the difference region of the pixel difference image and then obtaining the contour coordinates of the difference region.
For steps S202, S203, and S204, there is currently disclosed image processing software, such as OpenCV, which can implement the technical solution described in each step. The image processing software generally provides functions with different functions, and the descriptions in the technical scheme can be realized by calling different functions.
The target image identification method provided by the embodiment has simple steps and is convenient for correct implementation by sewing machine operators who are not familiar with image processing. The method has high accuracy, the patterns generated based on the calculation result of the method can be directly used in actual production, and the produced products meet the quality requirement. Meanwhile, the method has low time complexity and high running speed. The method is actually operated on a single-head embroidery machine, the result can be obtained only within about 1 second, and compared with the operation time of the GrabGut algorithm of 30-45 seconds, the production efficiency can be greatly improved.
Example three:
fig. 4 is a schematic diagram of an architecture of the image calibration apparatus according to the present embodiment. The image calibration device 40 includes a calibration pattern making device 41, a pattern moving device 42, a scale value acquisition device 43, a scale value setting device 44, and a distance acquisition device 45. The pattern shifting device 42 is used to shift the calibration pattern so that the image of the calibration pattern is located at a set position within the field of view of the image capturing device, and the images of all the calibration points of the calibration pattern are located within the field of view of the image capturing device. The scale value acquiring means 43 is used for acquiring the scale value of the pixel length and the actual length of the image of each calibration point in the calibration pattern. The proportional value setting means 44 is configured to set the proportional value of the pixel length and the actual length of any pixel point in the image acquired by the image acquisition device, using the proportional value of the pixel length and the actual length of a pixel point corresponding to the image of the calibration point adjacent to the pixel point as a reference value.
The calibration pattern making device 41 is used for making a calibration pattern on the fabric, and the calibration pattern comprises a plurality of calibration points. The distance acquiring device 45 is used for acquiring the actual distance from the initial working point of the pattern to the projection of the sewing machine needle according to the proportional value of the pixel length and the actual length of the pixel point in the image and the visual field range of the image acquiring equipment.
The image calibration apparatus 40 may further include a target image recognition apparatus 50, where the target image recognition apparatus 50 is used for target image recognition, and fig. 5 is a schematic diagram of an architecture of the target image recognition apparatus 50 according to this embodiment. The target image recognition means 50 includes image acquisition means 51, image merging and separating means 52, difference region discrimination means 53, and boundary determination means 54. The image acquiring device 51 is configured to acquire a first image and a second image, where the first image includes only background image information, the second image includes background image information and target image information, and pixel points of the first image and the second image are in one-to-one correspondence. The image merging and separating device 52 is used for merging the first image and the second image into a pixel difference image, and separating the pixel difference image to obtain a plurality of monochromatic pixel difference images. The difference region determining device 53 is configured to obtain an average value and a limit value of pixel values of all pixel points of each monochromatic pixel difference image, and determine whether a pixel point belongs to a difference region by using a set multiple of a ratio of the average value and the limit value as a threshold. The boundary determining means 54 is used for acquiring the difference region of the pixel difference image according to the difference regions of all the monochromatic pixel difference images, and taking the boundary of the difference region of the pixel difference image as the boundary of the target image in the second image.
For a specific implementation manner of the apparatus in this embodiment, reference may be made to the contents described in the first embodiment or the second embodiment, and the implementation principle and the technical effect are similar, which are not described herein again.
The apparatus described in this embodiment is understood as a functional module framework mainly implemented by a computer program or the like. The division of the apparatus described in this embodiment corresponds to the method steps described in the first embodiment or the second embodiment, which is only a logical division, and there may be another division in actual implementation, for example, a plurality of apparatuses may be combined or integrated into another apparatus, or some apparatuses may be omitted or not executed.
The embodiments of the physical unit of the device-bearing entity in this embodiment have diversity, and all devices may be distributed in one physical unit, or one or several devices may be distributed in different physical units. The physical units of the carrying device can be electrically connected through cables, wireless networks and the like, and do not necessarily have direct physical contact or mechanical connection relation.
Example four:
fig. 6 is a schematic diagram of a hardware structure of the electronic device according to the embodiment. As shown in fig. 6, the electronic device 60 includes: at least one processor 61 and a memory 62. Optionally, the electronic device 60 further comprises a bus 63, and the processor 61 and the memory 62 are connected via the bus 63.
During operation of the electronic device, the memory 62 stores computer instructions, and the at least one processor 61 executes the computer instructions stored by the memory 62 to cause the electronic device 60 to perform the method according to the first embodiment or the second embodiment.
For a specific implementation process of the electronic device 60, reference may be made to the contents described in the first embodiment or the second embodiment, which have similar implementation principles and technical effects, and details of this embodiment are not described herein again.
In this embodiment, it should be understood that the processor may be a Central Processing Unit (CPU), other general-purpose processors, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), and the like. The general purpose processor may be a microprocessor or other conventional processor. The computer instructions stored by the execution memory 62 may be executed directly by a hardware processor, or may be executed by a combination of hardware and software modules within a processor.
The memory may include high speed RAM memory and may also include non-volatile storage NVM, such as at least one disk memory.
The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, the buses in the figures of the present application are not limited to only one bus or one type of bus.
Example five:
the present application further provides a computer-readable storage medium having a computer program stored thereon, which, when executed, implements the method as described in embodiment one or embodiment two.
The computer-readable storage medium may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks, and so forth. Readable storage media can be any available media that can be accessed by a general purpose or special purpose computer or similar electronic device.
A computer readable storage medium may be coupled to the processor such that the processor can read information from, and write information to, the medium. Of course, the media described above may also be part of the processor. The processor and the readable storage medium may reside in an Application Specific Integrated Circuits (ASIC). Of course, the processor and the readable storage medium may also reside as discrete components in an electronic device.
The technical scheme of the application can be stored in a computer readable storage medium if the technical scheme is realized in a software form and is sold or used as a product. Based on this understanding, all or part of the technical solutions of the present application may be embodied in the form of a software product stored in a storage medium, including a computer program or several instructions. The computer software product enables a computer device (which may be a personal computer, a server, a network device, or a similar electronic device) to perform all or part of the steps of the method described in the first or second embodiment of the present application. The storage medium may be a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and various media capable of storing program codes.
Those skilled in the art will appreciate that all or a portion of the steps described in relation to implementing embodiment one or embodiment two may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium, and when executed, performs all or part of the steps of the first or second embodiment. The storage medium includes various media that can store program codes, such as ROM, RAM, magnetic or optical disk.
Finally, it should be noted that the embodiments of the present application are only used for illustrating the technical solutions of the present application, and not for limiting the same. Although the present application has been described in detail with reference to the embodiments, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention.

Claims (10)

1. The image calibration method is characterized by comprising the following steps:
moving the calibration pattern to enable the image of the calibration pattern to be located at a set position within the visual field range of the image acquisition equipment, wherein the images of all the calibration points of the calibration pattern are located within the visual field range of the image acquisition equipment;
acquiring a ratio value of the pixel length and the actual length of the image of each calibration point in the calibration pattern;
the ratio of the pixel length to the actual length of any pixel point in the image acquired by the image acquisition equipment is set by taking the ratio of the pixel length to the actual length of a pixel point which is adjacent to the pixel point and corresponds to the image of the calibration point as a reference value.
2. The method of claim 1, further comprising: and manufacturing the calibration pattern on the fabric, wherein the calibration pattern comprises a plurality of calibration points, and the calibration pattern comprises a plurality of regular geometric figures with the same shape.
3. The method according to claim 2, wherein the setting of the position comprises that a geometric center of the image of the calibration pattern coincides with a geometric center of a field of view of the image acquisition device, and an edge of the regular geometric figure in the calibration pattern is parallel or perpendicular to a boundary of the field of view of the image acquisition device.
4. The method according to claim 1, wherein the setting of the reference value using the ratio of the pixel length to the actual length of the pixel point corresponding to the image of the index point adjacent to the pixel point comprises:
the ratio value of the pixel length to the actual length of the pixel point is the same as the ratio value of the pixel length to the actual length of the pixel point which is closest to the pixel point and corresponds to the image of the calibration point;
alternatively, the first and second electrodes may be,
the ratio value of the pixel length and the actual length of the pixel point is the weighted average of the ratio values of the pixel length and the actual length of two or more adjacent pixel points corresponding to the images of the calibration points.
5. The method of claim 1, further comprising: and obtaining the actual distance from the initial operation point of the pattern to the projection of the sewing machine needle according to the ratio value of the pixel length and the actual length of the pixel point in the image and the visual field range of the image acquisition equipment.
6. The method of claim 1, further comprising a target image recognition method, the target image recognition method comprising:
acquiring a first image and a second image, wherein the first image only comprises background image information, the second image comprises a background image and target image information, and pixel points of the first image and the second image are in one-to-one correspondence;
the first image and the second image are combined into a pixel difference image, and the pixel difference image is separated to obtain a plurality of single-color pixel difference images;
acquiring an average value and a limit value of pixel values of all pixel points of each monochromatic pixel difference image, and judging whether the pixel points belong to a difference area or not by taking a set multiple of the ratio of the average value to the limit value as a threshold;
and acquiring the difference regions of the pixel difference images according to the difference regions of all the monochromatic pixel difference images, and taking the boundaries of the pixel difference image difference regions as the boundaries of the target image in the second image.
7. The method of claim 6, wherein the obtaining the difference regions of the pixel difference images from the difference regions of all monochromatic pixel difference images comprises:
a certain pixel point belongs to a difference region on a certain monochromatic pixel difference image, and pixel points corresponding to the pixel point on all other monochromatic pixel difference images also belong to the difference region on the monochromatic pixel difference image where the pixel point is located, so that the pixel point corresponding to the pixel point on the pixel difference image belongs to the difference region; if a certain pixel point does not belong to the difference region on a certain monochromatic pixel difference image, the pixel point corresponding to the pixel point on the pixel difference image does not belong to the difference region;
alternatively, the first and second electrodes may be,
if a certain pixel point on a certain monochromatic pixel difference image belongs to a difference region, the pixel point corresponding to the pixel point on the pixel difference image belongs to the difference region; if a certain pixel point does not belong to a difference region on a certain monochromatic pixel difference image and pixel points corresponding to the pixel point on all other monochromatic pixel difference images do not belong to the difference region on the monochromatic pixel difference image where the pixel point is located, the pixel point corresponding to the pixel point on the pixel difference image does not belong to the difference region;
alternatively, the first and second electrodes may be,
if a certain pixel point belongs to a difference region on a certain monochromatic pixel difference image, other monochromatic pixel difference images exist so that the pixel point corresponding to the pixel point also belongs to the difference region on the monochromatic pixel difference image where the pixel point is located, and the number of the pixel point and the monochromatic pixel difference image where the pixel point corresponding to the pixel point belongs to the difference region exceeds the set proportion of the total number of the monochromatic pixel difference images, the pixel point corresponding to the pixel point on the pixel difference image belongs to the difference region; otherwise, the pixel point corresponding to the pixel point on the pixel difference image does not belong to the difference area.
8. An image calibration apparatus, characterized in that the apparatus comprises: the device comprises a pattern moving device, a proportional value acquisition device and a proportional value setting device; the pattern moving device is used for moving the calibration pattern to enable the image of the calibration pattern to be located at a set position in the visual field range of the image acquisition equipment, and the images of all the calibration points of the calibration pattern are located in the visual field range of the image acquisition equipment; the proportion value acquisition device is used for acquiring the proportion value of the pixel length and the actual length of the image of each calibration point in the calibration pattern; the proportional value setting device is used for setting the proportional value of the pixel length and the actual length of any pixel point in the image acquired by the image acquisition equipment by taking the proportional value of the pixel length and the actual length of the pixel point which is adjacent to the pixel point and corresponds to the image of the calibration point as a reference value.
9. Electronic device, characterized in that the electronic device comprises a processor and a memory, the memory is used for storing computer instructions, and the processor is used for executing the computer instructions stored in the memory to make the electronic device execute the image calibration method according to any one of claims 1-7.
10. Storage medium, characterized in that it stores a computer program which, when executed, implements an image calibration method as claimed in any one of claims 1 to 7.
CN201911269742.3A 2019-12-11 2019-12-11 Image calibration method and device, electronic equipment and storage medium Withdrawn CN110992425A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911269742.3A CN110992425A (en) 2019-12-11 2019-12-11 Image calibration method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911269742.3A CN110992425A (en) 2019-12-11 2019-12-11 Image calibration method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN110992425A true CN110992425A (en) 2020-04-10

Family

ID=70092555

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911269742.3A Withdrawn CN110992425A (en) 2019-12-11 2019-12-11 Image calibration method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110992425A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111707587A (en) * 2020-06-04 2020-09-25 核工业北京地质研究院 Particle size statistical method
CN112991742A (en) * 2021-04-21 2021-06-18 四川见山科技有限责任公司 Visual simulation method and system for real-time traffic data
CN113718436A (en) * 2021-09-15 2021-11-30 汝州玛雅机电科技有限公司 Embroidery machine control method and electronic equipment
CN116071429A (en) * 2023-03-29 2023-05-05 天津市再登软件有限公司 Method and device for identifying outline of sub-pattern, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201643290A (en) * 2015-06-02 2016-12-16 Zeng Hsing Ind Co Ltd Compensation method of fabric feeding amount for sewing machine
CN110004600A (en) * 2018-09-20 2019-07-12 浙江大学台州研究院 Intelligent sewing device and method based on machine vision
CN110345875A (en) * 2018-04-04 2019-10-18 灵动科技(北京)有限公司 Calibration and distance measuring method, device, electronic equipment and computer readable storage medium
CN110390278A (en) * 2019-07-05 2019-10-29 北京大豪科技股份有限公司 Sewing Boundary Recognition method, apparatus, electronic equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201643290A (en) * 2015-06-02 2016-12-16 Zeng Hsing Ind Co Ltd Compensation method of fabric feeding amount for sewing machine
CN110345875A (en) * 2018-04-04 2019-10-18 灵动科技(北京)有限公司 Calibration and distance measuring method, device, electronic equipment and computer readable storage medium
CN110004600A (en) * 2018-09-20 2019-07-12 浙江大学台州研究院 Intelligent sewing device and method based on machine vision
CN110390278A (en) * 2019-07-05 2019-10-29 北京大豪科技股份有限公司 Sewing Boundary Recognition method, apparatus, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李建松: "一种新的数码相机畸变改正方法", 《第十三届全国图象图形学学术会议论文集》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111707587A (en) * 2020-06-04 2020-09-25 核工业北京地质研究院 Particle size statistical method
CN112991742A (en) * 2021-04-21 2021-06-18 四川见山科技有限责任公司 Visual simulation method and system for real-time traffic data
CN113718436A (en) * 2021-09-15 2021-11-30 汝州玛雅机电科技有限公司 Embroidery machine control method and electronic equipment
CN113718436B (en) * 2021-09-15 2022-07-19 诸暨玛雅电器机械有限公司 Embroidery machine control method and electronic equipment
CN116071429A (en) * 2023-03-29 2023-05-05 天津市再登软件有限公司 Method and device for identifying outline of sub-pattern, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN110992425A (en) Image calibration method and device, electronic equipment and storage medium
US9773302B2 (en) Three-dimensional object model tagging
CN110189322B (en) Flatness detection method, device, equipment, storage medium and system
US9534326B2 (en) Sewing machine and computer-readable medium storing program
CN109712162B (en) Cable character defect detection method and device based on projection histogram difference
WO2017092427A1 (en) Electronic element positioning method and apparatus
US10597806B2 (en) Sewing machine and non-transitory computer-readable storage medium
JP2022542573A (en) Method and computer program product for generating three-dimensional model data of clothing
JP7214432B2 (en) Image processing method, image processing program, recording medium, image processing apparatus, production system, article manufacturing method
CN111145091A (en) Image splicing method and device, electronic equipment and storage medium
CN110390278B (en) Sewing material boundary identification method and device, electronic equipment and storage medium
CN114926385A (en) Panel defect detection method, storage medium and terminal equipment
CN115078365A (en) Soft package printing quality defect detection method
CN111982933B (en) Coating defect detection system and device
CN110956147A (en) Method and device for generating training data
US20220028055A1 (en) Defect detection method, computer device and storage medium
CN109712115A (en) A kind of pcb board automatic testing method and system
CN111325106B (en) Method and device for generating training data
CN112634259A (en) Automatic modeling and positioning method for keyboard keycaps
CN110033507B (en) Method, device and equipment for drawing internal trace of model map and readable storage medium
CN116703887A (en) Embroidery defect detection method, system and storage medium for visual comparison
CN113128499B (en) Vibration testing method for visual imaging device, computer device and storage medium
TW201522949A (en) Inspection method for image data
TW202034132A (en) Mouse cursor image detection and comparison and feedback state determination method using an image processing unit to read the mouse cursor image in the operation screen to conduct detection and comparison
CN114324405B (en) Data labeling system based on film detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20200410