CN112464950B - Pattern recognition positioning method based on flexible material - Google Patents

Pattern recognition positioning method based on flexible material Download PDF

Info

Publication number
CN112464950B
CN112464950B CN202011320735.4A CN202011320735A CN112464950B CN 112464950 B CN112464950 B CN 112464950B CN 202011320735 A CN202011320735 A CN 202011320735A CN 112464950 B CN112464950 B CN 112464950B
Authority
CN
China
Prior art keywords
positioning
pattern
flexible material
images
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011320735.4A
Other languages
Chinese (zh)
Other versions
CN112464950A (en
Inventor
陈琮林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Shunchen Technology Co ltd
Original Assignee
Wuhan Shunchen Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Shunchen Technology Co ltd filed Critical Wuhan Shunchen Technology Co ltd
Priority to CN202011320735.4A priority Critical patent/CN112464950B/en
Publication of CN112464950A publication Critical patent/CN112464950A/en
Application granted granted Critical
Publication of CN112464950B publication Critical patent/CN112464950B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

A pattern recognition positioning method based on flexible material searches the characteristic point of the line on the image of the flexible material, produces the characteristic point descriptor, and make the pattern positioning template according to the characteristic point; calculating feature point descriptors, and searching feature points with similar feature point descriptors in the two images; the camera is aimed at the flexible material to take a picture, so that an image of the flexible material is obtained, and the position and the included angle of a matching result are obtained; translating and rotating a positioning pattern of the pattern positioning template, and performing coarse positioning, image processing and smoothing processing to obtain a smooth curve; sequencing the smooth curves and traversing the first reference point and the second reference point of the edge lines Line1 and Line1 of the edge Line graph from the starting point of the edge Line graph; calculating an included angle between a second reference point and a connecting line of the first reference point; adding the included angle with the locating pattern, translating and rotating the locating pattern; discretizing the positioning graph into lines; until the first reference point or the second reference point is not found on the edge line pattern.

Description

Pattern recognition positioning method based on flexible material
[ field of technology ]
The invention relates to the technical field of pattern recognition and positioning, in particular to a pattern recognition and positioning method based on flexible materials.
[ background Art ]
The existing recognition positioning technology is carried out by a template matching method in image processing, namely placing a material in the field of view of a camera, acquiring a frame of image, finding a position with the most characteristic mark in the image, manufacturing the image at the position into a characteristic template, and simultaneously associating a graph to be positioned with the characteristic template. In the actual production process, one frame of image is obtained each time, then the identity is calculated through the comparison of the characteristics in the target image and the template image, the position can be positioned after the set identity is reached, and the associated graph is positioned at the specific position of the image. The flexible material is very easy to stretch and deform, when the method is used, the position which is identified and positioned is inaccurate, the error which is generated by the stretching and deformation of the material is also generated, the method also has no method for calculating the error, and the positioning identification of the pattern cannot be performed on the material without obvious characteristics.
In view of the foregoing, it is desirable to provide a new pattern recognition positioning method based on flexible materials to overcome the above-mentioned drawbacks.
[ invention ]
The invention aims to provide a pattern recognition positioning method based on a flexible material, which can position the edge of the flexible material and the edge line pattern of a manufactured pattern positioning template and can recognize whether the flexible material has the problems of position deviation, deformation and wrinkles.
In order to achieve the above object, the present invention provides a pattern recognition positioning method based on a flexible material, comprising the steps of,
s1: searching the characteristic points of the lines on the image of the flexible material, generating characteristic point descriptors, and manufacturing a graph positioning template according to the characteristic points;
s2: after finding out the characteristic points, calculating characteristic point descriptors, and finding out characteristic points with similar characteristic point descriptors in the two images;
s3: tiling the flexible material on a platform, then erecting a camera on the platform and photographing the flexible material by using a lens of the camera;
s4: acquiring an image of a flexible material, and performing template matching on the image to acquire a position pos and an Angle of a matching result;
s5: translating and rotating a positioning pattern of the pattern positioning template, and performing coarse positioning;
s6: performing image processing on an image obtained by photographing a camera by using an edge extraction method;
s7: carrying out smoothing treatment on the line after the image processing in the step S6, and carrying out smoothing treatment on the line by using a B spline curve equation to obtain a smooth curve;
s8: judging the y value of the starting point and the end point of each curve, and if the y value of the starting point is smaller than the y value of the end point, exchanging the sequence of all points of the smooth curve;
s9: sequencing the smooth curves, and sequencing the x values of the midpoints center of the circumscribed rectangles Rect of the 4 smooth curves;
s10: traversing a first reference point with equal distance between a starting point on a straight Line of an edge Line1 of the edge Line graph and an end point on a wavy Line of the edge Line from the starting point of the first edge Line graph;
s11: traversing all points of the edge Line graph in the step S10 from the first reference point, and determining a second reference point with equal distance between a starting point on a straight Line of an edge Line2 of the edge Line graph and an ending point of a wavy Line of the edge Line and Dis;
s12: calculating an included Angle1 of a connecting line from the second reference point to the first reference point;
s13: adding the included Angle1 with the included Angle tan of the positioning pattern, and finally translating the positioning pattern, and rotating the positioning pattern to a position where the included Angle1 is added with the included Angle tan of the positioning pattern;
s14: calculating whether the positioning patterns are intersected with other patterns which are already arranged, discretizing the positioning patterns in the step S12 into lines, and judging whether the lines among the positioning patterns are intersected or not;
s15: if the positioning patterns are intersected, moving the starting points of the positioning patterns, and repeating the steps S11 to S14 until the positioning patterns are not intersected;
s16: repeating the steps S11 to S15 for each positioning pattern until the first reference point is not found on the edge line pattern or the second reference point is not found, and obtaining the positioning pattern of the pattern positioning template and the pattern of the edge line pattern positioning identification flexible material.
Preferably, a threshold is set manually according to the color difference between the flexible material and the platform background, and the image is binarized, so that the color of the flexible material is changed into white, the color of the platform background is changed into black, and the edge Line1 is obtained according to the jump of gray level.
Preferably, an image area surrounded by the obtained edge Line1 is set as an area Region, the area Region is subjected to open operation to remove waves, then the straight Line of the area Region is contracted to the trough of the edge Line1, and then the edge Line2 of the area Region is extracted.
Preferably, the total equation for the B-spline curve is:where Pi is the characteristic point of the control curve, fi j K (t) is a B spline basis function of K-order;
the basis functions in the B spline curve equation are:
wherein->Representing factorization, obtaining:
the B spline curve equation can be converted by bringing the basis function into the total equation:
P(t)=P 0 *F 0,3 (t)+P 1 *F 1,3 (t)+P 2 *F 2,3 (t)+P 3 *F 3,3 (t)o
compared with the prior art, the pattern recognition positioning method based on the flexible material has the beneficial effects that: through making the figure location template in advance, can make the edge of flexible material and the marginal line figure location of figure location template, discern flexible material have the problem of offset, warp, fold to reduce the produced error of deformation, accurate location can also be through the line on the characteristic point is the image of being taken.
[ description of the drawings ]
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 (including fig. 1a and 1 b) is a flow chart of a pattern recognition positioning method based on flexible materials according to the present invention.
FIG. 2 is a schematic illustration of a flexible material used in an embodiment of the present invention.
FIG. 3 is a schematic diagram of a graphic positioning template made in accordance with an embodiment of the present invention.
FIG. 4 is a schematic diagram of a positioning pattern and edge line pattern positioning recognition flexible material of a pattern positioning template according to an embodiment of the present invention.
[ detailed description ] of the invention
In order to make the objects, technical solutions and advantageous technical effects of the present invention more apparent, the present invention will be further described in detail with reference to the accompanying drawings and detailed description. It should be understood that the detailed description is intended to illustrate the invention, and not to limit the invention.
It should be noted that, in the embodiment of the present invention, the flexible material has obvious characteristics (such as the edge of the flexible material is in a straight line or a wavy line, and the surface has lines), and the characteristic points need to be set for coarse positioning.
Referring to fig. 1, the present invention provides a pattern recognition positioning method based on flexible material, comprising the following steps,
s1: searching the characteristic points of the lines on the image of the flexible material, generating a characteristic point descriptor, manufacturing a graph positioning template by using a CAD drawing tool according to the characteristic points, and drawing a positioning graph and an edge line graph on the graph positioning template (shown in figure 3);
specifically, the feature points are calculated using a gaussian differential pyramid,
(1) Firstly, constructing a Gaussian pyramid, amplifying an original image by one time to form a first layer of a first group of the Gaussian pyramid, and taking the image of the first layer of the first group as a first layer and a second layer of the Gaussian pyramid after Gaussian convolution, wherein the Gaussian convolution function is as follows:
(2) The parameter sigma takes a fixed value of 1.6 in a shift operator;
(3) Multiplying σ by a scaling factor k to obtain a new smoothing factor σ=k×σ, and smoothing the first group of second layer images with the new smoothing factor σ=k×σ, where the resulting image is used as the pyramid first group of third layer images;
(4) Repeating the steps to obtain N layers of images, wherein the sizes of the images of each layer in the same group are the same, but the smoothing coefficients are different, and the corresponding smoothing coefficients are respectively: 0, sigma, k σ K. (5) And (3) performing downsampling with a scale factor of 2 on the first group of reciprocal three-layer images, taking the obtained images as the first layer of the second group, performing Gaussian convolution on the first layer of the second group to obtain the second group of second images, repeating the smoothing of the step (3) to obtain the second group of third-layer images, and sequentially repeating to obtain the second group of N-layer images, wherein the sizes of the images of each layer in the group are the same, but the sizes of the images of the second group are half of those of the images of the first group.
And repeatedly executing the steps to obtain M groups of N layers of pyramids and M times of N images, forming a Gaussian pyramid, and subtracting adjacent pictures in the same group of pyramids after the Gaussian pyramid is constructed to construct a Gaussian differential pyramid.
The feature points are formed by local extreme points of Gaussian differential pyramid space, the preliminary detection of the feature points is completed through comparison between two adjacent layers of images of each Gaussian differential space in the same group, for example, one point is compared with 8 adjacent points of the same scale and 9 multiplied by 2 points corresponding to the upper and lower adjacent scales, and 26 points are compared in total to ensure that the extreme points are detected in both the scale space and the two-dimensional space. The pixel points detected in this way are discrete, the positioning of sub-pixel level is realized near the characteristic points, then the characteristic points are screened, noise and edge effect are removed, and the rest points are the characteristic points.
Specifically, irregular polygons (blue lines are used) are drawn to represent positioning graphics, which are physical lines; drawing two wavy lines and two straight lines (green lines are adopted) on a graph positioning template to represent an edge line graph, wherein the edge line graph is a simulated positioning line, and the position is an (x, y) coordinate;
grouping the positioning patterns and the edge line patterns in the pattern positioning template; if a positioning pattern close to the first line (No. 1 wavy line) exists, placing the positioning pattern close to the first line (No. 1 wavy line); if a positioning pattern close to the second line (No. 2 wavy line) is arranged, the positioning pattern is arranged close to the second line (No. 2 wavy line), and according to the arrangement rule, the positioning patterns are not overlapped and are arranged close to the lower end of the edge line pattern.
S2: after finding out the characteristic points, calculating characteristic point descriptors, and finding out characteristic points with similar characteristic point descriptors in the two images;
each group has 1 edge line graph (i.e. a straight line) and 2 positioning graphs, then the position relation between each positioning graph and the corresponding edge line graph is calculated to obtain the starting point of the edge line graph as p3 (x 3, y 3), the starting point of each positioning graph as p4 (x 4, y 4), and the length from the starting point p3 of the edge line graph to the starting point p4 of the positioning graph(1);
Calculating according to the formula (1) to obtain 4 positioning patterns which are all semi-closed irregular polygons, and calculating according to the following formula by using the line slope to obtain the distance between the starting point p0 (x 0, y 0) and the end point p1 (x 1, y 1) of the pattern positioning templateThe angle tan= | (k 2-k 1)/[ 1+ (k 2) (k 1) of the line from the start point p0 to the end point p1]I, where k1 is the slope of the line and k2 is the coordinateSlope of the axis.
S3: the flexible material (as shown in fig. 2) is tiled on the platform, then the camera is mounted on top of the platform and the lens of the camera is aimed at the flexible material to take a picture. It should be noted that, when the flexible material is tiled on the platform, the situation of wrinkles and curls will not occur, and if the problem of wrinkles and curls occurs in the tiled flexible material, the calculation is inaccurate.
S4: acquiring an image of a flexible material, and performing template matching on the image to acquire a position pos and an included angle tan of a matching result;
s5: translating and rotating a positioning pattern of the pattern positioning template, and performing coarse positioning;
s6: performing image processing on an image obtained by photographing a camera by using an edge extraction method;
specifically, a threshold value (the gray value range is 0-255) is manually set according to the color difference between the flexible material and the platform background, and the image is binarized, so that the color of the flexible material is changed into white (the gray value 255), the color of the platform background is changed into black (the gray value 0), and then an edge Line1 (a wavy Line) is obtained according to the jump of gray level;
setting an image area surrounded by the obtained edge Line1 as an area Region, performing open operation on the area Region to remove waves, then contracting the straight Line of the area Region to the trough of the edge Line1 (wave Line), and then extracting the edge Line2 of the area Region.
S7: smoothing the line after the image processing in S6, smoothing the line by using a B spline curve equation,
specifically, the total equation of the B-spline curve is:where Pi is the characteristic point of the control curve, fi j K (t) is a B spline basis function of K-order;
the basis functions in the B spline curve equation are:
wherein->Representing factorization, obtaining:
substituting the basis function into the total equation of the B spline curve to calculate a B spline curve equation:
P(t)=P 0 *F 0,3 (t)+P 1 *F 1,3 (t)+P 2 *F 2,3 (t)+P 3 *F 3,3 (t) obtaining 4 smooth curves;
s8: judging the y value of the starting point p2 (x 2, y 2) and the end point p3 (x 3, y 3) of each smooth curve, and if the y2 value of the starting point is smaller than the y3 value of the end point, exchanging the sequence of all points of the smooth curves;
s9: sequencing the 4 smooth curves, and sequencing the x values of the midpoints center of the circumscribed rectangles Rect of the 4 smooth curves to obtain a well-arranged graphic list;
specifically, grouping the positioning graphs to obtain the midpoint centers (x, y) of the external rectangle of the positioning graphs, and calculating the distance from the midpoint center (x, y) of each positioning graph to a straight line, wherein the distance is small and is divided into a group; it should be noted that, each edge line pattern (i.e., 1 wavy line, 1 straight line) is a simulated positioning line, and the positioning pattern is positioned according to the edge line pattern;
after grouping, each group has 1 edge line graph (i.e. 1 straight line and 1 wavy line) and 2 positioning graphs (including lines), and then the position relation between each positioning graph and the corresponding edge line graph is calculated to obtain the starting point of the edge line graph as p4 (x 4, y 4), the starting point of each positioning graph as p5 (x 5, y 5), and the length from the starting point p4 of the edge line graph to the starting point p5 of the positioning graph(1);
The positioning patterns are all semi-closed irregular polygons obtained through calculation according to the formula (1), and the distance between the starting point p0 (x 0, y 0) and the end point p1 (x 1, y 1) of the pattern positioning template is obtained through calculation of the connecting line slope according to the following formulaThe angle tan= | (k 2-k 1)/[ 1+ (k 2) (k 1) of the line from the start point p0 to the end point p1]I, where k1 is the slope of the line and k2 is the slope of the coordinate axis.
S10: the 2 edge lines are a group (1 wavy Line, 1 straight Line), and a first reference point (a trough point wp 1) with equal distance from the starting point p4 of the first edge Line graph to the ending point p1 of the wavy Line of the edge Line1 and Dis is traversed from the starting point p4 of the first edge Line graph;
s11: traversing all points of the edge Line graph in the step S10 from the first reference point, and according to the distance Dis between the starting point p0 and the end point p1 of the positioning graph, determining a second reference point with equal distance between the starting point p4 on the straight Line of the edge Line2 of the edge Line graph and the end point p1 and Dis of the wave Line of the edge Line; it should be noted that, all trough points wp1 are calculated in this way;
s12: calculating an included Angle1 of a connecting line from the second reference point to the corresponding first reference point (trough point wp 1);
s13: adding the included Angle1 with the included Angle tan of the positioning pattern, and finally translating the positioning pattern, and rotating the positioning pattern to a position where the included Angle1 is added with the included Angle tan of the positioning pattern;
s14: calculating whether the positioning patterns are intersected with other arranged positioning patterns, discretizing the positioning patterns in the step S12 into lines, and judging whether the lines among the positioning patterns are intersected;
s15: if the positioning patterns are intersected, moving a starting point p0 of the positioning patterns, and repeating the steps S11 to S14 until the positioning patterns are not intersected;
s16: repeating the steps S11 to S15 for each positioning pattern until the first reference point (the trough point wp 1) is not found on the edge line pattern or the second reference point is not found, and obtaining the positioning pattern of the pattern positioning template and the pattern of the edge line pattern positioning identification flexible material (as shown in fig. 4).
The present invention is not limited to the details and embodiments described herein, and thus additional advantages and modifications may readily be made by those skilled in the art, without departing from the spirit and scope of the general concepts defined in the claims and the equivalents thereof, and the invention is not limited to the specific details, representative apparatus and examples shown and described herein.

Claims (4)

1. A pattern recognition positioning method based on flexible material is characterized by comprising the following steps,
s1: searching the characteristic points of the lines on the image of the flexible material, generating characteristic point descriptors, and manufacturing a graph positioning template according to the characteristic points;
specifically, a Gaussian differential pyramid is used for calculating the characteristic points;
(1) Constructing a Gaussian pyramid, amplifying an original image by one time to form a first layer of a first group of the Gaussian pyramid, and taking the image of the first layer of the first group as a first layer and a second layer of the Gaussian pyramid after Gaussian convolution, wherein the first layer is a Gaussian convolution function:
(2) The parameter sigma takes a fixed value of 1.6 in a shift operator;
(3) Multiplying σ by a scaling factor k to obtain a new smoothing factor σ=k×σ, and smoothing the first group of second layer images with the new smoothing factor σ=k×σ, where the resulting image is used as the pyramid first group of third layer images;
(4) Repeating the steps to obtain N layers of images, wherein the sizes of the images in the same group are the same, and the corresponding smoothing coefficients are respectively: 0, σ, kσ, k2σ, k3σ, kΛ (L-2) σ;
(5) Taking the first group of reciprocal three-layer images as downsampling with the scale factor of 2, taking the obtained images as the first layer of the second group, taking the second group of first layers as Gaussian convolution to obtain a second group of second images, repeating the smoothing of the step (3) to obtain a second group of third-layer images, and sequentially repeating to obtain a second group of N-layer images, wherein the size of each layer of images in the group is the same, and the size of the second group of images is half of that of the first group of images;
repeatedly executing to obtain M groups of N layers of pyramids, wherein M is equal to N images to form a Gaussian pyramid, and subtracting adjacent pictures in the same group of pyramids to form a Gaussian differential pyramid;
the characteristic points consist of local extreme points of Gaussian differential pyramid spaces, and the preliminary detection of the characteristic points is completed through the comparison between two adjacent layers of images of each Gaussian differential space in the same group;
s2: after finding out the characteristic points, calculating characteristic point descriptors, and finding out characteristic points with similar characteristic point descriptors in the two images;
s3: tiling the flexible material on a platform, then erecting a camera on the platform and photographing the flexible material by using a lens of the camera;
s4: acquiring an image of a flexible material, and performing template matching on the image to acquire the position and the included angle of a matching result;
s5: translating and rotating a positioning pattern of the pattern positioning template, and performing coarse positioning;
s6: performing image processing on an image obtained by photographing a camera by using an edge extraction method;
s7: carrying out smoothing treatment on the lines subjected to the image processing in the step S6, and carrying out smoothing treatment on the lines by using a B spline curve equation to obtain a smooth curve;
s8: judging the y value of the starting point and the end point of each smooth curve, and if the y value of the starting point is smaller than the y value of the end point, exchanging the sequence of all points of the smooth curves;
s9: sequencing the smooth curves, and sequencing the x value of the midpoint center of the circumscribed rectangle Rect of the smooth curves;
s10: traversing a first reference point with equal distance between a starting point on a straight Line of an edge Line1 of the edge Line graph and an end point on a wavy Line of the edge Line from the starting point of the first edge Line graph;
s11: traversing all points of the edge Line graph in the step S10 from the first reference point, and determining a second reference point with equal distance between a starting point on a straight Line of an edge Line2 of the edge Line graph and an ending point of a wavy Line of the edge Line and Dis;
s12: calculating an included Angle1 of a connecting line from the second reference point to the first reference point;
s13: adding the included Angle1 with the included Angle tan of the positioning pattern, and finally translating the positioning pattern, and rotating the positioning pattern to a position where the included Angle1 is added with the included Angle tan of the positioning pattern;
s14: calculating whether the positioning patterns are intersected with other positioning patterns or not, discretizing the positioning patterns in the step S12 into lines, and judging whether the lines among the positioning patterns are intersected or not;
s15: if the positioning patterns are intersected, moving the starting points of the positioning patterns, and repeating the steps S11 to S14 until the positioning patterns are not intersected;
s16: repeating the steps S11 to S15 for each positioning pattern until the first reference point is not found on the edge line pattern or the second reference point is not found, and obtaining the positioning pattern of the pattern positioning template and the pattern of the edge line pattern positioning identification flexible material.
2. The pattern recognition positioning method based on flexible material according to claim 1, wherein a threshold is manually set according to the color difference between the flexible material and the platform background, and the image is binarized, so that the color of the flexible material is changed to white, the platform background is changed to black, and the edge Line1 is obtained according to the jump of gray level.
3. The pattern recognition positioning method based on flexible material according to claim 1, wherein the image area surrounded by the obtained edge Line1 is set as an area Region, the area Region is subjected to open operation to remove waves, then the straight Line of the area Region is contracted to the trough of the edge Line1, and then the edge Line2 of the area Region is extracted.
4. The flexible material-based pattern recognition positioning method as claimed in claim 1, wherein the total equation of the B-spline curve is:wherein P is i Is the characteristic point of the control curve, fi j K (t) is a B spline basis function of K-order;
the basis functions in the B spline curve equation are:wherein->Representing factorization, obtaining:
the B spline curve equation can be converted by bringing the basis function into the total equation:
P(t)=P 0 *F 0,3 (t)+P 1 *F 1,3 (t)+P 2 *F 2,3 (t)+P 3 *F 3,3 (t)。
CN202011320735.4A 2020-11-23 2020-11-23 Pattern recognition positioning method based on flexible material Active CN112464950B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011320735.4A CN112464950B (en) 2020-11-23 2020-11-23 Pattern recognition positioning method based on flexible material

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011320735.4A CN112464950B (en) 2020-11-23 2020-11-23 Pattern recognition positioning method based on flexible material

Publications (2)

Publication Number Publication Date
CN112464950A CN112464950A (en) 2021-03-09
CN112464950B true CN112464950B (en) 2023-08-08

Family

ID=74798474

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011320735.4A Active CN112464950B (en) 2020-11-23 2020-11-23 Pattern recognition positioning method based on flexible material

Country Status (1)

Country Link
CN (1) CN112464950B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112464779B (en) * 2020-11-23 2022-10-11 武汉舜陈技术有限公司 Flexible material based pattern recognition positioning method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2504009C1 (en) * 2012-07-10 2014-01-10 Общество С Ограниченной Ответственностью "Дрессформер" Method of facilitating remote fitting and/or selection of clothes
CN103605979A (en) * 2013-12-03 2014-02-26 苏州大学张家港工业技术研究院 Object identification method and system based on shape fragments
CN104339946A (en) * 2014-07-28 2015-02-11 杨立群 Charting and drawing template
WO2015184764A1 (en) * 2014-11-17 2015-12-10 中兴通讯股份有限公司 Pedestrian detection method and device
WO2016011433A2 (en) * 2014-07-17 2016-01-21 Origin Wireless, Inc. Wireless positioning systems
CN105705430A (en) * 2013-11-06 2016-06-22 宝洁公司 Flexible containers for use with short shelf-life products, and methods for accelerating distribution of flexible containers
CN108921175A (en) * 2018-06-06 2018-11-30 西南石油大学 One kind being based on the improved SIFT method for registering images of FAST
CN111950230A (en) * 2020-08-20 2020-11-17 广东工业大学 Flexible material intelligent continuous processing control method and equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2504009C1 (en) * 2012-07-10 2014-01-10 Общество С Ограниченной Ответственностью "Дрессформер" Method of facilitating remote fitting and/or selection of clothes
CN105705430A (en) * 2013-11-06 2016-06-22 宝洁公司 Flexible containers for use with short shelf-life products, and methods for accelerating distribution of flexible containers
CN103605979A (en) * 2013-12-03 2014-02-26 苏州大学张家港工业技术研究院 Object identification method and system based on shape fragments
WO2016011433A2 (en) * 2014-07-17 2016-01-21 Origin Wireless, Inc. Wireless positioning systems
CN104339946A (en) * 2014-07-28 2015-02-11 杨立群 Charting and drawing template
WO2015184764A1 (en) * 2014-11-17 2015-12-10 中兴通讯股份有限公司 Pedestrian detection method and device
CN108921175A (en) * 2018-06-06 2018-11-30 西南石油大学 One kind being based on the improved SIFT method for registering images of FAST
CN111950230A (en) * 2020-08-20 2020-11-17 广东工业大学 Flexible material intelligent continuous processing control method and equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Minutiae Triangle Graphs: A New Fingerprint Representation with Invariance Properties;Akmal-Jahan Mohamed-Abdul-Cader等;《IEEE》;全文 *

Also Published As

Publication number Publication date
CN112464950A (en) 2021-03-09

Similar Documents

Publication Publication Date Title
CN111243032B (en) Full-automatic detection method for checkerboard corner points
CN104318548B (en) Rapid image registration implementation method based on space sparsity and SIFT feature extraction
CN113240626A (en) Neural network-based method for detecting and classifying concave-convex flaws of glass cover plate
CN109034245A (en) A kind of object detection method merged using characteristic pattern
CN115096206B (en) High-precision part size measurement method based on machine vision
CN114529925B (en) Method for identifying table structure of whole line table
CN103456022A (en) High-resolution remote sensing image feature matching method
CN109949227A (en) Image split-joint method, system and electronic equipment
CN114022439B (en) Flexible circuit board defect detection method based on morphological image processing
CN116543188B (en) Machine vision matching method and system based on gray level matching
CN106296587B (en) Splicing method of tire mold images
CN112884746B (en) Character defect intelligent detection algorithm based on edge shape matching
CN112464950B (en) Pattern recognition positioning method based on flexible material
CN113744142A (en) Image restoration method, electronic device and storage medium
CN115311289A (en) Method for detecting oil stain defects of plain-color cloth
CN114419045A (en) Method, device and equipment for detecting defects of photoetching mask plate and readable storage medium
CN114399505B (en) Detection method and detection device in industrial detection
CN109544513A (en) A kind of steel pipe end surface defect extraction knowledge method for distinguishing
CN111754461B (en) Method and device for positioning image character area of semiconductor chip
CN111047614B (en) Feature extraction-based method for extracting target corner of complex scene image
CN117372422A (en) Material bending degree detection method for part production
CN106934846B (en) Cloth image processing method and system
CN117011376A (en) Industrial part comprehensive positioning method and system based on edge contour and feature moment
CN108985294B (en) Method, device and equipment for positioning tire mold picture and storage medium
CN116758266A (en) Reading method of pointer type instrument

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant