CN111882570A - Edge positioning method and device, storage medium and electronic equipment - Google Patents

Edge positioning method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN111882570A
CN111882570A CN202010741422.XA CN202010741422A CN111882570A CN 111882570 A CN111882570 A CN 111882570A CN 202010741422 A CN202010741422 A CN 202010741422A CN 111882570 A CN111882570 A CN 111882570A
Authority
CN
China
Prior art keywords
caliper
image
determining
point
edge line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010741422.XA
Other languages
Chinese (zh)
Inventor
费晨
张太鹏
陈龙
姜豪
刘风雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Crystal Optech Co Ltd
Original Assignee
Zhejiang Crystal Optech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Crystal Optech Co Ltd filed Critical Zhejiang Crystal Optech Co Ltd
Priority to CN202010741422.XA priority Critical patent/CN111882570A/en
Publication of CN111882570A publication Critical patent/CN111882570A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)

Abstract

The application provides an edge positioning method, an edge positioning device, a storage medium and an electronic device, wherein the method comprises the following steps: obtaining an image to be positioned, wherein the image to be positioned comprises an edge line of an object; generating a plurality of calipers on the edge line according to the image to be positioned, wherein the edge line penetrates through two opposite sides of a rectangle represented by the calipers, the edge line is spaced from adjacent sides of the rectangle, and the adjacent sides represent two sides of the rectangle adjacent to the opposite sides; for each caliper, determining a target point of the caliper according to the caliper and the image in the caliper, wherein the target point is used for representing the position of the edge line in the caliper; and determining the position of the edge line in the image to be positioned according to the target point of each caliper. In such a way, the data processing amount can be reduced, so that the processing efficiency is improved, and the time and the cost are saved.

Description

Edge positioning method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of machine vision technologies, and in particular, to an edge positioning method, an edge positioning device, a storage medium, and an electronic device.
Background
Machine vision is an integrated technology including image processing, mechanical engineering, control, electrical light source illumination, optical imaging, sensors, analog and digital video technology, computer hardware and software technology (image enhancement and analysis algorithms, image cards, I/O cards, etc.). A typical machine vision application system comprises an image capture module, a light source system, an image digitization module, a digital image processing module, an intelligent judgment decision module, a mechanical control execution module and the like. The most basic feature of machine vision systems is to increase the flexibility and automation of production.
Machine vision can well replace artificial vision in dangerous working environments which are not suitable for manual operation or occasions in which the requirements of the artificial vision are difficult to meet. And in the process of mass repetitive industrial production, the machine vision detection method can greatly improve the production efficiency and the automation degree.
In order to implement machine vision positioning, a machine learning method is usually adopted in the prior art, a model closest to a real target is trained through a large amount of pre-collected target data, and then a result can be output by inputting an actual target into the model obtained through calculation.
The method has high identification and positioning efficiency and strong universality, and is a mainstream application method in the current market. However, the method has extremely long early deployment time and higher requirement on operation hardware, and has no way to deal with some cases with short period and fast product updating iteration, and is not beneficial to reducing the production cost.
In addition, the existing machine vision technology generally adopts an image enhancement feature mode to highlight the image edge feature in a specific processing process, and a typical scheme is to perform convolution filtering on horizontal edges and vertical edges on an input original image and then synthesize an edge map. And then, calculating the gradient value and gradient direction of each pixel in the image through a Canny algorithm formula, and searching a gradient extreme point in a local area as a remarkable edge candidate point according to the characteristic of higher gradient value of the edge pixel.
The method has the following problems, so that the current high-precision positioning requirement cannot be met:
1. the existing method preprocesses the image, although the edge characteristics of the image are enhanced, some image edge information is lost, and the loss cannot be accurately estimated and avoided, so that the risk of positioning distortion is finally brought.
2. The existing method usually processes the whole image, although the method is rigorous, each pixel of the image needs to be traversed, and a lot of unnecessary computing resources are consumed. In some cases, such as large-format images or images with complex backgrounds, the amount of computation is also greatly increased, which is not cost-effective in terms of both time and money costs.
3. The existing method generally only performs gradient calculation on four directions (0 degrees or 180 degrees, 45 degrees or 225 degrees, 90 degrees or 270 degrees, 135 degrees or 315 degrees) of each characteristic pixel, which cannot meet the accurate positioning of the angle in the real situation.
Disclosure of Invention
An object of the embodiments of the present application is to provide an edge positioning method, an edge positioning apparatus, a storage medium, and an electronic device, which can improve image positioning accuracy and efficiency in a low-cost manner.
In order to achieve the above object, embodiments of the present application are implemented as follows:
in a first aspect, an embodiment of the present application provides an edge positioning method, including: obtaining an image to be positioned, wherein the image to be positioned comprises an edge line of an object; generating a plurality of calipers on the edge line according to the image to be positioned, wherein the edge line penetrates through two opposite sides of a rectangle represented by the calipers, the edge line is spaced from adjacent sides of the rectangle, and the adjacent sides represent two sides of the rectangle adjacent to the opposite sides; for each caliper, determining a target point of the caliper according to the caliper and the image in the caliper, wherein the target point is used for representing the position of the edge line in the caliper; and determining the position of the edge line in the image to be positioned according to the target point of each caliper.
In the embodiment of the application, a plurality of calipers are generated on the edge line of the object contained in the image to be positioned, and the target point (representing the position of the edge line in the caliper) of the calipers is determined according to the calipers and the image in the calipers, so that the position of the edge line in the image to be positioned is determined according to the target point of each caliper. By the mode, the whole image does not need to be processed, so that the data processing amount can be reduced, the processing efficiency is improved, and the time and the cost are saved. In addition, in this way, the accurate positioning of the object edge can be realized without preprocessing the image (for example, performing convolution filtering on the horizontal edge and the vertical edge, and then synthesizing the edge map), on one hand, the accuracy of the positioning can be improved due to the fact that the information loss of the edge is avoided, on the other hand, the time and the cost can be further saved without performing such processing, and therefore, the high-precision positioning of the image can be realized with low cost and high efficiency.
With reference to the first aspect, in a first possible implementation manner of the first aspect, the generating, according to the image to be positioned, a plurality of calipers on the edge line includes: estimating the position range of the edge line according to the image to be positioned; determining the arrangement range of the calipers according to the position range and the acquired parameter information of the calipers, wherein the parameter information comprises the length and the width of a rectangle represented by the calipers; generating a plurality of calipers within the arrangement range.
In this implementation, the arrangement range of the calipers is determined by estimating the position range of the edge line and combining the parameter information (such as the length and width of the rectangle) of the calipers, and a plurality of calipers are generated in the arrangement range. Through the mode, the calipers can be accurately and efficiently arranged on the edge line of the object in the image, so that the positioning accuracy is ensured.
With reference to the first aspect, in a second possible implementation manner of the first aspect, the determining, for each caliper, a target point of the caliper according to the caliper and the image in the caliper includes: extracting a caliper image within the caliper range according to parameter information of the caliper, wherein the parameter information comprises the length, the width, the arrangement included angle and the coordinate of the caliper, and the caliper image comprises a part of the edge line; and determining a target point of the caliper according to the parameter information and the caliper image.
In the implementation mode, according to the parameters of the caliper, the target point of the caliper image in the caliper range can be extracted, so that the target point representing the edge line position in the caliper image can be accurately determined. By the method, the target point of the caliper image can be simply and accurately extracted, and therefore the operation efficiency of the method is improved.
With reference to the second possible implementation manner of the first aspect, in a third possible implementation manner of the first aspect, the parameter information includes a coordinate of a center point of the caliper, and the extracting, according to a range represented by the caliper, a caliper image within the caliper range includes: determining a minimum circumscribed rectangle of the caliper and a coordinate of the minimum circumscribed rectangle according to the parameter information of the caliper; extracting a first image within the range of the minimum circumscribed rectangle according to the coordinates of the minimum circumscribed rectangle; rotating the first image according to the coordinates of the central point and the arrangement included angles so as to enable the calipers in the first image to be kept in a horizontal state; and cutting the rotated first image to obtain a caliper image within the caliper range.
In this implementation, through the minimum external rectangle who determines the slide caliper rule to the coordinate according to minimum external rectangle draws the first image in the minimum external rectangle within range, and, through the central point coordinate of slide caliper rule and the contained angle of arranging rotate first image, so that the slide caliper rule image is in the horizontality, so that tailor in order to draw the slide caliper rule image to the slide caliper rule image. The caliper image can be simply and conveniently extracted from the image to be positioned in such a way, so that the accuracy and the efficiency of the method are ensured.
With reference to the second possible implementation manner of the first aspect, in a fourth possible implementation manner of the first aspect, when the caliper image is in a horizontal state, the determining a target point of the caliper according to the parameter information and the caliper image includes: carrying out gray processing on the caliper image to determine a pixel gray value horizontal arrangement matrix map of the caliper image; determining step points in the caliper image according to the matrix diagram, wherein the step points represent points with the maximum pixel gray value change; and determining the target point according to the parameter information and the jump point.
In this implementation manner, a step point (a point at which the pixel gray value changes the most) in the caliper image can be further determined by determining the pixel gray value horizontal arrangement matrix map of the caliper image, so that the target point is obtained by combining the parameter information and the step point. The target point determined in this way may well represent the position of the edge line. Moreover, the mode of determining the target point is simple and convenient, and the operation efficiency of the method can be improved.
With reference to the fourth possible implementation manner of the first aspect, in a fifth possible implementation manner of the first aspect, the determining, according to the matrix map, a step point in the caliper image includes: determining a single-column vector according to the matrix diagram, wherein each element in the single-column vector represents the mean value of the row element of the element in the matrix diagram; deriving the single-column vector to determine a point with the maximum change of the column elements and determine the row number of the point as the row coordinate of the jump point; determining the number of columns of the matrix according to the matrix diagram, and taking the product of the number of columns and a preset value as the column coordinate of the jump point; and determining the step points in the caliper image according to the row coordinates of the step points and the column coordinates of the step points.
In this implementation, a single-column vector (each element represents an average value of row elements where the element is located in the matrix diagram) may be determined through the matrix diagram, and then, based on a derivation manner of the single-column vector, a point where a column element changes most may be determined quickly and accurately, so as to determine a row coordinate where the row where the point is located is a step point; in addition, the step point in the caliper image can be determined by taking the product of the column number of the matrix and the preset value as the column coordinate of the step point, so that on one hand, the position where the pixel change is the largest can be accurately determined by the mode, the position of the edge line is indicated by taking the point as the step point, and the accuracy can be ensured. On the other hand, when determining the column coordinates, the column coordinates can be determined by taking the coordinates of the preset points (i.e. taking the product of the column number and the preset value), and the efficiency can be ensured while considering the accuracy.
With reference to the fourth possible implementation manner of the first aspect, in a sixth possible implementation manner of the first aspect, the determining, according to the matrix map, a step point in the caliper image includes: determining a single-column vector according to the matrix diagram, wherein each element in the single-column vector represents the mean value of the row element of the element in the matrix diagram; deriving the single-column vector to determine a point with the maximum change of the column elements and determine the row number of the point as the row coordinate of the jump point; determining a single-row vector according to the matrix map, wherein each element in the single-row vector represents the mean value of the column element where the element is located in the matrix map; obtaining a derivative of the single-row vector, determining a point with the maximum change of the row elements, and determining the number of columns where the point is located as the column coordinates of the jump points; and determining the step points in the caliper image according to the row coordinates of the step points and the column coordinates of the step points.
In this implementation, a single-column vector (each element represents an average value of row elements where the element is located in the matrix) may be determined through the matrix diagram, and then, based on a derivation manner of the single-column vector, a point where the column element changes most may be determined quickly and accurately, so as to determine a row coordinate where the row where the point is located is a step point; and determining a single row vector (each element represents the mean value of the row elements where the element is located in the matrix diagram), and then quickly and accurately determining the point with the maximum row element change based on the single row vector derivation mode, so as to determine that the column number where the point is located is the column coordinate of the jump point. By the method, the accuracy of the determined step point can be further improved, so that the accuracy of the method is ensured.
With reference to the first aspect, or with reference to any one of the first to sixth possible implementation manners of the first aspect, in a seventh possible implementation manner of the first aspect, the determining, according to a target point of each of the calipers, a position of the edge line in the image to be positioned includes: determining a corresponding position point in the image to be positioned according to the parameter information of each caliper and the step point of the caliper; and determining a fitting curve and positioning information of the fitting curve in the image to be positioned according to each corresponding position point, wherein the positioning information of the fitting curve represents the position of the edge line in the image to be positioned.
In the implementation mode, the edge line is determined by fitting each step point, so that the position of the edge line of the object in the image can be simply, conveniently and accurately determined, and the image can be accurately positioned.
In a second aspect, an embodiment of the present application provides an edge positioning apparatus, including: the image acquisition module is used for acquiring an image to be positioned, wherein the image to be positioned comprises an edge line of an object; a caliper generating module, configured to generate a plurality of calipers on the edge line according to the image to be positioned, where the edge line penetrates through two opposite sides of a rectangle represented by the calipers, the edge line is spaced from an adjacent side of the rectangle, and the adjacent side represents two sides of the rectangle adjacent to the opposite side; a target point determining module, configured to determine, for each caliper, a target point of the caliper according to the caliper and the image inside the caliper, where the target point is used to represent a position of the edge line inside the caliper; and the edge positioning module is used for determining the position of the edge line in the image to be positioned according to the target point of each caliper.
In a third aspect, an embodiment of the present application provides a storage medium, where one or more programs are stored, and the one or more programs are executable by one or more processors to implement the first aspect or the edge positioning method according to any one of the first to sixth aspects.
In a fourth aspect, an embodiment of the present application provides an electronic device, including a memory and a processor, where the memory is configured to store information including program instructions, and the processor is configured to control execution of the program instructions, where the program instructions are loaded and executed by the processor to implement the first aspect or the edge positioning method according to any one of the first to sixth aspects.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a flowchart of an edge positioning method according to an embodiment of the present disclosure.
Fig. 2 is a schematic diagram of arranging calipers on an edge line in an image to be positioned according to an embodiment of the present disclosure.
Fig. 3 is a schematic diagram of extracting a caliper image within a caliper range according to an embodiment of the present disclosure.
Fig. 4 is a schematic diagram of a pixel projection according to an embodiment of the present disclosure.
Fig. 5 is a block diagram of an edge positioning device according to an embodiment of the present disclosure.
Fig. 6 is a block diagram of an electronic device according to an embodiment of the present disclosure.
Icon: 10-edge positioning means; 11-an image acquisition module; 12-a caliper generation module; 13-target point determination module; 14-an edge positioning module; 20-an electronic device; 21-a memory; 22-a communication module; 23-a bus; 24-a processor.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
In order to locate the object in the image, the edge line of the object in the image can be located. In order to improve efficiency and accuracy of positioning edge lines of an object in an image, the embodiment of the application provides an edge positioning method. The method may be performed by an electronic device, and the following description will take the electronic device as an example to perform the edge location method, but should not be construed as limiting the present application. In addition, the method can be applied to various scenes, such as conventional image processing scenes of image positioning, edge detection and the like, is particularly suitable for industrial scenes with short period and fast product updating iteration, and is also very suitable for other scenes with similar characteristics, and the specific application scene of the method is not limited, so that the scene which can be actually applied by the method is taken as the standard.
Referring to fig. 1, fig. 1 is a flowchart illustrating an edge positioning method according to an embodiment of the present disclosure. The edge positioning method may include step S10, step S20, step S30, and step S40.
To simply and efficiently achieve the positioning of the edge line, the electronic device may perform step S10.
Step S10: and obtaining an image to be positioned, wherein the image to be positioned comprises the edge line of the object.
In this embodiment, the electronic device may obtain an image to be positioned that contains the edge line of the object. For example, the image to be located may include a complete edge line of the object, or may include a part of the edge line of the object. The position of the object in the image can be accurately positioned by positioning the complete margin line, and the object can be further positioned based on the preset relation between the margin line and the object under the condition of partial margin line, so that the method is not particularly limited, and a proper mode (namely whether the image to be positioned needs to contain the complete margin line or not) is adaptively selected according to actual needs.
After obtaining the image to be located, the electronic device may perform step S20.
Step S20: and generating a plurality of calipers on an edge line according to the image to be positioned, wherein the edge line penetrates through two opposite sides of the rectangle represented by the calipers, the edge line is spaced from adjacent sides of the rectangle, and the adjacent sides represent two sides adjacent to the opposite sides in the rectangle.
In this embodiment, the electronic device may generate a plurality of calipers on the edge line according to the image to be positioned. Before describing the manner in which the plurality of calipers are generated on the edge line, the calipers will be described herein.
In this embodiment, the caliper may be represented as a rectangular area, and the parameter information of the caliper may include a length and a width of the rectangle represented by the caliper, an arrangement angle (for example, an angle with a horizontal direction), coordinates (for example, a center point coordinate, a vertex coordinate, and the like, and the center point coordinate is taken as an example in this embodiment, but not limited thereto), and the like.
For example, the electronic device generates a plurality of calipers on an edge line according to the image to be positioned, where the edge line needs to intersect two opposite sides (e.g., two long sides or two wide sides) of the rectangle represented by the calipers, and the edge line is spaced from an adjacent side (e.g., two sides adjacent to the two opposite sides that intersect, when the edge line intersects the two long sides, the adjacent side is the two wide sides) of the rectangle.
Referring to fig. 2, fig. 2 is a schematic diagram of arranging calipers on an edge line in an image to be positioned according to an embodiment of the present disclosure. It should be noted that fig. 2 shows a rectangular frame of the caliper, which is only for convenience of observation and explanation, and the actual frame line does not need to be generated in the actual operation process. Of course, in some possible implementations, there may be a way to generate the border line of the caliper, and this is not limited specifically here.
In this embodiment, in order to accurately and efficiently arrange the calipers on the edge line of the object in the image, the electronic device may estimate the position range of the edge line according to the image to be positioned.
For example, for repetitive image positioning (i.e., a plurality of images to be positioned contain the same object and may have different specific positions, for example, positioning of images of products or materials in industrial applications), the position range of the edge line of the object can be estimated by roughly detecting the images and determining the approximate range of the edge line. Alternatively, for a case where the object is within a certain fixed range in the image and there may be a difference in specific positions within the range, the range where the edge line of the object may be located may be determined in advance, so as to estimate the position range of the edge line. These are exemplary only and should not be construed as limiting the present application.
After the position range of the edge line is determined, the electronic equipment can determine the arrangement range of the caliper according to the position range and the acquired parameter information of the caliper.
For example, the electronic device may determine, in combination with parameter information of the caliper (for example, the length and width of the rectangle represented by the caliper) and the position range of the edge line, an arrangement range that can be used for arranging the caliper (i.e., generating the caliper), so that the generated caliper satisfies the relationship condition between the edge line and the caliper as much as possible (i.e., the edge line needs to penetrate through two opposite sides of the rectangle represented by the caliper and is spaced from the adjacent side of the rectangle).
After determining the range of arranging, electronic equipment can generate a plurality of calipers in the range of arranging. It should be noted that the generated arrangement angles (i.e., arrangement included angles) of the plurality of calipers, the length and the width of each caliper, and other parameters may be the same or different, and are not limited herein based on actual needs. For example, for convenience of calculation, the long sides and the wide sides of each caliper in the same image are the same, but the arrangement angles may be the same or different; or the long sides of each caliper are the same, the wide sides can be different, the arrangement angles can be different, and the like. Caliper has certain angle of arranging (the angle of arranging of different calipers can be different) can avoid the adjacent side and the coincident condition of margin line of the rectangle that the caliper shows as far as possible for arranging of caliper accords with the demand more.
In addition, the arrangement of the calipers shown in fig. 2 is only an exemplary one, and should not be construed as limiting the present application. The plurality of calipers generated on the edge line can be the same or different in interval, and each caliper can be closely and adjacently arranged or arranged at a certain distance, even can be partially overlapped and arranged, so that the actual requirement is met, and the calipers are not limited here. For example, when higher positioning accuracy is needed, the closely adjacent arranged calipers can be selected, or the calipers are arranged in an overlapped mode, and under the condition that the requirement on the positioning accuracy is relatively low, calipers with fewer quantities can be generated, and the arrangement intervals of the calipers can be larger. Or, the parts with higher requirement for the edge line positioning accuracy of the object in the image (or the parts with higher identification degree, the whole edge line or the object can be more easily and accurately positioned through the parts with higher identification degree), the calipers can be arranged in a close adjacent mode, and a small number of calipers can be arranged in the parts with lower accuracy requirement, so that the efficiency of positioning the edge of the image can be further improved.
The arrangement range of the calipers is determined by predicting the position range of the edge line and combining the parameter information of the calipers, and a plurality of calipers are generated in the arrangement range. Through the mode, the calipers can be accurately and efficiently arranged on the edge line of the object in the image, so that the positioning accuracy is ensured. On the other hand, the range of arranging that this kind of mode was confirmed can also consider the parameter information (for example length, width etc.) of slide caliper rule to be favorable to promoting the accuracy that slide caliper rule arranged, and the mode of arranging of slide caliper rule can be more nimble.
In this embodiment, for repeated image positioning, for example, when the object is within a certain fixed range in the image, there may be a difference in specific positions within the range, and it is also possible to determine the arrangement range of the calipers in advance, and generate the calipers within the arrangement range, which is more convenient and efficient, and these are merely exemplary ways, and should not be considered as limitations of the present application.
After generating the plurality of calipers on the edge line of the image, the electronic device may perform step S30.
Step S30: and determining a target point of each caliper according to the caliper and the image in the caliper, wherein the target point is used for representing the position of the edge line in the caliper.
In this embodiment, the electronic device may determine a target point representing the position of the edge line within the caliper according to the caliper and the image within the caliper.
For example, the electronic device may extract a caliper image within the caliper range according to parameter information of the caliper, where the parameter information includes a length, a width, an arrangement angle, and coordinates (e.g., a center coordinate, a vertex coordinate, etc.) of the caliper, and the caliper image includes a portion of an edge line (i.e., a portion of the edge line that penetrates through the caliper).
Referring to fig. 3, fig. 3 is a schematic diagram of extracting a caliper image within a caliper range according to an embodiment of the present disclosure. For example, the electronic device extracts a caliper image within a caliper range, which may be implemented as follows:
as shown in part a of fig. 3, the electronic device may determine the minimum circumscribed rectangle of the caliper and the coordinates of the minimum circumscribed rectangle according to the parameter information (length, width, coordinates, and arrangement included angle of the rectangle) of the caliper.
Specifically, the minimum circumscribed rectangle mode of the caliper may be:
determining the length, the width and the arrangement included angle of the calipers to be calculated; and calculating the width (rectangular) and the length (rectangular) of the minimum bounding rectangle by the following formulas:
Figure BDA0002606290430000121
Figure BDA0002606290430000122
wherein theta is an arrangement included angle and takes a value between 0 and 2; height denotes the long side of the caliper (i.e., the long side of the rectangle for which the caliper denotes); width denotes the short side of the caliper (i.e., the wide side of the rectangle represented by the caliper).
Then, the electronic device can calculate the starting point coordinate and the end point coordinate of the minimum circumscribed rectangle in the image coordinate system according to the calculated minimum circumscribed rectangle and the coordinate (center point coordinate) of the caliper. Therefore, the minimum circumscribed rectangle of the caliper and the coordinate of the minimum circumscribed rectangle can be determined.
After the minimum circumscribed rectangle of the caliper and the coordinates of the minimum circumscribed rectangle are determined, the electronic device may extract a first image (see part a in fig. 3) within the range of the minimum circumscribed rectangle according to the coordinates of the minimum circumscribed rectangle.
Then, the electronic equipment can rotate the first image according to the central point coordinate and the arrangement included angle so that the caliper in the first image keeps a horizontal state. For example, as shown in part B in fig. 3, after the center of the first image (i.e., the center of the caliper) is used as the rotation center and the first image is rotated by the arrangement angle, the caliper in the first image can be kept in a horizontal state, so that the rotated first image can be cropped to obtain a caliper image in the caliper range. The caliper image obtained after cutting is shown in fig. 3, part C.
Through the minimum external rectangle who determines slide caliper rule to the coordinate according to minimum external rectangle draws the first image in the minimum external rectangle scope, and, through the central point coordinate of slide caliper rule and the contained angle of arranging rotate first image, so that the slide caliper rule image is in the horizontality, so that tailor in order to draw the slide caliper rule image to the slide caliper rule image. The caliper image can be simply and conveniently extracted from the image to be positioned in such a way, so that the accuracy and the efficiency of the method are ensured.
After the caliper image within the caliper range is extracted, the electronic device can determine the target point of the caliper according to the parameter information and the caliper image. According to the parameters of the caliper, the target point of the caliper image in the caliper range can be extracted, so that the target point representing the edge line position in the caliper image can be accurately determined. By the method, the target point of the caliper image can be simply and accurately extracted, and therefore the operation efficiency of the method is improved.
In this embodiment, when the image of the caliper is in a horizontal state, the electronic device may determine the target point of the caliper by:
for example, the electronic device may perform gray processing on the caliper image, determine a pixel gray value horizontal arrangement matrix map of the caliper image, and determine the pixel gray value horizontal arrangement matrix map of the caliper image as shown in fig. 4. Of course, the gray processing on the image may also be performed before obtaining the caliper image, for example, the gray processing is performed after obtaining the image to be positioned, or the gray processing is performed after generating the caliper on the edge line, or the gray processing is performed during the extraction process of the caliper image, and the like, which is not limited herein, only the gray processing is performed before determining the pixel gray value level arrangement matrix map of the caliper image.
After determining the pixel gray value horizontally arranged matrix map of the caliper image, the electronic device may determine the step points in the caliper image according to the matrix map, so as to determine the target points (i.e., the target points are obtained after determining the coordinates of the step points) based on the parameter information and the step points, where the step points represent the points with the maximum pixel gray value variation.
And further determining a step point (a point with the maximum pixel gray value change) in the caliper image by determining the pixel gray value horizontal arrangement matrix map of the caliper image, so as to obtain a target point by combining the parameter information and the step point. The target point determined in this way may well represent the position of the edge line. Moreover, the mode of determining the target point is simple and convenient, and the operation efficiency of the method can be improved.
For example, the electronic device may determine a single-column vector from the matrix map, where each element in the single-column vector represents a mean value of a row element of the matrix map where the element is located (as shown in fig. 4). On one hand, the method does not need to preprocess the edge in the image, so that complete edge information in the image can be reserved, and on the other hand, because the influence on the actual edge is small when noise in a caliper area is filtered by the method, a single-row pixel image (namely a single-row vector) can be determined by adopting the method, so that the method flow is simplified, the operation efficiency is improved, and meanwhile, high precision can still be kept.
It should be noted that each element in the single-column vector here may be determined by an average value of row elements in which the element is located in the matrix diagram, but is not limited to this, and may also be determined by taking a highest value, or an average value of the highest value and the lowest value, for example.
After determining the single-column vector, the electronic device may derive the single-column vector, determine a point where the column element changes the most, and determine a row coordinate where the row where the point is located is the step point.
And the electronic equipment can also determine a single-row vector according to the matrix map, wherein each element in the single-row vector represents the mean value of the column element of the element in the matrix map. And then, deriving the single-row vector, determining a point with the maximum change of the row element, and determining the number of columns where the point is located as the column coordinate of the step point, so as to determine the step point in the caliper image according to the row coordinate of the step point and the column coordinate of the step point.
Determining a single-column vector through a matrix diagram (each element represents the mean value of the row element of the element in the matrix diagram), and then quickly and accurately determining the point with the maximum change of the column element based on the derivation mode of the single-column vector, so as to determine the row number of the point as the row coordinate of the step point; and determining a single row vector (each element represents the mean value of the row elements where the element is located in the matrix diagram), and then quickly and accurately determining the point with the maximum row element change based on the single row vector derivation mode, so as to determine that the column number where the point is located is the column coordinate of the jump point. By the method, the accuracy of the determined step point can be further improved, so that the accuracy of the method is ensured.
Of course, due to the size of the caliper, the edge line within the caliper is typically simple in construction, such as a slightly curved curve. Therefore, in determining the column coordinates of the step points, the column coordinates of the step points in the caliper image may be determined by taking the product of the number of columns of the matrix and a preset value (for example, 0.5, 0.6, or the like) (where the product is a decimal number, an integer value may be rounded) as the column coordinates of the step points. On one hand, the method can accurately determine the position where the pixel change is maximum, so that the position of the edge line is indicated by taking the point as the step point, and the accuracy can be ensured. On the other hand, when the column coordinates are determined, the column coordinates can be determined by taking the coordinates of the preset points (namely, taking the product of the column number and the preset value), so that the operation efficiency of the method can be ensured while the accuracy is considered.
In this embodiment, the electronic device may determine, based on the position (row coordinate and column coordinate) of the step point in the caliper image, the coordinate of the center point of the caliper in the image to be positioned, and the parameters such as the length and the width of the caliper, the coordinate of the step point in the image to be positioned to obtain the target point (the target point includes the coordinate position of the target point in the image to be positioned).
For example, the corresponding position coordinates of the step point position information (row coordinates and column coordinates) in the original image (i.e., the image to be positioned) can be calculated through the arrangement and the direction of the calipers. The conversion method comprises the following steps:
the coordinate relationship can be described as:
Figure BDA0002606290430000151
namely, the coordinates of the step points in the caliper image are rotated anticlockwise and then translated to obtain the coordinates of the step points in the image to be positioned:
Figure BDA0002606290430000152
wherein the content of the first and second substances,
Figure BDA0002606290430000153
representing the original coordinates (i.e. the coordinates in the image to be positioned),
Figure BDA0002606290430000154
Representing caliper coordinates (i.e. coordinates within the caliper image),
Figure BDA0002606290430000155
The center coordinate of the caliper is shown, and theta represents the included angle (namely the arrangement included angle) between the caliper and the horizontal direction.
In this way, the target point for each caliper can be determined. The electronic device may then perform step S40.
Step S40: and determining the position of the edge line in the image to be positioned according to the target point of each caliper.
In this embodiment, the electronic device may determine a corresponding position point in the image to be positioned according to the parameter information of each caliper and the step point of the caliper; and determining a fitting curve and positioning information of the fitting curve in the image to be positioned according to each corresponding position point, wherein the positioning information of the fitting curve represents the position of the edge line in the image to be positioned.
For example, the electronic device may fit a curve according to the geometry of a previously known target (edge line of the object) by using a least squares method:
given the function y (f) (x), x1, x2, xn correspond to y1, y2, yn.
Figure BDA0002606290430000161
The polynomial should satisfy the condition:
Figure BDA0002606290430000162
the expected curve can be obtained preliminarily by the above method.
And then, removing points deviating from the curve by an iterative point removing method, repeating the fitting step until the preset score is met, and finally outputting the geometric information and the position information of the curve, thereby determining the fitted curve and the positioning information of the fitted curve in the image to be positioned.
The edge positioning method provided by the embodiment of the application can accurately extract the outline edge position of the target, and the geometric center position and the geometric parameters of the target can be fitted through the edge. Experiments prove that the error between the geometric center position of the target and the actual position is about 0.6 pixel, and the maximum error is not more than 1 pixel. After the method is converted with the actual space size, the error is about 0.0096mm and is less than 1 wire, and the positioning precision is higher than that of the common machine vision in the market. Meanwhile, the single running time of the method is within 30ms, and the running efficiency is far higher than that of a common machine vision program on the market.
Referring to fig. 5, fig. 5 and 5 are block diagrams illustrating an edge positioning device according to an embodiment of the present disclosure.
The embodiment of the present application provides an edge positioning device 10, including:
the image obtaining module 11 is configured to obtain an image to be positioned, where the image to be positioned includes an edge line of an object.
A caliper generating module 12, configured to generate a plurality of calipers on the edge line according to the image to be positioned, where the edge line penetrates through two opposite sides of a rectangle represented by the calipers, the edge line is spaced from an adjacent side of the rectangle, and the adjacent side represents two sides of the rectangle adjacent to the opposite side.
And a target point determining module 13, configured to determine, for each caliper, a target point of the caliper according to the caliper and the image inside the caliper, where the target point is used to represent a position of the edge line inside the caliper.
And the edge positioning module 14 is configured to determine a position of the edge line in the image to be positioned according to the target point of each caliper.
In this embodiment, the caliper generating module 12 is further configured to estimate a position range of the edge line according to the image to be positioned; determining the arrangement range of the calipers according to the position range and the acquired parameter information of the calipers, wherein the parameter information comprises the length and the width of a rectangle represented by the calipers; generating a plurality of calipers within the arrangement range.
In this embodiment, the target point determining module 13 is further configured to extract a caliper image within the caliper range according to parameter information of the caliper, where the parameter information includes a length, a width, an arrangement included angle, and a coordinate of the caliper, and the caliper image includes a part of the edge line; and determining a target point of the caliper according to the parameter information and the caliper image.
In this embodiment, the parameter information includes a coordinate of a center point of the caliper, and the target point determining module 13 is further configured to determine a minimum circumscribed rectangle of the caliper and a coordinate of the minimum circumscribed rectangle according to the parameter information of the caliper; extracting a first image within the range of the minimum circumscribed rectangle according to the coordinates of the minimum circumscribed rectangle; rotating the first image according to the coordinates of the central point and the arrangement included angles so as to enable the calipers in the first image to be kept in a horizontal state; and cutting the rotated first image to obtain a caliper image within the caliper range.
In this embodiment, when the caliper image is in a horizontal state, the target point determining module 13 is further configured to perform gray processing on the caliper image to determine a pixel gray value horizontal arrangement matrix map of the caliper image; determining step points in the caliper image according to the matrix diagram, wherein the step points represent points with the maximum pixel gray value change; and determining the target point according to the parameter information and the jump point.
In this embodiment, the target point determining module 13 is further configured to determine a single-column vector according to the matrix map, where each element in the single-column vector represents a mean value of row elements in the matrix map where the element is located; deriving the single-column vector to determine a point with the maximum change of the column elements and determine the row number of the point as the row coordinate of the jump point; determining the number of columns of the matrix according to the matrix diagram, and taking the product of the number of columns and a preset value as the column coordinate of the jump point; and determining the step points in the caliper image according to the row coordinates of the step points and the column coordinates of the step points.
In this embodiment, the target point determining module 13 is further configured to determine a single-column vector according to the matrix map, where each element in the single-column vector represents a mean value of row elements in the matrix map where the element is located; deriving the single-column vector to determine a point with the maximum change of the column elements and determine the row number of the point as the row coordinate of the jump point; determining a single-row vector according to the matrix map, wherein each element in the single-row vector represents the mean value of the column element where the element is located in the matrix map; obtaining a derivative of the single-row vector, determining a point with the maximum change of the row elements, and determining the number of columns where the point is located as the column coordinates of the jump points; and determining the step points in the caliper image according to the row coordinates of the step points and the column coordinates of the step points.
In this embodiment, the parameter information of the caliper includes the length, the width, the arrangement included angle, and the coordinates of the caliper, and the edge positioning module 14 is further configured to determine a corresponding position point in the image to be positioned according to the parameter information of each caliper and the step point of the caliper; and determining a fitting curve and positioning information of the fitting curve in the image to be positioned according to each corresponding position point, wherein the positioning information of the fitting curve represents the position of the edge line in the image to be positioned.
Referring to fig. 6, fig. 6 is a block diagram of an electronic device 20 according to an embodiment of the present disclosure. In this embodiment, the electronic device 20 may be a server, such as a network server, a cloud server, a server cluster formed by a plurality of servers, and the like; the electronic device 20 may also be a terminal, such as a smart phone, a tablet computer, a personal computer, etc., and is not limited herein.
Illustratively, the electronic device 20 may include: a communication module 22 connected to the outside world via a network, one or more processors 24 for executing program instructions, a bus 23, a Memory 21 of different form, such as a magnetic disk, a ROM (Read-only Memory), or a RAM (Random Access Memory), or any combination thereof. The memory 21, the communication module 22 and the processor 24 are connected by a bus 23.
Illustratively, the memory 21 has stored therein a program. The processor 24 may call and run these programs from the memory 21 so that the edge location method can be performed by running the programs.
Also, a storage medium is provided in an embodiment of the present application, where one or more programs are stored, and the one or more programs are executable by one or more processors to implement the edge positioning method in the embodiment of the present application.
To sum up, the embodiments of the present application provide an edge positioning method, an edge positioning device, a storage medium, and an electronic device, where a plurality of calipers are generated on an edge line of an object included in an image to be positioned, and a target point of the calipers (indicating a position of the edge line in the calipers) is determined according to the calipers and the image in the calipers, so that a position of the edge line in the image to be positioned is determined according to the target point of each caliper. By the mode, the whole image does not need to be processed, so that the data processing amount can be reduced, the processing efficiency is improved, and the time and the cost are saved. In addition, in this way, the accurate positioning of the object edge can be realized without preprocessing the image (for example, performing convolution filtering on the horizontal edge and the vertical edge, and then synthesizing the edge map), on one hand, the accuracy of the positioning can be improved due to the fact that the information loss of the edge is avoided, on the other hand, the time and the cost can be further saved without performing such processing, and therefore, the high-precision positioning of the image can be realized with low cost and high efficiency.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the modules is only one logical division, and other divisions may be realized in practice, and some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (11)

1. An edge positioning method, comprising:
obtaining an image to be positioned, wherein the image to be positioned comprises an edge line of an object;
generating a plurality of calipers on the edge line according to the image to be positioned, wherein the edge line penetrates through two opposite sides of the rectangle represented by each caliper, the edge line is spaced from adjacent sides of the rectangle, and the adjacent sides represent two sides of the rectangle adjacent to the opposite sides;
for each caliper, determining a target point of the caliper according to the caliper and the image in the caliper, wherein the target point is used for representing the position of the edge line in the caliper;
and determining the position of the edge line in the image to be positioned according to the target point of each caliper.
2. The edge positioning method according to claim 1, wherein the generating a plurality of calipers on the edge line according to the image to be positioned comprises:
estimating the position range of the edge line according to the image to be positioned;
determining the arrangement range of the calipers according to the position range and the acquired parameter information of the calipers, wherein the parameter information comprises the length and the width of a rectangle represented by the calipers;
generating a plurality of calipers within the arrangement range.
3. The edge positioning method of claim 1, wherein determining the target point of the caliper from the caliper and the image within the caliper comprises:
extracting a caliper image within the caliper range according to parameter information of the caliper, wherein the parameter information comprises the length, the width, the arrangement included angle and the coordinate of the caliper, and the caliper image comprises a part of the edge line;
and determining a target point of the caliper according to the parameter information and the caliper image.
4. The edge positioning method according to claim 3, wherein the parameter information includes a center point coordinate of the caliper, and the extracting the image of the caliper within the caliper range according to the range represented by the caliper includes:
determining a minimum circumscribed rectangle of the caliper and a coordinate of the minimum circumscribed rectangle according to the parameter information of the caliper;
extracting a first image within the range of the minimum circumscribed rectangle according to the coordinates of the minimum circumscribed rectangle;
rotating the first image according to the coordinates of the central point and the arrangement included angles so as to enable the calipers in the first image to be kept in a horizontal state;
and cutting the rotated first image to obtain a caliper image within the caliper range.
5. The edge positioning method according to claim 3, wherein when the caliper image is in a horizontal state, the determining the target point of the caliper according to the parameter information and the caliper image includes:
carrying out gray processing on the caliper image to determine a pixel gray value horizontal arrangement matrix map of the caliper image;
determining step points in the caliper image according to the matrix diagram, wherein the step points represent points with the maximum pixel gray value change;
and determining the target point according to the parameter information and the jump point.
6. The edge location method of claim 5, wherein the determining step points in the caliper image from the matrix map comprises:
determining a single-column vector according to the matrix diagram, wherein each element in the single-column vector represents the mean value of the row element of the element in the matrix diagram;
deriving the single-column vector to determine a point with the maximum change of the column elements and determine the row number of the point as the row coordinate of the jump point;
determining the number of columns of the matrix according to the matrix diagram, and taking the product of the number of columns and a preset value as the column coordinate of the jump point;
and determining the step points in the caliper image according to the row coordinates of the step points and the column coordinates of the step points.
7. The edge location method of claim 5, wherein the determining step points in the caliper image from the matrix map comprises:
determining a single-column vector according to the matrix diagram, wherein each element in the single-column vector represents the mean value of the row element of the element in the matrix diagram;
deriving the single-column vector to determine a point with the maximum change of the column elements and determine the row number of the point as the row coordinate of the jump point;
determining a single-row vector according to the matrix map, wherein each element in the single-row vector represents the mean value of the column element where the element is located in the matrix map;
obtaining a derivative of the single-row vector, determining a point with the maximum change of the row elements, and determining the number of columns where the point is located as the column coordinates of the jump points;
and determining the step points in the caliper image according to the row coordinates of the step points and the column coordinates of the step points.
8. The edge positioning method according to any one of claims 1 to 7, wherein the parameter information of the calipers includes a length, a width, an arrangement included angle, and coordinates of the calipers, and the determining the position of the edge line in the image to be positioned according to the target point of each of the calipers includes:
determining a corresponding position point in the image to be positioned according to the parameter information of each caliper and the step point of the caliper;
and determining a fitting curve and positioning information of the fitting curve in the image to be positioned according to each corresponding position point, wherein the positioning information of the fitting curve represents the position of the edge line in the image to be positioned.
9. An edge positioning device, comprising:
the image acquisition module is used for acquiring an image to be positioned, wherein the image to be positioned comprises an edge line of an object;
a caliper generating module, configured to generate a plurality of calipers on the edge line according to the image to be positioned, where the edge line penetrates through two opposite sides of a rectangle represented by the calipers, the edge line is spaced from an adjacent side of the rectangle, and the adjacent side represents two sides of the rectangle adjacent to the opposite side;
a target point determining module, configured to determine, for each caliper, a target point of the caliper according to the caliper and the image inside the caliper, where the target point is used to represent a position of the edge line inside the caliper;
and the edge positioning module is used for determining the position of the edge line in the image to be positioned according to the target point of each caliper.
10. A storage medium storing one or more programs, the one or more programs being executable by one or more processors to implement the edge location method of any one of claims 1 to 8.
11. An electronic device comprising a memory for storing information including program instructions and a processor for controlling execution of the program instructions, characterized in that: the program instructions when loaded and executed by a processor implement the edge positioning method of any of claims 1 to 8.
CN202010741422.XA 2020-07-28 2020-07-28 Edge positioning method and device, storage medium and electronic equipment Pending CN111882570A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010741422.XA CN111882570A (en) 2020-07-28 2020-07-28 Edge positioning method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010741422.XA CN111882570A (en) 2020-07-28 2020-07-28 Edge positioning method and device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN111882570A true CN111882570A (en) 2020-11-03

Family

ID=73200937

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010741422.XA Pending CN111882570A (en) 2020-07-28 2020-07-28 Edge positioning method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN111882570A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1797470A (en) * 2004-12-28 2006-07-05 北京航空航天大学 Quick method for picking up stepped edge in sub pixel level
US20140029815A1 (en) * 2011-04-05 2014-01-30 Miranda Medical Limited Measurement system for medical images
CN103759672A (en) * 2014-01-15 2014-04-30 陈涛 Vision measurement method for ice cream stick plane contour dimensions
CN108197642A (en) * 2017-12-25 2018-06-22 山东浪潮云服务信息科技有限公司 A kind of seal discrimination method and device
CN109427066A (en) * 2017-08-31 2019-03-05 中国科学院微电子研究所 Edge detection method at any angle
CN110110697A (en) * 2019-05-17 2019-08-09 山东省计算中心(国家超级计算济南中心) More fingerprint segmentation extracting methods, system, equipment and medium based on direction correction
CN110906875A (en) * 2019-11-26 2020-03-24 湖北工业大学 Visual processing method for aperture measurement

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1797470A (en) * 2004-12-28 2006-07-05 北京航空航天大学 Quick method for picking up stepped edge in sub pixel level
US20140029815A1 (en) * 2011-04-05 2014-01-30 Miranda Medical Limited Measurement system for medical images
CN103759672A (en) * 2014-01-15 2014-04-30 陈涛 Vision measurement method for ice cream stick plane contour dimensions
CN109427066A (en) * 2017-08-31 2019-03-05 中国科学院微电子研究所 Edge detection method at any angle
CN108197642A (en) * 2017-12-25 2018-06-22 山东浪潮云服务信息科技有限公司 A kind of seal discrimination method and device
CN110110697A (en) * 2019-05-17 2019-08-09 山东省计算中心(国家超级计算济南中心) More fingerprint segmentation extracting methods, system, equipment and medium based on direction correction
CN110906875A (en) * 2019-11-26 2020-03-24 湖北工业大学 Visual processing method for aperture measurement

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
吴禄慎,项桔敏,胡贇: "基于机器视觉的卡尺工具法螺母实时检测系统", 《仪表技术与传感器》, pages 1 - 6 *
崔灿,景文博,闫娜,王晓曼: "基于边缘映射的轴承内外径尺寸检测", 《轴承》, pages 1 - 4 *

Similar Documents

Publication Publication Date Title
US11120254B2 (en) Methods and apparatuses for determining hand three-dimensional data
JP6902122B2 (en) Double viewing angle Image calibration and image processing methods, equipment, storage media and electronics
CN108573471B (en) Image processing apparatus, image processing method, and recording medium
CN111583381B (en) Game resource map rendering method and device and electronic equipment
CN105225218B (en) Distortion correction method and equipment for file and picture
JP2019091493A (en) System and method for efficiently scoring probes in image with vision system
CN111739020B (en) Automatic labeling method, device, equipment and medium for periodic texture background defect label
CN114049499A (en) Target object detection method, apparatus and storage medium for continuous contour
CN114331951A (en) Image detection method, image detection device, computer, readable storage medium, and program product
CN114758093A (en) Three-dimensional model generation method, device, equipment and medium based on image sequence
CN112785492A (en) Image processing method, image processing device, electronic equipment and storage medium
CN116071520A (en) Digital twin water affair simulation test method
CN109741306B (en) Image processing method applied to dangerous chemical storehouse stacking
JP2008158984A (en) Image processor and area tracing program
CN115423852A (en) Point cloud data registration method and system, electronic device and storage medium
EP4075381B1 (en) Image processing method and system
CN113627210A (en) Method and device for generating bar code image, electronic equipment and storage medium
CN113793349A (en) Target detection method and device, computer readable storage medium and electronic equipment
CN117115358A (en) Automatic digital person modeling method and device
CN111882570A (en) Edge positioning method and device, storage medium and electronic equipment
CN116385527A (en) Object positioning method, device and medium based on multi-source sensor
JP5051671B2 (en) Information processing apparatus, information processing method, and program
CN112991451B (en) Image recognition method, related device and computer program product
CN115374517A (en) Testing method and device for wiring software, electronic equipment and storage medium
CN113486941A (en) Live image training sample generation method, model training method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination