CN114913527A - Calibration method and system for camera calibration of scanning translation pen - Google Patents

Calibration method and system for camera calibration of scanning translation pen Download PDF

Info

Publication number
CN114913527A
CN114913527A CN202210411161.4A CN202210411161A CN114913527A CN 114913527 A CN114913527 A CN 114913527A CN 202210411161 A CN202210411161 A CN 202210411161A CN 114913527 A CN114913527 A CN 114913527A
Authority
CN
China
Prior art keywords
image
point
calibration
value
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210411161.4A
Other languages
Chinese (zh)
Inventor
张远雄
肖仁彪
刘丙洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Kaixin Microelectronics Technology Co ltd
Original Assignee
Zhuhai Kaixin Microelectronics Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Kaixin Microelectronics Technology Co ltd filed Critical Zhuhai Kaixin Microelectronics Technology Co ltd
Priority to CN202210411161.4A priority Critical patent/CN114913527A/en
Publication of CN114913527A publication Critical patent/CN114913527A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Position Input By Displaying (AREA)

Abstract

The invention provides a calibration method and a calibration system for a camera of a scanning translation pen, and relates to the technical field of scanning translation pens. The calibration process is completed by starting a calibration program preset in the pen, preprocessing the acquired image, selecting the image to be detected, capturing the boundary point coordinates of the image to be detected, continuously selecting a plurality of pixel points along the Y-axis and X-axis directions from the central point position of the image to be detected, and detecting and determining the coordinate data of the effective window boundary line and storing the coordinate data by giving the pixel points high values or wide values from small to large. The method can calibrate the effective window boundary line of the camera at one time after the scanning translation pen is produced, and the calibration process is simple and efficient.

Description

Calibration method and system for camera calibration of scanning translation pen
Technical Field
The invention relates to the technical field of scanning translation pens, in particular to a camera calibration method and a camera calibration system of a scanning translation pen.
Background
With the continuous development of image processing technology, the image character recognition based on the embedded processor can reach higher accuracy, so that an intelligent scanning translation pen and a dictionary pen based on the image character recognition are born, and a user can conveniently find unknown words, even a segment of characters and an article in a scanning mode. The definition of the effective visual boundary of the image shot by the camera of the scanning translation pen is the core requirement of the product, which directly influences the recognition of the characters of the subsequent image and the processing result of the algorithm, and is the key factor of the quality of the product.
In the prior art, the boundary line of the effective window of the camera of the scanning translation pen is calibrated in the actual production process, and the calibration and calibration mode mainly has the following defects:
1. the manual calibration and calibration method comprises the following steps: in actual production, calibration and calibration of the camera can be completed only by arranging special training, and understanding of each person on the calibration method and understanding of the judgment standard are different, so that the effective window boundary line of the camera is easily judged by mistake, and the production efficiency is low;
2. the simple black-white comparison algorithm has strict requirements on the ambient light and the mold result, and in the actual operation process, the complete consistency of the ambient light and the mold material is difficult to ensure, so that the deviation is easily caused;
3. the conventional judgment method needs to use paper with fine meshes, and the paper is black to the naked eye. If the paper has black stains or dust points, the stains or dust points cannot be found by naked eyes, so that misjudgment of the boundary line of the effective window of the camera is easily caused.
Therefore, the traditional calibration method is often failed to calibrate due to uneven light, material deviation of the die, material deviation of the camera module, stain of calibration paper and the like, so that the production efficiency of the scanning translation pen is low, meanwhile, the effective window boundary of the camera is easily judged by mistake, the product quality is influenced, and the experience of a user is felt.
Disclosure of Invention
The invention aims to provide a calibration method and a calibration system for camera calibration of a scanning translation pen, which can calibrate an effective window boundary line of a camera at one time after the scanning translation pen is produced, and the calibration process is simple and efficient.
The embodiment of the invention is realized by the following steps:
in a first aspect, an embodiment of the present application provides a calibration method for calibrating a camera of a scanning translation pen, which includes the following steps:
starting a calibration and calibration program preset in the scanning translation pen, and enabling the pen point of the scanning translation pen to slide on the flat pure white calibration paper to acquire a plurality of continuous images;
preprocessing all continuous images to obtain a series of sample images, and selecting one of the sample images as an image to be detected;
acquiring coordinate data of boundary points in an image to be detected, and establishing a reference data table according to the coordinate data;
starting from the position coordinate of the central point of the image to be detected, respectively and continuously selecting a plurality of pixel points along the Y-axis direction and the X-axis direction according to the coordinate of the boundary point in the reference data table, giving a width value from small to large to the selected pixel points on the Y-axis, and giving a high value from small to large to the selected pixel points on the X-axis;
comparing the pixel points endowed with the width values and the pixel points endowed with the width values with the boundary points in the reference data table in real time, and determining the coordinate data of the effective window boundary line corresponding to the pixel points;
calculating and obtaining coordinate values (x1, y1) and (x2, y2) of the final effective window boundary line according to the detected coordinate data of the effective window boundary line, and storing the coordinate values;
when the scanning translation pen is started again, the coordinate values of (x1, y1) and (x2, y2) are directly read, and the effective area range of the image intercepted by the camera is calibrated according to the coordinate values of (x1, y1) and (x2, y 2).
Based on the first aspect, in some embodiments of the present invention, the step of preprocessing all the continuous images to obtain a series of sample images specifically includes:
carrying out noise reduction processing on all continuous images;
carrying out graying processing on the noise-reduced image;
and carrying out binarization processing on the grayed image to obtain a series of sample images.
In some embodiments of the present invention, the step of preprocessing all the continuous images further comprises: and carrying out edge detection on the grayed image to obtain a series of sample images.
In some embodiments of the present invention, the step of selecting one of the series of sample images as the image to be detected specifically includes:
acquiring the brightness value of each pixel point in each sample image;
calculating the brightness difference value of each image to obtain difference value data, and calculating the median value of the difference value data;
and taking the sample image with the brightness difference value closest to the median as the image to be detected.
In some embodiments of the present invention, the step of continuously selecting a plurality of pixel points along the Y axis and the X axis respectively from the position coordinates of the central point of the image to be detected and according to the coordinates of the boundary points in the reference data table, and assigning a width value from small to large to the pixel points selected on the Y axis and a height value from small to large to the pixel points selected on the X axis specifically includes:
from the position of the centre point (x) of the image to be detected a ,y a ) Starting, selecting N at equal intervals along the Y axis according to the coordinates of boundary points in the reference data table 1 Selecting N from each pixel point at equal intervals downwards 2 Each pixel point is endowed with a width value w from small to large n
Get x a -w n Per 2 as the left frame value, x, of the pixel a +w n The/2 is used as the right frame value of the pixel point to obtain the left frame value and the right frame value of the pixel point;
starting from the central point position of the image to be detected, selecting M at equal intervals along the X axis to the left according to the coordinates of the boundary points in the reference data table 1 Selecting M at equal intervals to the right for each pixel point 2 Each pixel point is endowed with a high value h from small to large n
Get y a +h n /2 as the top frame value, y, of the pixel a -h n And/2, taking the lower frame value of the pixel point to obtain the upper frame value and the lower frame value of the pixel point.
In some embodiments of the present invention, the step of comparing the pixel point assigned with the width value and the pixel point assigned with the width value with the boundary point in the reference data table in real time to determine the coordinate data of the effective window boundary line corresponding to the pixel point specifically includes:
for the pixel point selected along the Y-axis direction, the left and right frame values are compared with the abscissa of the left and right boundary points corresponding to the pixel point in the reference data table in real time until-w n 2 or w n If the/2 is equal to the abscissa of the boundary point, determining the coordinate data of the left and right effective window boundary lines corresponding to the pixel point;
for the pixel point selected along the X-axis direction, comparing the upper and lower frame values with the vertical coordinates of the upper and lower boundary points corresponding to the pixel point in the reference data table in real time until-h n /2 or h n And 2, if the vertical coordinate of the boundary point is equal to the vertical coordinate of the boundary point, determining the coordinate data of the upper and lower effective window boundary lines corresponding to the pixel point.
In some embodiments of the present invention, the step of calculating and storing the coordinate values (x1, y1) and (x2, y2) of the final valid window boundary line according to the detected coordinate data of the valid window boundary line includes:
calculating the average value of the coordinate data of the left and right effective window boundary lines of the pixel points selected along the Y-axis direction to obtain left and right effective window boundary line values x1 and x 2;
calculating the average value of the coordinate data of the upper and lower effective window boundary lines of the pixel points selected along the X-axis direction to obtain upper and lower effective window boundary line values y1 and y 2;
the coordinate values (x1, y1) and (x2, y2) are stored as coordinate values of the effective window boundary line.
In a second aspect, an embodiment of the present application provides a camera calibration system for a scanning translation pen, which includes:
the image acquisition module is used for starting a calibration program preset in the scanning translation pen to enable the pen point of the scanning translation pen to slide on the flat pure white calibration paper so as to acquire a plurality of continuous images;
the image acquisition module to be detected is used for preprocessing all the continuous images to obtain a series of sample images, and selecting one sample image as an image to be detected;
the reference data table establishing module is used for acquiring coordinate data of boundary points in the image to be detected and establishing a reference data table according to the coordinate data;
the boundary detection module is used for continuously selecting a plurality of pixel points along the Y-axis direction and the X-axis direction respectively from the position coordinates of the central point of the image to be detected according to the coordinates of the boundary points in the reference data table, endowing the pixel points selected on the Y-axis with a width value from small to large, and endowing the pixel points selected on the X-axis with a high value from small to large;
the comparison module is used for comparing the pixel points endowed with the width values and the pixel points endowed with the width values with the boundary points in the reference data table in real time and determining the coordinate data of the effective window boundary line corresponding to the pixel points;
the effective window borderline calculation module is used for calculating and obtaining and storing coordinate values (x1, y1) and (x2, y2) of the final effective window borderline according to the coordinate data of the effective window borderline obtained by detection;
and the effective window boundary line reading module is used for directly reading the coordinate values of (x1, y1) and (x2, y2) when the scanning translation pen is started again, and calibrating the effective area range of the image intercepted by the camera according to the coordinate values of (x1, y1) and (x2, y 2).
In a third aspect, an embodiment of the present application provides an electronic device, which includes a memory for storing one or more programs; a processor. The one or more programs, when executed by the processor, implement the method as described in any of the first aspects above.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the method as described in any one of the above first aspects.
Compared with the prior art, the embodiment of the invention has at least the following advantages or beneficial effects:
the embodiment of the invention provides a camera calibration method and a camera calibration system for a scanning translation pen. Then, starting from the position coordinate of the central point of the image to be detected, respectively and continuously selecting a plurality of pixel points along the Y-axis direction and the X-axis direction according to the coordinate of the boundary point in the reference data table, giving a width value from small to large to the pixel points selected on the Y-axis, giving a height value from small to large to the pixel points selected on the X-axis, and determining the coordinate data of the effective window boundary line corresponding to the pixel points by comparing the pixel points giving the width values and the pixel points giving the width values with the boundary point in the reference data table in real time. Finally, the coordinate values (x1, y1) and (x2, y2) of the final effective window boundary line are calculated and stored according to the detected coordinate data of the effective window boundary line. When the scanning translation pen is started again, the effective area range of the image intercepted by the camera is directly calibrated according to the coordinate values of (x1, y1) and (x2, y 2). The method is simple in operation process and easy to realize, not only avoids the condition of calibration deviation caused by environmental light, mold materials and the like in the scanning translation pen production process, but also avoids the condition of misjudging the effective window boundary line of the camera during manual calibration and calibration, and effectively improves the calibration and calibration efficiency of the camera of the scanning translation pen.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
FIG. 1 is a block flow diagram of an embodiment of a calibration method for calibrating camera calibration of a scanning translation pen according to the present invention;
fig. 2 is an image acquired by a camera in an embodiment of a calibration method for calibrating calibration of a camera of a scanning translation pen according to the present invention;
fig. 3 is an image obtained after preprocessing an image acquired by a camera in an embodiment of a calibration method for calibrating calibration of a camera of a scanning translation pen according to the present invention;
FIG. 4 is a schematic diagram of a calibration method for calibrating calibration of a camera of a scanning translation pen according to an embodiment of the present invention, in which a width value is assigned to a pixel point selected along a Y-axis direction;
FIG. 5 is a schematic diagram of a calibration method for calibrating calibration of a camera of a scanning translation pen according to an embodiment of the present invention, in which a high value is assigned to a pixel point selected along the X-axis direction;
FIG. 6 is a schematic diagram of the calibration method for calibrating camera calibration of a scanning translation pen according to an embodiment of the present invention, in which left and right effective window boundary line data corresponding to pixels selected along the Y-axis direction are detected and determined;
FIG. 7 is a schematic diagram of the calibration method for calibrating the calibration of a camera of a scanning translation pen according to an embodiment of the present invention, in which the data of the upper and lower effective window boundary lines corresponding to the pixels selected along the X-axis direction are detected and determined;
FIG. 8 is a schematic diagram illustrating the determination of coordinate values (x1, y1) and (x2, y2) of the boundary line of the effective window according to an embodiment of the calibration method for calibrating the camera of the scanning translation pen;
FIG. 9 is a diagram illustrating a calibrated effective window boundary line in an embodiment of a calibration method for calibrating camera calibration of a scan translation pen according to the present invention;
FIG. 10 is a block diagram of a camera calibration system for a scan translation pen according to an embodiment of the present invention;
fig. 11 is a block diagram of an electronic device according to an embodiment of the present invention.
Icon: 11. an image acquisition module; 12. an image acquisition module to be detected; 13. a reference data table establishing module; 14. a boundary detection module; 15. a comparison module; 16. an effective window boundary line calculation module; 17. an effective window boundary line reading module; 101. a memory; 102. a processor; 103. a communication interface.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Examples
Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments described below and the individual features of the embodiments can be combined with one another without conflict.
Referring to fig. 1, fig. 1 is a block diagram illustrating a flowchart of a calibration method for calibrating a camera calibration of a scanning translation pen according to an embodiment of the present application. The calibration and calibration method for the camera of the scanning translation pen specifically comprises the following steps:
step S1: and starting a calibration and calibration program preset in the scanning translation pen, so that the pen point of the scanning translation pen slides on the flat pure white calibration paper to acquire a plurality of continuous images.
In the above steps, for example, a non-reflective pure white paper may be prepared as the calibration paper and placed flat. And then starting a calibration program preset in the scanning translation pen, and inclining the scanning translation pen by 60-80 degrees to enable the scanning translation pen to slide on the calibration paper for a certain distance, wherein in the process, a camera of the scanning translation pen collects an image at intervals. In order to improve the accuracy of calibration, more than 20 continuous images can be collected, so that the reference images can be conveniently selected for analysis. As shown in fig. 2, fig. 2 is a captured image.
Step S2: and preprocessing all the continuous images to obtain a series of sample images, and selecting one of the sample images as an image to be detected.
In the steps, the collected continuous images are preprocessed, so that the subsequent analysis of the brightness change trend of the pixel points in each image is facilitated, and a more appropriate image to be detected can be selected. The preprocessing method includes graying processing, binarization processing, and the like, and specifically, the steps of preprocessing all the continuous images are as follows:
step S2-1: carrying out noise reduction processing on all continuous images; because the image may be polluted by noise in the processes of acquisition and transmission, and the quality of the image is affected, filtering processing can be performed before the image is analyzed, so that noise components in the image are eliminated, and a smoother image is obtained. Illustratively, median filtering may be performed using a median filter. The median filter can protect sharp edges of the image while removing noise, and the blurring caused by the median filter is less than that caused by a linear smoothing filter with the same size, so that a basis is provided for carrying out subsequent graying processing and edge detection processing.
Step S2-2: carrying out graying processing on the noise-reduced image; because the acquired image adopts an RGB color mode, the RGB components are required to be processed respectively when the image is processed. In fact, RGB does not reflect morphological features of an image, and only color matching is performed in an optical principle, so that graying processing can be performed on the image to improve the operation speed, and meanwhile, the subsequent analysis of the brightness change trend of the image is facilitated. Illustratively, an average value method can be adopted for graying processing, and the gray value of each pixel point is obtained by calculating the average value of the brightness of three components of RGB of each pixel point in the image, so that a grayed image is obtained.
Step S2-3: and carrying out binarization processing on the grayed image to obtain a series of sample images. In this step, the gray value of each pixel point in the grayed image is compared with a preset threshold value, the gray value becomes 0 (black) when the gray value is smaller than or equal to the threshold value, and becomes 255 (white) when the gray value is larger than the threshold value, so as to obtain the binarized image (as shown in fig. 3), therefore, the aggregate property of the image is only related to the position of the point with the pixel value of 0 or 255, the multi-level value of the pixel is not related, the processing is simple, and the processing and compression amount of the data is smaller.
The step of preprocessing all the consecutive images further comprises:
step S2-4: and carrying out edge detection on the grayed image. The purpose of edge detection is to detect and identify a set formed by pixels with severe brightness change in an image, greatly reduce data volume, eliminate information which can be considered irrelevant, retain important structural attributes of the image, such as a boundary line of the image, and conveniently calibrate an effective window boundary line of a camera of a scanning translation pen in the follow-up process.
After all the continuous images are preprocessed through the steps, the step of selecting one image from the continuous images as the image to be detected is as follows:
step S2-5: acquiring the brightness value of each pixel point in each sample image; during the graying process of the image, the gray value of each pixel point in the image can be obtained, namely the gray value can be directly used as the brightness value of the pixel point, and basic data are provided for the subsequent calculation of the brightness difference value and the analysis of the brightness change condition.
Step S2-6: calculating the brightness difference value of each image to obtain difference value data, and calculating the median value of the difference value data; the general brightness change condition of each image can be obtained by calculating the brightness difference of each image, then the image with the minimum brightness difference and the image with the maximum brightness difference are removed, and the median (median) of the brightness differences of the rest images is calculated, so that a data base is provided for subsequently selecting the image to be detected.
Step S2-7: and taking the sample image with the brightness difference value closest to the median as the image to be detected. The sample image with the brightness difference value closest to the median is used as the image to be detected, so that the detection and calibration of the effective window boundary line of the camera of the scanning translation pen are facilitated.
Step S3: and acquiring coordinate data of boundary points in the image to be detected, and establishing a reference data table according to the coordinate data.
In the above steps, for the image to be detected after binarization processing, illustratively, an OpenCV software library may be used to extract an edge contour and coordinates of corresponding boundary points from the image to be detected, and establish a reference data table for storage, and then the reference data table is used as a comparison basis for later detection and determination of an effective window boundary line of a camera of a scanning translation pen. The process of extracting the contour by using the OpenCV software library belongs to the prior art, and is not described herein again.
Step S4: starting from the position coordinates of the central point of the image to be detected, continuously selecting a plurality of pixel points along the Y-axis direction and the X-axis direction according to the coordinates of the boundary points in the reference data table, giving a width value from small to large to the selected pixel points on the Y-axis, and giving a high value from small to large to the selected pixel points on the X-axis.
Step S5: and comparing the pixel points endowed with the width values and the pixel points endowed with the width values with the boundary points in the reference data table in real time, and determining the coordinate data of the effective window boundary line corresponding to the pixel points.
In the above steps S4-S5, the center point position (the center point position is at one half of the length and width) of the image to be detected is obtained first, then a plurality of pixels are continuously selected along the Y axis and the X axis respectively according to the coordinates of the boundary points in the reference data table, the width value of the selected pixel on the Y axis is increased from small to large to obtain the left and right frame values thereof, and the height value of the selected pixel on the X axis is increased from small to large to obtain the upper and lower frame values thereof. Then, comparing the pixel points endowed with the width value and the pixel points endowed with the width value with boundary points in a reference data table in real time, and adjusting the width value or the height value of a certain pixel point from small to large to ensure that the left frame and the right frame of the selected pixel point on the Y axis are close to the left effective window boundary line and the right effective window boundary line corresponding to the pixel point step by step, and the upper frame and the lower frame of the selected pixel point on the X axis are close to the upper effective window boundary line and the lower effective window boundary line corresponding to the pixel point step by step, thereby obtaining the coordinate data of the effective window boundary lines corresponding to the pixel points.
The step of starting from the position coordinate of the central point of the image to be detected, respectively and continuously selecting a plurality of pixel points along the directions of the Y axis and the X axis according to the coordinate of the boundary point in the reference data table, giving a small-to-large width value to the pixel points selected on the Y axis, and giving a small-to-large high value to the pixel points selected on the X axis specifically comprises the following steps:
step S4-1: starting from the central point position of the image to be detected, selecting N upward at equal intervals along the Y-axis direction according to the coordinates of the boundary points in the reference data table 1 Selecting N from each pixel point at equal intervals downwards 2 Each pixel point is endowed with a width value w from small to large n
Step S4-2: get x a -w n Per 2 as the left frame value, x, of the pixel a +w n The/2 is used as the right frame value of the pixel point to obtain the left frame value and the right frame value of the pixel point;
in the above steps, for example, the center position of the image to be detected is determined according to the length and the width of the image to be detected, for example, if the pixel of the image to be detected is 320 × 240, the center position coordinate of the image to be detected is (160,120), then from the center point (160,120), 35 pixels are selected at equal intervals in the Y-axis direction, and 30 pixels are selected at equal intervals in the downward direction, in this process, it is easy to know that the abscissa of the points is 160. Then, each selected pixel point is sequentially endowed with a width value w from small to large from top to bottom n And take x a -w n Per 2 as the left frame value, x, of the pixel a +w n And/2, obtaining the left frame value and the right frame value of the pixel point for subsequent comparison and use. For example, referring to fig. 4, for a certain selected point (160,180), if a width value of 200 is given, the left frame of the point is 160-200/2-60, and the right frame is 160+ 200/2-260; if the width value of 280 is given, the left frame of the point is 20, and the right frame is 300.
Step S4-3: starting from the central point position of the image to be detected, selecting M at equal intervals left along the X-axis direction according to the coordinates of the boundary points in the reference data table 1 Selecting M at equal intervals to the right for each pixel point 2 Each pixel point is endowed with a high value h from small to large n
Step S4-4: get y a +h n /2 as the top frame value, y, of the pixel a -h n And/2, taking the lower frame value of the pixel point to obtain the upper frame value and the lower frame value of the pixel point.
In the above steps, for example, 40 pixels are selected from the center point (160,120) at equal intervals to the left along the X-axis direction, and 40 pixels are selected at equal intervals to the bottom, and in this process, it is easy to know that the ordinate of these points is 120. Then, each pixel point is sequentially endowed with a high value h from small to large from left to right n And take y a +h n /2 as the top frame value, y, of the pixel a -h n And/2, taking the value of the lower frame of the pixel point to obtain the values of the upper frame and the lower frame of the pixel point so as to be used for comparison in the following process. For example, referring to fig. 5, for a certain selected point (240,120), if a high value of 160 is assigned, the upper frame of the point is 120+ 160/2-200, and the lower frame is 120-160/2-40; if the value is given as 220, the upper frame of the point is 230 and the lower frame is 10.
Referring to fig. 5, the step of comparing the pixel point with the width value and the pixel point with the width value with the boundary point in the reference data table in real time to determine the coordinate data of the effective window boundary line corresponding to the pixel point specifically includes:
step S5-1: for the pixel point selected along the Y-axis direction, comparing the left and right frame values with the abscissa of the left and right boundary points corresponding to the pixel point in the reference data table in real time until x a -w n /2 or x a +w n If the/2 is equal to the abscissa of the boundary point, determining the coordinate data of the left and right effective window boundary lines corresponding to the pixel point;
illustratively, the point (160,180) taken up along the center point of the image is assumed to have coordinates (20,180) for the left boundary point and (320,180) for the right boundary point. When the width value of 200 is given to the point (160,180), the left frame of the point is 60, the right frame of the point is 260, and the left frame and the right frame of the point are both in the boundary compared with the abscissa of the left boundary point and the right boundary point, so the width value is continuously increased until the left frame of the point (160,180) is 20 and the right frame of the point is 300 when the width value of 280 is given to the point, and the left frame and the left boundary are overlapped (as shown in figure 6), so the left frame value and the right frame value at the moment are recorded and used as the coordinate data of the left effective window boundary line and the right effective window boundary line corresponding to the point, and a data base is provided for the subsequent calculation of the final effective window boundary line.
Step S5-2: for the pixel point selected along the X-axis direction, comparing the upper and lower frame values with the vertical coordinates of the upper and lower boundary points corresponding to the pixel point in the reference data table in real time until y a -h n /2 or y a +h n And 2, if the vertical coordinate of the boundary point is equal to the vertical coordinate of the boundary point, determining the coordinate data of the upper and lower effective window boundary lines corresponding to the pixel point.
Illustratively, the point (240,120) taken to the left along the center point of the image is assumed to have coordinates of the upper boundary point (240,230) and the lower boundary point (240, 0). When the point (240,120) is given a high value of 160, the upper frame of the point is 200 and the lower frame is 40, and both the upper and lower frames of the point are within the boundary compared with the vertical coordinates of the upper and lower boundary points, so the high value is continuously increased until the upper frame is 230 and the lower frame is 10 when the point (240,120) is given a high value of 220, and the upper frame coincide with each other (as shown in fig. 7), so the values of the upper and lower frames at this time are recorded as the coordinate data of the upper and lower effective window boundary lines corresponding to the point, and a data basis is provided for the subsequent calculation of the final effective window boundary line.
Step S6: based on the detected coordinate data of the effective window border line, coordinate values (x1, y1) and (x2, y2) of the final effective window border line are calculated and stored. Specifically, please refer to fig. 8-9, which includes the following steps:
step S6-1: calculating the average value of the coordinate data of the left and right effective window boundary lines of the pixel points selected along the Y-axis direction to obtain left and right effective window boundary line values x1 and x 2;
illustratively, the coordinate of the center point is (160,120), the abscissa of the left and right effective window boundary lines of the pixel point (160,180) selected along the Y-axis direction is 20, 300, the abscissa of the left and right effective window boundary lines of the point (160, 60) is 10, 310, the left effective window boundary line x1 of the camera is (20+10)/2 is 15, and the right effective window boundary line x2 is (300+310)/2 is 305, so as to obtain the left and right effective window boundary line values of 15, 305.
Step S6-2: calculating the average value of the coordinate data of the upper and lower effective window boundary lines of the pixel points selected along the X-axis direction to obtain upper and lower effective window boundary line values y1 and y 2;
illustratively, the coordinate of the center point is (160,120), the ordinate of the upper and lower effective window boundary lines of the pixel point (240,120) selected along the X-axis direction is 230 and 0, and the ordinate of the upper and lower effective window boundary lines of the point (80,120) is 230 and 10, then the upper effective window boundary line y1 of the camera is (230+230)/2 is 230, and the lower effective window boundary line y2 is (10+0)/2 is 5, so as to obtain the upper and lower effective window boundary line values of 230 and 5.
Step S6-3: the coordinate values (x1, y1) and (x2, y2) are stored as coordinate values of the effective window boundary line.
Referring to fig. 8, the step S6-1 and the step S6-2 already calculate that the left effective window boundary x1 of the camera is 15, the right effective window boundary x2 is 305, the upper effective window boundary y1 is 230, and the lower effective window boundary y2 is 5, so that the coordinate values of the final effective window boundaries are (15,230) and (305,5) in combination. At this time, the image scanned by the camera and the effective window boundary line determined after calibration can be seen by scanning the display screen on the translation pen, as shown in fig. 9. Then clicking a confirmation button on the display screen to store the data of the boundary line of the effective window, and finally shutting down the computer to finish the calibration process.
Step S7: when the scanning translation pen is started again, the coordinate values of (x1, y1) and (x2, y2) are directly read, and the effective area range of the image intercepted by the camera is calibrated according to the coordinate values of (x1, y1) and (x2, y 2).
The valid window boundary line data has been obtained in step S6-3 above, completing the calibration process. So when the scanning translation pen is started again, the camera directly reads the valid window boundary line data (x1, y1) and (x2, y2) to mark the valid area range of the image intercepted by the camera.
Based on the same inventive concept, please refer to fig. 10, the present invention further provides a calibration system for calibrating a camera of a scanning translation pen, which specifically includes:
the image acquisition module 11 is used for starting a calibration program preset in the scanning translation pen, so that the pen point of the scanning translation pen slides on the flat pure white calibration paper to acquire a plurality of continuous images;
the image acquisition module 12 to be detected is used for preprocessing all the continuous images to obtain a series of sample images, and selecting one sample image as an image to be detected;
a reference data table establishing module 13, configured to acquire coordinate data of boundary points in the image to be detected, and establish a reference data table according to the coordinate data;
the boundary detection module 14 is configured to continuously select a plurality of pixel points along the Y-axis and X-axis directions respectively from the position coordinates of the central point of the image to be detected according to the coordinates of the boundary points in the reference data table, assign a width value from small to large to the pixel points selected on the Y-axis, and assign a height value from small to large to the pixel points selected on the X-axis;
the comparison module 15 is used for comparing the pixel points endowed with the width values and the pixel points endowed with the width values with the boundary points in the reference data table in real time, and determining the coordinate data of the effective window boundary line corresponding to the pixel points;
the effective window borderline calculating module 16 is used for calculating and obtaining and storing coordinate values (x1, y1) and (x2, y2) of the final effective window borderline according to the coordinate data of the detected effective window borderline;
and the effective window boundary line reading module 17 is used for directly reading the (x1, y1) and (x2, y2) coordinate values when the scanning translation pen is started again, and calibrating the effective area range of the image intercepted by the camera according to the (x1, y1) and (x2, y2) coordinate values.
Referring to fig. 11, fig. 11 is a schematic structural block diagram of an electronic device according to an embodiment of the present disclosure. The electronic device comprises a memory 101, a processor 102 and a communication interface 103, wherein the memory 101, the processor 102 and the communication interface 103 are electrically connected with each other directly or indirectly to realize the transmission or interaction of data. For example, the components may be electrically connected to each other via one or more communication buses or signal lines. The memory 101 may be used for storing software programs and modules, such as program instructions/modules corresponding to the electronic device 300 provided in the embodiments of the present application, and the processor 102 executes the software programs and modules stored in the memory 101, so as to execute various functional applications and data processing. The communication interface 103 may be used for communicating signaling or data with other node devices.
The Memory 101 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like.
The processor 102 may be an integrated circuit chip having signal processing capabilities. The Processor 102 may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
It will be appreciated that the configuration shown in fig. 11 is merely illustrative and that the electronic device may include more or fewer components than shown in fig. 11 or have a different configuration than shown in fig. 11. The components shown in fig. 11 may be implemented in hardware, software, or a combination thereof.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions may be stored in a computer-readable storage medium if they are implemented in the form of software functional modules and sold or used as separate products. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
To sum up, the embodiment of the present application provides a calibration method and a calibration system for a camera of a scanning translation pen, which includes the steps of starting a calibration program preset in the pen, preprocessing a collected image and selecting an image to be detected, then capturing boundary point coordinates of the image to be detected, then starting from a central point position of the image to be detected, continuously selecting a plurality of pixel points along a Y axis and an X axis direction respectively, and detecting and determining coordinate data of an effective window boundary line and storing the coordinate data by giving the pixel points high values or wide values from small to large, thereby completing a calibration process. The method can calibrate the effective window boundary line of the camera at one time after the scanning translation pen is produced, and the calibration process is simple and efficient.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made to the present application by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.
It will be evident to those skilled in the art that the application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.

Claims (10)

1. A calibration method for calibrating a camera of a scanning translation pen is characterized by comprising the following steps:
starting a calibration and calibration program preset in the scanning translation pen, and enabling the pen point of the scanning translation pen to slide on the flat pure white calibration paper to acquire a plurality of continuous images;
preprocessing all continuous images to obtain a series of sample images, and selecting one sample image as an image to be detected;
acquiring coordinate data of boundary points in an image to be detected, and establishing a reference data table according to the coordinate data;
starting from the position coordinate of the central point of the image to be detected, respectively and continuously selecting a plurality of pixel points along the Y-axis direction and the X-axis direction according to the coordinate of the boundary point in the reference data table, giving a width value from small to large to the selected pixel points on the Y-axis, and giving a high value from small to large to the selected pixel points on the X-axis;
comparing the pixel points endowed with the width values and the pixel points endowed with the width values with the boundary points in the reference data table in real time, and determining the coordinate data of the effective window boundary line corresponding to the pixel points;
calculating and obtaining coordinate values (x1, y1) and (x2, y2) of the final effective window boundary line according to the detected coordinate data of the effective window boundary line, and storing the coordinate values;
when the scanning translation pen is started again, the coordinate values of (x1, y1) and (x2, y2) are directly read, and the effective area range of the image intercepted by the camera is calibrated according to the coordinate values of (x1, y1) and (x2, y 2).
2. The method for calibration of camera calibration of a scanning translation pen according to claim 1, wherein said step of preprocessing all successive images to obtain a series of sample images comprises:
carrying out noise reduction processing on all continuous images;
carrying out graying processing on the noise-reduced image;
and carrying out binarization processing on the grayed image to obtain a series of sample images.
3. The method for camera calibration of a scan translation pen according to claim 2, wherein said steps further comprise:
and carrying out edge detection on the grayed image to obtain a series of sample images.
4. The method for calibration of camera calibration of a scanning translation pen according to claim 1, wherein said step of selecting one image to be detected comprises:
acquiring the brightness value of each pixel point in each sample image;
calculating the brightness difference value of each image to obtain difference value data, and calculating the median value of the difference value data;
and taking the sample image with the brightness difference value closest to the median as the image to be detected.
5. The method for calibration of camera calibration of a scanning translation pen according to claim 1, wherein said step of continuously selecting a plurality of pixels along the Y-axis and X-axis directions from the coordinates of the center point position of the image to be detected according to the coordinates of the boundary points in the reference data table, and assigning a width value from small to large to the selected pixels on the Y-axis, and assigning a high value from small to large to the selected pixels on the X-axis specifically comprises:
from the position of the centre point (x) of the image to be detected a ,y a ) Starting from the coordinate of the boundary point in the reference data table, selecting N along the Y-axis at equal intervals 1 Selecting N from each pixel point at equal intervals downwards 2 Each pixel point is endowed with a width value w from small to large n
Take x a -w n Per 2 as the left frame value, x, of the pixel a +w n The/2 is used as the right frame value of the pixel point to obtain the left frame value and the right frame value of the pixel point;
starting from the position of the central point of the image to be detected, selecting M1 pixel points at equal intervals along the X axis leftwards according to the coordinates of the boundary points in the reference data table, selecting M2 pixel points at equal intervals rightwards, and endowing each pixel point with a high value h from small to large n
Get y a +h n /2 as the pixelUpper frame value of point, y a -h n And/2, taking the lower frame value of the pixel point to obtain the upper frame value and the lower frame value of the pixel point.
6. The method for calibration and calibration of a camera of a scanning translation pen according to claim 5, wherein the step of comparing the pixel points assigned with the width value and the boundary points in the reference data table in real time to determine the coordinate data of the effective window boundary line corresponding to the pixel points specifically comprises:
for the pixel point selected along the Y-axis direction, comparing the left and right frame values with the horizontal coordinates of the left and right boundary points corresponding to the pixel point in the reference data table in real time until x a -w n /2 or x a +w n If the/2 is equal to the abscissa of the boundary point, determining the coordinate data of the left and right effective window boundary lines corresponding to the pixel point;
for the pixel points selected along the X-axis direction, comparing the values of the upper and lower frames with the vertical coordinates of the upper and lower boundary points corresponding to the pixel points in the reference data table in real time until y a +h n /2 or y a -h n And 2, if the vertical coordinate of the boundary point is equal to the vertical coordinate of the boundary point, determining the coordinate data of the upper and lower effective window boundary lines corresponding to the pixel point.
7. The method for calibrating camera calibration of a scanning translation pen according to claim 1, wherein the step of calculating and storing the final coordinate values (x1, y1) and (x2, y2) of the effective window boundary line based on the detected coordinate data of the effective window boundary line comprises:
calculating the average value of the coordinate data of the left and right effective window boundary lines of the pixel points selected along the Y-axis direction to obtain left and right effective window boundary line values x1 and x 2;
calculating the average value of the coordinate data of the upper and lower effective window boundary lines of the pixel points selected along the X-axis direction to obtain upper and lower effective window boundary line values y1 and y 2;
the coordinate values (x1, y1) and (x2, y2) are stored as coordinate values of the effective window boundary line.
8. A camera calibration system for a scanning translation pen, comprising:
the image acquisition module is used for starting a calibration program preset in the scanning translation pen to enable the pen point of the scanning translation pen to slide on the flat pure white calibration paper so as to acquire a plurality of continuous images;
the image acquisition module to be detected is used for preprocessing all the continuous images to obtain a series of sample images, and selecting one sample image as an image to be detected;
the reference data table establishing module is used for acquiring coordinate data of boundary points in the image to be detected and establishing a reference data table according to the coordinate data;
the boundary detection module is used for continuously selecting a plurality of pixel points along the Y-axis direction and the X-axis direction respectively from the position coordinates of the central point of the image to be detected according to the coordinates of the boundary points in the reference data table, endowing the pixel points selected on the Y-axis with a width value from small to large, and endowing the pixel points selected on the X-axis with a high value from small to large;
the comparison module is used for comparing the pixel points endowed with the width values and the pixel points endowed with the width values with the boundary points in the reference data table in real time and determining the coordinate data of the effective window boundary line corresponding to the pixel points;
the effective window borderline calculation module is used for calculating and obtaining and storing coordinate values (x1, y1) and (x2, y2) of the final effective window borderline according to the coordinate data of the effective window borderline obtained by detection;
and the effective window boundary line reading module is used for directly reading the coordinate values of (x1, y1) and (x2, y2) when the scanning translation pen is started again, and calibrating the effective area range of the image intercepted by the camera according to the coordinate values of (x1, y1) and (x2, y 2).
9. An electronic device, comprising:
a memory for storing one or more programs;
a processor;
the one or more programs, when executed by the processor, implement the method of any of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-7.
CN202210411161.4A 2022-04-19 2022-04-19 Calibration method and system for camera calibration of scanning translation pen Pending CN114913527A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210411161.4A CN114913527A (en) 2022-04-19 2022-04-19 Calibration method and system for camera calibration of scanning translation pen

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210411161.4A CN114913527A (en) 2022-04-19 2022-04-19 Calibration method and system for camera calibration of scanning translation pen

Publications (1)

Publication Number Publication Date
CN114913527A true CN114913527A (en) 2022-08-16

Family

ID=82764425

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210411161.4A Pending CN114913527A (en) 2022-04-19 2022-04-19 Calibration method and system for camera calibration of scanning translation pen

Country Status (1)

Country Link
CN (1) CN114913527A (en)

Similar Documents

Publication Publication Date Title
CN104809422B (en) QR code recognition methods based on image procossing
CN113688807B (en) Self-adaptive defect detection method, device, recognition system and storage medium
CN105894036B (en) A kind of characteristics of image template matching method applied to mobile phone screen defects detection
CN106960208A (en) A kind of instrument liquid crystal digital automatic segmentation and the method and system of identification
CN115908269B (en) Visual defect detection method, visual defect detection device, storage medium and computer equipment
CN108665458A (en) Transparent body surface defect is extracted and recognition methods
CN113538603B (en) Optical detection method and system based on array product and readable storage medium
EP3563345A1 (en) Automatic detection, counting, and measurement of lumber boards using a handheld device
CN110807763A (en) Method and system for detecting ceramic tile surface bulge
CN112419207A (en) Image correction method, device and system
CN114255212A (en) FPC surface defect detection method and system based on CNN
CN116503388A (en) Defect detection method, device and storage medium
CN114255223A (en) Deep learning-based method and equipment for detecting surface defects of two-stage bathroom ceramics
CN114998217A (en) Method for determining defect grade of glass substrate, computer device and storage medium
CN112560538A (en) Method for quickly positioning damaged QR (quick response) code according to image redundant information
TWI428807B (en) Optical coordinate input device and coordinate calculation method thereof
CN112669272A (en) AOI rapid detection method and rapid detection system
CN112634259A (en) Automatic modeling and positioning method for keyboard keycaps
CN114913527A (en) Calibration method and system for camera calibration of scanning translation pen
Luo et al. Adaptive canny and semantic segmentation networks based on feature fusion for road crack detection
CN113744200B (en) Camera dirt detection method, device and equipment
CN115184362A (en) Rapid defect detection method based on structured light projection
CN114937003A (en) Multi-type defect detection system and method for glass panel
CN112329774A (en) Commodity size table automatic generation method based on image
JP2001126027A (en) Number plate recognition system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination