CN108205395B - Method for accurately positioning center coordinates of calibration points - Google Patents

Method for accurately positioning center coordinates of calibration points Download PDF

Info

Publication number
CN108205395B
CN108205395B CN201810042165.3A CN201810042165A CN108205395B CN 108205395 B CN108205395 B CN 108205395B CN 201810042165 A CN201810042165 A CN 201810042165A CN 108205395 B CN108205395 B CN 108205395B
Authority
CN
China
Prior art keywords
point
circle
center
gradient
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810042165.3A
Other languages
Chinese (zh)
Other versions
CN108205395A (en
Inventor
汪俊锋
邓宏平
高祥
邢川生
任维蒙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Huishi Jintong Technology Co ltd
Original Assignee
Anhui Huishi Jintong Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Huishi Jintong Technology Co ltd filed Critical Anhui Huishi Jintong Technology Co ltd
Priority to CN201810042165.3A priority Critical patent/CN108205395B/en
Publication of CN108205395A publication Critical patent/CN108205395A/en
Application granted granted Critical
Publication of CN108205395B publication Critical patent/CN108205395B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a method for accurately positioning the center coordinates of a calibration point, which comprises the following steps: drawing a calibration point and acquiring a gray image of the calibration point; performing thresholding operation and connected domain detection on the gray level image to obtain a calibration point region; calculating to obtain the center coordinates of the calibration point connected domain and the external rectangle of the connected domain, and enlarging the size of the external rectangle; extracting the outline corresponding to the connected domain of the calibration point of the corresponding area in the gray level image according to the size and the center point coordinate of the circumscribed rectangle; performing circle fitting to obtain an initial central point of the contour; taking the central point as the center, emitting rays to the periphery, and calculating the gradient amplitude; traversing all the star rays to obtain a maximum gradient value point on each star ray, recording the maximum gradient value point as a key point, and completing circle fitting through the key point to obtain a circle; and rechecking the obtained circle and verifying the positioning result. The invention realizes the high-precision solving of the center coordinates of the positioning area, and meets the problem of positioning accuracy to a greater extent.

Description

Method for accurately positioning center coordinates of calibration points
Technical Field
The invention relates to the technical field of projection interaction, in particular to a method for accurately positioning center coordinates of a calibration point.
Background
The projection interaction system is a convenient man-machine interaction mode. The projection interactive system is used for realizing multi-point touch and realizing the touch interactive function under a large-size projection picture, such as multi-player interactive games, multi-player interactive operation in infant teaching and the like. The calibration of the projection area and the display screen area and the establishment of the mapping relation are the crucial processes of the projection interactive system. The accurate positioning of the calibration point directly affects the calibration quality.
The patent proposes a method (star ray method) that can obtain the sub-pixel coordinates of the calibration point, thus providing high quality mapping point coordinates for the calibration process.
For the current coordinate calibration, a positioning area is mainly obtained, and then the center of gravity of the positioning area is used as the center of the positioning area, so that accurate positioning is realized. Therefore, this method is not well suited for accurate positioning of various areas.
In the patent, the central coordinate of the positioning area is solved with high precision by combining the gradient amplitude of the positioning area and the distance from the boundary point of the positioning area to the central coordinate of the positioning area, so that the problem of positioning accuracy is met to a greater extent.
Disclosure of Invention
The invention aims to provide a method for accurately positioning the center coordinates of a calibration point, which can solve the center coordinates of a positioning area with high precision and meet the problem of positioning accuracy to a greater extent.
In order to achieve the purpose, the invention adopts the following technical scheme:
a method for accurately positioning the center coordinates of a calibration point comprises the following steps:
(1) drawing a calibration point on a computer screen;
(2) collecting a computer projection picture, and acquiring a gray image of a calibration point;
(3) carrying out thresholding operation on the gray level image;
(4) detecting a connected domain of the image after the threshold value to obtain a calibration point region;
(5) analyzing each connected domain, calculating to obtain the center coordinates of the calibration point connected domain and the circumscribed rectangle of the connected domain, and enlarging the size of the circumscribed rectangle;
(6) extracting the outline corresponding to the connected domain of the calibration point of the corresponding area in the gray level image according to the size and the center point coordinate of the expanded circumscribed rectangle;
(7) performing circle fitting on the obtained contour region to obtain an initial central point of the contour;
(8) taking the central point as the center, emitting rays to the periphery, and calculating the gradient amplitude;
(9) traversing all the star rays to obtain a maximum gradient value point on each star ray, and recording the maximum gradient value point as a key point;
(10) randomly selecting three key points, and finishing circle fitting through the key points to obtain a circle;
(11) and rechecking the obtained circle and verifying the positioning result.
Further, the method for emitting rays to the periphery by taking the central point as the center and calculating the gradient amplitude specifically comprises the following steps:
(81) the method comprises the following steps of (1) emitting star rays to the periphery by taking a central point as a center, wherein the angle interval between adjacent star rays is 1 degree;
(82) traversing all the star rays, starting from a central point, and performing pixel acquisition once every other pixel along the star rays in the current direction;
(83) extracting two pixels before and after the current pixel, and calculating a gradient value;
(84) all gradient amplitudes on each of the star rays are computed cyclically.
Further, traversing all the star rays to obtain a maximum gradient value point on each star ray, and recording the maximum gradient value point as a key point, specifically comprising the following steps:
(91) traversing gradient values of all pixel points on each star ray;
(92) finding the position of the pixel point with the maximum gradient value;
(93) setting a gradient threshold, if the maximum gradient value on the star ray is smaller than the gradient threshold, no key point exists in the direction, continuously searching key points on the next star ray, and if the maximum gradient value is larger than the gradient threshold, recording the maximum gradient point and marking as the key point;
(94) and traversing all the star rays to obtain key points on each star ray.
Further, the randomly selecting three key points, and completing circle fitting through the key points to obtain a circle specifically includes the following steps:
(101) acquiring a key point set, and randomly selecting three key points from the set each time;
(102) calculating the circle center positions and the radiuses corresponding to the three key points by using the three key points through a perpendicular bisector method to obtain a circle Oi
(103) Counting the number of the remaining key points which are not randomly drawn and are located in the circle OiAnd the ratio of these points is counted and recorded as Bi
(104) Circularly executing the steps (101) to (103);
(105) in finding the number of cycles, BiThe circle with the highest value is marked as a circle P, and the center of the circle is marked as a circle cp
(106) If B isiIf the maximum value of the radius is less than 0.9, circle fitting needs to be performed again, and the circle center position and the radius are obtained again by using the peripheral points corresponding to the circle P.
Further, the retrieving the circle center position and the radius by using the peripheral points corresponding to the circle P specifically includes the following steps:
(A) let the set of peripheral points corresponding to the circle P be I { (x)i,ji)};
(B) Finding a coordinate point (X)k,Yk) The coordinate point (X)k,Yk) Satisfies the relation between c and the center of circlepDoes not exceed 5 pixels;
(C) for each coordinate point (X)k,Yk) Finding a coordinate point (X)k,Yk) And the set I { (x)i,ji) The sum of the distances of all coordinate points in the symbol is recorded as dkWherein j is the number of data in the set I;
Figure BDA0001549200850000031
(D) finding dkObtaining the coordinate point (X) corresponding to the minimum valuek,Yk) The position of the center of the circle is (X)k,Yk) The radius of the circle is
Figure BDA0001549200850000032
Further, the reviewing the acquired circle and the verifying the positioning result specifically include the following steps:
(111) traversing the circumference, and solving and recording gradient values of all coordinate points on the circumference;
(112) sorting the gradient values in the order from small to large;
(113) acquiring a gradient value at the 10% position and a gradient value at the 90% position;
(114) calculating the absolute value of the difference value of the two gradient values, if the difference value is less than or equal to the threshold value, the circular positioning is successful, if the difference value is greater than the threshold value, the positioning is failed, and the calibration point cannot be used for calibration of projection interaction
According to the technical scheme, the method for accurately positioning the center coordinate of the calibration point disclosed by the invention can be used for solving the center coordinate of the positioning area at high precision by combining the gradient amplitude of the positioning area and the distance from the boundary point of the positioning area to the center coordinate of the positioning area, so that the problem of positioning accuracy is met to a greater extent.
Drawings
FIG. 1 is a flow chart of a method of the present invention;
FIG. 2 is a flow chart of initial positioning and image extraction of the present invention;
FIG. 3 is a flow chart of the circle fitting of the present invention.
Detailed Description
The invention is further described below with reference to the accompanying drawings:
as shown in fig. 1-3, a method for accurately positioning coordinates of a center of a calibration point includes the following steps:
s1: drawing a calibration point on a computer screen;
s2: projecting a picture on a computer screen onto a wall of a projection area by using a projector, and controlling a camera to acquire the picture of the projection area by the computer through an instruction so as to acquire a gray image G with a calibration point;
s3: carrying out thresholding operation on the gray level image;
the threshold value of this embodiment is an empirical threshold value, and the thresholding principle is as follows: traversing each pixel point in the gray image, if the gray value corresponding to the point (x, y) is less than the threshold value, resetting the value of the pixel of the point to 0, namely marking the point as black, otherwise marking the point as white;
s4: performing connected domain detection on the valved image to obtain a calibration point region, namely classifying adjacent white pixel points into the same connected domain;
s5: analyzing each connected domain, calculating to obtain the center coordinates of the calibration point connected domain and the circumscribed rectangle of the connected domain, and enlarging the size of the circumscribed rectangle;
the center point coordinate calculation formula of the connected domain may be as shown in formulas (1) - (3). In formula (1), i and j represent the positions of the abscissa and the ordinate, respectively, in the connected component, and f (i, j) represents the pixel value corresponding to the point (i, j) in the connected component. Wherein formula (2) represents the abscissa position of the center point, and formula (3) represents the ordinate position of the center point.
Figure BDA0001549200850000041
Figure BDA0001549200850000042
Figure BDA0001549200850000051
S6: extracting the outline corresponding to the connected domain of the calibration point of the corresponding area in the gray level image according to the size and the center point coordinate of the expanded circumscribed rectangle;
and expanding the circumscribed rectangle of the connected domain, so that the expanded circumscribed rectangle has an area which is enlarged by 10 pixels compared with the previous matrix (ensuring that the central point of the circumscribed rectangle is not changed). And extracting a corresponding region G in the gray level image according to the dimension and the central point coordinate of the re-expanded circumscribed rectangle, and marking the region G as P. By enlarging the size of the external rectangle of the connected domain, more effective information can be obtained, and a foundation is laid for accurately searching the center of the connected domain.
S7: performing circle fitting on the obtained contour region to obtain an initial circle center of the contour, and recording the initial circle center as c, wherein the method specifically comprises the following steps:
the equation for a circle in a cartesian coordinate system is shown in equation (4) where (a, b) denotes the center of the circle and r denotes the radius, and equations (5) and (6) can be derived from equation (4).
(x-a)2+(y-b)2=r2 (4)
a=x-rcosθ (5)
b=y-rsinθ (6)
Therefore, in the three-dimensional coordinate system formed by a, b and r, a point can define a circle, and a point in the same Cartesian coordinate system corresponds to a curve in the three-dimensional coordinate system of a, b and r. Assuming that n points are located on the same circle in cartesian coordinates, the n points correspond to n curves in the three-dimensional coordinate system of a, b and r, and the n curves intersect with one point. Therefore, by determining the number of intersections (accumulation) of each point in the three-dimensional coordinate system of a, b and r, if the number of intersections of the point is greater than a certain threshold value, the point is considered as a circle.
S8: taking the central point as the center, emitting rays to the periphery, and calculating the gradient amplitude;
if the rays are emitted to the periphery by taking c as the center, the length of the star ray just covers the whole area P, and the gradient amplitude is calculated as follows:
s81: taking c as a center, emitting rays to the periphery, traversing all the star rays with the angle interval of the adjacent star rays being 1 degree;
s82: starting from c, carrying out pixel acquisition once every other pixel along the star ray in the current direction;
s83: extracting two pixels before and after the current pixel, and calculating a gradient value;
s84: after all the gradient amplitudes of the star ray are calculated, the steps S81-S83 are repeated for the next star ray until all the star rays are finished.
S9: traversing all the star rays to obtain a maximum gradient value point on each star ray, and recording the maximum gradient value point as a key point;
in computer vision, for a certain point of a circumferential point, the gradient value of the point is relatively large with respect to the gradient value of the point inside the circle. Therefore, the detection of the point on the circumferential boundary can be realized by utilizing the characteristic, which is as follows:
s91: for each star ray, traversing the gradient values of all pixel points on the star ray (namely, in the direction of the star ray, the difference between two adjacent pixel values, and subtracting the previous pixel value from the next pixel value).
S92: finding the position of the pixel point with the maximum gradient value;
s93: a gradient threshold is set, for example, to 50 (empirical value).
S94: if the maximum gradient value is smaller than the gradient threshold value on the star ray, the key point is not considered to exist in the direction (the gradient value on the circumference cannot be too small), and the key point is searched continuously on the next star ray. If the maximum gradient value is larger than the gradient threshold value, recording the maximum gradient point, recording as a key point, and taking the key point as a boundary point of the circumference in the direction of the star ray;
s95: and traversing all the star rays to obtain the key points on each star ray.
S10: randomly selecting three key points, and finishing circle fitting through the key points to obtain a circle;
since noise points are inevitably present among the key points, there is a case where the key point detection is erroneous. These noisy and erroneous points can interfere with the circle fit, so i need to reduce the effect of noisy point interference.
The specific method comprises the following steps:
(1) acquiring a key point set, and randomly selecting three key points from the set each time;
(2) calculating the circle center positions and the radiuses corresponding to the three key points by using the three key points and through a perpendicular bisector method to obtain a circle Oi: assuming that there are three coordinate points A, B, C, the coordinate points A, B and B, C are connected to obtain a line segment LABAnd LBC(ii) a Finding the line segment LABPerpendicular bisector T1And a line segment LBCPerpendicular bisector T2(ii) a Perpendicular bisector T1And T2The intersection point of the coordinate point A is a center point, and the distance from the coordinate point A to the center point is a radius;
(3) counting the number of the remaining key points which are not randomly drawn and are located in the circle OiAnd the ratio of these points is counted and denoted as Bi
(4) Circularly executing the steps (1) - (3) for 1000 times (i is less than or equal to 1 and less than or equal to 1000);
(5) found in this 1000 times, BiThe circle with the highest value is recorded as a circle P, and the circle center is recorded as cp;
(6) if B is presentiIf the maximum value of (a) is less than 0.9 (an empirical value), then the circle fitting needs to be performed again, i.e. the fitting accuracy is considered to be relatively poor;
(7) and (3) re-acquiring the circle center position and the radius with more accurate precision by using the peripheral points corresponding to the circle P (namely, the distance from the key points to the circumference of the circle P is less than 3 pixels).
The specific method comprises the following steps:
(71) the set of peripheral points corresponding to the circle P is denoted as I { (x)i,ji)};
(72) Finding a coordinate point (X)k,Yk) Wherein the coordinate point (X)k,Yk) Satisfies the relation between c and the center of circlepDoes not exceed 5 pixels;
(73) for each coordinate point (X)k,Yk) To find out the coordinatesDot (X)k,Yk) And the set I { (x)i,ji) The sum of the distances of all coordinate points in the symbol is recorded as dkAnd j is the number of data in the set I.
Figure BDA0001549200850000071
(74) Finding dkWhen the minimum value is obtained, the corresponding coordinate point (X) is obtainedk,Yk) The center of the circle is (X)k,Yk) The radius of the circle is
Figure BDA0001549200850000072
S11: the method mainly comprises the steps of rechecking the circle obtained by the small points and judging whether the circle needs to be filtered or not.
After the final positioning is completed, verification of the circular positioning result is also required. The specific method comprises the following steps:
traversing the circumference, and solving and recording gradient values of all coordinate points on the circumference; sequencing the gradient values in a sequence from small to large; acquiring a gradient value at the 10 th% position and a gradient value at the 90 th% position in the gradient value sequencing sequence, wherein the two values are the same; the absolute value of the difference between the two gradient values is calculated. Since it is unlikely that the gradient maxima and minima will differ significantly on the same circle, a circular location will be considered successful if the difference is less than or equal to a threshold value (e.g., 20). If the value is larger than the threshold value, the positioning is failed, and the calibration point cannot be used for calibrating the projection interaction.
The above-mentioned embodiments are merely illustrative of the preferred embodiments of the present invention, and do not limit the scope of the present invention, and various modifications and improvements of the technical solution of the present invention by those skilled in the art should fall within the protection scope defined by the claims of the present invention without departing from the spirit of the present invention.

Claims (5)

1. A method for accurately positioning the center coordinates of a calibration point is characterized by comprising the following steps:
(1) drawing a calibration point on a computer screen;
(2) collecting a computer projection picture, and acquiring a gray image of a calibration point;
(3) carrying out thresholding operation on the gray level image;
(4) detecting a connected domain of the image after the threshold value to obtain a calibration point region;
(5) analyzing each connected domain, calculating to obtain the center coordinates of the calibration point connected domain and the circumscribed rectangle of the connected domain, and enlarging the size of the circumscribed rectangle;
(6) extracting the outline corresponding to the connected domain of the calibration point of the corresponding area in the gray level image according to the size and the center point coordinate of the expanded circumscribed rectangle;
(7) performing circle fitting on the obtained contour region to obtain an initial central point of the contour;
(8) taking the central point as the center, emitting rays to the periphery, and calculating the gradient amplitude;
(9) traversing all the rays to obtain a gradient amplitude maximum point on each ray, and recording the gradient amplitude maximum point as a key point;
(10) randomly selecting three key points, and finishing circle fitting through the key points to obtain a circle;
the method comprises the following steps of randomly selecting three key points, completing circle fitting through the key points, and obtaining a circle, wherein the method specifically comprises the following steps:
(101) acquiring a key point set, and randomly selecting three key points from the set each time;
(102) calculating the circle center positions and the radiuses corresponding to the three key points by using the three key points through a perpendicular bisector method to obtain a circle Oi
(103) Counting the number of the remaining key points which are not randomly drawn and are located in the circle OiAnd the ratio of these points is counted and recorded as Bi
(104) Circularly executing the steps (101) to (103);
(105) find the number of cyclesIn the number BiThe circle with the highest value is marked as a circle P, and the center of the circle is marked as a circle cp
(106) If B isiIf the maximum value of the center point P is less than 0.9, circle fitting needs to be carried out again, and the circle center position and the radius are obtained again by using the peripheral points corresponding to the circle P;
(11) and rechecking the obtained circle and verifying the positioning result.
2. The method for accurately positioning coordinates of center of index point according to claim 1, wherein: in the step (8), the method comprises the following steps of using a central point as a center, emitting rays to the periphery, and calculating the gradient amplitude value:
(81) taking a central point as a center, emitting rays to the periphery, wherein the angle interval between adjacent rays is 1 degree;
(82) traversing all rays, starting from a central point, and performing pixel acquisition once every other pixel along the rays in the current direction;
(83) extracting two pixels before and after the current pixel, and calculating a gradient amplitude;
(84) all gradient magnitudes on each ray are computed cyclically.
3. The method for accurately positioning coordinates of center of index point according to claim 1, wherein: in the step (9), traversing all the rays to obtain a point with the maximum gradient amplitude on each ray, and marking as a key point, specifically comprising the following steps:
(91) traversing the gradient amplitudes of all pixel points on each ray;
(92) finding the position of the pixel point with the maximum gradient amplitude;
(93) setting a gradient threshold, if the maximum gradient amplitude on the ray is smaller than the gradient threshold, no key point exists in the direction, continuously searching the key point on the next ray, and if the maximum gradient amplitude is larger than the gradient threshold, recording the maximum gradient point and marking as the key point;
(94) and traversing all the rays to obtain key points on each ray.
4. The method for accurately positioning coordinates of center of index point according to claim 1, wherein: in step (106), the step of reacquiring the circle center position and the radius by using the peripheral points corresponding to the circle P specifically includes the following steps:
(A) let the set of peripheral points corresponding to the circle P be I { (x)i,yi)};
(B) Finding a coordinate point (X)k,Yk) The coordinate point (X)k,Yk)Satisfies the relation between c and the center of circlepDoes not exceed 5 pixels;
(C) for each coordinate point (X)k,Yk) Finding a coordinate point (X)k,Yk) And the set I { (x)i,yi) The sum of the distances of all coordinate points in the symbol is recorded as dkWherein j is the number of data in the set I;
Figure FDA0002922276070000031
(D) finding dkObtaining the coordinate point (X) corresponding to the minimum valuek,Yk)The position of the center of the circle is (X)k,Yk) The radius of the circle is
Figure FDA0002922276070000032
5. The method for accurately positioning coordinates of center of index point according to claim 1, wherein: in the step (11), the rechecking of the acquired circle and the verification of the positioning result specifically include the following steps:
(111) traversing the circumference, and solving and recording the gradient amplitudes of all coordinate points on the circumference;
(112) sorting the gradient amplitudes in a sequence from small to large;
(113) acquiring the gradient amplitude of the 10% position and the gradient amplitude of the 90% position;
(114) and calculating the absolute value of the difference value of the two gradient amplitude values, wherein if the difference value is less than or equal to the threshold value, the circular positioning is successful, and if the difference value is greater than the threshold value, the positioning is failed, and the calibration point cannot be used for calibrating the projection interaction.
CN201810042165.3A 2018-01-16 2018-01-16 Method for accurately positioning center coordinates of calibration points Active CN108205395B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810042165.3A CN108205395B (en) 2018-01-16 2018-01-16 Method for accurately positioning center coordinates of calibration points

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810042165.3A CN108205395B (en) 2018-01-16 2018-01-16 Method for accurately positioning center coordinates of calibration points

Publications (2)

Publication Number Publication Date
CN108205395A CN108205395A (en) 2018-06-26
CN108205395B true CN108205395B (en) 2021-03-23

Family

ID=62605603

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810042165.3A Active CN108205395B (en) 2018-01-16 2018-01-16 Method for accurately positioning center coordinates of calibration points

Country Status (1)

Country Link
CN (1) CN108205395B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109014725B (en) * 2018-08-28 2021-03-23 昆山华恒焊接股份有限公司 Method and device for positioning pipe hole of workpiece and computer storage medium
CN109859198B (en) * 2019-02-01 2020-10-30 佛山市南海区广工大数控装备协同创新研究院 Large-breadth PCB multi-region accurate positioning method
CN110412257B (en) * 2019-07-22 2022-05-03 深圳市预防宝科技有限公司 Test paper block positioning method combining manual calibration and star ray algorithm
CN110647890B (en) * 2019-08-28 2022-05-27 惠州市德赛西威智能交通技术研究院有限公司 High-performance image feature extraction and matching method, system and storage medium
CN110610524B (en) * 2019-08-30 2022-06-17 广东奥普特科技股份有限公司 Camera calibration point coordinate calculation method
CN111339982A (en) * 2020-03-05 2020-06-26 西北工业大学 Multi-stage pupil center positioning technology implementation method based on features
CN111627058B (en) * 2020-04-17 2024-07-23 联想(北京)有限公司 Method, equipment and storage medium for positioning visual center point
CN112033313B (en) * 2020-10-10 2022-01-11 成都瑞拓科技有限责任公司 Eccentricity detection method for wet blasting bead
CN113920324B (en) * 2021-12-13 2022-04-01 广州思德医疗科技有限公司 Image recognition method and device, electronic equipment and storage medium
CN115082552B (en) * 2022-07-25 2022-12-27 荣耀终端有限公司 Marking hole positioning method and device, assembly equipment and storage medium
CN117080142B (en) * 2023-10-11 2024-02-06 迈为技术(珠海)有限公司 Positioning method for center point of alignment mark and wafer bonding method

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6859555B1 (en) * 2000-09-19 2005-02-22 Siemens Corporate Research, Inc. Fast dominant circle detection through horizontal and vertical scanning
CN102298467A (en) * 2010-06-24 2011-12-28 北京威亚视讯科技有限公司 Automatic calibration method and system for display screen
JP2012059171A (en) * 2010-09-13 2012-03-22 Seiko Epson Corp Optical detection system, electronic apparatus and program
CN104794704A (en) * 2015-03-27 2015-07-22 华为技术有限公司 Calibration template and template detection method, device and terminal
CN105469084A (en) * 2015-11-20 2016-04-06 中国科学院苏州生物医学工程技术研究所 Rapid extraction method and system for target central point
CN106204544A (en) * 2016-06-29 2016-12-07 南京中观软件技术有限公司 A kind of automatically extract index point position and the method and system of profile in image
CN106340044A (en) * 2015-07-09 2017-01-18 上海振华重工电气有限公司 Camera external parameter automatic calibration method and calibration device
CN106372593A (en) * 2016-08-30 2017-02-01 上海交通大学 Optic disc area position method based on blood vessel convergence
CN106384355A (en) * 2016-09-21 2017-02-08 安徽慧视金瞳科技有限公司 Automatic calibration method applied to projection interactive system
CN106669139A (en) * 2016-12-03 2017-05-17 西安科锐盛创新科技有限公司 Auxiliary selecting method for electronic-sport-game players
CN106780615A (en) * 2016-11-23 2017-05-31 安徽慧视金瞳科技有限公司 A kind of Projection surveying method based on intensive sampling

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6859555B1 (en) * 2000-09-19 2005-02-22 Siemens Corporate Research, Inc. Fast dominant circle detection through horizontal and vertical scanning
CN102298467A (en) * 2010-06-24 2011-12-28 北京威亚视讯科技有限公司 Automatic calibration method and system for display screen
JP2012059171A (en) * 2010-09-13 2012-03-22 Seiko Epson Corp Optical detection system, electronic apparatus and program
CN104794704A (en) * 2015-03-27 2015-07-22 华为技术有限公司 Calibration template and template detection method, device and terminal
CN106340044A (en) * 2015-07-09 2017-01-18 上海振华重工电气有限公司 Camera external parameter automatic calibration method and calibration device
CN105469084A (en) * 2015-11-20 2016-04-06 中国科学院苏州生物医学工程技术研究所 Rapid extraction method and system for target central point
CN106204544A (en) * 2016-06-29 2016-12-07 南京中观软件技术有限公司 A kind of automatically extract index point position and the method and system of profile in image
CN106372593A (en) * 2016-08-30 2017-02-01 上海交通大学 Optic disc area position method based on blood vessel convergence
CN106384355A (en) * 2016-09-21 2017-02-08 安徽慧视金瞳科技有限公司 Automatic calibration method applied to projection interactive system
CN106780615A (en) * 2016-11-23 2017-05-31 安徽慧视金瞳科技有限公司 A kind of Projection surveying method based on intensive sampling
CN106669139A (en) * 2016-12-03 2017-05-17 西安科锐盛创新科技有限公司 Auxiliary selecting method for electronic-sport-game players

Also Published As

Publication number Publication date
CN108205395A (en) 2018-06-26

Similar Documents

Publication Publication Date Title
CN108205395B (en) Method for accurately positioning center coordinates of calibration points
CN111243032B (en) Full-automatic detection method for checkerboard corner points
CN106092086B (en) A kind of quick, high robust robot indoor orientation method based on panoramic vision
CN105654507B (en) A kind of vehicle overall dimension measurement method based on the tracking of image behavioral characteristics
WO2018133130A1 (en) 3d marker model construction and real-time tracking using monocular camera
CN105261022B (en) PCB board matching method and device based on outer contour
CN109215016B (en) Identification and positioning method for coding mark
CN103136525B (en) High-precision positioning method for special-shaped extended target by utilizing generalized Hough transformation
CN107203742B (en) Gesture recognition method and device based on significant feature point extraction
CN108986129B (en) Calibration plate detection method
CN111612841A (en) Target positioning method and device, mobile robot and readable storage medium
CN107192716A (en) A kind of workpiece, defect quick determination method based on contour feature
CN108805823B (en) Commodity image correction method, system, equipment and storage medium
CN111222507B (en) Automatic identification method for digital meter reading and computer readable storage medium
CN111833405B (en) Calibration and identification method and device based on machine vision
CN110415296B (en) Method for positioning rectangular electric device under shadow illumination
CN106815830B (en) Image defect detection method
CN110555879A (en) Space positioning method, device, system and computer readable medium thereof
CN112767497A (en) High-robustness calibration device based on circular calibration plate and positioning method
CN113129397B (en) Decoding method of parallelogram coding mark based on graphic geometric relation
CN104766331A (en) Imaging processing method and electronic device
CN113313725A (en) Bung hole identification method and system for energetic material medicine barrel
CN105718929B (en) The quick round object localization method of high-precision and system under round-the-clock circumstances not known
CN111709954A (en) Calibration method of go robot vision system
CN115880303A (en) Sub-pixel precision positioning detection method and system for PCB circular hole

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 230000 Yafu Park, Juchao Economic Development Zone, Chaohu City, Hefei City, Anhui Province

Applicant after: ANHUI HUISHI JINTONG TECHNOLOGY Co.,Ltd.

Address before: 102, room 602, C District, Hefei National University, Mount Huangshan Road, 230000 Hefei Road, Anhui, China

Applicant before: ANHUI HUISHI JINTONG TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant