CN109993799B - Ultraviolet camera calibration method and calibration device - Google Patents

Ultraviolet camera calibration method and calibration device Download PDF

Info

Publication number
CN109993799B
CN109993799B CN201910177334.9A CN201910177334A CN109993799B CN 109993799 B CN109993799 B CN 109993799B CN 201910177334 A CN201910177334 A CN 201910177334A CN 109993799 B CN109993799 B CN 109993799B
Authority
CN
China
Prior art keywords
circle
ultraviolet
camera
ultraviolet lamp
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910177334.9A
Other languages
Chinese (zh)
Other versions
CN109993799A (en
Inventor
陈锦龙
陈俊全
杜江
安成
张丹
王兴国
吴珂
范翔
黄晓旭
朱椤方
张旭
陈恩黔
赵翔宇
贺先强
郑凯文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guizhou Power Grid Co Ltd
Original Assignee
Guizhou Power Grid Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guizhou Power Grid Co Ltd filed Critical Guizhou Power Grid Co Ltd
Priority to CN201910177334.9A priority Critical patent/CN109993799B/en
Publication of CN109993799A publication Critical patent/CN109993799A/en
Application granted granted Critical
Publication of CN109993799B publication Critical patent/CN109993799B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Photometry And Measurement Of Optical Pulse Characteristics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses an ultraviolet camera calibration method, which comprises the following steps: step 1: establishing a world coordinate system by taking the plane of a calibration plate in the ultraviolet camera calibration device as an XoY plane and taking the right front of a camera as a Z axis, and determining the coordinate value of each ultraviolet lamp bead on the calibration plate in the world coordinate system; step 2: acquiring the calibration plate image shot by an ultraviolet camera; and step 3: eliminating partial noise; and 4, step 4: carrying out threshold segmentation on the denoised image; and 5: performing circle fitting on the result of the threshold segmentation; step 6: performing non-maximum suppression on the result of the circle fitting; and 7: and solving the corresponding relation between all ultraviolet lamp beads on the calibration plate and the image characteristic points of the sub-pixel precision of the ultraviolet lamp beads in sequence. The method effectively solves the problem that the conventional calibration plate cannot effectively image under an ultraviolet camera. In addition, the invention also provides an ultraviolet camera calibration device.

Description

Ultraviolet camera calibration method and calibration device
Technical Field
The invention relates to the field of machine vision processing, in particular to an ultraviolet camera calibration method and an ultraviolet camera calibration device.
Background
In the prior art, a camera projects an object in a three-dimensional world into an image frame by giving a specific imaging geometric model, and the process is called imaging. The geometric model can be described by a series of parameters, which are the camera parameters. To obtain these parameters, we need to determine by experiment and calculation.
For a generic camera, we represent an imaging geometry model with the following formula:
Figure GDA0002076556260000011
in the above formula, [ X ] w Y w Z w ]Representing a calibration object point in a world coordinate system, [ uv []Representing the corresponding position of the calibration object point in the image, matrix M 1 The elements in (1) are only related to the internal structure of the camera, and the parameters are called camera internal parameters, matrix M 2 The method is completely determined by the orientation of the camera relative to a world coordinate system and is called camera external parameters, the matrix M is called a camera projection matrix, and the process of determining the unknown camera internal parameters and the unknown camera external parameters is camera calibration.
For a general camera, the calibration object used is a checkerboard calibration plate, and each intersection point of black and white grids on the checkerboard calibration plate represents a calibration object point. The chessboard pattern image is shot by a camera, the intersection point of the black and white lattices is detected in the image, the corresponding relation of the calibration object point and the characteristic point in the image can be obtained, and further all camera internal parameters and external parameters are calculated.
However, for the uv camera, since the solar awn band of uv light is used to detect the uv target, the conventional checkerboard calibration plate cannot image on the uv camera at all. The ultraviolet image has low resolution and high noise, only responds to an ultraviolet target, is very close to a binary image, cannot provide target details, the existing characteristic point detection method has high requirements on image details and definition, and when detecting the characteristic points of the ultraviolet image, the positions of the detected characteristic points often have larger deviation with the actual characteristic points, thereby influencing the calibration precision. In addition, different ultraviolet cameras may have different angles of view, and the ultraviolet camera calibration must be able to accommodate variations in the different angles of view.
Therefore, it is necessary to provide an effective ultraviolet camera calibration method and an ultraviolet camera calibration apparatus to solve the above problems.
Disclosure of Invention
In view of this, the present invention aims to provide an ultraviolet camera calibration method, which effectively solves the problem that the conventional calibration plate cannot effectively image under an ultraviolet camera. The invention also aims to provide a calibration device for the ultraviolet camera.
One of the purposes of the invention is realized by the following technical scheme:
an ultraviolet camera calibration method, comprising the steps of:
step 1: establishing a world coordinate system by taking the plane of a calibration plate in the ultraviolet camera calibration device as an XoY plane and taking the right front of a camera as a Z axis, and determining the coordinate value of each ultraviolet lamp bead on the calibration plate in the world coordinate system;
and 2, step: acquiring the calibration plate image shot by an ultraviolet camera;
and step 3: eliminating partial noise;
and 4, step 4: carrying out threshold segmentation on the denoised image;
and 5: performing circle fitting on the result of the threshold segmentation;
and 6: performing non-maximum suppression on the result of the circle fitting;
and 7: and solving the corresponding relation between all ultraviolet lamp beads on the calibration plate and the image characteristic points of the sub-pixel precision of the ultraviolet lamp beads in sequence.
In particular, said step 6) comprises the following sub-steps:
step 6.1: assigning a score to each circle obtained by fitting, and calculating the average brightness of all internal elements of the circle, wherein the brightness is taken as the score of the circle;
step 6.2: grouping the circle fitting results, traversing all fitting results, clustering by using the distance between the circle centers, if the distance between the current circle and the circle center of a certain group of the existing groups is less than a threshold value R, grouping the current circle into the group, otherwise, grouping the current circle into a new group. Finally, if the number of the categories in the clustering result is less than the number of the ultraviolet lamp beads on the calibration plate, it is indicated that in the acquired image, the ultraviolet lamp bead areas obtained by segmentation are insufficient, and the image needs to be acquired again;
step 6.3: performing non-maximum suppression on each circle fitting group, wherein the same group represents all circles extracted from one ultraviolet lamp bead local area, m i And the position of the center of the circle (x) i ,y i ) A scoring surface may be constructed. Determining the extreme value of the curved surface by using a centroid method, and locating the circle center (x) i ,y i ) Score m as particle i The mass center is taken as the mass of the particle, and the mass center is taken as the estimation of an extreme value to obtain the mass center
Figure GDA0002076556260000021
Comprises the following steps:
Figure GDA0002076556260000022
step 6.4: the ultraviolet lamp bead with one mass center corresponds to the sub-pixel-level accurate position of the characteristic point.
Particularly, in the step 5), the center of each fitting circle represents the position of one ultraviolet lamp bead corresponding to the feature point in the image.
Particularly, in the step 7), the following equation is established according to the obtained corresponding relationship between the ultraviolet lamp bead and the image feature point with the sub-pixel precision:
X wi m 11 +Y wi m 12 +Z wi m 13 +m 14 -u i X wi m 31 -u i Y wi m 32 -u i Z wi m 33 =u i m 34
X wi m 21 +Y wi m 22 +Z wi m 23 +m 24 -v i X wi m 31 -v i Y wi m 32 -v i Z wi m 33 =v i m 34
in the formula m ij And expressing corresponding elements in the matrix M, listing the two equations for each characteristic point to form an equation set, solving the equation set to obtain all the elements of the matrix M, and further calculating all the internal parameters and the external parameters of the camera.
The second purpose of the invention is realized by the following technical scheme:
an ultraviolet camera calibration device, comprising:
the ultraviolet lamp bead calibration plate is used for fixing the ultraviolet lamp beads according to an array and controlling the brightness of the ultraviolet lamp beads through a current control device;
the ultraviolet camera calibration box body is used for installing an ultraviolet lamp bead calibration plate, and the calibration plate is arranged opposite to the ultraviolet camera;
the sliding rail set is used for controlling the movement of the camera;
and the camera carrying platform is connected with the sliding rail set and used for fixing the ultraviolet camera.
In particular, the slide rail set comprises three sets of slide rails which are respectively used for controlling the camera to move up and down, horizontally and forwards and backwards.
The beneficial effects of the invention are:
(1) The ultraviolet camera calibration method provided by the invention uses the ultraviolet lamp bead array as a calibration object, effectively solves the problem that the conventional calibration plate cannot effectively image under an ultraviolet camera, and for the imaging of the ultraviolet lamp beads, salt noise filtering is firstly carried out, and then characteristic points are solved through ellipse fitting, so that the salt noise with small area can be directly filtered, and an effective ellipse cannot be fitted even if the salt noise with large area exists, so that the method provided by the invention effectively inhibits the interference of the maximum noise under an ultraviolet image;
(2) The method inhibits the non-maximum value of the result of ellipse fitting, can effectively eliminate the problem of irregular imaging edge of the ultraviolet lamp bead on one hand, and can extract image characteristic points with sub-pixel precision on the other hand, thereby being very beneficial to improving the calibration precision of the ultraviolet camera;
(3) According to the ultraviolet camera calibration box provided by the invention, on one hand, operators are prevented from being exposed to ultraviolet irradiation; on the other hand, through the three groups of sliding rails, the ultraviolet camera can be operated to move in three dimensions, and the ultraviolet camera can adapt to ultraviolet cameras with different field angles and even different lens mounting positions.
Additional advantages, objects, and features of the invention will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof.
Drawings
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be further described in detail with reference to the accompanying drawings, in which:
FIG. 1 is a flow chart of ultraviolet camera calibration;
FIG. 2 is a structural diagram of an ultraviolet lamp bead calibration plate;
fig. 3 is a structure diagram of an ultraviolet camera calibration box.
Detailed Description
Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. It should be understood that the preferred embodiments are illustrative of the invention only and are not limiting upon the scope of the invention.
The design concept of the invention is as follows: the active luminous ultraviolet lamp bead is adopted to replace a reflection type calibration object used by a conventional camera for calibration, so that the problem that the calibration object is difficult to image under the condition of the ultraviolet camera is solved. The calibration object shot by the ultraviolet camera is ultraviolet lamp beads, and because the resolution ratio of the ultraviolet image is low and each lamp bead has higher brightness, an image can be formed on the ultraviolet image as a light spot with irregular edge. The center of the light spot is the position of the characteristic point, and circle fitting needs to be performed on the light spot to extract the position of the characteristic point, but a plurality of circles may be fitted due to irregular edges of the light spot. Therefore, the invention designs a non-maximum value inhibition process, and can calculate the position of the sub-pixel level accurate characteristic point from a plurality of circles, thereby improving the calibration precision.
In order to implement the inventive concept, the present invention first proposes an ultraviolet camera calibration apparatus, as shown in fig. 3, including:
(1) The ultraviolet lamp bead calibration plate 1 is used for fixing ultraviolet lamp beads according to an array and controlling the brightness of the ultraviolet lamp beads through a current control device; as shown in fig. 2;
(2) The ultraviolet camera calibration box body 2 is used for installing an ultraviolet lamp bead calibration plate, and the calibration plate is arranged opposite to the ultraviolet camera 5;
(3) The sliding rail group is used for controlling the movement of the camera; in this embodiment, the slide rail set includes three sets of slide rails for controlling the camera to move up and down, horizontally, and back and forth, respectively, i.e., the up-down guide rail 31, the horizontal guide rail 32, and the front-back guide rail 33 in fig. 3;
(4) And the camera carrying platform 4 is connected with the sliding rail set and used for fixing the ultraviolet camera.
Referring to scheme 1, the process of the present invention can be carried out as follows:
firstly, building an ultraviolet camera calibration box according to the ultraviolet camera calibration device, wherein in the calibration box, an ultraviolet lamp bead array is set to be 5 multiplied by 8 (5 rows and 8 columns, 40 lamp beads are counted); fix the ultraviolet camera on camera microscope carrier, adjust level and perpendicular slide rail, make camera center aim at ultraviolet lamp pearl array center roughly, slide rail before the adjustment for all ultraviolet lamp pearls can be shot to the ultraviolet camera. Properly adjusting the brightness of the ultraviolet lamp beads to form light spots with certain sizes and without mutual overlapping on the ultraviolet image; the calibration method comprises the following steps:
step 1: constructing a world coordinate system by taking the plane of the ultraviolet lamp bead array as an XoY plane, taking the right front of the camera as a Z axis and taking the center of the ultraviolet lamp bead array as an original point, and obtaining the world coordinate system coordinate (taking millimeters as a unit) of each ultraviolet lamp bead according to the world coordinate system coordinate;
and 2, step: acquiring the calibration plate image (ultraviolet lamp bead array) shot by an ultraviolet camera;
and step 3: noise reduction, in this embodiment, a median filter with a size of 7 × 7 is specifically used to perform median filtering on the ultraviolet image, so as to eliminate partial salt noise;
and 4, step 4: the image after noise reduction is subjected to threshold segmentation, an OSTU algorithm is specifically used to calculate a suitable threshold T in the embodiment, and then the threshold is used to perform threshold segmentation on the calibration board image, and the highlight area is the foreground. The OSTU algorithm is an adaptive threshold segmentation method, and the idea is to divide an image into a background part and a target part, and the inter-class variance between the background and the target is the largest. Let the image be I (x, y), the segmentation threshold of foreground and target be T, and the proportion of background pixel points to all pixel points be w 0 With an average gray level of mu 0 The ratio of the target pixel point to all the pixel points is w 1 Average gray of μ 1 And the integral average gray level of the image is mu, and the inter-class variance g is calculated by using a formula (1):
g=w 00 -μ) 2 +w 11 -μ) 2 (1)
traversing and calculating to obtain a segmentation threshold T which enables the inter-class variance to be maximum;
and 5: performing circle fitting on the result of the threshold segmentation, recording all fitting results, and giving a score to each circle, wherein the score value is the average value of the brightness of all pixels in the circle; traversing the fitting result, clustering by using the distance between the circle centers, and if the clustering result is less than 40 groups, indicating that the ultraviolet lamp bead regions obtained by segmentation in the acquired image are insufficient (cannot correspond to the ultraviolet lamp beads one by one), and at the moment, failing to calibrate. The horizontal, vertical and front and back slides should be readjusted by returning to the previous step and the calibration plate image is reacquired. If clustering is successful, each group represents a fitting result of an ultraviolet lamp bead on the calibration plate image;
and 6: and carrying out non-maximum suppression on the fitting result of each ultraviolet lamp bead. For each group, the location of the extremum is determined using a centroid method. Each circle center position (x) in the group i ,y i ) As particles, score m i The mass center is obtained by taking the mass center as the mass of the particle and the mass center as the estimation of an extreme value
Figure GDA0002076556260000051
Comprises the following steps: />
Figure GDA0002076556260000052
The centroid is the position of the corresponding feature point of the ultraviolet lamp bead represented by the group in the ultraviolet image. And calculating the positions of the characteristic points corresponding to all the ultraviolet lamp beads.
And 7: and calculating all internal parameters and external parameters of the ultraviolet camera according to the corresponding relation between the ultraviolet lamp beads and the image feature points of the sub-pixel precision. The specific calculation process is as follows:
for each ultraviolet lamp bead and the image characteristic points thereof, two equations can be written:
X wi m 11 +Y wi m 12 +Z wi m 13 +m 14 -u i X wi m 31 -u i Y wi m 32 -u i Z wi m 33 =u i m 34
X wi m 21 +Y wi m 22 +Z wi m 23 +m 24 -v i X wi m 31 -v i Y wi m 32 -v i Z wi m 33 =v i m 34
for all 40 ultraviolet lamp beads, 80 equations can be listed, and the equation is written in a matrix form as follows:
Figure GDA0002076556260000061
the simplification is in matrix form:
Km=U
the solution of the above equation set is calculated using the least squares method as:
m=(K T K) -1 K T U
thus, a camera projection matrix M is obtained, and then camera internal parameters and external parameters are calculated by decomposing M, including:
Figure GDA0002076556260000062
comparing the matrix elements at the same position, all the internal parameters and the external parameters can be calculated as follows:
Figure GDA0002076556260000071
Figure GDA0002076556260000072
Figure GDA0002076556260000073
Figure GDA0002076556260000074
/>
Figure GDA0002076556260000075
Figure GDA0002076556260000076
r 3 =m 34 m 3
Figure GDA0002076556260000077
Figure GDA0002076556260000078
t z =m 34
the method inhibits the non-maximum value of the result of the ellipse fitting, can effectively eliminate the problem of irregular imaging edge of the ultraviolet lamp bead on one hand, and can extract the image characteristic points with sub-pixel precision on the other hand, thereby being very beneficial to improving the calibration precision of the ultraviolet camera.
Finally, although the present invention has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that various changes and modifications may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (3)

1. An ultraviolet camera calibration method is characterized in that: the method comprises the following steps:
step 1: establishing a world coordinate system by taking the plane of a calibration plate in the ultraviolet camera calibration device as an XoY plane and taking the right front of a camera as a Z axis, and determining the coordinate value of each ultraviolet lamp bead on the calibration plate in the world coordinate system;
and 2, step: acquiring a calibration plate image shot by an ultraviolet camera;
and step 3: eliminating partial noise;
and 4, step 4: carrying out threshold segmentation on the denoised image;
and 5: performing circle fitting on the result of the threshold segmentation;
step 6: performing non-maximum suppression on the result of the circle fitting; said step 6 comprises the following sub-steps:
step 6.1: assigning a score to each circle obtained by fitting, and calculating the average brightness of all internal elements of the circle, wherein the brightness is taken as the score of the circle;
step 6.2: grouping the circle fitting results, traversing all fitting results, clustering by using the distance between circle centers, if the distance between the current circle and the circle center of a certain group of the existing groups is smaller than a threshold value R, grouping the current circle into the group, otherwise, grouping the current circle into a new group, and finally, if the number of categories in the clustering results is smaller than the number of ultraviolet lamp beads of a calibration plate, indicating that the area of the ultraviolet lamp beads obtained by segmentation is insufficient in the acquired image and needing to acquire the image again;
step 6.3: performing non-maximum suppression on each circle fitting group, wherein the same group represents all circles extracted from one ultraviolet lamp bead local area, m i And the position of the center of the circle (x) i ,y i ) Forming a scoring curved surface, determining the extreme value of the curved surface by using a centroid method, and determining the position (x) of the center of a circle i ,y i ) Score m as particle i The mass center is obtained by taking the mass center as the mass of the particle and the mass center as the estimation of an extreme value
Figure FDA0003993541050000012
Comprises the following steps:
Figure FDA0003993541050000011
step 6.4: obtaining a sub-pixel level accurate position of the feature point corresponding to the ultraviolet lamp bead with the mass center;
and 7: and solving the corresponding relation between all ultraviolet lamp beads on the calibration plate and the image characteristic points of the sub-pixel precision of the ultraviolet lamp beads in sequence.
2. The ultraviolet camera calibration method according to claim 1, characterized in that: in the step 5, the circle center of each fitting circle represents the position of one ultraviolet lamp bead corresponding to the characteristic point in the image.
3. The ultraviolet camera calibration method according to claim 1, characterized in that: in the step 7, the following equation is established according to the obtained corresponding relation between the ultraviolet lamp bead and the image feature point of the sub-pixel precision:
X wi m 11 +Y wi m 12 +Z wi m 13 +m 14 -u i X wi m 31 -u i Y wi m 32 -u i Z wi m 33 =u i m 34
X wi m 21 +Y wi m 22 +Z wi m 23 +m 24 -v i X wi m 31 -v i Y wi m 32 -v i Z wi m 33 =v i m 34
in the formula m ij And expressing corresponding elements in the matrix M, listing the two equations for each characteristic point to form an equation set, solving the equation set to obtain all the elements of the matrix M, and calculating all the internal parameters and the external parameters of the camera.
CN201910177334.9A 2019-03-08 2019-03-08 Ultraviolet camera calibration method and calibration device Active CN109993799B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910177334.9A CN109993799B (en) 2019-03-08 2019-03-08 Ultraviolet camera calibration method and calibration device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910177334.9A CN109993799B (en) 2019-03-08 2019-03-08 Ultraviolet camera calibration method and calibration device

Publications (2)

Publication Number Publication Date
CN109993799A CN109993799A (en) 2019-07-09
CN109993799B true CN109993799B (en) 2023-03-24

Family

ID=67130464

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910177334.9A Active CN109993799B (en) 2019-03-08 2019-03-08 Ultraviolet camera calibration method and calibration device

Country Status (1)

Country Link
CN (1) CN109993799B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855626A (en) * 2012-08-09 2013-01-02 深圳先进技术研究院 Methods and devices for light source direction calibration and human information three-dimensional collection
CN108447095A (en) * 2018-01-31 2018-08-24 潍坊歌尔电子有限公司 A kind of fisheye camera scaling method and device

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202329646U (en) * 2011-11-15 2012-07-11 中国科学院西安光学精密机械研究所 Dim target ultraviolet star magnitude calibration system
CN103530880B (en) * 2013-10-16 2016-04-06 大连理工大学 Based on the camera marking method of projection Gaussian network pattern
KR101528332B1 (en) * 2014-01-09 2015-06-15 한국과학기술연구원 Apparatus and method for extreme ultra-violet spectrometer calibration
TWI526992B (en) * 2015-01-21 2016-03-21 國立清華大學 Method for optimizing occlusion in augmented reality based on depth camera
CN106340044B (en) * 2015-07-09 2019-03-29 上海振华重工电气有限公司 Join automatic calibration method and caliberating device outside video camera
CN105675633B (en) * 2016-03-10 2018-08-17 中国工程物理研究院激光聚变研究中心 A kind of caliberating device of X-ray framing camera
WO2018076154A1 (en) * 2016-10-25 2018-05-03 成都通甲优博科技有限责任公司 Spatial positioning calibration of fisheye camera-based panoramic video generating method
CN107516325A (en) * 2017-08-22 2017-12-26 上海理工大学 Center of circle detection method based on sub-pixel edge
CN108765489B (en) * 2018-05-29 2022-04-29 中国人民解放军63920部队 Pose calculation method, system, medium and equipment based on combined target

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855626A (en) * 2012-08-09 2013-01-02 深圳先进技术研究院 Methods and devices for light source direction calibration and human information three-dimensional collection
CN108447095A (en) * 2018-01-31 2018-08-24 潍坊歌尔电子有限公司 A kind of fisheye camera scaling method and device

Also Published As

Publication number Publication date
CN109993799A (en) 2019-07-09

Similar Documents

Publication Publication Date Title
CN108369650B (en) Method for identifying possible characteristic points of calibration pattern
CN105894499B (en) A kind of space object three-dimensional information rapid detection method based on binocular vision
CN107203973B (en) Sub-pixel positioning method for center line laser of three-dimensional laser scanning system
CN104981105B (en) A kind of quickly accurate detection and method for correcting error for obtaining element central and deflection angle
CN109859226B (en) Detection method of checkerboard corner sub-pixels for graph segmentation
US20130070095A1 (en) Fast obstacle detection
CN108562250B (en) Keyboard keycap flatness rapid measurement method and device based on structured light imaging
CN108256521B (en) Effective area positioning method for vehicle body color identification
EP3776465B1 (en) System of vehicle inspection and method thereof
CN112116576A (en) Defect detection method based on polarization structure light imaging and improved Mask R-CNN
CN112907675B (en) Calibration method, device, system, equipment and storage medium of image acquisition equipment
CN109191429B (en) 3D printing nozzle detection method based on machine vision
CN110930368B (en) Method for extracting characteristics of real-time welding image of sheet lap weld
CN113313116A (en) Vision-based accurate detection and positioning method for underwater artificial target
CN108510544B (en) Light strip positioning method based on feature clustering
CN117576219A (en) Camera calibration equipment and calibration method for single shot image of large wide-angle fish-eye lens
CN116309275A (en) Method and device for detecting edges of sub-pixels of battery piece image and storage medium
US20100158378A1 (en) Method for image processing
CN109993799B (en) Ultraviolet camera calibration method and calibration device
CN112184723B (en) Image processing method and device, electronic equipment and storage medium
CN105005985B (en) Backlight image micron order edge detection method
CN114882122A (en) Image local automatic calibration method and device and related equipment
KR102383577B1 (en) A method for inspecting a skeleton wafer
CN112489065A (en) Checkerboard calibration fixed point sub-pixel extraction method
CN113920028B (en) Vignetting image correction method based on widening filtering

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant