CN110956624B - Image definition evaluation method for three-dimensional object - Google Patents

Image definition evaluation method for three-dimensional object Download PDF

Info

Publication number
CN110956624B
CN110956624B CN201911212471.8A CN201911212471A CN110956624B CN 110956624 B CN110956624 B CN 110956624B CN 201911212471 A CN201911212471 A CN 201911212471A CN 110956624 B CN110956624 B CN 110956624B
Authority
CN
China
Prior art keywords
image
focusing ring
images
definition
focusing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911212471.8A
Other languages
Chinese (zh)
Other versions
CN110956624A (en
Inventor
谭治英
赵宝来
费仲文
徐孝彬
赵鸣晖
沙刘力
袁文强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changzhou Campus of Hohai University
Original Assignee
Changzhou Campus of Hohai University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changzhou Campus of Hohai University filed Critical Changzhou Campus of Hohai University
Priority to CN201911212471.8A priority Critical patent/CN110956624B/en
Publication of CN110956624A publication Critical patent/CN110956624A/en
Application granted granted Critical
Publication of CN110956624B publication Critical patent/CN110956624B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to an image definition evaluation method based on a three-dimensional object, which comprises the following steps: starting a steering engine to enable a focusing ring to rotate, and triggering a camera to acquire a group of images at fixed time intervals; preprocessing each image, wherein preprocessing comprises median filtering to remove noise, and binarizing the images by threshold segmentation; carrying out Canny edge detection on the preprocessed image; extracting gray values in a fixed range in the normal direction of the interested edge, and taking the obtained gray values as a point set; performing curve fitting by using a least square method; outputting an index coefficient D of each region of interest, and taking an average value as a definition evaluation index of the image; and comparing indexes of each image to obtain the clearest image and rotating the focusing ring to a specified position. The invention adopts a new evaluation function to locally evaluate the interested region of the image, and can realize the purpose of accurate evaluation aiming at the specific requirement of a specific object.

Description

Image definition evaluation method for three-dimensional object
Technical Field
The invention relates to the field of computer vision technology and image processing, and designs an image definition evaluation method for a three-dimensional object.
Background
Along with the application and development of machine vision, the acquisition of high-definition images is of great significance in the field of industrial detection, in the industrial detection process, at present, an experienced machine vision engineer mainly spends a long time adjusting a lens focusing ring to acquire clear images, focusing results are different due to the influence of human factors, and an imaging system does not reach an optimal use state, so that the research of an automatic focusing technology is of great significance. An important part of the automatic focusing technology is the evaluation of image definition, and the quality of an image definition evaluation function directly influences the automatic focusing precision. The image definition evaluation function proposed at home and abroad at present aims at the whole image, different planes have different definition for a three-dimensional object, and different planes have different definition requirements according to specific requirements, so that the overall definition evaluation of the whole image cannot achieve the purpose. The invention can solve the problems by locally evaluating the region of interest.
Disclosure of Invention
In order to solve the problems, the invention provides an image definition evaluation method based on a three-dimensional object, which utilizes an automatic focusing device, wherein the automatic focusing device comprises a steering engine, a camera, a focusing ring and a workbench, the focusing ring is connected to the camera and is used for focusing through the steering engine, the workbench is positioned below the focusing ring, and an object to be measured is placed on the workbench, and the method is characterized by comprising the following steps:
(1) Starting a steering engine to enable a focusing ring to rotate, triggering a camera to acquire a group of images at intervals of fixed time, enabling the definition of the images to be changed from blurring to clear to blurring, and recording the corresponding scale value of each photo; manually setting the operation time by the industrial personal computer at intervals of fixed time;
(2) Preprocessing each image, wherein preprocessing comprises median filtering to remove noise, and binarizing the images by threshold segmentation;
(3) Carrying out Canny edge detection on the preprocessed image, and extracting an image edge;
(4) Extracting gray values in a fixed range in the normal direction of the interested edge, wherein the fixed range is formed by extending pixel points with the same length from the edge to two sides, and the obtained gray values are used as point sets;
(5) Performing curve fitting by using a least square method, wherein the obtained index coefficient D is used as an index for evaluating the definition of the image;
(6) Outputting an index coefficient D of each region of interest, and taking an average value as a definition evaluation index of the image;
(7) And comparing indexes of each image to obtain the clearest image and rotating the focusing ring to the designated position.
The curve fitting method in the step (5) is as follows:
the curve function is as follows:
wherein y represents a gray value at the x position, A, B, C, D, E is a parameter to be determined; let the point set be { (x) 1 ,y 1 ),(x 2 ,y 2 ),…,(x n ,y n ) The objective function is:
is provided withThe parameters A, B, C, D, E are obtained by using a least square nonlinear fitting method so that L is minimized. The present invention fits the curve by calling the nlinfit function in MATLAB.
The invention has the advantages that the local evaluation can be carried out on the specific area, and the requirement of accurate evaluation definition can be met according to specific requirements.
Drawings
Fig. 1 shows an autofocus mechanism according to the present invention.
FIG. 2 is a flow chart illustrating the operation of an embodiment of the present invention.
Fig. 3 is a graph of the function change corresponding to the different exponent coefficients D.
Fig. 4 is a workpiece with different definition.
Fig. 5 is a preprocessed image.
Fig. 6 is an image after edge detection.
FIG. 7 is a graph of a fixed range array of gray values along the normal to the four edges of interest in the left graph of FIG. 4 and its fit.
FIG. 8 is a graph of the array of gray scale values for a fixed range along the normal of four edges of interest in the right graph of FIG. 4 and the result of fitting the same.
Fig. 9 is a graph of a set of image evaluation sharpness.
Detailed Description
Examples:
as shown in fig. 1, an automatic focusing device is designed, the automatic focusing device comprises a steering engine 1, a camera 2, a focusing ring 3 and a workbench 4, the focusing ring 3 is connected to the camera 2, the focusing ring 3 focuses through the steering engine 1, the workbench 4 is located below the focusing ring, and an object 5 to be measured is placed on the workbench 4. The steering engine 1 is started to enable the focusing ring 3 to rotate, a group of images are collected, the collected images are evaluated, the image with the highest definition is obtained, the steering engine 1 is utilized to enable the focusing ring 3 to rotate to the corresponding position, and finally the purpose of automatic focusing is achieved.
The invention will now be described in more detail by way of example with reference to the accompanying drawings.
In this embodiment, a total of 23 images with different blur levels are acquired. As shown in fig. 4, 2 images were selected for presentation. The specific operation flow of the invention is as follows:
step one: the industrial personal computer controls the steering engine to drive the focusing ring to rotate and triggers the camera to collect a group of images, so that the definition of the images is changed from blurring to clear to blurring, and the corresponding scale value of each photo is recorded.
Step two: each image is preprocessed, wherein the preprocessing includes median filtering to remove noise, and thresholding binarizes the image, as shown in fig. 5.
Step three: and (3) carrying out Canny edge detection on the preprocessed image, and extracting an image edge. As shown in fig. 6.
Step four: gray values in a fixed range in the normal direction of the interested edge are extracted, as shown in fig. 3, the gray values are sequenced from left to right or from top to bottom according to the normal direction to form an array, the array is fitted by utilizing a least square fitting method, 40 pixels on the left and right of the edge are selected, 80 pixels are total, and the gray values of the 80 pixels are extracted as a point set, as shown in fig. 7 and 8.
Step five: and performing curve fitting on the point set, wherein the function form of the curve fitting is as follows:
the larger the D value, the larger the slope of the representation curve, the larger the gray scale change, and the clearer the image. As shown in fig. 6 and 7, the gray value change in the edge range corresponding to the blurred image in fig. 4 left is gentle, the slope of the fitted curve is small, the gray value change in the edge range corresponding to the clear image in fig. 4 right is steep, and the slope of the fitted curve is large.
Step six: the D values obtained in the four regions of each image were averaged as an evaluation index for this image, and the indexes obtained in the 23 images were output and plotted as shown in fig. 9. It can be seen that the invention has the advantages of strong unimodal property and high sensitivity. The focusing ring is rotated to the position enabling the image to be the clearest, the workpiece is moved to any position, and parameters obtained by the algorithm are unchanged, so that the algorithm is high in stability.

Claims (1)

1. The utility model provides an image definition evaluation method based on three-dimensional object utilizes an automatic focusing device, automatic focusing device includes steering wheel, camera, focusing ring, workstation, the focusing ring is connected on the camera, the focusing is carried out through the steering wheel to the focusing ring, the workstation is located the focusing ring below, places the object that awaits measuring on the workstation, its characterized in that includes following steps:
(1) Starting a steering engine to enable a focusing ring to rotate, triggering a camera to acquire a group of images at intervals of fixed time, enabling the definition of the images to be changed from blurring to clear to blurring, and recording the corresponding scale value of each photo; manually setting the operation time by the industrial personal computer at intervals of fixed time;
(2) Preprocessing each image, wherein preprocessing comprises median filtering to remove noise, and binarizing the images by threshold segmentation;
(3) Carrying out Canny edge detection on the preprocessed image, and extracting an image edge;
(4) Extracting gray values in a fixed range in the normal direction of the interested edge, wherein the fixed range is formed by extending pixel points with the same length from the edge to two sides, and the obtained gray values are used as point sets;
(5) And performing curve fitting by using a least square method, wherein the curve fitting method comprises the following steps of:
the curve function is as follows:
wherein y represents a gray value at the x position, A, B, C, D, E is a parameter to be determined; let the point set be { (x) 1 ,y 1 ),(x 2 ,y 2 ),…,(x n ,y n ) The objective function is:
is provided withSolving a parameter A, B, C, D, E by using a least square nonlinear fitting method so as to minimize L;
the obtained parameter D is used as an index for evaluating the definition of the image;
(6) Outputting an index coefficient D of each region of interest, and taking an average value as a definition evaluation index of the image;
(7) And comparing indexes of each image to obtain the clearest image and rotating the focusing ring to the designated position.
CN201911212471.8A 2019-12-02 2019-12-02 Image definition evaluation method for three-dimensional object Active CN110956624B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911212471.8A CN110956624B (en) 2019-12-02 2019-12-02 Image definition evaluation method for three-dimensional object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911212471.8A CN110956624B (en) 2019-12-02 2019-12-02 Image definition evaluation method for three-dimensional object

Publications (2)

Publication Number Publication Date
CN110956624A CN110956624A (en) 2020-04-03
CN110956624B true CN110956624B (en) 2023-09-01

Family

ID=69979238

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911212471.8A Active CN110956624B (en) 2019-12-02 2019-12-02 Image definition evaluation method for three-dimensional object

Country Status (1)

Country Link
CN (1) CN110956624B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112285876A (en) * 2020-11-04 2021-01-29 邱妙娜 Camera automatic focusing method based on image processing and bubble detection
CN113639630A (en) * 2021-04-01 2021-11-12 浙江大学台州研究院 Dimension measuring instrument system based on multi-template matching and automatic focusing functions
CN113810616B (en) * 2021-09-27 2023-05-09 季华实验室 Aperture focal length adjustment method, system, electronic device and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102788682B (en) * 2012-07-25 2015-02-04 宁波大学 Method for detecting parfocality of continuous zoom stereo microscope
CN106127775A (en) * 2016-06-28 2016-11-16 乐视控股(北京)有限公司 Measurement for Digital Image Definition and device

Also Published As

Publication number Publication date
CN110956624A (en) 2020-04-03

Similar Documents

Publication Publication Date Title
CN110956624B (en) Image definition evaluation method for three-dimensional object
Li et al. Aod-net: All-in-one dehazing network
CN108921176B (en) Pointer instrument positioning and identifying method based on machine vision
CN109507192B (en) Magnetic core surface defect detection method based on machine vision
Pertuz et al. Generation of all-in-focus images by noise-robust selective fusion of limited depth-of-field images
CN109785245B (en) Light spot image trimming method
CN109489566B (en) Lithium battery diaphragm material slitting width detection method, detection system and device
CN111083365B (en) Method and device for rapidly detecting optimal focal plane position
CN105139391B (en) A kind of haze weather traffic image edge detection method
CN104075965B (en) A kind of micro-image grain graininess measuring method based on watershed segmentation
CN116758071B (en) Intelligent detection method for carbon electrode dirt under visual assistance
CN110517213B (en) Laplacian pyramid-based real-time depth of field extension method for microscope
WO2017120796A1 (en) Pavement distress detection method and apparatus, and electronic device
CN108510544B (en) Light strip positioning method based on feature clustering
CN111242888A (en) Image processing method and system based on machine vision
CN111679418B (en) Microscope automatic focusing method and system based on laser image and computer equipment
CN107170002B (en) Automatic image focusing method and device
CN115661110A (en) Method for identifying and positioning transparent workpiece
CN106534661A (en) Automatic focus algorithm accumulated based on strongest edge gradient Laplasse operator
CN113538545B (en) Monocular depth estimation method based on electro-hydraulic adjustable-focus lens and corresponding camera and storage medium
CN116894775B (en) Bolt image preprocessing method based on camera motion model recovery and super-resolution
CN112330667B (en) Morphology-based laser stripe center line extraction method
CN114529715B (en) Image identification method and system based on edge extraction
CN115861220A (en) Cold-rolled strip steel surface defect detection method and system based on improved SSD algorithm
CN114972084A (en) Image focusing accuracy evaluation method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant