CN112489058A - Efficient and accurate stripe direction estimation method - Google Patents

Efficient and accurate stripe direction estimation method Download PDF

Info

Publication number
CN112489058A
CN112489058A CN202011410592.6A CN202011410592A CN112489058A CN 112489058 A CN112489058 A CN 112489058A CN 202011410592 A CN202011410592 A CN 202011410592A CN 112489058 A CN112489058 A CN 112489058A
Authority
CN
China
Prior art keywords
stripe
efficient
accurate
estimation method
marking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011410592.6A
Other languages
Chinese (zh)
Inventor
齐泽荣
付树军
胡明征
刘彦明
李长隆
徐象锋
王汉远
张晶
张晓旭
孙瑜阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Chengshi Electronic Technology Co ltd
Original Assignee
Shandong Chengshi Electronic Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Chengshi Electronic Technology Co ltd filed Critical Shandong Chengshi Electronic Technology Co ltd
Priority to CN202011410592.6A priority Critical patent/CN112489058A/en
Publication of CN112489058A publication Critical patent/CN112489058A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • G06T7/41Analysis of texture based on statistical description of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/255Detecting or recognising potential candidate objects based on visual cues, e.g. shapes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Probability & Statistics with Applications (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a high-efficiency and accurate fringe direction estimation method, which comprises the following steps: p1, carrying out exposure shooting on the surface of the stripe to obtain a real-time stripe image; p2, carrying out fragment interception on the real-time image to form a stripe unit; p3, overlapping and matching the units, finding out the units with the same shape, and removing edge patterns; p4, carrying out strong exposure on the same unit, distinguishing a stripe area from a background area, and marking a stripe boundary line; p5, marking inflection points of the stripe boundary lines to form marking points; p6, judging the stripe direction according to the distribution state of the mark points to form an estimation result, avoiding interference caused by adjacent stripes and background patterns, improving the stripe identification accuracy, and simultaneously identifying the stripe shape and the distribution state by combining a boundary line inflection point marking method, being beneficial to determining the stripe region, further judging the two directions, being capable of greatly improving the identification efficiency and the estimation accuracy, improving the use effect and being beneficial to popularization.

Description

Efficient and accurate stripe direction estimation method
Technical Field
The invention relates to the technical field of stripe directions, in particular to a high-efficiency and accurate stripe direction estimation method.
Background
In the current daily life and production, some graphs can be configured with stripe patterns, and direction identification and estimation are needed, so that stripe positioning and pavement directions are determined, and disorder is avoided.
However, most of the absorbed stripe directions are directly shot and identified through a camera, and most of the ordered stripes are continuous intensive patterns, so that the stripes are easy to mix with background patterns during shooting, the identification efficiency and accuracy are affected, and a new method needs to be provided.
Disclosure of Invention
The invention aims to solve the defects in the prior art and provides an efficient and accurate fringe direction estimation method.
In order to achieve the purpose, the invention adopts the following technical scheme:
an efficient and accurate fringe direction estimation method comprises the following steps:
p1, carrying out exposure shooting on the surface of the stripe to obtain a real-time stripe image;
p2, carrying out fragment interception on the real-time image to form a stripe unit;
p3, overlapping and matching the units, finding out the units with the same shape, and removing edge patterns;
p4, carrying out strong exposure on the same unit, distinguishing a stripe area from a background area, and marking a stripe boundary line;
p5, marking inflection points of the stripe boundary lines to form marking points;
and P6, judging the stripe direction according to the distribution state of the mark points to form an estimation result.
Preferably, the exposure shot of the step P1 uses two exposure beams, and forms an interference area on a fringe substrate.
Preferably, the fragment interception in the step P2 is in a picture format, and a primary picture set and a verification picture set are constructed.
Preferably, the primary picture set comprises a noisy histogram having a size of 512 × 512 and a density of 500ppi, where ppi denotes pixels per inch, and the verification picture set comprises a noiseless histogram having a size of 512 × 512 and a density of 1000 ppi.
Preferably, the overlap matching of the P3 step is stripe image stack matching, and the selected area maximizes the overlap area as a stripe region.
Preferably, the marker stripe boundary line in the step P4 is a light/dark boundary line position after the strong exposure, and a boundary pattern is drawn.
Preferably, the boundary line inflection point of the P5 step includes a convex point and a concave point.
Preferably, the method for determining the stripe direction in the step P6 includes the following steps:
s1, accurately marking the boundary inflection point, and recording the number and the corresponding position;
s2, identifying inflection point dispersion states, and removing the center position of the top stripe at the comprehensive intersection according to the central connecting line;
s3, identifying at least two areas with the most dense inflection point distribution, mutually connecting the areas to cross the center position, and determining a transverse central line and a longitudinal central line;
and S4, determining the distance from the most dense area to the central line, and estimating the stripe direction.
Preferably, the estimated stripe directions of step S4 are divided into two, where two symmetric farthest distance dense areas are taken as the extending directions of the stripes, and two closest home dense areas that are the densest are taken as the tiling directions of the stripes.
According to the efficient and accurate stripe direction estimation method provided by the invention, through adopting double-path light beam exposure for shooting, the image definition can be effectively improved, the segmentation interception and the strong exposure irradiation are combined, the interference caused by adjacent stripes and background patterns is avoided, the stripe identification accuracy is improved, meanwhile, the stripe shape and the distribution state are identified by combining a boundary line inflection point marking method, the stripe area is favorably determined, the two directions of the stripe area are further judged, the identification efficiency and the estimation accuracy can be greatly improved, the use effect is improved, and the popularization is facilitated.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
An efficient and accurate fringe direction estimation method comprises the following steps:
p1, carrying out exposure shooting on the surface of the stripe to obtain a real-time stripe image;
p2, carrying out fragment interception on the real-time image to form a stripe unit;
p3, overlapping and matching the units, finding out the units with the same shape, and removing edge patterns;
p4, carrying out strong exposure on the same unit, distinguishing a stripe area from a background area, and marking a stripe boundary line;
p5, marking inflection points of the stripe boundary lines to form marking points;
and P6, judging the stripe direction according to the distribution state of the mark points to form an estimation result.
Preferably, the exposure shot of the step P1 uses two exposure beams, and forms an interference area on a fringe substrate.
Preferably, the fragment truncation of step P2 is in a picture format, and a primary picture set and a verification picture set are constructed.
Preferably, the primary picture set comprises noisy fringe patterns, the size of the fringe patterns is 512 x 512, the density is 500ppi, ppi represents the number of pixels per inch, and the verification picture set comprises noiseless fringe patterns, the size of the fringe patterns is 512 x 512, and the density is 1000 ppi.
Preferably, the overlap matching of the P3 step is a stripe image stack matching, and the region is selected to maximize the overlap area as a stripe region.
Preferably, the marker stripe boundary line in the step P4 is a light/dark boundary line position after the strong exposure, and a boundary pattern is drawn.
Preferably, the boundary line inflection point of the P5 step includes a convex point and a concave point.
Preferably, the method for determining the stripe direction in the step P6 includes the steps of:
s1, accurately marking the boundary inflection point, and recording the number and the corresponding position;
s2, identifying inflection point dispersion states, and removing the center position of the top stripe at the comprehensive intersection according to the central connecting line;
s3, identifying at least two areas with the most dense inflection point distribution, mutually connecting the areas to cross the center position, and determining a transverse central line and a longitudinal central line;
and S4, determining the distance from the most dense area to the central line, and estimating the stripe direction.
Preferably, the estimated stripe directions in step S4 are divided into two, where two symmetric farthest distance dense areas are taken as extending directions of the stripes, and two closest home dense areas that are the densest are taken as tiling directions of the stripes.
According to the efficient and accurate stripe direction estimation method provided by the invention, through adopting double-path light beam exposure for shooting, the image definition can be effectively improved, the segmentation interception and the strong exposure irradiation are combined, the interference caused by adjacent stripes and background patterns is avoided, the stripe identification accuracy is improved, meanwhile, the stripe shape and the distribution state are identified by combining a boundary line inflection point marking method, the stripe area is favorably determined, the two directions of the stripe area are further judged, the identification efficiency and the estimation accuracy can be greatly improved, the use effect is improved, and the popularization is facilitated.

Claims (9)

1. An efficient and accurate fringe direction estimation method is characterized by comprising the following steps: the estimation method comprises the following steps:
p1, carrying out exposure shooting on the surface of the stripe to obtain a real-time stripe image;
p2, carrying out fragment interception on the real-time image to form a stripe unit;
p3, overlapping and matching the units, finding out the units with the same shape, and removing edge patterns;
p4, carrying out strong exposure on the same unit, distinguishing a stripe area from a background area, and marking a stripe boundary line;
p5, marking inflection points of the stripe boundary lines to form marking points;
and P6, judging the stripe direction according to the distribution state of the mark points to form an estimation result.
2. An efficient and accurate streak direction estimation method according to claim 1, wherein: the exposure shooting of the step P1 adopts two exposure beams, and forms an interference area on a fringe substrate.
3. An efficient and accurate streak direction estimation method according to claim 1, wherein: and the fragment interception in the step P2 adopts a picture format, and an initial picture set and a verification picture set are constructed.
4. A method for efficient and accurate streak direction estimation according to claim 3, wherein: the primary picture set comprises noisy fringe patterns, the size of the fringe patterns is 512 x 512, the density is 500ppi, the ppi represents the number of pixels per inch, the verification picture set comprises noiseless fringe patterns, the size of the fringe patterns is 512 x 512, and the density is 1000 ppi.
5. An efficient and accurate streak direction estimation method according to claim 1, wherein: the overlap matching of the P3 step is a stripe image stack matching, and the selected area maximizes the overlap area as a stripe region.
6. An efficient and accurate streak direction estimation method according to claim 1, wherein: the marker stripe boundary line in the step P4 is a light and dark boundary line position after strong exposure, and a boundary graph is drawn.
7. An efficient and accurate streak direction estimation method according to claim 1, wherein: the boundary line inflection point of the P5 step includes a convex point and a concave point.
8. An efficient and accurate streak direction estimation method according to claim 1, wherein: the method for judging the stripe direction in the step P6 comprises the following steps:
s1, accurately marking the boundary inflection point, and recording the number and the corresponding position;
s2, identifying inflection point dispersion states, and removing the center position of the top stripe at the comprehensive intersection according to the central connecting line;
s3, identifying at least two areas with the most dense inflection point distribution, mutually connecting the areas to cross the center position, and determining a transverse central line and a longitudinal central line;
and S4, determining the distance from the most dense area to the central line, and estimating the stripe direction.
9. An efficient and accurate streak direction estimation method according to claim 8, wherein: the estimated stripe directions in the step S4 are divided into two, where two symmetric farthest distance dense areas are taken as the extending directions of the stripes, and two closest home dense areas that are the densest are taken as the tiling directions of the stripes.
CN202011410592.6A 2020-12-03 2020-12-03 Efficient and accurate stripe direction estimation method Pending CN112489058A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011410592.6A CN112489058A (en) 2020-12-03 2020-12-03 Efficient and accurate stripe direction estimation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011410592.6A CN112489058A (en) 2020-12-03 2020-12-03 Efficient and accurate stripe direction estimation method

Publications (1)

Publication Number Publication Date
CN112489058A true CN112489058A (en) 2021-03-12

Family

ID=74938125

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011410592.6A Pending CN112489058A (en) 2020-12-03 2020-12-03 Efficient and accurate stripe direction estimation method

Country Status (1)

Country Link
CN (1) CN112489058A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000132685A (en) * 1998-10-23 2000-05-12 Matsushita Electric Works Ltd Method for inspecting external appearance
US20020126295A1 (en) * 2000-11-20 2002-09-12 Gilbert Dudkiewicz Automatic installation and process for taking measurements and acquiring shapes
WO2015189174A2 (en) * 2014-06-10 2015-12-17 Carl Zeiss Meditec, Inc. Improved frequency-domain interferometric based imaging systems and methods
CN106091978A (en) * 2016-06-01 2016-11-09 西安工程大学 The joining method of interference fringe image in inclined in type measurements by laser interferometry
CN107917676A (en) * 2017-10-24 2018-04-17 南京理工大学 A kind of interferometric method based on stripe pattern spectrum analysis
CN109889696A (en) * 2019-03-18 2019-06-14 上海顺久电子科技有限公司 Antinoise for automatic geometric correction shoots image-recognizing method and system
CN110298811A (en) * 2018-03-21 2019-10-01 北京大学 Preprocess method, device, terminal and the computer readable storage medium of image
CN111402149A (en) * 2020-03-06 2020-07-10 四川大学 Fringe pattern restoration method based on convolutional neural network denoising regularization

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000132685A (en) * 1998-10-23 2000-05-12 Matsushita Electric Works Ltd Method for inspecting external appearance
US20020126295A1 (en) * 2000-11-20 2002-09-12 Gilbert Dudkiewicz Automatic installation and process for taking measurements and acquiring shapes
WO2015189174A2 (en) * 2014-06-10 2015-12-17 Carl Zeiss Meditec, Inc. Improved frequency-domain interferometric based imaging systems and methods
CN106091978A (en) * 2016-06-01 2016-11-09 西安工程大学 The joining method of interference fringe image in inclined in type measurements by laser interferometry
CN107917676A (en) * 2017-10-24 2018-04-17 南京理工大学 A kind of interferometric method based on stripe pattern spectrum analysis
CN110298811A (en) * 2018-03-21 2019-10-01 北京大学 Preprocess method, device, terminal and the computer readable storage medium of image
CN109889696A (en) * 2019-03-18 2019-06-14 上海顺久电子科技有限公司 Antinoise for automatic geometric correction shoots image-recognizing method and system
CN111402149A (en) * 2020-03-06 2020-07-10 四川大学 Fringe pattern restoration method based on convolutional neural network denoising regularization

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨鹏程;刘洋;朱新栋;胥光申;肖渊;: "基于物体像的干涉条纹图像中散斑噪声的识别方法", 应用光学, no. 02, 15 March 2017 (2017-03-15) *

Similar Documents

Publication Publication Date Title
CN109145915B (en) Rapid distortion correction method for license plate under complex scene
CN111207695A (en) Hot-rolled strip steel end three-dimensional contour measuring method based on double-line structured light
KR20200087184A (en) Method for manufacturing metal plate and metal plate for manufacturing deposition mask, and method for manufacturing deposition mask and deposition mask
CN101339601B (en) License plate Chinese character recognition method based on SIFT algorithm
CN105718870A (en) Road marking line extracting method based on forward camera head in automatic driving
CN102673106A (en) Silk screen print positioning equipment and method for photovoltaic solar silicon chip
CN112224590B (en) Billet labeling method and system based on three-dimensional point cloud
CN110390256B (en) Asphalt pavement crack extraction method
CN113390605B (en) Full-field measurement method for wing deformation of wind tunnel test airplane
CN107389693A (en) A kind of printed matter defect automatic testing method based on machine vision
CN105551046A (en) Vehicle face location method and device
CN109870458B (en) Pavement crack detection and classification method based on three-dimensional laser sensor and bounding box
CN105225229A (en) Fish based on vision signal cross dam movement locus locating device and method
CN112489058A (en) Efficient and accurate stripe direction estimation method
CN111739006B (en) Elliptical image detection algorithm and system based on enclosed road integral
CN113313116A (en) Vision-based accurate detection and positioning method for underwater artificial target
CN112862898A (en) Flow velocity measuring method based on computer vision
JP3589293B2 (en) Road white line detection method
CN109015632A (en) A kind of robot hand end localization method
CN111462214A (en) Line structure light stripe central line extraction method based on Hough transformation
CN111968079A (en) Three-dimensional pavement crack extraction method based on section local extreme value and segmentation sparsity
CN117689717B (en) Ground badminton pose detection method for robot pickup
CN111318815A (en) Laser marking system
CN112184820B (en) Laser double-sided shot blasting spot positioning method and system and computer readable storage medium
CN114565745A (en) Laser additive manufacturing scanning path regional planning method considering suspension feature recognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination