CN114494407B - Image processing method for distance measurement - Google Patents

Image processing method for distance measurement Download PDF

Info

Publication number
CN114494407B
CN114494407B CN202210387883.0A CN202210387883A CN114494407B CN 114494407 B CN114494407 B CN 114494407B CN 202210387883 A CN202210387883 A CN 202210387883A CN 114494407 B CN114494407 B CN 114494407B
Authority
CN
China
Prior art keywords
voltage
voltage value
measured
voltage values
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210387883.0A
Other languages
Chinese (zh)
Other versions
CN114494407A (en
Inventor
代红林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Yike Automation Co ltd
Original Assignee
Elco Tianjin Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Elco Tianjin Electronics Co Ltd filed Critical Elco Tianjin Electronics Co Ltd
Priority to CN202210387883.0A priority Critical patent/CN114494407B/en
Publication of CN114494407A publication Critical patent/CN114494407A/en
Application granted granted Critical
Publication of CN114494407B publication Critical patent/CN114494407B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/06Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness for measuring thickness ; e.g. of sheet material
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Optics & Photonics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application provides an image processing method for range finding for based on the thickness that range finding image acquireed the testee, the testee is made by transparent material, includes: using a laser sensor to emit laser beams to a measured object; acquiring pixel positions and corresponding voltage values of 2 light spots of a light beam reflected by a measured object, and separating the voltage values of the two light spots to form two voltage value sets based on a first set voltage threshold; acquiring the centroids of the pixel position sets corresponding to the two voltage value sets based on a gray centroid algorithm, and acquiring distances L1 and L2 between the laser emitter and the front surface and the rear surface of the measured object according to a centroid and triangle distance measurement principle; based on L1And L2And acquiring the thickness of the measured object. This application make full use of multiple reflection acquire the thickness of testee, can reduce the thickness measurement cost.

Description

Image processing method for distance measurement
Technical Field
The application relates to the technical field of optical measurement, in particular to an image processing method for distance measurement.
Background
In real life, the thickness of many products needs to be measured to ensure that the product quality is qualified. At present, a commonly used thickness measurement method is to measure the thickness of a measured object according to a spectral confocal method, in which a laser emitter is used to emit a laser beam to the measured object, the laser beam is divided into a plurality of laser beams by a dispersion lens and is irradiated on the upper surface of the measured object, the distance between the measured object and the laser emitter needs to meet the convergence point of the laser beam with the wavelength on the measured object, the laser beam with the wavelength is reflected back, the thickness of the measured object is obtained according to the wavelength of the reflected laser beam, and the thickness precision of the measured object obtained by the spectral confocal method is higher.
Disclosure of Invention
In view of the above technical problems, an embodiment of the present application provides an image processing method for ranging, so as to solve at least one of the above technical problems.
The technical scheme adopted by the application is as follows:
the embodiment of the application provides an image processing method for range finding for based on the thickness that range finding image acquireed the testee, the testee is made by transparent material, including parallel arrangement's front surface and rear surface, the method includes following step:
s1, emitting a laser beam to a front surface of an object to be measured using a laser emitter, the reflectivity of the object to be measured being set such that the laser beam forms 2 reflections through the front and rear surfaces of the object to be measured, respectively;
and S2, acquiring the first light spot and the second light spot formed after 2 reflections by using the linear array image sensor, namely forming 2 light spots.
S3, acquiring the position and voltage value of each pixel in two light spots on the linear array image sensor, and respectively forming a pixel position set to form S = (S)1,S2,S3,……,Sm) And a corresponding set of voltage values U = (U)1,U2,U3,……,Um),SiThe value of i is 1 to m, wherein m is the number of pixels in 2 light spots; u shapeiIs SiA corresponding voltage value;
s4, traversing U, and acquiring a first voltage value set U based on the first set voltage threshold K11=(U1 1,U2 1,…,Uk1 1) And a second set of voltage values U2=(U1 2,U2 2,…,Uk2 2) Wherein, Uj 1Is U1J-th voltage value of (1), Uj 1K1, j is 1-K1, and K1 is U1The number of voltage values in; u shapet 2Is U2T-th voltage value of (1), Ut 2K1, t is 1-K2, and K2 is U2The number of voltage values in;
s5, based on U1In (1) U1 1And Uk1 1Obtaining the position of U from U1 1The previous p1 voltage values which are all more than or equal to K2 are added into U1 1In front of and acquisition is located at Uk1 1Then p1 rear voltage values which are all more than or equal to K2 are added into Uk1 1Forming a first set of target voltage values; k2 isA second set voltage threshold, K2 < K1;
s6, based on U2In (1) U1 2And Uk2 2Obtaining the position of U from U1 2The previous p2 voltage values which are all more than or equal to K2 are added into U1 2In front of and acquisition is located at Uk2 2Then p2 rear voltage values which are all more than or equal to K2 are added into Uk2 2Forming a second set of target voltage values; wherein at least p3 voltage values are spaced between the last voltage value in the first set of target voltage values and the first voltage value in the second set of target voltage values;
s7, obtaining a corresponding first and second pixel position group based on the first and second target voltage value groups, and calculating a corresponding first and second centroid positions based on the obtained first and second pixel position groups;
s8, acquiring a first distance L1 and a second distance L2 between the laser transmitter and the front surface and the rear surface of the measured object based on the calculated first centroid position, the second centroid position and the triangulation principle;
s9, obtaining | L2-L1 | as the thickness of the object to be measured.
The data processing method for the ranging image, provided by the embodiment of the application, includes the steps of firstly dividing a voltage value set U corresponding to a laser spot on a linear array image sensor into 2 voltage value sets with disconnected data by using a first set voltage threshold, then acquiring a plurality of voltage values adjacent to voltage values at two ends of the first voltage value set and the second voltage value set from the U by using a second set voltage threshold, enabling voltages in the first voltage value set and the second voltage value set to be complete as much as possible, then acquiring distances between a laser transmitter and a front surface and a rear surface of an object to be measured based on the two complete voltage value sets and a triangular ranging principle, and finally obtaining the thickness of the object to be measured based on the two obtained distances.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings required to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the description below are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic flowchart of an image processing method for ranging according to an embodiment of the present disclosure;
fig. 2 is a schematic measurement diagram according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Fig. 1 is a flowchart illustrating an image processing method for ranging according to an embodiment of the present disclosure. Fig. 2 is a schematic measurement diagram according to an embodiment of the present application.
The data processing method for the ranging image, as shown in fig. 2, is used for obtaining the thickness of a measured object 1 based on the ranging image, the measured object is made of a transparent material and includes a front surface and a rear surface which are arranged in parallel, as shown in fig. 1, and the method includes the following steps:
s1, the laser emitter 2 is used to emit a laser beam to the object to be measured.
In the embodiment of the present application, the laser emitter may be an existing structure, and for example, includes a laser emitting tube for emitting laser light, and a lens through which the laser light is focused or collimated and then transmitted to the object to be measured. The laser beam emitted by the laser emitter to the measured object is vertical to the measuring surface.
In the embodiment of the application, the reflectivity of the measured object meets the following conditions: the laser beam can be enabled to respectively form 2 reflections through the front surface and the rear surface of the measured object.
In an exemplary embodiment, the object to be measured is a single layer of glass, and the front and rear surfaces of the object to be measured are not provided with any support during the thickness measurement. In this embodiment, the front and back surfaces of the object to be measured have different reflectivities, so that 2 reflections can be formed via the front and back surfaces when the laser transmitter emits a laser beam toward the glass. For example, in one embodiment, the reflectivity of the front surface is greater than the reflectivity of the back surface, and in another exemplary embodiment, the reflectivity of the front surface may be less than the reflectivity of the back surface.
In another exemplary embodiment, the object to be measured is a single layer of glass and is placed on a support carrier during the thickness measurement process. In this embodiment, the reflectivity of the front surface and the back surface of the single-layer glass may be the same, the back surface of the object to be measured is placed on the supporting carrier, but the transparency of the object to be measured is greater than that of the supporting carrier, and the supporting carrier may have a certain transparency or no transparency, so that the laser beam can be incident on the supporting carrier through the back surface of the glass and then reflected to the back surface of the glass to form a secondary reflection.
And S2, acquiring the first light spot and the second light spot formed after 2 reflections by using the linear array image sensor 3, namely forming 2 light spots.
In the embodiment of the present application, the line array image sensor may be an existing structure, for example, may be a line array CMOS line array image sensor. The light beam reflected back by the object to be measured forms a laser spot on the linear array image sensor, as shown in fig. 2: the light beam emitted by the measured object forms laser spots on the line array image sensor, and each spot has a convergence center. The 2 laser spots are distributed at corresponding positions of the linear array image sensor according to distances between the front surface and the back surface of the object to be measured and the laser emitter, for example, the laser spots are sequentially arranged from the left side to the right side of the linear array image sensor according to the distance from the near side to the far side, or sequentially arranged from the right side to the left side of the linear array image sensor. In the embodiment of the present application, as shown in fig. 2, the light beams are sequentially arranged from the left side to the right side of the linear array image sensor from the near side to the far side, that is, a first light spot formed by a light beam reflected by the front surface of the object to be measured is located on the left side of the linear array image sensor, and a second light spot formed by a light beam reflected by the rear surface of the object to be measured is located on the right side of the first light spot. The 2 laser spots may be consecutive spots, i.e. two spots have an intersection area. The 2 laser spots can also be independent spots, i.e. there is no intersection area between the two spots. Whether the 2 light spots are continuous or not is determined based on the thickness of the measured object, if the thickness is larger, the formed 2 light spots are discontinuous, and if the thickness is smaller, the formed 2 light spots have an intersection area.
S3, acquiring the position and voltage value of each pixel in the n light spots on the linear array image sensor, and respectively forming a pixel position set to form S = (S)1,S2,S3,……,Sm) And a corresponding set of voltage values U = (U)1,U2,U3,……,Um),SiThe value of i is 1 to m, wherein m is the number of pixels in 2 light spots; u shapeiIs SiThe corresponding voltage value.
In this embodiment of the application, the linear array image sensor may obtain a position and a voltage value of each pixel point on the linear array according to the obtained light intensity of the light spot, and send the position and the voltage value to a signal processing device, such as a processor, and the signal processing device may store the position of each pixel point, that is, the number of each pixel point and the corresponding voltage value in an index manner, that is, the voltage value set U may correspond to the pixel position set S = (S)1,S2,S3,……,Sm),SiIs the position of the ith pixel, where S1<S2<S3,……<Sm. The position of each pixel is determined based on the line size of the line image sensor.
S4, traversing U, and acquiring a first voltage value set U based on the first set voltage threshold K11=(U1 1,U2 1,…,Uk1 1) And a second set of voltage values U2=(U1 2,U2 2,…,Uk2 2) Wherein, Uj 1Is U1J-th voltage value of (1), Uj 1K1, j is 1-K1, and K1 is U1The number of voltage values in; u shapet 2Is U2T-th voltage value of (1), Ut 2K1, t is 1-K2, and K2 is U2The number of voltage values in (a).
In the embodiment of the present application, K1 is set so that a voltage value group for each spot, which is a voltage value group from which noise is removed, can be obtained from n formed spots. In a specific example, a voltage pixel plot may be plotted based on S and U, wherein the ordinate of the voltage pixel plot is the voltage value and the abscissa is the pixel value, such that a curve of 2 light spots may be obtained, the curve comprising peak and valley regions. To find the actually required curve segment, the application sets K1 so that there is a break-off area between the curves of the respective spots, thereby obtaining an effective curve segment for each spot.
In particular, in one embodiment, k1= (∑ Σ)a s=1 Usp1+2*∑b w=1 Uwv+∑c x=1 Uxp 2)/4, wherein Usp1∈Up1=(U1p1,U2p1,…,Uap 1), s takes values from 1 to a, a is the voltage number in Up 1; up1 is a first peak voltage subset in a voltage fitting curve obtained based on U and S; u shapewv∈Uv=(U1v,U2v,…,Ubv), the value of w is 1 to b, b is the voltage quantity in Uv, and the trough voltage subset in the Uv voltage fitting curve; u shapexp1∈Up2=(U1p2,U2p2,…,Ucp 2), x is from 1 to c, c is the voltage quantity in Up2, Uxp1 is the second subset of peak voltages in the voltage fit curve.
In the embodiment of the present application, the abscissa of the voltage fitting curve is the pixel position of each pixel point, and the abscissa is the corresponding voltage value.
Those skilled in the art will appreciate that the first subset of peak voltages, the second subset of peak voltages, and the second subset of valley voltages may be obtained using known techniques, such as binning. Since the measurement may have fluctuations that may cause fluctuations in the peak and valley regions, calculating K1 based on the peak voltage subset and the valley voltage subset may enable the obtained K1 to better break off two spot images.
When each voltage value group is obtained, each voltage value in U can be compared with K1, and 2 voltage value groups larger than K1 can be obtained because the curve corresponding to the voltage value in U has peak and valley regions.
S5, based on U1In (1) U1 1And Uk1 1Obtaining the position of U from U1 1The previous p1 voltage values which are all more than or equal to K2 are added into U1 1In front of and acquisition is located at Uk1 1Then p1 rear voltage values which are all more than or equal to K2 are added into Uk1 1Forming a first set of target voltage values; k2 is the second set voltage threshold, K2 < K1.
S6, based on U2In (1) U1 2And Uk1 2Obtaining the position of U from U1 2The previous p2 voltage values which are all more than or equal to K2 are added into U1 2And acquisition is located in front of Uk2 2Then p2 rear voltage values which are all more than or equal to K2 are added into Uk2 2Forming a second set of target voltage values; wherein, at least p3 voltage values are separated between the last voltage value in the first target voltage value set and the first voltage value in the second target voltage value set, the specific value of p3 can be set based on actual needs, and in an exemplary embodiment, p3 is greater than or equal to 10.
In the embodiment of the present application, since K1 is to find the peak area segments of 2 light spots, the value of K1 is set to be relatively large, which may cause data of each light spot to be incomplete, and further may cause an inaccurate calculation result, and therefore, in order to make the data of each light spot as complete as possible, the embodiment of the present application uses the second setting threshold K2, so that the data of each light spot is as complete as possible, and simultaneously, interference caused by noise light can be avoided.
In an exemplary embodiment of the present application, K2 satisfies the following condition:
K2>U0max and K2 are set such that there are at least p3 voltage values between the last one of the first set of target voltage values and the first one of the second set of target voltage values; wherein, U0max is the maximum voltage value in the acquired voltage value group on the line image sensor under the condition that the laser transmitter is turned off. In the case where the laser transmitter is turned off, since a spot image is formed on the line image sensor due to ambient light and noise light such as dark current in the line image sensor, and a voltage value may exist, K2 is set to be greater than U0max, the interference caused by the noise light can be avoided.
Those skilled in the art will appreciate that the sequences of S5 and S6 may be performed simultaneously or sequentially or interchangeably.
S7, corresponding first and second pixel position groups are obtained based on the first and second target voltage value groups, and corresponding first and second centroid positions are calculated based on the obtained first and second pixel position groups.
Those skilled in the art know that it is prior art to obtain the corresponding pixel value based on the target voltage value set, for example, a weighted centroid algorithm may be used to enhance the pixels in a partial region in the center of the light spot, so that the calculated centroid is more accurate, and besides the weighted centroid algorithm, a gray centroid algorithm, a threshold centroid algorithm, or other prior art may also be used to calculate the centroid position of the light spot.
S8, acquiring a first distance L1 and a second distance L2 between the laser transmitter and the front surface and the rear surface of the measured object based on the calculated first centroid position, the second centroid position and the triangulation principle;
it is known to those skilled in the art to calculate the distance between the laser transmitter and the front and rear surfaces of the object to be measured based on the first centroid position, the second centroid position and the principle of triangulation.
S9, obtaining | L2-L1 | as the thickness of the object to be measured.
In the embodiment of the present application, the steps S1 to S10 may be executed in a processor, which is an existing structure, such as an MCU.
In this application embodiment, to the testee that has multilayer reflection, the second facula belongs to background noise, when calculating the testee thickness, according to facula that the front surface of testee reflected back beam and the facula that the back surface face of testee reflected back beam and formed obtain the testee thickness, the effectual background noise that has utilized compares in prior art's thickness measuring method, can practice thrift the thickness measurement cost. And the voltage value corresponding to the light spot is processed by utilizing the first set threshold and the second set threshold, so that the thickness measuring accuracy can be improved on the premise of eliminating noise as much as possible.
In another embodiment, the thickness measuring method provided by the embodiment of the present application may also be used for measuring the thickness of multiple layers of a layered body, as long as the reflectivity of each layer of the layered body is set to enable the corresponding spot image to be acquired. The pattern of processing the formed spot image and obtaining the thickness of each layer based on the processed data may be the same as the pattern described above, i.e. a thickness value obtained by subtracting two distances calculated by triangulation using two gray centroids determined by adjacent spots reflected by two adjacent reflecting surfaces.
Although some specific embodiments of the present application have been described in detail by way of illustration, it should be understood by those skilled in the art that the above illustration is only for purposes of illustration and is not intended to limit the scope of the present application. Those skilled in the art will also appreciate that various modifications might be made to the embodiments without departing from the scope and spirit of the present application. The scope of the present application is defined by the appended claims.

Claims (5)

1. An image processing method for ranging, for obtaining a thickness of an object to be measured based on a ranging image, the object to be measured being made of a transparent material and including a front surface and a rear surface which are arranged in parallel, the method comprising the steps of:
s1, emitting a laser beam to a front surface of an object to be measured using a laser emitter, the reflectivity of the object to be measured being set such that the laser beam forms 2 reflections through the front and rear surfaces of the object to be measured, respectively;
s2, acquiring a first light spot and a second light spot formed after 2 reflections by using the linear array image sensor, namely forming 2 light spots; the 2 light spots are distributed at corresponding positions of the linear array image sensor according to the distances between the front surface and the rear surface of the object to be measured and the laser emitter, and are sequentially arranged from the left side to the right side of the linear array image sensor or sequentially arranged from the right side to the left side of the linear array image sensor according to the distances from the near side to the far side;
s3, obtaining the position and voltage value of each pixel in two light spots on the linear array image sensor, storing the position of each pixel, namely the number of each pixel and the corresponding voltage value in an index manner, and respectively forming a pixel position set S = (S)1,S2,S3,……,Sm) And a corresponding set of voltage values U = (U)1,U2,U3,……,Um),SiIs the position of the ith pixel, wherein S1<S2<S3,……<SmThe value of i is 1 to m, and m is the number of pixels in 2 light spots; u shapeiIs SiA corresponding voltage value;
s4, traversing U, and acquiring a first voltage value group U based on the first set voltage threshold K11=(U1 1,U2 1,…,Uk1 1) And a second set of voltage values U2=(U1 2,U2 2,…,Uk2 2) Which isIn, Uj 1Is U1J-th voltage value of (1), Uj 1Not less than K1, j is from 1 to K1, and K1 is U1The number of voltage values in; u shapet 2Is U2T-th voltage value of (1), Ut 2Not less than K1, t is 1-K2, and K2 is U2The number of voltage values in;
s5, based on U1In (1) U1 1And Uk1 1Obtaining the position of U from U1 1The previous p1 voltage values which are all more than or equal to K2 are added into U1 1In front of and acquisition is located at Uk1 1Then p1 rear voltage values which are all more than or equal to K2 are added into Uk1 1Forming a first set of target voltage values; k2 is a second set voltage threshold, K2 < K1;
s6, based on U2In (1) U1 2And Uk2 2Obtaining the position of U from U1 2The previous p2 voltage values which are all more than or equal to K2 are added into U1 2And acquisition is located in front of Uk2 2Then p2 rear voltage values which are all more than or equal to K2 are added into Uk2 2Forming a second set of target voltage values; wherein at least p3 voltage values are spaced between the last voltage value in the first set of target voltage values and the first voltage value in the second set of target voltage values;
s7, obtaining a corresponding first and second pixel position group based on the first and second target voltage value groups, and calculating a corresponding first and second centroid position based on the obtained first and second pixel position groups;
s8, acquiring a first distance L1 and a second distance L2 between the laser transmitter and the front surface and the rear surface of the measured object based on the calculated first centroid position, the second centroid position and the triangulation principle;
s9, obtaining | L2-L1 | as the thickness of the object to be measured.
2. The method according to claim 1, wherein k1= (∑ Σ)a s=1 Usp1+2*∑b w=1 Uwv+∑c x=1Uxp 2)/4, wherein, Usp1∈Up1=(U1p1,U2p1,…,Uap 1), s takes values from 1 to a, and a is the voltage quantity in Up 1; up1 is a first peak voltage subset in a voltage fitting curve obtained based on U and S; u shapewv∈Uv=(U1v,U2v,…,Ubv), the value of w is 1 to b, b is the voltage quantity in Uv, and Uv is a trough voltage subset in a voltage fitting curve; u shapexp2∈Up2=(U1p2,U2p2,…,Ucp 2), wherein x is from 1 to c, c is the voltage quantity in the Up2, and Up2 is a second peak voltage subset in a voltage fitting curve;
the abscissa of the voltage fitting curve is the pixel position of each pixel point, and the abscissa is a corresponding voltage value.
3. The method of claim 1,
K2>U0max and K2 are set such that there are at least p3 voltage values between the last voltage value in the first set of target voltage values and the first voltage value in the second set of target voltage values; wherein, U0max is the maximum voltage value in the acquired voltage value group on the line image sensor under the condition that the laser transmitter is turned off.
4. The method of claim 1, wherein the front and back surfaces of the object are of different reflectivity.
5. The method according to claim 1, characterized in that the object to be measured is placed on a support carrier, the transparency of which is smaller than the transparency of the object to be measured.
CN202210387883.0A 2022-04-14 2022-04-14 Image processing method for distance measurement Active CN114494407B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210387883.0A CN114494407B (en) 2022-04-14 2022-04-14 Image processing method for distance measurement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210387883.0A CN114494407B (en) 2022-04-14 2022-04-14 Image processing method for distance measurement

Publications (2)

Publication Number Publication Date
CN114494407A CN114494407A (en) 2022-05-13
CN114494407B true CN114494407B (en) 2022-07-22

Family

ID=81488647

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210387883.0A Active CN114494407B (en) 2022-04-14 2022-04-14 Image processing method for distance measurement

Country Status (1)

Country Link
CN (1) CN114494407B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116068568B (en) * 2023-04-07 2023-07-28 天津宜科自动化股份有限公司 Data processing system for obtaining object distance
CN116203574B (en) * 2023-05-04 2023-07-28 天津宜科自动化股份有限公司 Data processing system for detecting object distance
CN116953716A (en) * 2023-07-10 2023-10-27 天津宜科自动化股份有限公司 Distance measuring device based on image sensor capable of dynamically adjusting pixel area output

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101644600B (en) * 2008-12-25 2011-11-16 长春理工大学 Embedded type laser beam quality measuring device
US9293197B2 (en) * 2011-08-15 2016-03-22 Lockheed Martin Corporation Reconfigurable phase change material masks for electro-optical compressive sensing
CN110470231B (en) * 2019-08-07 2020-11-20 上海交通大学 Transparent object thickness laser measurement method and system
JP2021060900A (en) * 2019-10-09 2021-04-15 ソニーセミコンダクタソリューションズ株式会社 Face authentication system and electronic device
CN110864635A (en) * 2019-10-30 2020-03-06 宁波兰羚钢铁实业有限公司 Online thickness detection system and method for slitting machine
CN111024225B (en) * 2019-12-02 2021-12-31 西北核技术研究院 Absolute measurement method for power distribution curve in laser far-field barrel
CN111812661A (en) * 2020-06-22 2020-10-23 深圳奥锐达科技有限公司 Distance measuring method and system
CN112346075B (en) * 2020-10-01 2023-04-11 奥比中光科技集团股份有限公司 Collector and light spot position tracking method
CN113176579A (en) * 2021-03-01 2021-07-27 奥比中光科技集团股份有限公司 Light spot position self-adaptive searching method, time flight ranging system and ranging method
CN113552725A (en) * 2021-07-20 2021-10-26 中国工程物理研究院激光聚变研究中心 Laser beam coaxial co-wave surface control system and method
CN114136976B (en) * 2021-11-08 2024-04-26 中国工程物理研究院激光聚变研究中心 Polarization coaxial illumination laser shearing speckle interferometry system and measurement method thereof

Also Published As

Publication number Publication date
CN114494407A (en) 2022-05-13

Similar Documents

Publication Publication Date Title
CN114494407B (en) Image processing method for distance measurement
US20210181317A1 (en) Time-of-flight-based distance measurement system and method
KR102452393B1 (en) Detector for optically determining a position of at least one object
US11821987B2 (en) Multiple resolution, simultaneous localization and mapping based on 3-D LIDAR measurements
CN114460594B (en) Image denoising method based on triangular distance measurement
US20200363341A1 (en) Image detection scanning method for object surface defects and image detection scanning system thereof
CN111830530B (en) Distance measuring method, system and computer readable storage medium
US20180003993A1 (en) Detector for an optical detection of at least one object
CN110687541A (en) Distance measuring system and method
CN108845332B (en) Depth information measuring method and device based on TOF module
CN110780312B (en) Adjustable distance measuring system and method
CN111965658B (en) Distance measurement system, method and computer readable storage medium
JP2019523423A (en) Apparatus and method for determining secondary image angle and / or viewing angle
WO2021214123A1 (en) Illumination pattern for object depth measurment
CN114746772A (en) Filtering measurement data of an active optical sensor system
CN115330794B (en) LED backlight foreign matter defect detection method based on computer vision
CN108459328A (en) A kind of detection device with uniform receiving optics
CN116381708A (en) High-precision laser triangular ranging system
CN111189840A (en) Paper defect detection method with near-field uniform illumination
CN107193428B (en) Optical touch screen, touch positioning method thereof and optical distortion calibration method
CN102809351B (en) Wall thickness detecting device and wall thickness detecting method for transparent and semitransparent glass bottles
CN115407349A (en) Image capture auxiliary multi-line laser ranging module
CN112017244A (en) High-precision planar object positioning method and device
EA028167B1 (en) Method of determining distance to an object, its height and width
CN210036592U (en) Measuring device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: No.12, Saida 4th branch road, economic development zone, Xiqing District, Tianjin

Patentee after: Tianjin Yike Automation Co.,Ltd.

Address before: No.12, Saida 4th branch road, economic development zone, Xiqing District, Tianjin

Patentee before: ELCO (TIANJIN) ELECTRONICS Co.,Ltd.

CP01 Change in the name or title of a patent holder
CP02 Change in the address of a patent holder

Address after: No. 12 Saida Fourth Branch Road, Xiqing Economic and Technological Development Zone, Xiqing District, Tianjin, 300385

Patentee after: Tianjin Yike Automation Co.,Ltd.

Address before: No.12, Saida 4th branch road, economic development zone, Xiqing District, Tianjin

Patentee before: Tianjin Yike Automation Co.,Ltd.

CP02 Change in the address of a patent holder