CN111288967B - Remote high-precision displacement detection method based on machine vision - Google Patents

Remote high-precision displacement detection method based on machine vision Download PDF

Info

Publication number
CN111288967B
CN111288967B CN202010061063.3A CN202010061063A CN111288967B CN 111288967 B CN111288967 B CN 111288967B CN 202010061063 A CN202010061063 A CN 202010061063A CN 111288967 B CN111288967 B CN 111288967B
Authority
CN
China
Prior art keywords
point
measured
displacement
reference point
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010061063.3A
Other languages
Chinese (zh)
Other versions
CN111288967A (en
Inventor
孟小华
唐英杰
胡辉
林兴立
胡荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Hannan Engineering Technology Co ltd
Original Assignee
Guangzhou Hannan Engineering Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Hannan Engineering Technology Co ltd filed Critical Guangzhou Hannan Engineering Technology Co ltd
Priority to CN202010061063.3A priority Critical patent/CN111288967B/en
Publication of CN111288967A publication Critical patent/CN111288967A/en
Application granted granted Critical
Publication of CN111288967B publication Critical patent/CN111288967B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Abstract

The invention discloses a remote high-precision displacement detection method based on machine vision, which comprises the following steps: arranging a camera and a reference point light source controlled by an embedded computer at fixed positions, and arranging a point light source to be detected in a monitoring target area; acquiring images of a monitoring target area according to a set time interval, and comparing point light source coordinates in the images at different moments to solve the displacement of a point to be measured relative to a reference point; calibrating and correcting the pixel coordinates of the point to be measured by using the reference point as a reference for the displacement direction and the scale of the point to be measured; comparing the position changes of the point to be measured and the reference point at different moments to obtain the pixel displacement of the point to be measured; and converting into the actual displacement to obtain the accurate value of the actual displacement of the point to be measured in the period of time. A camera is used for shooting a reference point at a fixed position and a point to be measured on a monitored target on an image, so that the problems that the existing displacement monitoring means is high in economic cost and incapable of realizing remote accurate measurement and the like are solved.

Description

Remote high-precision displacement detection method based on machine vision
Technical Field
The invention relates to the technical field of displacement detection, in particular to a remote high-precision displacement detection method based on machine vision.
Background
With the development of image acquisition equipment and image processing technology, the current vision measurement technology is widely applied to the fields of medicine, military, industrial and agricultural production and the like. The precision of the appearance and the size detection of the precision parts reaches the micron level, the vision measurement technology needs more target pixel information, generally depends on high-performance image acquisition equipment, and the precision is not ideal in the aspect of remote measurement. The centering algorithm is an algorithm generally applied to astronomical image data processing, and the accurate position of an image is calculated according to the distribution characteristics and the statistical characteristics of pixel information of the star in the image, and can reach a sub-pixel level. In the field of computer vision, the algorithm can also be used for improving the detection precision.
In the technical field of engineering, a total station and an RTK measuring instrument are mainly used as a remote measuring technology, the total station can realize remote accurate measurement, the measurement precision can reach millimeter level, manual operation is needed, and the development trend of automation and real-time monitoring technology cannot be well met. The RTK measuring apparatu can accomplish automatic operation but measurement accuracy is centimetre level. And the price of the above devices is quite expensive, and it is difficult to meet the monitoring task which needs to be implemented in a large range.
Disclosure of Invention
In order to solve the technical problems, the invention provides a remote high-precision displacement detection method based on machine vision, which is characterized in that a camera is used for shooting a reference point to be at a fixed position and a point to be measured on a monitored target, and the measurement precision is improved by combining the reference object principle with an image processing technology, so that the problems that the existing displacement monitoring means is high in economic cost and cannot realize remote precise measurement are solved.
In order to achieve the purpose, the invention adopts the following technical scheme:
a remote high-precision displacement detection method based on machine vision comprises the following steps:
a. the method comprises the following steps of respectively arranging an embedded computer-controlled camera and a reference point light source at fixed positions, and arranging a point light source to be detected in a monitoring area;
b. the embedded computer controls the camera to collect images of a monitoring target area according to a set time interval, calls an image processing program to solve high-precision sub-pixel coordinates of a point light source in the images, and compares point light source coordinates in the images at different moments to solve displacement of a point to be measured relative to a reference point.
c. The image processing program utilizes the principle that the position of a reference point is fixed, utilizes the reference point as a reference standard of the displacement direction and the scale of the point to be measured, calibrates and corrects the pixel coordinate of the point to be measured, compares the position change of the point to be measured and the reference point at different moments, and obtains the pixel displacement of the point to be measured;
d. converting the pixel displacement into an actual displacement through a proportionality coefficient by an image processing program to obtain an accurate value of the actual displacement of the point to be measured in the period of time;
preferably, the camera and the reference point are arranged in a fixed-position area, and according to the characteristic of fixed position of the reference point, the position change of the point to be measured relative to the reference point is compared to realize automatic calibration and correction of measurement.
Preferably, the measured target is set as two points to be measured with known initial distance, and the actual distance of the two points to be measured is divided by the pixel distance to obtain the scaling coefficient alpha converted from the actual distance to the pixel distance.
Preferably, the image acquisition device is a long-focus camera and a high-resolution camera which are fixed in position, acquires images of the area to be measured at different time points, compares the pixel distances of the point to be measured moving relative to the reference point at different moments, and converts the pixel displacement into an actual displacement value by using a proportionality coefficient alpha.
Preferably, the positions of the reference point and the point to be measured in the image are solved by a centering algorithm according to the statistical characteristics and distribution characteristics of the image data of the reference point and the point to be measured, so as to obtain the position coordinates of the point to be measured and the reference point at the sub-pixel level.
Preferably, the processing the photographed image data using the image processing program includes:
s1, installing reference points in the position fixing area, and installing two points to be measured with known distances on the target to be measured;
s2, installing a camera at a fixed position, enabling the lens of the camera to face the target to be measured, adjusting the visual field and the focal length of the camera to enable the camera to shoot images of the reference point and all points to be measured of the target to be measured, and shooting image data at regular time to send to an image processing program by taking the images as a standard; shooting a plurality of image data of the area to be detected at the same moment, and finally carrying out mean value processing on the calculation result so as to reduce random errors;
and S3, processing the data shot in the step two by using an image processing algorithm.
Preferably, the processing flow of step S3 is as follows:
s3.1, preprocessing the image: interference factors are removed, and image quality is improved;
s3.2, roughly calculating coordinates and a connected domain of the light source by using an identification algorithm, calculating an accurate value of a central coordinate of each measuring point by using a centering algorithm, establishing a coordinate system by taking the reference point as a reference, and determining the coordinates of the points to be measured;
s3.3, calculating light source coordinates of a plurality of images shot at the same time point, and calculating an average value as a final result;
and S3.4, calculating the pixel offset between each point to be measured and the point to be measured at the previous time point in the time point image, and converting by using a proportionality coefficient to obtain the actual offset direction and offset of each point to be measured.
It should be further explained that the specific centering algorithm includes: gaussian fitting method, correction moment method and median method. According to the scheme, a median method is used for carrying out centering measurement on the measured light source, and a cubic interpolation function is used for fitting the gray value accumulated image, so that the center position of the target light source is determined. The calculated central coordinate is a floating point number which can be accurate to six digits after a decimal point, and the resolution of the image is improved by using the algorithm, namely the measurement precision is greatly improved by the algorithm on a software level.
The invention has the advantages that the vision measurement technology is used, the reference principle is utilized, the image processing algorithm is combined, the difference of imaging at different distances is eliminated, the performance bottleneck of the image acquisition equipment is made up by a software means, the measurement precision is greatly improved, and the precise measurement of the displacement of a long-distance target is realized.
Drawings
FIG. 1 is a schematic view of installation positions of a camera, a reference point and a point to be measured;
FIG. 2 is a schematic diagram of pixel deviation of a point to be measured with respect to a reference point at two different times, where a dotted line is the point to be measured at time T1, a solid line is the point to be measured at time T2, and a solid circle is the reference point;
FIG. 3: a flow chart of processing image data.
Detailed Description
The present invention will be further described with reference to the accompanying drawings, and it should be noted that the following examples are provided to illustrate the detailed embodiments and specific operations based on the technical solutions of the present invention, but the scope of the present invention is not limited to the examples.
The invention relates to a remote high-precision displacement detection method based on machine vision, which comprises the following steps:
a. the method comprises the following steps of respectively arranging an embedded computer-controlled camera and a reference point light source at fixed positions, and arranging a point light source to be detected in a monitoring area;
b. the embedded computer controls the camera to collect images of a monitoring target area according to a set time interval, calls an image processing program to solve high-precision sub-pixel coordinates of a point light source in the images, and compares point light source coordinates in the images at different moments to solve displacement of a point to be measured relative to a reference point.
c. The image processing program utilizes the principle that the position of a reference point is fixed, utilizes the reference point as a reference standard of the displacement direction and the scale of the point to be measured, calibrates and corrects the pixel coordinate of the point to be measured, compares the position change of the point to be measured and the reference point at different moments, and obtains the pixel displacement of the point to be measured;
d. converting the pixel displacement into an actual displacement through a proportionality coefficient by an image processing program to obtain an accurate value of the actual displacement of the point to be measured in the period of time;
preferably, the camera and the reference point are arranged in a fixed-position area, and according to the characteristic of fixed position of the reference point, the position change of the point to be measured relative to the reference point is compared to realize automatic calibration and correction of measurement.
Preferably, the measured target is set as two points to be measured with known initial distance, and the actual distance of the two points to be measured is divided by the pixel distance to obtain the scaling coefficient alpha converted from the actual distance to the pixel distance.
Preferably, the image acquisition device is a long-focus camera and a high-resolution camera which are fixed in position, acquires images of the area to be measured at different time points, compares the pixel distances of the point to be measured moving relative to the reference point at different moments, and converts the pixel displacement into an actual displacement value by using a proportionality coefficient alpha.
Preferably, the positions of the reference point and the point to be measured in the image are solved by a centering algorithm according to the statistical characteristics and distribution characteristics of the image data of the reference point and the point to be measured, so as to obtain the position coordinates of the point to be measured and the reference point at the sub-pixel level.
Preferably, the processing the photographed image data using the image processing program includes:
s1, installing reference points in the position fixing area, and installing two points to be measured with known distances on the target to be measured;
s2, installing a camera at a fixed position, enabling the lens of the camera to face the target to be measured, adjusting the visual field and the focal length of the camera to enable the camera to shoot images of the reference point and all points to be measured of the target to be measured, and shooting image data at regular time to send to an image processing program by taking the images as a standard; shooting a plurality of image data of the area to be detected at the same moment, and finally carrying out mean value processing on the calculation result so as to reduce random errors;
and S3, processing the data shot in the step two by using an image processing algorithm.
Preferably, the processing flow of step S3 is as follows:
s3.1, preprocessing the image: interference factors are removed, and image quality is improved;
s3.2, roughly calculating coordinates and a connected domain of the light source by using an identification algorithm, calculating an accurate value of a central coordinate of each measuring point by using a centering algorithm, establishing a coordinate system by taking the reference point as a reference, and determining the coordinates of the points to be measured;
s3.3, calculating light source coordinates of a plurality of images shot at the same time point, and calculating an average value as a final result;
and S3.4, calculating the pixel offset between each point to be measured and the point to be measured at the previous time point in the time point image, and converting by using a proportionality coefficient to obtain the actual offset direction and offset of each point to be measured.
Example 1
Fig. 1 is an overall structure of the system according to the embodiment of the present invention, including: camera 1, reference point 2, point 3 to be measured.
After the target to be monitored is determined, two points to be measured with the interval L are installed on the monitored target, and the distance L is used for an image processing program to calculate the conversion coefficient alpha of the actual distance and the pixel distance under the current measuring environment.
Specifically, in order to calculate a conversion coefficient α between an actual distance and a pixel distance, a group of point light sources to be measured in an image is centered and calculated during first calculation to obtain a pixel distance L' of the point light sources, and α is calculated. The calculation formula is as follows:
Figure BDA0002374502180000071
l: actual distance in the initial state; l': and measuring the pixel distance of the point to be measured.
Furthermore, after all the measuring points of the monitoring target are installed, a fixed position should be selected to set a camera and a reference point. The shape or color of the reference point is different from that of the point to be measured, so that the algorithm can conveniently perform automatic identification, the camera lens faces the monitoring area, and the vision and the focal length of the camera are adjusted to clearly shoot all light source data of the monitoring area.
Furthermore, after all the equipment is installed, the system is started to monitor the area to be detected.
Example 2
According to the scheme, before formal measurement, a proportional coefficient alpha of pixel offset and actual offset in the current monitoring environment needs to be calculated, and the proportional coefficient alpha is used for converting the pixel offset of the point to be measured in the image into an actual displacement distance in subsequent calculation. The working steps are as follows:
s1: two measuring points are arranged on a measured target, and the initial distance between the two measuring points is fixed to be L.
S2: centering calculation is performed on the two measurement points, and the pixel pitch L' between the two measurement points is calculated.
S3: when the algorithm processes the data shot for the first time, the conversion ratio alpha of the actual distance of the measured target and the pixel distance in the image can be automatically calculated. The calculation formula is as follows:
Figure BDA0002374502180000081
l: an actual distance; l': pixel distance.
As shown in fig. 3, the image processing program processes the image data, and the working steps are as follows:
s101: and (4) a pretreatment stage. The image is preprocessed by utilizing a noise reduction technology and an image enhancement technology in the image processing, the interference of natural factors is eliminated, the local characteristics of the point light source are more obvious, and the data corresponding to the RGB channel is extracted according to the color of the detected light source to avoid the interference of other colors.
S102: and (5) initially positioning the light source. In order to improve the calculation efficiency of a program, when point light source coordinate calculation is carried out, the scheme carries out coarse positioning on an image according to the characteristics of the point light source, firstly carries out binarization processing, then carries out corrosion expansion operation, and only selects an N-N area near the point light source as a data source of a centering algorithm.
S103: and (5) centering calculation. And traversing all point light source data areas, and performing centering calculation. The centering algorithm with better effect at present comprises the following steps: gaussian fitting method, correction moment method and median method. The system selects a median method to carry out centering measurement on the measured light source, and fits a gray value accumulation image by using cubic interpolation, thereby determining the center coordinate of the light source.
S104: and (5) converting the global coordinate. In order to improve the stability of measurement, the system adopts reference points to mark points to be measured. S102 processes local data, resulting in local coordinates, so that the local coordinates need to be converted into global coordinates again for subsequent offset calculation.
S105: and (5) calibrating and correcting. In order to eliminate the interference of lens jitter, the relative coordinates between the point to be measured and each reference point are recorded by utilizing the characteristic that the position of the light source of the reference point is fixed, so that the offset direction and the offset of the point to be measured are calibrated. And secondly, judging whether the lens shakes according to whether the coordinate change of the reference point in the image exceeds a threshold value. The coordinates of the reference point in the front image and the back image are deviated, and the front deviation amount and the back deviation amount exceed the threshold value, which shows that the current image has serious jitter and poor image quality and should be discarded. Therefore, the same measurement system takes multiple pictures, and a group of images with the least interference are selected as effective data.
S106: an offset is calculated. And the offset direction and the offset of the point to be measured in the time interval can be obtained by comparing the coordinates of the point to be measured at the front moment and the rear moment. And calculating the offset pixel value of the center of the pixel of the point to be measured and the center of the reference point light source on the X-axis and the Y-axis. When the lens does not shake, and the data is normal, taking the X direction as an example, the pixel displacement distance calculation formula is as follows:
Δx=d0-d2=(x1'-x0)-(x1-x0) (1)
s107: and (4) converting the offset. After the pixel displacement delta d is calculated, the actual displacement is calculated according to a proportion conversion formula:
D=Δx*α (2)
the invention uses the vision measurement technology, utilizes the reference object principle, combines various image processing algorithms, eliminates the difference of imaging at different distances, makes up the performance bottleneck of image acquisition equipment by a software means, greatly improves the measurement precision and realizes the precise measurement of the displacement of a long-distance target.
Various corresponding changes and modifications can be made by those skilled in the art based on the above technical solutions and concepts, and all such changes and modifications should be included in the protection scope of the present invention.

Claims (7)

1. A remote high-precision displacement detection method based on machine vision is characterized by comprising the following steps:
a. the method comprises the following steps of respectively arranging an embedded computer-controlled camera and a reference point light source at fixed positions, and arranging a point light source to be detected in a monitoring area;
b. the embedded computer controls the camera to collect images of a monitoring target area according to a set time interval, calls an image processing program to solve high-precision sub-pixel coordinates of a point light source in the images, and compares point light source coordinates in the images at different moments to solve displacement of a point to be measured relative to a reference point;
c. the image processing program utilizes the principle that the position of a reference point is fixed, utilizes the reference point as a reference standard of the displacement direction and the scale of the point to be measured, calibrates and corrects the pixel coordinate of the point to be measured, compares the position change of the point to be measured and the reference point at different moments, and obtains the pixel displacement of the point to be measured;
d. and the image processing program converts the pixel displacement into an actual displacement through a proportionality coefficient to obtain an accurate value of the actual displacement of the point to be measured in the period of time.
2. The remote high-precision displacement detection method based on machine vision as claimed in claim 1, wherein the camera and the reference point are arranged in a fixed-position area, and according to the characteristic of fixed position of the reference point, the position change of the point to be measured relative to the reference point is compared, so as to realize automatic calibration and correction of measurement.
3. The remote high-precision displacement detection method based on machine vision according to claim 1, characterized in that two points to be measured with known initial distances are set on the object to be measured, and the actual distances of the two points to be measured are divided by the pixel distance to obtain the scaling coefficient α of the conversion between the actual distances and the pixel distance.
4. A remote high-precision displacement detection method based on machine vision as claimed in claim 3, characterized in that the image acquisition device acquires images of the monitoring target area at different time points for a long-focus high-resolution camera with fixed position, compares the pixel distances of the point to be measured moving relative to the reference point at different time, and converts the pixel displacement into an actual displacement value by using the proportionality coefficient α.
5. The remote high-precision displacement detection method based on machine vision according to claim 4, characterized in that the positions of the reference point and the point to be measured in the image are solved by a centering algorithm according to the statistical characteristics and distribution characteristics of the image data to obtain the position coordinates of the point to be measured and the reference point at the sub-pixel level.
6. The machine-vision-based remote high-precision displacement detection method according to claim 1, wherein the processing the captured image data using an image processing program comprises:
s1, installing reference points in the position fixing area, and installing two points to be measured with known distances on the target to be measured;
s2, installing a camera at a fixed position, enabling the lens of the camera to face the target to be measured, adjusting the visual field and the focal length of the camera to enable the camera to shoot images of the reference point and all points to be measured of the target to be measured, and shooting image data at regular time to send to an image processing program by taking the images as a standard; shooting a plurality of image data of the area to be detected at the same moment, and finally carrying out mean value processing on the calculation result so as to reduce random errors;
and S3, processing the data shot in the step two by using an image processing algorithm.
7. The method for detecting remote high-precision displacement based on machine vision according to claim 6, wherein the processing flow of step S3 is as follows:
s3.1, preprocessing the image: interference factors are removed, and image quality is improved;
s3.2, roughly calculating coordinates and a connected domain of the light source by using an identification algorithm, calculating an accurate value of a central coordinate of each measuring point by using a centering algorithm, establishing a coordinate system by taking the reference point as a reference, and determining the coordinates of the points to be measured;
s3.3, calculating light source coordinates of a plurality of images shot at the same time point, and calculating an average value as a final result;
and S3.4, calculating the pixel offset between each point to be measured and the point to be measured at the previous time point in the time point image, and converting by using a proportionality coefficient to obtain the actual offset direction and offset of each point to be measured.
CN202010061063.3A 2020-01-19 2020-01-19 Remote high-precision displacement detection method based on machine vision Active CN111288967B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010061063.3A CN111288967B (en) 2020-01-19 2020-01-19 Remote high-precision displacement detection method based on machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010061063.3A CN111288967B (en) 2020-01-19 2020-01-19 Remote high-precision displacement detection method based on machine vision

Publications (2)

Publication Number Publication Date
CN111288967A CN111288967A (en) 2020-06-16
CN111288967B true CN111288967B (en) 2020-10-27

Family

ID=71029969

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010061063.3A Active CN111288967B (en) 2020-01-19 2020-01-19 Remote high-precision displacement detection method based on machine vision

Country Status (1)

Country Link
CN (1) CN111288967B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113091699A (en) * 2021-03-31 2021-07-09 中煤科工集团重庆研究院有限公司 Micro-displacement amplification method based on video image
CN113240747B (en) * 2021-04-21 2022-08-26 浙江大学 Outdoor structure vibration displacement automatic monitoring method based on computer vision
CN113405530B (en) * 2021-07-02 2023-02-28 菲特(天津)检测技术有限公司 Visual measurement system, method, equipment, production line and terminal for deviation of stamping process sheet material
CN113554667B (en) * 2021-07-27 2023-12-12 上海海瞩智能科技有限公司 Three-dimensional displacement detection method and device based on image recognition
CN114295058B (en) * 2021-11-29 2023-01-17 清华大学 Method for measuring whole-face dynamic displacement of building structure
CN114593715A (en) * 2022-03-08 2022-06-07 广州翰南工程技术有限公司 Remote high-precision displacement measurement method based on image processing
CN114719770B (en) * 2022-04-02 2024-04-02 基康仪器股份有限公司 Deformation monitoring method and device based on image recognition and space positioning technology
CN114972519B (en) * 2022-08-02 2022-10-21 石家庄铁道大学 Rail vehicle accurate positioning method based on infrared reflection and image processing
CN115143887B (en) * 2022-09-05 2022-11-15 常州市建筑科学研究院集团股份有限公司 Method for correcting measurement result of visual monitoring equipment and visual monitoring system
CN115314700B (en) * 2022-10-13 2023-01-24 潍坊歌尔电子有限公司 Position detection method for control device, positioning system, and readable storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SU746186A1 (en) * 1978-03-02 1980-07-07 Новосибирский Институт Инженеров Геодезии,Аэрофотосъемки И Картографии Method of photogrammetric determination of non-deforming object motion parameters
DE10028713B4 (en) * 2000-06-08 2010-11-25 Art+Com Ag Visualization device and method
CN102175228B (en) * 2011-01-27 2012-09-05 北京播思软件技术有限公司 Distance measurement method based on mobile terminal
KR101734372B1 (en) * 2013-05-10 2017-05-24 모젼스랩(주) Distance measurement method using single camera module
CN103411589B (en) * 2013-07-29 2016-01-13 南京航空航天大学 A kind of 3-D view matching navigation method based on four-dimensional real number matrix
JP5906272B2 (en) * 2014-03-28 2016-04-20 富士重工業株式会社 Stereo image processing apparatus for vehicle

Also Published As

Publication number Publication date
CN111288967A (en) 2020-06-16

Similar Documents

Publication Publication Date Title
CN111288967B (en) Remote high-precision displacement detection method based on machine vision
CN106092059B (en) A kind of works Horizontal Displacement Monitoring Method based on multi-point fitting
CN106197292B (en) A kind of building displacement monitoring method
CN112541953B (en) Vehicle detection method based on radar signal and video synchronous coordinate mapping
CN102778196A (en) Image size measuring method based on laser calibration
CN105865349B (en) A kind of building displacement monitoring method
CN110223355B (en) Feature mark point matching method based on dual epipolar constraint
CN109685744B (en) Scanning galvanometer precision correction method
CN110533649B (en) Unmanned aerial vehicle general structure crack identification and detection device and method
CN111047586B (en) Pixel equivalent measuring method based on machine vision
CN112802123A (en) Binocular linear array camera static calibration method based on stripe virtual target
CN112665523B (en) Combined measurement method for complex profile
CN105717502B (en) A kind of high-rate laser range unit based on line array CCD
CN105371759A (en) Measuring method of multi-lens image measuring instrument
Wang et al. Distance measurement using single non-metric CCD camera
CN114509018A (en) Full-field real-time bridge deflection measurement method
CN109242911A (en) One kind being based on subregional binocular camera fundamental matrix calculation method
CN210154538U (en) Metal structure deformation measuring device based on machine vision
CN109493382B (en) Fixed star high-precision position extraction method based on intra-pixel response
CN210154537U (en) Metal structure deformation measuring device based on digital photography
CN114511498A (en) Propeller vibration measuring method based on monocular vision
CN108917632B (en) High-efficiency high-precision digital image correlation displacement post-processing method
CN110887488A (en) Unmanned rolling machine positioning method
CN110738706A (en) quick robot vision positioning method based on track conjecture
CN111121637A (en) Grating displacement detection method based on pixel coding

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant