CN111882618A - Left and right view feature point matching processing method, terminal and system in binocular ranging - Google Patents
Left and right view feature point matching processing method, terminal and system in binocular ranging Download PDFInfo
- Publication number
- CN111882618A CN111882618A CN202010594989.9A CN202010594989A CN111882618A CN 111882618 A CN111882618 A CN 111882618A CN 202010594989 A CN202010594989 A CN 202010594989A CN 111882618 A CN111882618 A CN 111882618A
- Authority
- CN
- China
- Prior art keywords
- matching
- binocular
- right view
- view
- feature points
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003672 processing method Methods 0.000 title abstract description 11
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 18
- 238000000605 extraction Methods 0.000 claims abstract description 11
- 238000012216 screening Methods 0.000 claims abstract description 4
- 238000000034 method Methods 0.000 claims description 25
- 238000012545 processing Methods 0.000 claims description 15
- 230000015654 memory Effects 0.000 claims description 9
- 239000000126 substance Substances 0.000 claims description 8
- 239000011159 matrix material Substances 0.000 claims description 7
- 238000001914 filtration Methods 0.000 claims description 4
- 238000005259 measurement Methods 0.000 abstract description 3
- 230000001737 promoting effect Effects 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 13
- 230000006870 function Effects 0.000 description 6
- 230000004297 night vision Effects 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 3
- 238000009795 derivation Methods 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000004438 eyesight Effects 0.000 description 2
- 239000000383 hazardous chemical Substances 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 description 1
- 241000282414 Homo sapiens Species 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000004880 explosion Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06T7/85—Stereo camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
The application relates to a left view and right view feature point matching processing method, a terminal and a system in binocular ranging, and belongs to the technical field of binocular ranging. The application includes: extracting feature points according to left and right views of the target object, and performing primary feature point matching by using an Euclidean distance algorithm; using a first predetermined formula: screening the feature points preliminarily matched from the left view and the right view; and obtaining a matching value interval range according to the characteristic points of the screened left and right views, and performing strong matching extraction on the selected matching point pairs according to the minimum value of the matching value interval range and a preset empirical constant. Through this application, help promoting binocularAnd matching precision of the characteristic points of the left view and the right view in the distance measurement.
Description
Technical Field
The application belongs to the technical field of binocular ranging, and particularly relates to a left view and right view feature point matching processing method, terminal and system in binocular ranging.
Background
The stacking safety accidents in the hazardous chemical substance warehouse frequently occur, so that the monitoring on the stacking and placing of the hazardous chemical substances is strengthened and more worthy of attention. In the related technology, according to binocular monitoring video information, distance exceeding monitoring is carried out by adopting vision, and an intelligent and automatic supervision means is formed.
Referring to fig. 1, fig. 1 is a schematic diagram of a conventional binocular distance measuring method. In fig. 1, f is the focal length of the camera; o isR、OTThe optical centers of the two cameras are respectively; b is a base line, namely the distance between the optical centers of the two cameras, Z is the distance between the measured object P and the camera, and P' are binocular imaging points respectively. Based on the binocular distance measurement schematic diagram shown in fig. 1, there are the following distance measurement formulas:
(|OR-OT|)/(|P′-P″|)=Z/f (1),
wherein, | OR-OTAnd | is the baseline distance B, the baseline distance B and the focal length f can be obtained through binocular calibration, and P '-P' is the pixel difference of the matching points of the left and right pictures.
According to the formula (1), the requirement on the accuracy of the matching point, calibration and the like is high. The quality of the matching point information directly influences the numerical value of the pixel coordinate difference | P' -P |. Referring to fig. 2, fig. 2 is a diagram of left and right view matching point information obtained by a conventional SURF matching algorithm in a night vision environment, and as shown in fig. 2, although matching information can be detected by the conventional SURF matching algorithm, a large amount of matching point information is generated, and the left and right view matching points are different in number, include a large amount of incorrect matching information, and have excessive matching information, which results in low accuracy of matching information of feature points and adversely affect subsequent ranging results.
Disclosure of Invention
In order to overcome the problems in the related art at least to a certain extent, the method, the terminal and the system for matching the left and right view feature points in the binocular ranging are provided, and the matching accuracy of the left and right view feature points in the binocular ranging is improved.
In a first aspect,
the application provides a left view and right view feature point matching processing method in binocular ranging, which comprises the following steps:
extracting feature points according to left and right views of the target object, and performing primary feature point matching by using an Euclidean distance algorithm;
using a first predetermined formula:
screening the feature points preliminarily matched by the left view and the right view, wherein Match is matching point pixel information, M is the maximum number of the feature points of the left view, N is the maximum number of the feature points of the right view, and Det (H)i) Feature point determination, Det (H), for the ith feature point of the left viewj) Determining a characteristic point decision value of the jth characteristic point of the right view;
and obtaining a matching value interval range according to the screened characteristic points of the left view and the right view, and performing strong matching extraction on the selected matching point pairs according to the minimum value of the matching value interval range and a preset empirical constant.
Further, the calculating a matching degree value interval range according to the screened feature points includes:
obtaining the range of the matching degree value interval;
wherein MatchMin is the minimum value of the interval range of the matching degree value, and MatchMax is the maximum value of the interval range of the matching degree value.
Further, the performing strong matching extraction on the selected matching point pair according to the minimum value of the matching value interval range and a preset empirical constant includes:
using a third predetermined formula: performing strong matching extraction on the selected matching point pair [ Match < a × min (Match) ];
wherein a is an empirical constant.
Further, the empirical constant takes a value of 2.
Further, the extracting of feature points according to the left and right views of the target object and the preliminary matching of feature points by using the euclidean distance algorithm includes:
creating a Hessian matrix according to the left view and the right view;
filtering the Hessian matrix by using a preset filter to obtain characteristic points of the left view and the right view;
and carrying out primary matching on the feature points of the left view and the right view by utilizing a Euclidean distance algorithm.
Further, the filter includes: a gaussian filter, or a box filter.
Further, the target includes: and (5) stacking dangerous chemicals.
In a second aspect of the present invention,
the application provides binocular range finding handles terminal includes:
one or more memories having executable programs stored thereon;
one or more processors configured to execute the executable program in the memory to implement the steps of any of the methods described above.
In a third aspect,
the application provides binocular range finding system includes:
the binocular ranging processing terminal is as described above.
Further, the binocular ranging system further includes:
and the binocular camera is connected with the binocular ranging processing terminal so that the binocular ranging processing terminal can acquire left and right views of the target object through the binocular camera.
This application adopts above technical scheme, possesses following beneficial effect at least:
through the method and the device, the traditional SURF matching algorithm is improved, and a strong characteristic value matching mechanism is added, so that the matching precision of the left view characteristic point and the right view characteristic point in binocular ranging is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic diagram of a conventional binocular ranging;
FIG. 2 is a left and right view matching point information diagram under a night vision environment obtained by a conventional SURF matching algorithm;
fig. 3 is a flowchart illustrating a left and right view feature point matching processing method in binocular ranging according to an exemplary embodiment;
FIG. 4 is a left and right view matching point information diagram in a night vision environment obtained based on the left and right view feature point matching processing method in binocular ranging according to the present application;
fig. 5 is a block diagram configuration diagram of a binocular ranging processing terminal according to an exemplary embodiment;
fig. 6 is a block diagram configuration diagram illustrating a binocular ranging system according to an exemplary embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail below. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the examples given herein without making any creative effort, shall fall within the protection scope of the present application.
In a first aspect of the present application, please refer to fig. 3, where fig. 3 is a flowchart illustrating a left-right view feature point matching processing method in binocular range finding according to an exemplary embodiment, and as shown in fig. 3, the left-right view feature point matching processing method in binocular range finding includes the following steps:
and step S11, extracting feature points according to the left and right views of the target object, and performing preliminary feature point matching by using an Euclidean distance algorithm.
Specifically, left and right views of a target object can be acquired through binocular cameras, for example, binocular vision formed by splicing two cameras of HIKVISION monocular 1080p and DS-IPC-B12-I models is used as an experimental device in a night vision environment, and a binocular camera of 1920 x 1080 resolution and RER-1MP2CAM002-V90 models is used as an experimental device in a daytime environment.
For the target object, in the specific application, the dangerous chemical substance piled up in the dangerous chemical substance warehouse is taken as an example, and due to the characteristics of easy combustion, easy explosion, high corrosivity and the like of the dangerous chemical substance, if the dangerous chemical substance is not properly treated in the storage process, major accidents are easy to happen, and the life and property safety of human beings is threatened.
Step S11 is a step of a conventional SURF algorithm, where the SURF algorithm determines the matching degree by calculating the euclidean distance between feature vectors of two feature points, and the shorter the euclidean distance, the better the matching degree of the two feature points is represented.
For step S11, the following specific method can be used:
step S101, creating a Hessian matrix according to the left and right views of the target, wherein the formula is shown as (2):
wherein,is the x-direction quadratic derivative between picture pixels,is the y-direction quadratic derivative between picture pixels,for derivation in the y direction after one derivation in the x direction among picture pixels, H (I (x, y)) is a created Hessian matrix, and I (x, y) is a derivation function.
And S102, filtering the Hessian matrix by using a preset filter to obtain characteristic points of the left view and the right view.
Specifically, gaussian filtering may be performed on Hessian, and the formula is shown in (3):
in formula (3), σ is a gaussian filter function value, (x, y) is a pixel position, L (x, y, σ) ═ G (σ) × I (x, y) represents a gaussian scale space of an image, and is obtained by convolving the image with different gaussians, and G (σ) is a gaussian filter function. L isxxFor the second derivative calculation, the formula is shown in (4):
Lxx=L(x+1,y)+L(x-1,y)-2L(x,y) (4)
Lxyand LyyIn a similar manner, wherein LxFor the first derivative calculation, the formula is shown in (5):
Lx=L(x+1,y)-L(x,y) (5)
and then solving determinant feature point decision values, wherein the formula is shown as (6):
Det(H)=Lxx*Lyy-(Lxy)2(6)
the Gaussian kernels are distributed according to positive and negative, coefficients are smaller and smaller from a central point to the outside, and in order to improve the operation speed, the SURF algorithm uses a box filter to replace the Gaussian filter, so that the L is higher than the LxyA weighting factor of 0.9 is multiplied in order to balance the error balance due to the approximation using the box filter, and the formula is shown in (7) below:
Det(H)=Lxx*Lyy-(0.9*Lxy)2(7)
and S103, carrying out primary matching on the feature points of the left view and the right view by using a Euclidean distance algorithm.
Specifically, the matching degree values are calculated one by one through Euclidean distance calculation on the feature point decision values of the left view and the right view, and preliminary matching is achieved.
Step S12, using a first preset formula:
screening the feature points preliminarily matched by the left view and the right view, wherein Match is matching point pixel information, M is the maximum number of the feature points of the left view, N is the maximum number of the feature points of the right view, and Det (H)i) Feature point determination, Det (H), for the ith feature point of the left viewj) And determining the value of the characteristic point of the jth characteristic point for the right view.
And step S13, obtaining a matching value interval range according to the screened characteristic points of the left and right views, and carrying out strong matching extraction on the selected matching point pairs according to the minimum value of the matching value interval range and a preset empirical constant.
Further, in step S13, the calculating a matching degree value range according to the screened feature points includes:
using a second predetermined formula:
obtaining the range of the matching degree value interval;
wherein MatchMin is the minimum value of the interval range of the matching degree value, and MatchMax is the maximum value of the interval range of the matching degree value.
Further, in step S13, the performing strong matching extraction on the selected matching point pair according to the minimum value of the matching degree value interval range and a preset empirical constant includes:
using a third predetermined formula:
Norm=[Match<a*min(Match)](10),
carrying out strong matching extraction on the selected matching point pairs;
wherein a is an empirical constant.
Further, the empirical constant takes a value of 2.
Specifically, according to the formula (10), when the minimum distance of matching greater than or equal to 2 times is selected based on empirical reference, the matching is regarded as a mismatch or weak matching, and further mismatch denoising and strong matching extraction are achieved, that is, the amount of information is reduced and the matching accuracy is improved. Referring to fig. 4, fig. 4 is a left and right view matching point information diagram in a night vision environment obtained based on the left and right view feature point matching processing method in binocular ranging, and it can be known by comparing fig. 4 and fig. 2 that matching point information is more refined and the number of left and right view feature points is substantially consistent through the feature points obtained by the left and right view feature point matching processing method in binocular ranging, so that the purpose of improving the matching accuracy of the feature points can be achieved.
In conclusion, through the above related steps, the strong feature point matching algorithm of the present application can be realized to improve the conventional SURF matching algorithm, thereby achieving the purpose of improving the feature point matching accuracy.
In a third aspect of the present application, referring to fig. 5, fig. 5 is a block diagram illustrating a binocular ranging processing terminal according to an exemplary embodiment, and as shown in fig. 5, the binocular ranging processing terminal 10 includes:
one or more memories 1001 on which executable programs are stored;
one or more processors 1002 for executing the executable programs in the memory 1001 to implement the steps of any of the methods described above.
With regard to the binocular ranging processing terminal 10 in the above embodiment, the specific manner in which the processor 1002 thereof executes the program of the memory 1001 has been described in detail in the above embodiment related to the method, and will not be described in detail here.
Referring to fig. 6 in a fourth aspect of the present application, fig. 6 is a block diagram of a binocular ranging system according to an exemplary embodiment, and as shown in fig. 6, the binocular ranging system 1 includes:
the binocular ranging processing terminal 10 as described above.
Further, the binocular ranging system further includes:
and the binocular camera 20 is connected with the binocular ranging processing terminal 10, so that the binocular ranging processing terminal 10 acquires left and right views of a target object through the binocular camera 20. With regard to the binocular ranging system 1 in the above embodiment, the specific operation thereof has been described in detail in the embodiment related to the method, and will not be explained in detail here.
With regard to the readable storage medium in the above embodiment, the specific manner of executing the operation by the stored program has been described in detail in the above embodiment related to the method, and will not be elaborated herein.
It is understood that the same or similar parts in the above embodiments may be mutually referred to, and the same or similar parts in other embodiments may be referred to for the content which is not described in detail in some embodiments.
It should be noted that, in the description of the present application, the terms "first", "second", etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. In addition, in the description of the present application, the meaning of "plurality" means at least two unless otherwise specified.
Any process or method descriptions in flow charts or otherwise described herein may be understood as: represents modules, segments or portions of code which include one or more executable instructions for implementing specific logical functions or steps of a process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present application.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in the memory 301 and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.
Claims (10)
1. The method for matching and processing the feature points of the left view and the right view in binocular ranging is characterized by comprising the following steps of:
extracting feature points according to left and right views of the target object, and performing primary feature point matching by using an Euclidean distance algorithm;
using a first predetermined formula:
screening the feature points preliminarily matched by the left view and the right view, wherein Match is matching point pixel information, M is the maximum number of the feature points of the left view, N is the maximum number of the feature points of the right view, and Det (H)i) Feature point determination, Det (H), for the ith feature point of the left viewj) Determining a characteristic point decision value of the jth characteristic point of the right view;
and obtaining a matching value interval range according to the screened characteristic points of the left view and the right view, and performing strong matching extraction on the selected matching point pairs according to the minimum value of the matching value interval range and a preset empirical constant.
2. The method according to claim 1, wherein the calculating a matching degree value interval range according to the screened feature points comprises:
obtaining the range of the matching degree value interval;
wherein MatchMin is the minimum value of the interval range of the matching degree value, and MatchMax is the maximum value of the interval range of the matching degree value.
3. The method according to claim 1, wherein the performing strong matching extraction on the selected matching point pair according to the minimum value of the matching degree value interval range and a preset empirical constant comprises:
using a third predetermined formula: performing strong matching extraction on the selected matching point pair [ Match < a × min (Match) ];
wherein a is an empirical constant.
4. The method of claim 3, wherein the empirical constant takes on a value of 2.
5. The method according to any one of claims 1 to 4, wherein the extracting feature points according to the left and right views of the target object and performing preliminary feature point matching by using Euclidean distance algorithm comprises:
creating a Hessian matrix according to the left view and the right view;
filtering the Hessian matrix by using a preset filter to obtain characteristic points of the left view and the right view;
and carrying out primary matching on the feature points of the left view and the right view by utilizing a Euclidean distance algorithm.
6. The method of claim 5, wherein the filter comprises: a gaussian filter, or a box filter.
7. The method of claim 1, wherein the target comprises: and (5) stacking dangerous chemicals.
8. Binocular range finding handles terminal, its characterized in that includes:
one or more memories having executable programs stored thereon;
one or more processors configured to execute the executable program in the memory to implement the steps of the method of any one of claims 1-7.
9. Binocular range finding system, its characterized in that includes:
the binocular range processing terminal of claim 8.
10. The binocular ranging system of claim 9, further comprising:
and the binocular camera is connected with the binocular ranging processing terminal so that the binocular ranging processing terminal can acquire left and right views of the target object through the binocular camera.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010594989.9A CN111882618B (en) | 2020-06-28 | 2020-06-28 | Left-right view characteristic point matching processing method, terminal and system in binocular ranging |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010594989.9A CN111882618B (en) | 2020-06-28 | 2020-06-28 | Left-right view characteristic point matching processing method, terminal and system in binocular ranging |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111882618A true CN111882618A (en) | 2020-11-03 |
CN111882618B CN111882618B (en) | 2024-01-26 |
Family
ID=73157058
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010594989.9A Active CN111882618B (en) | 2020-06-28 | 2020-06-28 | Left-right view characteristic point matching processing method, terminal and system in binocular ranging |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111882618B (en) |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110134221A1 (en) * | 2009-12-07 | 2011-06-09 | Samsung Electronics Co., Ltd. | Object recognition system using left and right images and method |
CN103260043A (en) * | 2013-04-28 | 2013-08-21 | 清华大学 | Binocular stereo image matching method and system based on learning |
CN105894574A (en) * | 2016-03-30 | 2016-08-24 | 清华大学深圳研究生院 | Binocular three-dimensional reconstruction method |
KR101705330B1 (en) * | 2015-09-30 | 2017-02-10 | 인하대학교 산학협력단 | Keypoints Selection method to Find the Viewing Angle of Objects in a Stereo Camera Image |
CN107705333A (en) * | 2017-09-21 | 2018-02-16 | 歌尔股份有限公司 | Space-location method and device based on binocular camera |
CN108388854A (en) * | 2018-02-11 | 2018-08-10 | 重庆邮电大学 | A kind of localization method based on improvement FAST-SURF algorithms |
US20180357788A1 (en) * | 2016-08-11 | 2018-12-13 | Changzhou Campus of Hohai University | UAV Inspection Method for Power Line Based on Human Visual System |
CN109631829A (en) * | 2018-12-17 | 2019-04-16 | 南京理工大学 | A kind of binocular distance measuring method of adaptive Rapid matching |
CN109816051A (en) * | 2019-02-25 | 2019-05-28 | 北京石油化工学院 | A kind of harmful influence cargo characteristic point matching method and system |
CN109919247A (en) * | 2019-03-18 | 2019-06-21 | 北京石油化工学院 | Characteristic point matching method, system and equipment in harmful influence stacking binocular ranging |
CN110070610A (en) * | 2019-04-17 | 2019-07-30 | 精伦电子股份有限公司 | The characteristic point matching method and device of characteristic point matching method, three-dimensionalreconstruction process |
CN111062990A (en) * | 2019-12-13 | 2020-04-24 | 哈尔滨工程大学 | Binocular vision positioning method for underwater robot target grabbing |
-
2020
- 2020-06-28 CN CN202010594989.9A patent/CN111882618B/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110134221A1 (en) * | 2009-12-07 | 2011-06-09 | Samsung Electronics Co., Ltd. | Object recognition system using left and right images and method |
CN103260043A (en) * | 2013-04-28 | 2013-08-21 | 清华大学 | Binocular stereo image matching method and system based on learning |
KR101705330B1 (en) * | 2015-09-30 | 2017-02-10 | 인하대학교 산학협력단 | Keypoints Selection method to Find the Viewing Angle of Objects in a Stereo Camera Image |
CN105894574A (en) * | 2016-03-30 | 2016-08-24 | 清华大学深圳研究生院 | Binocular three-dimensional reconstruction method |
US20180357788A1 (en) * | 2016-08-11 | 2018-12-13 | Changzhou Campus of Hohai University | UAV Inspection Method for Power Line Based on Human Visual System |
CN107705333A (en) * | 2017-09-21 | 2018-02-16 | 歌尔股份有限公司 | Space-location method and device based on binocular camera |
CN108388854A (en) * | 2018-02-11 | 2018-08-10 | 重庆邮电大学 | A kind of localization method based on improvement FAST-SURF algorithms |
CN109631829A (en) * | 2018-12-17 | 2019-04-16 | 南京理工大学 | A kind of binocular distance measuring method of adaptive Rapid matching |
CN109816051A (en) * | 2019-02-25 | 2019-05-28 | 北京石油化工学院 | A kind of harmful influence cargo characteristic point matching method and system |
CN109919247A (en) * | 2019-03-18 | 2019-06-21 | 北京石油化工学院 | Characteristic point matching method, system and equipment in harmful influence stacking binocular ranging |
CN110070610A (en) * | 2019-04-17 | 2019-07-30 | 精伦电子股份有限公司 | The characteristic point matching method and device of characteristic point matching method, three-dimensionalreconstruction process |
CN111062990A (en) * | 2019-12-13 | 2020-04-24 | 哈尔滨工程大学 | Binocular vision positioning method for underwater robot target grabbing |
Non-Patent Citations (4)
Title |
---|
YINCEN ZHAO等: "Aircraft relative attitude measurement based on binocular vision", OPTICAL SENSING AND IMAGING TECHNOLOGY AND APPLICATIONS, pages 1 - 9 * |
刘学君等: "危化品堆垛双目测距中改进的SURF匹配算法研究", 计算机与应用化学, vol. 36, no. 4, pages 345 - 349 * |
王永锋等: "基于SURF算法的双目视觉特征点定位研究", 软件工程, vol. 22, no. 3, pages 5 - 8 * |
翁秀玲;王云峰;余小亮;: "一种图像立体匹配的误匹配点剔除与校正方法", 软件, no. 10, pages 28 - 32 * |
Also Published As
Publication number | Publication date |
---|---|
CN111882618B (en) | 2024-01-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10755381B2 (en) | Method and device for image stitching | |
Zhuo et al. | Defocus map estimation from a single image | |
US10909719B2 (en) | Image processing method and apparatus | |
US8406510B2 (en) | Methods for evaluating distances in a scene and apparatus and machine readable medium using the same | |
CN111222395A (en) | Target detection method and device and electronic equipment | |
CN110956661A (en) | Method for calculating dynamic pose of visible light and infrared camera based on bidirectional homography matrix | |
US20230214989A1 (en) | Defect detection method, electronic device and readable storage medium | |
CN108182708B (en) | Calibration method and calibration device of binocular camera and terminal equipment | |
CN112861870B (en) | Pointer instrument image correction method, system and storage medium | |
CN114332349B (en) | Binocular structured light edge reconstruction method, system and storage medium | |
CN107392948B (en) | Image registration method of amplitude-division real-time polarization imaging system | |
CN116645365B (en) | Quartz glass detection method, device, equipment and medium based on frequency spectrum | |
CN116977671A (en) | Target tracking method, device, equipment and storage medium based on image space positioning | |
CN111882618A (en) | Left and right view feature point matching processing method, terminal and system in binocular ranging | |
CN111383262B (en) | Occlusion detection method, occlusion detection system, electronic terminal and storage medium | |
Pashchenko et al. | An algorithm for the visualization of stereo images simultaneously captured with different exposures | |
CN113269761A (en) | Method, device and equipment for detecting reflection | |
CN112950709A (en) | Pose prediction method, pose prediction device and robot | |
CN114862686A (en) | Image processing method and device and electronic equipment | |
CN117726666B (en) | Cross-camera monocular picture measurement depth estimation method, device, equipment and medium | |
CN111507931A (en) | Data processing method and device | |
CN116363031B (en) | Imaging method, device, equipment and medium based on multidimensional optical information fusion | |
CN110163147B (en) | Binaryzation method, device, equipment and storage medium for stacking five-distance detection | |
CN118505500B (en) | Point cloud data splicing method, point cloud data splicing device, medium and electronic equipment | |
CN117990345A (en) | Lens quality detection method, device and test equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |