CN111882618B - Left-right view characteristic point matching processing method, terminal and system in binocular ranging - Google Patents
Left-right view characteristic point matching processing method, terminal and system in binocular ranging Download PDFInfo
- Publication number
- CN111882618B CN111882618B CN202010594989.9A CN202010594989A CN111882618B CN 111882618 B CN111882618 B CN 111882618B CN 202010594989 A CN202010594989 A CN 202010594989A CN 111882618 B CN111882618 B CN 111882618B
- Authority
- CN
- China
- Prior art keywords
- matching
- binocular
- right view
- view
- feature points
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 9
- 238000000034 method Methods 0.000 claims abstract description 24
- 238000000605 extraction Methods 0.000 claims abstract description 11
- 238000012216 screening Methods 0.000 claims abstract description 4
- 238000012545 processing Methods 0.000 claims description 17
- 230000015654 memory Effects 0.000 claims description 9
- 239000000126 substance Substances 0.000 claims description 8
- 239000011159 matrix material Substances 0.000 claims description 7
- 238000001914 filtration Methods 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 6
- 230000004297 night vision Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 4
- 238000005259 measurement Methods 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000004438 eyesight Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 description 1
- 241000282414 Homo sapiens Species 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000004880 explosion Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06T7/85—Stereo camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
The application relates to a left-right view characteristic point matching processing method, a terminal and a system in binocular ranging, and belongs to the technical field of binocular ranging. The application comprises the following steps: extracting feature points according to the left and right views of the target object, and performing preliminary feature point matching by using an Euclidean distance algorithm; using a first preset formula:screening the feature points preliminarily matched by the left view and the right view; and obtaining a matching degree value interval range according to the characteristic points of the screened left and right views, and carrying out strong matching extraction on the selected matching point pairs according to the minimum value of the matching degree value interval range and a preset experience constant. By the method and the device, matching precision of the left view feature points and the right view feature points in binocular ranging is improved.
Description
Technical Field
The application belongs to the technical field of binocular distance measurement, and particularly relates to a left-right view characteristic point matching processing method, a terminal and a system in binocular distance measurement.
Background
The stacking safety accidents frequently occur in the dangerous chemical warehouse, so that the supervision on the stacking and placing of the dangerous chemicals is more and more important. In the related art, the visual monitoring of the distance exceeding is carried out according to the binocular monitoring video information, so that an intelligent and automatic monitoring means is formed.
Referring to fig. 1, fig. 1 is a schematic diagram of a conventional binocular distance measurement. In the context of figure 1 of the drawings,is the focal length of the camera; />、/>The optical centers of the two cameras are respectively; b is a base line, namely the distance between the optical centers of the two cameras, Z is the distance between the measured object P and the camera, and the distance is +.>And->Each binocular imaging point. Based on the binocular ranging schematic shown in fig. 1, there is the following ranging formula:
(1),
wherein,namely the baseline distance B, which is the sum of the focal lengths +.>All can be derived by double targeting, +.>The pixel difference between the matching points of the left picture and the right picture is obtained.
According to the formula (1), the requirements on the accuracy of matching points, calibration and the like are high. Wherein, the quality of the matching point information directly influences the pixel coordinate differenceIs a numerical value of (a). Referring to fig. 2, fig. 2 is a view matching point information diagram of the left and right views in a night vision environment obtained by a conventional SURF matching algorithm, and as shown in fig. 2, although the conventional SURF matching algorithm can detect matching information, a large amount of matching point information is generated, and the number of the view matching points of the left and right views is different, including a large amount of incorrect matching information, the matching information is excessive, so that the accuracy of the characteristic point matching information is not high, and adverse effects are generated on subsequent ranging results.
Disclosure of Invention
In order to overcome the problems in the related art to at least a certain extent, the application provides a method, a terminal and a system for processing the matching of the left view feature points and the right view feature points in the binocular ranging, which are beneficial to improving the matching precision of the left view feature points and the right view feature points in the binocular ranging.
In a first aspect of the present invention,
the application provides a left-right view characteristic point matching processing method in binocular ranging, which comprises the following steps:
extracting feature points according to the left and right views of the target object, and performing preliminary feature point matching by using an Euclidean distance algorithm;
using a first preset formula:
,
screening the feature points preliminarily matched by the left view and the right view, wherein,in order to match the point pixel information,maximum number of feature points for left view, < >>Maximum number of feature points for right view, < >>Left view->Characteristic point decision values of individual characteristic points, +.>Right view->Feature point determination values of the feature points;
and obtaining a matching degree value interval range according to the screened characteristic points of the left and right views, and carrying out strong matching extraction on the selected matching point pairs according to the minimum value of the matching degree value interval range and a preset experience constant.
Further, the calculating the range of the matching degree value interval according to the screened feature points includes:
using a second preset formula:,
obtaining the range of the matching degree value interval;
wherein,for the minimum value of the range of the matching degree value interval, < + >>The range of the matching degree value interval is the maximum value.
Further, the performing strong matching extraction on the selected matching point pair according to the minimum value of the matching degree value interval range and a preset experience constant includes:
based on experience reference, selecting the MatchMin with the matching degree interval being greater than or equal to two times, and considering the matching as an error matching or weak matching, so as to realize error matching denoising and strong matching extraction.
Further, the extracting the feature points according to the left and right views of the target object, and performing preliminary feature point matching by using a euclidean distance algorithm, includes:
creating a Hessian matrix according to the left view and the right view;
filtering the Hessian matrix by using a preset filter to obtain characteristic points of the left and right views;
and carrying out preliminary matching on the characteristic points of the left view and the right view by using an Euclidean distance algorithm.
Further, the filter includes: gaussian filter, or box filter.
Further, the target includes: and stacking dangerous chemicals.
In a second aspect of the present invention,
the application provides a binocular range processing terminal, include:
one or more memories having executable programs stored thereon;
one or more processors configured to execute the executable program in the memory to implement the steps of any of the methods described above.
In a third aspect of the present invention,
the application provides a binocular ranging system comprising:
the binocular ranging process terminal as described above.
Further, the binocular ranging system further comprises:
and the binocular camera is connected with the binocular distance processing terminal, so that the binocular distance processing terminal obtains left and right views of the target object through the binocular camera.
The application adopts the technical scheme, possesses following beneficial effect at least:
through the method and the device, improvement is carried out on a traditional SURF matching algorithm, and a strong characteristic value matching mechanism is added to achieve improvement of matching accuracy of the left view characteristic points and the right view characteristic points in binocular ranging.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a conventional binocular range finding;
FIG. 2 is a left-right view matching point information diagram of a night vision environment obtained by a conventional SURF matching algorithm;
FIG. 3 is a flowchart illustrating a method of left and right view feature point matching processing in binocular ranging according to an exemplary embodiment;
fig. 4 is a left-right view matching point information diagram in a night vision environment based on a left-right view characteristic point matching processing method in binocular ranging of the present application;
FIG. 5 is a block diagram of an exemplary embodiment of a binocular range processing terminal;
FIG. 6 is a block diagram of an exemplary embodiment of a binocular ranging system.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail below. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present application. All other embodiments, based on the examples herein, which are within the scope of the protection sought by those of ordinary skill in the art without undue effort, are intended to be encompassed by the present application.
In a first aspect of the present application, please refer to fig. 3, fig. 3 is a flowchart illustrating a method for matching left and right view feature points in binocular ranging according to an exemplary embodiment, as shown in fig. 3, the method for matching left and right view feature points in binocular ranging includes the following steps:
and S11, extracting characteristic points according to the left and right views of the target object, and performing preliminary characteristic point matching by using a Euclidean distance algorithm.
Specifically, the left and right views of the target object can be obtained through a binocular camera, for example, binocular vision formed by splicing two HIKVISION monocular 1080p cameras and two DS-IPC-B12-I cameras is used as experimental equipment in a night vision environment, and 1920 x 1080 resolution and RER-1MP2CAM002-V90 type binocular cameras are used as experimental equipment in a daytime environment.
For the target object, in the specific application, taking stacking dangerous chemicals in a dangerous chemical library as an example, the dangerous chemicals are easy to burn, explosion, high in corrosiveness and the like, if the dangerous chemicals are improperly handled in the storage process, major accidents are easy to happen, and the life and property safety of human beings is threatened.
Step S11, a step of a traditional SURF algorithm, wherein the SURF algorithm is used for determining the matching degree by calculating the Euclidean distance of the feature vector between two feature points, and the shorter the Euclidean distance is, the better the matching degree of the two feature points is represented.
For step S11, this may be achieved by:
step S101, a Hessian matrix is created according to left and right views of a target object, and the formula is shown as (2):
(2)
wherein,is a picture imageSecondary conduction in the x-direction between elements->For the second conduction in y direction between picture pixels, < >>Deriving y-direction after once deriving x-direction among picture pixels, and performing +.>For the created Hessian matrix, +.>For a derivative function.
And S102, filtering the Hessian matrix by using a preset filter to obtain the characteristic points of the left and right views.
Specifically, the Hessian may be gaussian filtered, and the formula is shown in (3):
(3)
in the formula (3),is a Gaussian filter function value>For pixel position +.>The Gaussian scale space representing the image is obtained by convolving the image with a different Gaussian, and +.>Is a gaussian filter function. />For the second derivative calculation mode, the formula is shown as (4):
(4)
and->Similarly, wherein->For the first derivative calculation mode, the formula is shown as (5):
(5)
and then solving determinant feature point decision values, wherein the formula is shown as (6):
(6)
the Gaussian kernel follows the forward distribution, the coefficients are smaller and smaller from the center point to the outside, and in order to improve the operation speed, the SURF algorithm uses a box filter to replace the Gaussian filter, so thatThe weighting factor 0.9 is multiplied in order to balance the error balance due to the approximation using the box filter, as shown in the following equation (7):
(7)
and step S103, performing preliminary matching on the characteristic points of the left view and the right view by using a Euclidean distance algorithm.
Specifically, the Euclidean distance is carried out on the feature point decision values of the left view and the right view to calculate the matching degree value one by one, so that preliminary matching is realized.
Step S12, utilizing a first preset formula:
(8),
screening the feature points preliminarily matched by the left view and the right view, wherein,in order to match the point pixel information,maximum number of feature points for left view, < >>Maximum number of feature points for right view, < >>Left view->Characteristic point decision values of individual characteristic points, +.>Right view->And determining the value of the characteristic point of each characteristic point.
And S13, obtaining a matching degree value interval range according to the screened characteristic points of the left and right views, and carrying out strong matching extraction on the selected matching point pairs according to the minimum value of the matching degree value interval range and a preset experience constant.
Further, in step S13, the calculating the range of the matching degree value interval according to the screened feature points includes:
using a second preset formula:
(9),
obtaining the range of the matching degree value interval;
wherein,for the minimum value of the range of the matching degree value interval, < + >>The range of the matching degree value interval is the maximum value.
Further, in step S13, the performing strong matching extraction on the selected matching point pair according to the minimum value of the matching degree value interval range and a preset experience constant includes:
using a third preset formula:
](10),
performing strong matching extraction on the selected matching point pairs;
wherein,is an empirical constant.
Further, the empirical constant takes a value of 2.
Specifically, through the above formula (10), when MatchMin with the matching being more than or equal to 2 times is selected based on empirical reference, the matching is considered as an error matching or a weak matching, so that the denoising of the error matching and the extraction of the strong matching are realized, namely, the information quantity is reduced, and the matching precision is improved. Referring to fig. 4, fig. 4 is a diagram of left and right view matching point information in a night vision environment based on the left and right view feature point matching processing method in binocular ranging of the present application, and as can be seen by comparing fig. 4 and fig. 2, the feature points obtained by the left and right view feature point matching processing method in binocular ranging of the present application are more refined, and the number of left and right view feature points is basically consistent, so that the purpose of improving the feature point matching precision can be achieved.
In summary, through the above-mentioned related steps, the strong feature point matching algorithm of the present application can be implemented, so as to improve the conventional SURF matching algorithm, and achieve the purpose of improving the feature point matching precision.
In a third aspect of the present application, referring to fig. 5, fig. 5 is a block diagram of an binocular ranging processing terminal according to an exemplary embodiment, and as shown in fig. 5, the binocular ranging processing terminal 10 includes:
one or more memories 1001 on which executable programs are stored;
one or more processors 1002 configured to execute the executable programs in the memory 1001 to implement the steps of the method of any of the preceding claims.
The detailed manner in which the processor 1002 executes the program of the memory 1001 of the binocular distance processing terminal 10 of the above embodiment has been described in the above embodiments regarding the method, and will not be described in detail herein.
In a fourth aspect of the present application, referring to fig. 6, fig. 6 is a block diagram of an binocular ranging system according to an exemplary embodiment, and as shown in fig. 6, the binocular ranging system 1 includes:
the dual vision distance processing terminal 10 as described above.
Further, the binocular ranging system further comprises:
and a binocular camera 20 connected to the binocular distance processing terminal 10 such that the binocular distance processing terminal 10 acquires left and right views of a target object through the binocular camera 20. The detailed operation of the binocular distance measuring system 1 in the above embodiment has been described in detail in the embodiments related to the method, and will not be described in detail herein.
The specific manner in which the program is executed and the operations are executed in the readable storage medium in the above embodiment has been described in detail in the above embodiment about the method, and will not be described in detail here.
It is to be understood that the same or similar parts in the above embodiments may be referred to each other, and that in some embodiments, the same or similar parts in other embodiments may be referred to.
It should be noted that in the description of the present application, the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. Furthermore, in the description of the present application, unless otherwise indicated, the meaning of "plurality", "multiple" means at least two.
Any process or method description in a flowchart or otherwise described herein may be understood as: means, segments, or portions of code representing executable instructions including one or more steps for implementing specific logical functions or processes are included in the preferred embodiments of the present application, in which functions may be executed out of order from that shown or discussed, including in a substantially simultaneous manner or in an inverse order, depending upon the functionality involved, as would be understood by those skilled in the art to which the embodiments of the present application pertains.
It is to be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in the memory 301 and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
Those of ordinary skill in the art will appreciate that all or a portion of the steps carried out in the method of the above-described embodiments may be implemented by a program to instruct related hardware, where the program may be stored in a computer readable storage medium, and where the program, when executed, includes one or a combination of the steps of the method embodiments.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing module, or each unit may exist alone physically, or two or more units may be integrated in one module. The integrated modules may be implemented in hardware or in software functional modules. The integrated modules may also be stored in a computer readable storage medium if implemented in the form of software functional modules and sold or used as a stand-alone product.
The above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, or the like.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present application have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the application, and that variations, modifications, alternatives, and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the application.
Claims (6)
1. The left and right view characteristic point matching processing method in binocular ranging is characterized by comprising the following steps of:
extracting feature points according to the left and right views of the target object, and performing preliminary feature point matching by using an Euclidean distance algorithm;
using a first preset formula:
,
screening the feature points preliminarily matched by the left view and the right view, wherein,for matching point pixel information +.>Maximum number of feature points for left view, < >>Maximum number of feature points for right view, < >>Left view->Characteristic point decision values of individual characteristic points, +.>Right view->Feature point determination values of the feature points;
obtaining a matching degree value interval range according to the screened characteristic points of the left view and the right view, and carrying out strong matching extraction on the selected matching point pairs according to the minimum value of the matching degree value interval range and a preset experience constant;
calculating a matching degree value interval range according to the screened characteristic points, wherein the calculating comprises the following steps:
using a second preset formula:,
obtaining the range of the matching degree value interval;
wherein,for the minimum value of the range of the matching degree value interval, < + >>The maximum value of the range of the matching degree value interval;
and performing strong matching extraction on the selected matching point pairs according to the minimum value of the matching degree value interval range and a preset experience constant, wherein the method comprises the following steps:
selecting the MatchMin with the matching degree interval being greater than or equal to two times based on experience reference, and considering the matching as an error matching or weak matching to realize error matching denoising and strong matching extraction;
the extracting of the characteristic points according to the left and right views of the target object and the preliminary characteristic point matching by using the Euclidean distance algorithm comprise:
creating a Hessian matrix according to the left view and the right view;
filtering the Hessian matrix by using a preset filter to obtain characteristic points of the left and right views;
and carrying out preliminary matching on the characteristic points of the left view and the right view by using an Euclidean distance algorithm.
2. The method of claim 1, wherein the filter comprises: gaussian filter, or box filter.
3. The method of claim 1, wherein the target comprises: and stacking dangerous chemicals.
4. Binocular range processing terminal, its characterized in that includes:
one or more memories having executable programs stored thereon;
one or more processors to execute the executable program in the memory to implement the steps of the method of any one of claims 1-3.
5. Binocular ranging system, characterized in that it comprises:
the binocular distance processing terminal of claim 4.
6. The binocular ranging system of claim 5, further comprising:
and the binocular camera is connected with the binocular distance processing terminal, so that the binocular distance processing terminal obtains left and right views of the target object through the binocular camera.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010594989.9A CN111882618B (en) | 2020-06-28 | 2020-06-28 | Left-right view characteristic point matching processing method, terminal and system in binocular ranging |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010594989.9A CN111882618B (en) | 2020-06-28 | 2020-06-28 | Left-right view characteristic point matching processing method, terminal and system in binocular ranging |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111882618A CN111882618A (en) | 2020-11-03 |
CN111882618B true CN111882618B (en) | 2024-01-26 |
Family
ID=73157058
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010594989.9A Active CN111882618B (en) | 2020-06-28 | 2020-06-28 | Left-right view characteristic point matching processing method, terminal and system in binocular ranging |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111882618B (en) |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103260043A (en) * | 2013-04-28 | 2013-08-21 | 清华大学 | Binocular stereo image matching method and system based on learning |
CN105894574A (en) * | 2016-03-30 | 2016-08-24 | 清华大学深圳研究生院 | Binocular three-dimensional reconstruction method |
KR101705330B1 (en) * | 2015-09-30 | 2017-02-10 | 인하대학교 산학협력단 | Keypoints Selection method to Find the Viewing Angle of Objects in a Stereo Camera Image |
CN107705333A (en) * | 2017-09-21 | 2018-02-16 | 歌尔股份有限公司 | Space-location method and device based on binocular camera |
CN108388854A (en) * | 2018-02-11 | 2018-08-10 | 重庆邮电大学 | A kind of localization method based on improvement FAST-SURF algorithms |
CN109631829A (en) * | 2018-12-17 | 2019-04-16 | 南京理工大学 | A kind of binocular distance measuring method of adaptive Rapid matching |
CN109816051A (en) * | 2019-02-25 | 2019-05-28 | 北京石油化工学院 | A kind of harmful influence cargo characteristic point matching method and system |
CN109919247A (en) * | 2019-03-18 | 2019-06-21 | 北京石油化工学院 | Characteristic point matching method, system and equipment in harmful influence stacking binocular ranging |
CN110070610A (en) * | 2019-04-17 | 2019-07-30 | 精伦电子股份有限公司 | The characteristic point matching method and device of characteristic point matching method, three-dimensionalreconstruction process |
CN111062990A (en) * | 2019-12-13 | 2020-04-24 | 哈尔滨工程大学 | Binocular vision positioning method for underwater robot target grabbing |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20110064197A (en) * | 2009-12-07 | 2011-06-15 | 삼성전자주식회사 | Object recognition system and method the same |
CN106356757B (en) * | 2016-08-11 | 2018-03-20 | 河海大学常州校区 | A kind of power circuit unmanned plane method for inspecting based on human-eye visual characteristic |
-
2020
- 2020-06-28 CN CN202010594989.9A patent/CN111882618B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103260043A (en) * | 2013-04-28 | 2013-08-21 | 清华大学 | Binocular stereo image matching method and system based on learning |
KR101705330B1 (en) * | 2015-09-30 | 2017-02-10 | 인하대학교 산학협력단 | Keypoints Selection method to Find the Viewing Angle of Objects in a Stereo Camera Image |
CN105894574A (en) * | 2016-03-30 | 2016-08-24 | 清华大学深圳研究生院 | Binocular three-dimensional reconstruction method |
CN107705333A (en) * | 2017-09-21 | 2018-02-16 | 歌尔股份有限公司 | Space-location method and device based on binocular camera |
CN108388854A (en) * | 2018-02-11 | 2018-08-10 | 重庆邮电大学 | A kind of localization method based on improvement FAST-SURF algorithms |
CN109631829A (en) * | 2018-12-17 | 2019-04-16 | 南京理工大学 | A kind of binocular distance measuring method of adaptive Rapid matching |
CN109816051A (en) * | 2019-02-25 | 2019-05-28 | 北京石油化工学院 | A kind of harmful influence cargo characteristic point matching method and system |
CN109919247A (en) * | 2019-03-18 | 2019-06-21 | 北京石油化工学院 | Characteristic point matching method, system and equipment in harmful influence stacking binocular ranging |
CN110070610A (en) * | 2019-04-17 | 2019-07-30 | 精伦电子股份有限公司 | The characteristic point matching method and device of characteristic point matching method, three-dimensionalreconstruction process |
CN111062990A (en) * | 2019-12-13 | 2020-04-24 | 哈尔滨工程大学 | Binocular vision positioning method for underwater robot target grabbing |
Non-Patent Citations (4)
Title |
---|
Aircraft relative attitude measurement based on binocular vision;Yincen Zhao等;Optical Sensing and Imaging Technology and Applications;第1-9页 * |
一种图像立体匹配的误匹配点剔除与校正方法;翁秀玲;王云峰;余小亮;;软件(10);第28-32页 * |
危化品堆垛双目测距中改进的SURF匹配算法研究;刘学君等;计算机与应用化学;第36卷(第4期);第345-349页 * |
基于SURF算法的双目视觉特征点定位研究;王永锋等;软件工程;第22卷(第3期);第5-8页 * |
Also Published As
Publication number | Publication date |
---|---|
CN111882618A (en) | 2020-11-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20180330471A1 (en) | Method and device for image stitching | |
US8406510B2 (en) | Methods for evaluating distances in a scene and apparatus and machine readable medium using the same | |
EP3798975B1 (en) | Method and apparatus for detecting subject, electronic device, and computer readable storage medium | |
CN110956661B (en) | Method for calculating dynamic pose of visible light and infrared camera based on bidirectional homography matrix | |
US20200098133A1 (en) | Image Processing Method and Apparatus | |
CN110634137A (en) | Bridge deformation monitoring method, device and equipment based on visual perception | |
US11928805B2 (en) | Information processing apparatus, information processing method, and storage medium for defect inspection and detection | |
CN107980138A (en) | A kind of false-alarm obstacle detection method and device | |
JP7251409B2 (en) | Lane change vehicle detection device, method and video surveillance device | |
CN110458952B (en) | Three-dimensional reconstruction method and device based on trinocular vision | |
CN110120012B (en) | Video stitching method for synchronous key frame extraction based on binocular camera | |
CN111882618B (en) | Left-right view characteristic point matching processing method, terminal and system in binocular ranging | |
CN105303544A (en) | Video splicing method based on minimum boundary distance | |
CN109902695B (en) | Line feature correction and purification method for image pair linear feature matching | |
CN116977671A (en) | Target tracking method, device, equipment and storage medium based on image space positioning | |
US20210168348A1 (en) | Depth measurement | |
CN113870292B (en) | Edge detection method and device of depth image and electronic equipment | |
CN115471537A (en) | Monocular camera-based moving target distance and height measuring method | |
CN114972084A (en) | Image focusing accuracy evaluation method and system | |
CN111383262B (en) | Occlusion detection method, occlusion detection system, electronic terminal and storage medium | |
CN112950709A (en) | Pose prediction method, pose prediction device and robot | |
CN118505500B (en) | Point cloud data splicing method, point cloud data splicing device, medium and electronic equipment | |
JP2014127068A (en) | Corresponding point search device, camera posture estimation device and program for these | |
CN117726666B (en) | Cross-camera monocular picture measurement depth estimation method, device, equipment and medium | |
CN112132895B (en) | Image-based position determination method, electronic device, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |