CN102607510A - Three-dimensional distance measuring method based on sparse representation - Google Patents
Three-dimensional distance measuring method based on sparse representation Download PDFInfo
- Publication number
- CN102607510A CN102607510A CN2012100093721A CN201210009372A CN102607510A CN 102607510 A CN102607510 A CN 102607510A CN 2012100093721 A CN2012100093721 A CN 2012100093721A CN 201210009372 A CN201210009372 A CN 201210009372A CN 102607510 A CN102607510 A CN 102607510A
- Authority
- CN
- China
- Prior art keywords
- sig1
- signal
- sig
- libsig
- sig2
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
The invention discloses a three-dimensional distance measuring method based on sparse representation, which comprises the following steps of: obtaining a target object image in the same scene from a long distance by utilizing a binocular camera system, wherein the depth change of a target object can be ignored relatively to a photographing distance; carrying out long-distance calibration for a binocular camera, collecting an image and carrying out limited correction; respectively detecting left and right image targets to carry out characteristic enhancement; projecting to obtain two one-dimensional signals to calculate a whole pixel parallax error; expanding one signal into an over-complete dictionary of atoms in a frequency domain by utilizing a digital fraction time delay unit; carrying out sparse decomposition on a phase-frequency signal of the other signal and calculating a sub-pixel parallax error by a decomposition coefficient; and combining the whole pixel parallax error and the sub-pixel parallax error and calculating depth information of the target object by utilizing a three-dimensional vision principle. According to the three-dimensional distance measuring method based on the sparse representation, long-distance target distance measurement can be realized through calculating a sub-pixel-grade precision parallax error and the precision is high; and the over-complete dictionary of the atoms is obtained by expanding, so that the pertinence is higher, the operation is simplified and the execution speed is high.
Description
Technical field
The present invention relates to the stereoscopic rangefinding method, particularly a kind of stereoscopic rangefinding method based on rarefaction representation.
Background technology
Along with the continuous development of digital imaging technology, how expeditiously decomposition, presentation video information have become the important step in the image processing process.Signal decomposition conversion commonly used is to choose one group of complete orthogonal basis, and signal is carried out projection and decomposition.Classical Fourier decomposition and mutation DCT decomposition thereof, Fourier conversion in short-term, wavelet transformation etc. all have important application in Flame Image Process.Consider that the quadrature decomposition has certain limitation, many in recent years scholars are devoted to the research of nonopiate decomposition.This brand-new research direction of signal Sparse Decomposition (Sparse decomposition) is exactly that Mallat and Zhang form after crossing the thought of decomposing on the complete storehouse at proposition signal on the basis of wavelet analysis gradually.
The rarefaction representation of signal is different from traditional quadrature and decomposes, and it can be realized more flexible, the succinct and adaptive expression of signal.Through the decomposition of signal on over-complete dictionary of atoms (over-complete dictionary of atoms), be used for the base of expression signal can choose flexibly according to the time-frequency characteristic of signal itself adaptively.Finally obtain a very succinct expression of signal, i.e. rarefaction representation (Sparse representation).The process that obtains the signal rarefaction representation is called the signal Sparse Decomposition.
To different ranges of application, it is theoretical that forefathers have proposed a series of signal rarefaction representation, like Wavelet, Ridgelet, Curvelet, Brushlet, Wedgelet, Beamlet, Contourlet, Bandlet, Garbor etc.These theories all are based on the atom of canonical form, utilize the big coefficient of minority to describe original signal, can approach original signal by enough several bigger sparse coefficients.Lack targetedly and bring big complexity to calculating again simultaneously, time loss is also bigger.
Summary of the invention
The objective of the invention is to overcome the deficiency of prior art, kind of the stereoscopic rangefinding method based on rarefaction representation is provided.
Step based on the stereoscopic rangefinding method of rarefaction representation is following
1) image pre-service
Optical axis to two video cameras carries out parallel and demarcation such as height such as grade; Adopt the Zhang Zhengyou standardization; Obtain baseline, focal length, radial distortion parameter, tangential distortion parameter, photocentre imager coordinate, rotational transform matrix, these camera interior and exterior parameters of translation transformation matrix; And image is carried out the limit proofread and correct, the image after proofreading and correct is carried out target detection, extract interested target in the image;
2) strengthen target signature
The based target testing result is carried out histogram equalization to the target area; Rim detection is carried out in zone behind the histogram equalization; Extract the high-frequency characteristic of target area parts of images; Target detection result and edge detection results are carried out weighting fusion, keep the high and low frequency information of target, the picture noise influence that filtering illumination and camera properties cause;
3) whole pixel parallax asks for
To above-mentioned fusion figure, left and right sides view is added up to grey scale pixel value in vertical direction, obtain two one-dimensional signal sig1 and sig2 on the spatial domain; To these two signals, do the computing that is similar to cross correlation algorithm, through global registration, obtain best Δ n, satisfy maximization E [Δ n]=∑ sig2 [n+ Δ n] * sig1 [n], Δ n is whole pixel parallax Dis;
4) set up over-complete dictionary of atoms
According to whole pixel parallax Dis, one-dimensional signal sig1 is carried out translation get sig1 ' [n]=sig1 [n-Δ n]; Sig1 ' and the one-dimensional signal sig2 of this moment only differs the parallax of sub-pixel in the ideal case, promptly sig1 ' [n]=sig1 ' (nT)=sig2 (nT-Δ t), | Δ t|<T, wherein T be signal sampling at interval; Respectively sig1 ' and one-dimensional signal sig2 are carried out the FFT conversion and are transformed into the amplitude-frequency signal and the phase frequency signal of frequency domain to the spatial domain one-dimensional signal,
Make SIG
1=FFT (sig1 '), SIG
2=FFT (sig2) then has
SIG
1(e
jω)=e
-jωΔt/T×SIG
2(e
jω),|ω|<π,|Δt|<T
Can get by following formula: | SIG
1(e
J ω) |=| SIG
2(e
J ω) |,
,|Δt|<T,
Make the phase-frequency characteristic of left and right sides one-dimensional signal be respectively φ sig1 and φ sig2;
Let spatial domain one-dimensional signal sig2 through set of number mark chronotron, the frequency response of chronotron is following:
H
d(jω)=e
-jωiΔt/T,|ω|<π,N
t×Δt=T,-N
t/2≤i<N
t/2
Obtain a series of responses, the phase frequency signal of getting after the FFT conversion is the atom of over-complete dictionary of atoms:
LibSig
i=φ(SIG
2(e
jω))-ω(i-N
t/2)/N
t,|ω|<π,N
t×Δt=T,0≤i<N
t
=φ (SIG
2(e
J ω))-π * n * (i-N
t/ 2)/(N * N
t) 0≤n<N wherein, N
t* Δ t=T, 0≤i<N
t, N
tBe the atom number, N is a signal length, and satisfies N
t>>N is with LibSig
iValue be limited in 0~2 π;
5) Sparse Decomposition of Signal
Adopt the coupling track algorithm to realize Sparse Decomposition of Signal: the atom in the step 4) is combined into set D={LibSig
i, 0≤i<N
t, be the over-complete dictionary of atoms that φ sig2 expansion generates, and satisfy N
t>>N; For signal psi sig1, coupling is followed the tracks of that at first selection is mated the most from over-complete dictionary of atoms, promptly satisfies
Signal psi sig1 just can be decomposed into following form like this:
φsig1=<φsig1,LibSig
i0>LibSig
i0+R
1φsig1
Continue above-mentioned decomposable process and carry out interative computation, obtain until the n rank:
Wherein ik representes the atomic number that the k time iteration chosen; As approximate error R
nThe energy of φ sig1 is less than certain value, promptly || R
nφ sig1||
2During<ε, stop iteration, decompose and accomplish;
6), the whole parallax of sub-pixel asks for and depth calculation
According to signal psi sig1 at D={LibSig
i, 0≤i<N
tOn the sparse expression formula, further try to achieve the sub-pix parallax between φ sig1 and the φ sig2, expression formula is following:
Obtain whole parallax
totalDis=Dis-subDis
Combine the binocular stereo vision principle at last, by formula z=f * (1+B/D), wherein B is a parallax range, D is a parallax, and f is a focal length, tries to achieve the depth location of target object.
The beneficial effect that the present invention compared with prior art has:
1) the present invention converts two dimensional image into one-dimensional signal, through the phase frequency Sparse Decomposition of Signal being asked for the parallax of left and right sides view, obtains the depth information of target object at last according to principle of stereoscopic vision.The matching precision of sub-pixel can be used for realizing the distant object range finding, reduces operand simultaneously, and higher execution speed is arranged.
2) based on the atom of canonical form, like Garbor dictionary, curvelet, ridgelet, wavelet and mutation thereof etc., computing is complicated, and is consuming time bigger mostly for the traditional image Sparse Decomposition.The present invention directly utilizes the principle of numerical fraction chronotron that an one-dimensional signal is resampled and obtains an over-complete dictionary of atoms, has higher specific aim, reduces residual error effectively, simplifies the computing of Sparse Decomposition, further improves execution speed.
Description of drawings
Fig. 1 is the stereoscopic rangefinding method flow diagram based on rarefaction representation;
Fig. 2 (a) is the original image that left video camera is taken;
Fig. 2 (b) is the original image that right video camera is taken;
Fig. 3 (a) is the as a result figure of the left view limit after proofreading and correct;
Fig. 3 (b) is the as a result figure of the right view limit after proofreading and correct;
Fig. 4 (a) is the left view target detection that needs range finding figure as a result;
Fig. 4 (b) is the right view target detection that needs range finding figure as a result;
The fusion figure that Fig. 5 (a) strengthens for the left view target signature that needs range finding;
The fusion figure that Fig. 5 (b) strengthens for the right view target signature that needs range finding;
Fig. 6 is the last program run figure of the present invention of realization.
Embodiment
Below, in conjunction with accompanying drawing and practical implementation example the present invention is described further.
Step based on the stereoscopic rangefinding method of rarefaction representation is following
1) image pre-service
Optical axis to two video cameras carries out parallel and demarcation such as height such as grade; Adopt the Zhang Zhengyou standardization; Obtain baseline, focal length, radial distortion parameter, tangential distortion parameter, photocentre imager coordinate, rotational transform matrix, these camera interior and exterior parameters of translation transformation matrix; And image is carried out the limit proofread and correct, the image after proofreading and correct is carried out target detection, extract interested target in the image;
2) strengthen target signature
The based target testing result is carried out histogram equalization to the target area; Rim detection is carried out in zone behind the histogram equalization; Extract the high-frequency characteristic of target area parts of images; Target detection result and edge detection results are carried out weighting fusion, keep the high and low frequency information of target, the picture noise influence that filtering illumination and camera properties cause;
3) whole pixel parallax asks for
To above-mentioned fusion figure, left and right sides view is added up to grey scale pixel value in vertical direction, obtain two one-dimensional signal sig1 and sig2 on the spatial domain; To these two signals, do the computing that is similar to cross correlation algorithm, through global registration, obtain best Δ n, satisfy maximization E [Δ n]=∑ sig2 [n+ Δ n] * sig1 [n], Δ n is whole pixel parallax Dis;
4) set up over-complete dictionary of atoms
According to whole pixel parallax Dis, one-dimensional signal sig1 is carried out translation get sig1 ' [n]=sig1 [n-Δ n]; Sig1 ' and the one-dimensional signal sig2 of this moment only differs the parallax of sub-pixel in the ideal case, promptly sig1 ' [n]=sig1 ' (nT)=sig2 (nT-Δ t), | Δ t|<T, wherein T be signal sampling at interval; Respectively sig1 ' and one-dimensional signal sig2 are carried out the FFT conversion and are transformed into the amplitude-frequency signal and the phase frequency signal of frequency domain to the spatial domain one-dimensional signal,
Make SIG
1=FFT (sig1 '), SIG
2=FFT (sig2) then has
SIG
1(e
jω)=e
-jωΔt/T×SIG
2(e
jω),|ω|<π,|Δt|<T
Can get by following formula: | SIG
1(e
J ω) |=| SIG
2(e
J ω) |,
,|Δt|<T,
Make the phase-frequency characteristic of left and right sides one-dimensional signal be respectively φ sig1 and φ sig2;
Let spatial domain one-dimensional signal sig2 through set of number mark chronotron, the frequency response of chronotron is following:
H
d(jω)=e
-jωiΔt/T,|ω|<π,N
t×Δt=T,-N
t/2≤i<N
t/2
Obtain a series of responses, the phase frequency signal of getting after the FFT conversion is the atom of over-complete dictionary of atoms:
LibSig
i=φ(SIG
2(e
jω))-ω(i-N
t/2)/N
t,|ω|<π,N
t×Δt=T,0≤i<N
t
=φ (SIG
2(e
J ω))-π * n * (i-N
t/ 2)/(N * N
t) 0≤n<N wherein, N
t* Δ t=T, 0≤i<N
t, N
tBe the atom number, N is a signal length, and satisfies N
t>>N is with LibSig
iValue be limited in 0~2 π;
5) Sparse Decomposition of Signal
Adopt the coupling track algorithm to realize Sparse Decomposition of Signal: the atom in the step 4) is combined into set D={LibSig
i, 0≤i<N
t, be the over-complete dictionary of atoms that φ sig2 expansion generates, and satisfy N
t>>N; For signal psi sig1, coupling is followed the tracks of that at first selection is mated the most from over-complete dictionary of atoms, promptly satisfies
Signal psi sig1 just can be decomposed into following form like this:
φsig1=<φsig1,LibSig
i0>LibSig
i0+R
1φsig1
Continue above-mentioned decomposable process and carry out interative computation, obtain until the n rank:
Wherein ik representes the atomic number that the k time iteration chosen; As approximate error R
nThe energy of φ sig1 is less than certain value, promptly || R
nφ sig1||
2During<ε, stop iteration, decompose and accomplish;
6), the whole parallax of sub-pixel asks for and depth calculation
According to signal psi sig1 at D={LibSig
i, 0≤i<N
tOn the sparse expression formula, further try to achieve the sub-pix parallax between φ sig1 and the φ sig2, expression formula is following:
Obtain whole parallax
totalDis=Dis-subDis
Combine the binocular stereo vision principle at last, by formula z=f * (1+B/D), wherein B is a parallax range, D is a parallax, and f is a focal length, tries to achieve the depth location of target object.
Embodiment
Present embodiment is unified the image under the scene with the binocular camera system acquisition, and resolution is 1024*768, and with the character on the image medium and long distance flat board as interesting target, record the actual range of this target.
As shown in Figure 2, for left and right cameras collects coloured image, be that left view (b) is a right view (a); Earlier video camera is demarcated at a distance,, left and right sides view is proofreaied and correct to two width of cloth gray-scale maps of same plane sustained height again through the limit bearing calibration in the method step, as shown in Figure 3; Obtain extracting interested target by algorithm of target detection, in the present embodiment with one group of character on the remote flat board as interested target, testing result is as shown in Figure 4.More than be the image preprocessing process, be the specific operation process of stereoscopic rangefinding method below.
The target area of two width of cloth images among Fig. 4 is carried out target signature respectively strengthen, comprise that histogram equalization, rim detection and result merge, must merge figure, as shown in Figure 5.To fusion figure is Fig. 5; Carry out the projection of vertical direction; Promptly form an one-dimensional signal to the whole additions of the identical pixel gray-scale value of horizontal ordinate, owing to be wide-long shot, the area of the shared image in target area is less; Most of signal intensity is 0 in the one-dimensional signal, and a part that only comprises target just has semaphore.One segment signal of intercepting non-0, and expand slightly, be used to calculate the whole pixel parallax of two views.According to formula E [Δ n]=∑ sig2 [n+ Δ n] * sig1 [n], when E [Δ n] is maximum, promptly try to achieve the parallax Δ n (being Dis) of whole pixel.Behind the parallax of trying to achieve whole pixel; Left signal carries out translation and obtains sig1 ' [n]=sig1 [n-Δ n]; Sig2 [n] signal through different mark delayed time systems, is obtained a series of new signals and does the FFT conversion, get the phase frequency signal respectively as over-complete dictionary of atoms D={LibSig of atomic building
i, 0≤i<N
t.Simultaneously sig1 ' is done the FFT conversion and get its phase-frequency characteristic φ sig1, on over-complete dictionary of atoms D, adopt the MP algorithm to carry out Sparse Decomposition.Try to achieve sub-pix parallax subDis according to the Sparse Decomposition coefficient then.
At last, will put in order pixel parallax Dis and sub-pix parallax subDis subtracts each other, obtain the whole parallax of left and right sides view sub-pixel precision; According to the formula z=f* (1+B/D) of binocular stereo vision, wherein B is a parallax range, and D is a parallax; F is a focal length, tries to achieve required target object depth information, output range finding result; Realize this method function, as shown in Figure 6.
Claims (1)
1. the stereoscopic rangefinding method based on rarefaction representation is characterized in that its step is following
1) image pre-service
Optical axis to two video cameras carries out parallel and demarcation such as height such as grade; Adopt the Zhang Zhengyou standardization; Obtain baseline, focal length, radial distortion parameter, tangential distortion parameter, photocentre imager coordinate, rotational transform matrix, these camera interior and exterior parameters of translation transformation matrix; And image is carried out the limit proofread and correct, the image after proofreading and correct is carried out target detection, extract interested target in the image;
2) strengthen target signature
The based target testing result is carried out histogram equalization to the target area; Rim detection is carried out in zone behind the histogram equalization; Extract the high-frequency characteristic of target area parts of images; Target detection result and edge detection results are carried out weighting fusion, keep the high and low frequency information of target, the picture noise influence that filtering illumination and camera properties cause;
3) whole pixel parallax asks for
To above-mentioned fusion figure, left and right sides view is added up to grey scale pixel value in vertical direction, obtain two one-dimensional signal sig1 and sig2 on the spatial domain; To these two signals, do the computing that is similar to cross correlation algorithm, through global registration, obtain best Δ n, satisfy maximization E [Δ n]=∑ sig2 [n+ Δ n] * sig1 [n], Δ n is whole pixel parallax Dis;
4) set up over-complete dictionary of atoms
According to whole pixel parallax Dis, one-dimensional signal sig1 is carried out translation get sig1 ' [n]=sig1 [n-Δ n]; Sig1 ' and the one-dimensional signal sig2 of this moment only differs the parallax of sub-pixel in the ideal case, promptly sig1 ' [n]=sig1 ' (nT)=sig2 (nT-Δ t), | Δ t|<T, wherein T be signal sampling at interval; Respectively sig1 ' and one-dimensional signal sig2 are carried out the FFT conversion and are transformed into the amplitude-frequency signal and the phase frequency signal of frequency domain to the spatial domain one-dimensional signal,
Make SIG
1=FFT (sig1 '), SIG
2=FFT (sig2) then has
SIG
1(e
jω)=e
-jωΔt/T×SIG
2(e
jω),|ω|<π,|Δt|<T
Can get by following formula: | SIG
1(e
J ω) |=| SIG
2(e
J ω) |,
,|Δt|<T,
Make the phase-frequency characteristic of left and right sides one-dimensional signal be respectively φ sig1 and φ sig2;
Let spatial domain one-dimensional signal sig2 through set of number mark chronotron, the frequency response of chronotron is following:
H
d(j ω)=e
-j ω i Δ t/T, | ω |<π, N
t* Δ t=T ,-N
t/ 2≤i<N
t/ 2 obtain a series of responses, and the phase frequency signal of getting after the FFT conversion is the atom of over-complete dictionary of atoms:
LibSig
i=φ(SIG
2(e
jω))-ω(i-N
t/2)/N
t,|ω|<π,N
t×Δt=T,0≤i<N
t
=φ (SIG
2(e
J ω))-π * n * (i-N
t/ 2)/(N * N
t) 0≤n<N wherein, N
t* Δ t=T, 0≤i<N
t, N
tBe the atom number, N is a signal length, and satisfies N
t>>N is with LibSig
iValue be limited in 0~2 π;
5) Sparse Decomposition of Signal
Adopt the coupling track algorithm to realize Sparse Decomposition of Signal: the atom in the step 4) is combined into set D={LibSig
i, 0≤i<N
t, be the over-complete dictionary of atoms that φ sig2 expansion generates, and satisfy N
t>>N; For signal psi sig1, coupling is followed the tracks of that at first selection is mated the most from over-complete dictionary of atoms, promptly satisfies
Signal psi sig1 just can be decomposed into following form like this:
φsig1=<φsig1,LibSig
i0>LibSig
i0+R
1φsig1
Continue above-mentioned decomposable process and carry out interative computation, obtain until the n rank:
Wherein ik representes the atomic number that the k time iteration chosen; As approximate error R
nThe energy of φ sig1 is less than certain value, promptly || R
nφ sig1||
2During<ε, stop iteration, decompose and accomplish;
6), the whole parallax of sub-pixel asks for and depth calculation
According to signal psi sig1 at D={LibSig
i, 0≤i<N
tOn the sparse expression formula, further try to achieve the sub-pix parallax between φ sig1 and the φ sig2, expression formula is following:
Obtain whole parallax
totalDis=Dis-subDis
Combine the binocular stereo vision principle at last, by formula z=f * (1+B/D), wherein B is a parallax range, D is a parallax, and f is a focal length, tries to achieve the depth location of target object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210009372.1A CN102607510B (en) | 2012-01-12 | 2012-01-12 | Three-dimensional distance measuring method based on sparse representation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210009372.1A CN102607510B (en) | 2012-01-12 | 2012-01-12 | Three-dimensional distance measuring method based on sparse representation |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102607510A true CN102607510A (en) | 2012-07-25 |
CN102607510B CN102607510B (en) | 2014-01-29 |
Family
ID=46525115
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201210009372.1A Active CN102607510B (en) | 2012-01-12 | 2012-01-12 | Three-dimensional distance measuring method based on sparse representation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102607510B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109076164A (en) * | 2016-04-18 | 2018-12-21 | 月光产业股份有限公司 | Focus pulling is carried out by means of the range information from auxiliary camera system |
CN112347973A (en) * | 2020-11-19 | 2021-02-09 | 武汉光庭信息技术股份有限公司 | Front vehicle state estimation method and system based on binocular high-speed camera |
CN114650405A (en) * | 2022-03-21 | 2022-06-21 | 嘉兴智瞳科技有限公司 | Optimal fusion image parallax imaging method and device for three-dimensional video |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003028635A (en) * | 2001-07-16 | 2003-01-29 | Honda Motor Co Ltd | Image range finder |
CN1741618A (en) * | 2004-09-03 | 2006-03-01 | 北京航空航天大学 | A fast sub-picture element movement estimating method |
CN101813467A (en) * | 2010-04-23 | 2010-08-25 | 哈尔滨工程大学 | Blade running elevation measurement device and method based on binocular stereovision technology |
JP2011064639A (en) * | 2009-09-18 | 2011-03-31 | Panasonic Corp | Distance measuring device and distance measuring method |
CN102036094A (en) * | 2010-12-30 | 2011-04-27 | 浙江大学 | Stereo matching method based on digital score delay technology |
-
2012
- 2012-01-12 CN CN201210009372.1A patent/CN102607510B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003028635A (en) * | 2001-07-16 | 2003-01-29 | Honda Motor Co Ltd | Image range finder |
CN1741618A (en) * | 2004-09-03 | 2006-03-01 | 北京航空航天大学 | A fast sub-picture element movement estimating method |
JP2011064639A (en) * | 2009-09-18 | 2011-03-31 | Panasonic Corp | Distance measuring device and distance measuring method |
CN101813467A (en) * | 2010-04-23 | 2010-08-25 | 哈尔滨工程大学 | Blade running elevation measurement device and method based on binocular stereovision technology |
CN102036094A (en) * | 2010-12-30 | 2011-04-27 | 浙江大学 | Stereo matching method based on digital score delay technology |
Non-Patent Citations (3)
Title |
---|
史培培,练秋生,尚 倩: "基于三层稀疏表示的图像修复算法", 《计算机工程》 * |
宋昌江,吴冈,何艳: "基于相位编码技术的双目视觉亚像素匹配算法的研究", 《自动化技术与应用》 * |
王一叶: "基于稀疏表示的立体匹配算法和红外目标的检测与跟踪", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109076164A (en) * | 2016-04-18 | 2018-12-21 | 月光产业股份有限公司 | Focus pulling is carried out by means of the range information from auxiliary camera system |
CN109076164B (en) * | 2016-04-18 | 2020-10-27 | 月光产业股份有限公司 | Method, apparatus and computer-readable storage medium for switching focus |
CN112347973A (en) * | 2020-11-19 | 2021-02-09 | 武汉光庭信息技术股份有限公司 | Front vehicle state estimation method and system based on binocular high-speed camera |
CN114650405A (en) * | 2022-03-21 | 2022-06-21 | 嘉兴智瞳科技有限公司 | Optimal fusion image parallax imaging method and device for three-dimensional video |
Also Published As
Publication number | Publication date |
---|---|
CN102607510B (en) | 2014-01-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102930544B (en) | Parameter calibration system of vehicle-mounted camera | |
CN107680156B (en) | Three-dimensional reconstruction method based on polarization information | |
CN102036094B (en) | Stereo matching method based on digital fractional delay technology | |
US20220092819A1 (en) | Method and system for calibrating extrinsic parameters between depth camera and visible light camera | |
CN107560592B (en) | Precise distance measurement method for photoelectric tracker linkage target | |
CN102221331B (en) | Measuring method based on asymmetric binocular stereovision technology | |
CN105225482A (en) | Based on vehicle detecting system and the method for binocular stereo vision | |
CN102842117B (en) | Method for correcting kinematic errors in microscopic vision system | |
CN105205858A (en) | Indoor scene three-dimensional reconstruction method based on single depth vision sensor | |
CN102609941A (en) | Three-dimensional registering method based on ToF (Time-of-Flight) depth camera | |
CN103458261B (en) | Video scene variation detection method based on stereoscopic vision | |
CN101383899A (en) | Video image stabilizing method for space based platform hovering | |
CN107729893A (en) | A kind of vision positioning method of clapper die spotting press, system and storage medium | |
CN105467370A (en) | Cross-range scaling method for precession object ISAR image of composite bistatic radar | |
CN112017248B (en) | 2D laser radar camera multi-frame single-step calibration method based on dotted line characteristics | |
CN102069770A (en) | Automobile active safety control system based on binocular stereo vision and control method thereof | |
CN104363369A (en) | Image restoration method and device for optical field camera | |
CN106846416A (en) | Unit beam splitting bi-eye passiveness stereo vision Accurate Reconstruction and subdivision approximating method | |
CN104143192A (en) | Calibration method and device of lane departure early warning system | |
CN102607510B (en) | Three-dimensional distance measuring method based on sparse representation | |
CN104020456A (en) | Linear array imaging radar system amplitude and phase error correction method based on multiple dominant scattering centers | |
CN103308000A (en) | Method for measuring curve object on basis of binocular vision | |
CN101216936A (en) | A multi-focus image amalgamation method based on imaging mechanism and nonsampled Contourlet transformation | |
CN104049257A (en) | Multi-camera space target laser three-dimensional imaging device and method | |
CN111538008B (en) | Transformation matrix determining method, system and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |