CN109813251A - Method, apparatus and system for three-dimensional measurement - Google Patents

Method, apparatus and system for three-dimensional measurement Download PDF

Info

Publication number
CN109813251A
CN109813251A CN201711164746.6A CN201711164746A CN109813251A CN 109813251 A CN109813251 A CN 109813251A CN 201711164746 A CN201711164746 A CN 201711164746A CN 109813251 A CN109813251 A CN 109813251A
Authority
CN
China
Prior art keywords
image
camera
characteristic point
abscissa
feature point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201711164746.6A
Other languages
Chinese (zh)
Other versions
CN109813251B (en
Inventor
黄正宇
周炜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yilian Technology Co ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201711164746.6A priority Critical patent/CN109813251B/en
Priority to PCT/CN2018/114016 priority patent/WO2019100933A1/en
Publication of CN109813251A publication Critical patent/CN109813251A/en
Application granted granted Critical
Publication of CN109813251B publication Critical patent/CN109813251B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

This application discloses a kind of method for three-dimensional measurement and device for implementing the method and system.The method for three-dimensional measurement includes: the characteristic point extracted in the image from three cameras;It is fixed than relationship screening matching characteristic point group based on following parallax: for same object point, the first parallax d generated between the first and second images, in a first direction1With the second parallax d generated between second and third image, in a second direction2Meet d1:d2=D1:D2, wherein D1For optical center offset in a first direction of the optical center relative to second camera of first camera, D2For optical center offset in a second direction of the optical center relative to third camera of second camera;And calculate the three-dimensional coordinate of the corresponding object point of matching characteristic point group.Method for three-dimensional measurement according to the present invention can contribute to reduce the calculation amount of Feature Points Matching process;Furthermore, it is possible to help to improve the accuracy and matching rate of Feature Points Matching.

Description

Method, apparatus and system for three-dimensional measurement
Technical field
The present invention generally relates to the method, apparatus of three-dimensional measurement and systems, in particular to based on calculating The method for three-dimensional measurement of machine vision technique, device and system.
Background technique
Three-dimensional measurement based on computer vision and the three-dimensionalreconstruction realized on the basis of three-dimensional measurement are in industry, peace Entirely, traffic, amusement aspect are all widely used.Industrial robot perception real world simultaneously carries out the space letter that decision needs three-dimensional Breath.Three-dimensional scenic, which is added, in security monitoring can be improved target identification accuracy.Automatic driving, unmanned plane etc. need real-time perception The position of surrounding objects.Building recovery, historical relic recovery etc. need to carry out three-dimensionalreconstruction to building, historical relic, in particular for Based on the existing three-dimensionalreconstruction of highdensity true color point mysorethorn.In addition, the figure image that three-dimensionalreconstruction is formed is widely used in electricity Shadow, animation, game industry.The three-dimensional role for being based at least partially on three-dimensionalreconstruction and being formed also is widely used in VR, AR Industry.
Three-dimensionalreconstruction based on binocular camera has developed many decades, but restores the true color three-dimensional point cloud of high accuracy out simultaneously Non- easy thing.Specifically, the three-dimensionalreconstruction based on binocular camera corresponds to the picture point of same object point by calculating in binocular image Between position deviation restore the three-dimensional coordinate information of object, core is to choose the characteristic point in image, and find/ The feature point group (i.e. Feature Points Matching) of same object point may be corresponded on screening different images.Currently used for screening matching characteristic point Group method for instance that
1) quadratic sum (SSD, sum of squared of pixel grey scale difference between different images corresponding points to be matched is calculated Difference), minimum value is found;
2) quadratic sum (ZSSD, the zero of zero passage mean pixel gray scale difference between different images corresponding points to be matched are calculated Mean sum of squared differences), find minimum value;
3) pixel grey scale absolute value of the difference and (SAD, sum of between different images corresponding points to be matched are calculated Absolute difference), find minimum value;And
4) zero passage mean pixel gray scale absolute value of the difference and (ZSAD, zero between different images corresponding points to be matched are calculated Mean sum of squared differences), find minimum value.
5) it calculates and normalizes crosscorrelation (NCC, Normalization between different images corresponding vertex neighborhood to be matched Cross correlation), maximizing;
6) zero passage average normalized crosscorrelation (ZNCC, zero- between different images corresponding vertex neighborhood to be matched are calculated Mean normalized cross correlation), maximizing.
One common feature of the method for the above screening matching characteristic point group is which characteristic point searched on piece image Most like with the characteristic point to be matched on another piece image, an existing problem is the processing in the image to complex scene Middle matching error rate is often very high.Such as in the image of periodic structure, be not belonging to same object or same period it is extremely secondary on Two picture points, it is likely that the normalization crosscorrelation of its neighborhood is maximum or pixel difference is minimum, and really corresponding points combination can The position that normalization cross correlation score time is big or pixel difference time is small can be in.When whether other no dimensions have to examine to match Effect, then this kind of matching error can be always existed and be transmitted.Another problem of the method for above-mentioned existing screening matching characteristic point group It is that its calculation amount is very big, to cause processing overlong time and/or required calculating equipment to be difficult to minimize, cost The problems such as high.
Summary of the invention
The object of the present invention is to provide a kind of method for three-dimensional measurement and device for implementing the method and system, until Partially solves the above-mentioned problems in the prior art.
According to an aspect of the present invention, it provides a kind of based on the fixed method for three-dimensional measurement than relationship of parallax, this method packet Include: receive respectively from first camera, second camera and third camera the first image, the second image and third image, first Camera, second camera and third camera focal length having the same and optical axis parallel to each other, and first camera, second camera It is arranged on the same plane perpendicular to optical axis with the optical center of third camera;Respectively in the first image, the second image and Characteristic point is extracted in three images;Characteristic point in first image, the second image and third image is matched, which includes It is fixed than relationship screening matching characteristic point group based on following parallax: for same object point, to be produced between the first image and the second image First parallax d raw, in a first direction1With it is being generated between the second image and third image, in a second direction Second parallax d2Meet d1:d2=D1:D2, wherein D1For first camera optical center relative to second camera optical center described first Offset on direction, D2For optical center offset in this second direction of the optical center relative to third camera of second camera Amount, wherein first direction is to be parallel to the plane and non-perpendicular to the direction of first camera and the optical center connection of second camera, Second direction is to be parallel to the plane and non-perpendicular to the direction of second camera and the optical center connection of third camera;And it calculates The three-dimensional coordinate of the corresponding object point of matching characteristic point group.
According to another aspect of the present invention, a kind of three-dimensional measuring apparatus is provided comprising: processor;With storage program The memory of instruction, wherein when described program instruction is executed by the processor, so that the processor executes following operation: Receive the first image, the second image and third image;Feature is extracted in the first image, the second image and third image respectively Point;Characteristic point in first image, the second image and third image is matched, which includes: to be closed based on following coordinate System's screening matching characteristic point group: the difference of the abscissa of the characteristic point in the first image and the abscissa of the characteristic point in the second image Value is with the abscissa of the characteristic point in the second image and the difference of the abscissa of the characteristic point in third image at predetermined ratio Example relationship, and the characteristic point ordinate having the same in the first image and third image;And calculate matching characteristic The three-dimensional coordinate of the corresponding object point of point group.
According to a further aspect of the invention, a kind of three-dimensional measuring apparatus is provided, for camera array be used cooperatively with Carry out three-dimensional measurement.The camera array includes at least first camera, second camera and third camera, the first camera, the Two cameras and third camera focal length having the same and optical axis parallel to each other, and first camera, second camera and third The optical center of camera is arranged on the same plane perpendicular to optical axis.The three-dimensional measuring apparatus includes processing unit, receives and divides The first image, the second image and third image not from first camera, second camera and third camera, and be configured for It handles below: extracting characteristic point in the first image, the second image and third image respectively;To the first image, the second figure Characteristic point in picture and third image is matched, which includes fixed than relationship screening matching characteristic point based on following parallax Group: for same object point, the first parallax d generated between the first image and the second image, in a first direction1With The second parallax d generated between two images and third image, in a second direction2Meet d1:d2=D1:D2, wherein D1It is Optical center offset in said first direction of the optical center of one camera relative to second camera, D2For the optical center phase of second camera For the optical center offset in this second direction of third camera, wherein first direction is to be parallel to the plane and non-hang down Directly in the direction of first camera and the optical center connection of second camera, second direction is to be parallel to the plane and non-perpendicular to second The direction of the optical center connection of camera and third camera;And calculate the three-dimensional coordinate of the corresponding object point of matching characteristic point group.
Another aspect according to the present invention provides a kind of based on the fixed three-dimension measuring system than relationship of parallax, the system packet Include: camera array and above-mentioned any three-dimensional measuring apparatus, the camera array include at least first camera, second camera And third camera, the first camera, second camera and third camera focal length having the same and optical axis parallel to each other, and And the optical center of first camera, second camera and third camera is arranged on the same plane perpendicular to optical axis.
Detailed description of the invention
By reading a detailed description of non-restrictive embodiments in the light of the attached drawings below, of the invention other Feature, objects and advantages will become more apparent upon:
Figure 1A, Figure 1B and Fig. 1 C show the relationship of binocular parallax Yu camera photocentre relative position;
Fig. 2 shows the parallaxes of the camera on the same plane for being placed perpendicular to optical axis to compare relationship surely;
Fig. 3 shows an exemplary schematic block diagram of three-dimension measuring system of the invention;
Fig. 4 shows the schematic general flow chart of method for three-dimensional measurement of the invention;
Fig. 5 shows showing for the camera array that can be used in combination with method for three-dimensional measurement according to a first embodiment of the present invention Example;
Fig. 6 A, Fig. 6 B and Fig. 6 C show the parallax between camera shown in Fig. 5 and compare relationship surely.
Fig. 7 schematically shows the example of the image of the acquisition of camera array shown in Fig. 5 and illustrates the coordinate difference of picture point;
Fig. 8 shows the schematic flow chart of method for three-dimensional measurement according to a first embodiment of the present invention;
Fig. 9 shows the processing for the screening feature point group that can be used for method for three-dimensional measurement according to a first embodiment of the present invention One example;
Figure 10 schematically compared Feature Points Matching based on Similarity measures and according to a first embodiment of the present invention Feature Points Matching based on coordinate relationship in method for three-dimensional measurement;
Figure 11 shows an exemplary flow chart of method for three-dimensional measurement according to a first embodiment of the present invention;
Figure 12 shows showing for the camera array that can be used in combination with method for three-dimensional measurement according to a second embodiment of the present invention Example simultaneously shows its parallax and compares relationship surely;
Figure 13 schematically shows the example of the image of the acquisition of camera array shown in Figure 12 and illustrates the coordinate difference of picture point;
Figure 14 shows the processing that can be used for the screening feature point group of method for three-dimensional measurement according to a second embodiment of the present invention An example;
Figure 15 shows showing for the camera array that can be used in combination with method for three-dimensional measurement according to a third embodiment of the present invention Example simultaneously shows its parallax and compares relationship surely;
Figure 16 schematically shows the example of the image of the acquisition of camera array shown in Figure 12 and illustrates the coordinate difference of picture point;
Figure 17 shows the processing that can be used for the screening feature point group of method for three-dimensional measurement according to a third embodiment of the present invention An example;
Figure 18 shows an exemplary flow chart of method for three-dimensional measurement according to a third embodiment of the present invention;And
Figure 19 shows the example that can be used for the camera array arrangement of three-dimension measuring system according to an embodiment of the present invention.
Specific embodiment
The present invention is described in further detail with reference to the accompanying drawings and examples.It is understood that this place is retouched The specific embodiment stated is used only for explaining related invention, rather than the restriction to the invention.It also should be noted that in order to Convenient for description, part relevant to invention is illustrated only in attached drawing.
In this application, unless otherwise indicated, " left side " and " right side ", "up" and "down" only indicate opposite position respectively, when When " left side " and " right side ", "up" and "down", expression is the relative position in mutually orthogonal direction, and above-mentioned position is unlimited In people usually to the identification above and below left and right.In addition, in this application, unless otherwise indicated, " vertical " and " cross " indicates to hang down each other Straight direction, the former is not limited to "vertical" direction, and the latter is also not necessarily limited to " horizontal direction ".
It should be noted that in the absence of conflict, the features in the embodiments and the embodiments of the present application can phase Mutually combination.The present invention will be described in detail below with reference to the accompanying drawings and embodiments.
The relationship of binocular parallax Yu camera photocentre spacing will be introduced in conjunction with Figure 1A, Figure 1B and Fig. 1 C first.O in figurelWith OrPoint The optical center (optical centers of camera lens) of left camera and right camera, I are not indicatedlAnd IrRespectively indicate the image planes of left camera and right camera (being referred to as left image planes and right image planes individually below).The position of the photosurface for the imaging sensor that the image planes of camera are included by camera Decision is set, described image sensor is, for example, CCD or CMOS, is conventionally positioned at one focal length f of optical center.Camera is usually To object at the real image of handstand, image planes are located at the opposite side of object relative to optical center, but the side in order to illustrate and analyze herein Just, image planes are shown in the symmetric position ipsilateral with object, as shown in the figure.It is incited somebody to action it should be understood that this does not change in the application The parallax relationship of discussion.
Two camera focal lengths having the same for binocular vision.The optical center O of right cameralOptical center relative to left camera OlHave an offset D (, corresponding optical axis ZlAnd ZrIt is parallel to each other.
In Figure 1A, Figure 1B and Fig. 1 C, with the optical center O of left cameralFor the origin of camera coordinates system, with the optical axis of left and right camera Direction be Z-direction.The optical center of left camera and right camera is located at perpendicular in the same plane (i.e. X/Y plane) of optical axis.In Figure 1A In Figure 1B, with optical center OlWith OrX-direction of the line direction as camera coordinates system, perpendicular to the optical center connection and optical axis Direction is Y-direction.It should be understood that camera coordinates system can also be set in other ways, such as with the optical center of right camera OrIt has no effect on parallax discussed below for the setting means of origin, and different camera coordinates systems and compares relationship surely.
In this application, the corresponding image planes of each camera have the image coordinates system set in the same manner.For example, at this In application diagram example, the point (picture point of objective point imaging on optical axis) intersected with camera optical axis with image planes is the original of image coordinates system Point, and the x-axis and y-axis of image coordinates system is in the direction to be respectively parallel to camera coordinates system X-axis and Y-axis.It should be understood that Image coordinates system can also set in other ways, for example, image taking sensor photosurface an apex angle as origin, And the setting means of different image coordinates systems has no effect on parallax discussed below and compares relationship surely.
Figure 1A shows the relationship of binocular parallax and optical center spacing on camera photocentre line direction.Figure 1A is shown entirely Projection of the imaging system in XZ plane.Same object point P [X, Y, Z] in space images in picture point P by left and right camera respectivelyl And Pr.As more clearly illustrated in Figure 1B, it is based on Similar Principle of Triangle, for object point P, correspondence of the left and right camera in image In generation parallax d=x on the direction (direction x) of camera photocentre linel-xr, and d/D=f/Z, wherein xlAnd xrRespectively picture point Pl And PrIn left image planes IlWith right image planes IrIn x-axis coordinate.It will be appreciated by those skilled in the art that the offset D of camera photocentre according to The definition of coordinate system be possible be positive value, it is also possible to for negative value.
Figure 1B is shown perpendicular to the binocular parallax on camera photocentre line direction.Figure 1B schematically show entirely at As projection of the system in YZ plane.As shown in Figure 1B, object point P passes through left and right camera in image perpendicular to camera photocentre line Direction (direction y) on do not generate parallax, i.e. d=yl-yr=0.
Next the parallax in image planes on any direction and the relationship between camera photocentre space D are investigated.For discussion It is convenient, the direction of parallax to be investigated is set as X-direction in Fig. 1 C, at this time the optical center connection and X-direction of left camera and right camera Inconsistent, in other words the direction of parallax to be investigated can be any direction relative to camera photocentre line.It assume that presence One middle camera, optical center OmWith optical center OlWith OrIt is arranged in the same plane perpendicular to optical axis, the middle camera is in X-direction It is upper to be aligned with left camera, it is aligned in the Y direction with right camera, as shown in Figure 1 C.According to above in conjunction with Figure 1A, Figure 1B and Fig. 1 C's It discusses, left camera and the parallax d of middle camera in the X directionlm=xl-xm=Dx× f/Z, wherein xmPass through interphase for object point X-axis coordinate of the machine imaging point in the image planes of middle camera, DxOffset equal to right camera relative to left camera in the X direction Amount, i.e. Dx=Xr-Xl.Middle camera with right camera due to being aligned in the Y direction, so the two is in the X direction without parallax, i.e., dmr=xm-xr=0.In terms of comprehensive, the parallax d that left camera and right camera generate in the X directionx=xl-xr=dlm+dmr=Dx× f/Z.Similarly, the parallax dy=y that left camera and right camera generate in the Y directionl-yr=Dy× f/Z, wherein DyEqual to right phase Offset of the machine relative to left camera in the Y direction, i.e. Dy=Yr-Yl
Situation shown in Fig. 1 C is extended to the feelings for three cameras being arranged in the same plane perpendicular to optical axis by Fig. 2 Condition, wherein first camera C1, second camera C2With third camera C3Focal length f having the same and optical axis parallel to each other (do not show Out), and respective optical center O1、O2And O3In the same plane perpendicular to optical axis.According to the analysis above by reference to Fig. 1 C, For same object point, the first parallax d generated between the image that first camera and second camera obtain, in a first direction1 =D1× f/Z, wherein D1For the optical center O of second camera2Optical center O relative to first camera1Offset on A in a first direction, And the second parallax d generated between the image that second camera and third camera obtain, on second direction B2=D2× f/Z, Wherein D2For the optical center O of third camera3Optical center O relative to second camera2Offset on second direction B, wherein first party It is plane where being parallel to camera photocentre and non-perpendicular to the direction of first camera and the optical center connection of second camera to A, Two direction B are the plane where being parallel to camera photocentre and non-perpendicular to the side of second camera and the optical center connection of third camera To.Although showing first direction in Fig. 2 different from second direction, first direction can also be identical as second direction.This hair It is bright to be closed inventors have found that determining ratio there are following parallax for three cameras being arranged in the same plane perpendicular to optical axis System, it may be assumed that above-mentioned first parallax d1With the second parallax d2Meet d1:d2=D1:D2
Based on above-mentioned discovery, it is a kind of fixed than the three-dimension measuring system of relationship and three-dimensional survey based on parallax to propose according to the present invention Amount method.Fig. 3 shows an exemplary schematic block diagram of three-dimension measuring system 10 according to the present invention.Fig. 4 is shown The schematic general flow chart of method for three-dimensional measurement 100 of the invention.
As shown in figure 3, three-dimension measuring system 10 includes camera array CA, camera array CA includes at least first camera C1, second camera C2With third camera C3, their focal lengths having the same and optical axis parallel to each other, and respective optical center It is arranged on the same plane perpendicular to optical axis.Preferably, each camera aperture having the same, ISO, aperture time, image Sensor etc..It is highly preferred that each camera is completely with the camera of model.
Three-dimension measuring system 10 includes processing unit 11, receives the image from camera array, including come from the first phase Machine C1, second camera C2With third camera C3The first image, the second image and third image, and based at these images Reason, to realize three-dimensional measurement.Certainly, camera array CA may include additional camera, and processing unit 11 can receive and The image of the camera additional from these is simultaneously handled.
Three-dimension measuring system 10 may include control unit 12, can operate to control first camera, second camera and Three cameras synchronously acquire image.Dynamic scene three-dimensional is reconstructed, the wagon flow such as on road, the moment expression of people is different The consequence of step shooting is that matching result error rate is very high.Control unit 12 can also control camera equimultiple zoom.For example, right In farther away scene, need three cameras being adjusted to larger focal length in the same manner.Control unit 12 can pass through wired or nothing The mode and first camera C of line1, second camera C2With third camera C3Connection, to realize above-mentioned control.Control unit 12 can be with It is communicated with processing unit 11, receives the information from processing unit 11, it is described for controlling camera synchronous acquisition image to generate Control signal, can also work independently to realize above-mentioned control.
Of the invention determines the method for three-dimensional measurement 100 than relationship based on the camera of such as three-dimension measuring system 10 based on parallax Array CA is realized.Method for three-dimensional measurement 100 includes:
S110: the first image, the second image and third respectively from first camera, second camera and third camera are received Image;
S120: characteristic point is extracted in the first image, the second image and third image;
S130: matching the characteristic point in the first image, the second image and third image, the matching include based on Lower parallax is fixed to screen matching characteristic point group than relationship: for same object point, it is being generated between the first image and the second image, The first parallax d on first direction1With the second parallax being generated between the second image and third image, in a second direction d2Meet d1:d2=D1:D2, wherein D1For first camera optical center relative to second camera optical center in said first direction Offset, D2For optical center offset in this second direction of the optical center relative to third camera of second camera, wherein One direction is to be parallel to the plane and non-perpendicular to the direction of first camera and the optical center connection of second camera, and second direction is It is parallel to the plane and non-perpendicular to the direction of second camera and the optical center connection of third camera;And
S140: the three-dimensional coordinate of the corresponding object point of matching characteristic point group is calculated.
The processing unit 11 of three-dimension measuring system 10 be configured to execute method for three-dimensional measurement 100 in above-mentioned processing s110~ s140.Processing unit 11 can be realized that the program instruction is by processor by the memory of processor and storage program instruction Processor is made to execute the operation of above-mentioned processing s110~s140 when execution.Processing unit 11 may be constructed according to the present invention three Tie up measuring device 20.
It handles in s110, directly can receive described image from first camera, second camera and third camera, it can also be through Described image is received by other units or equipment.
It handles in s120, finds the characteristic point in each image.Such as can to find grey scale change with shade of gray larger Point as characteristic point, perhaps using sift algorithm find sift characteristic point or using such as Harris algorithm etc angle Point detection algorithm finds the angle point of image as characteristic point.It should be understood that the present invention is not limited to the tools for extracting characteristic point Body method.
Processing s130 is for matching the characteristic point in each image, including fixed than relationship sieve based on above-mentioned parallax Select the processing of matching characteristic point group.It introduces below in conjunction with different embodiments and is compared surely in processing s130 based on parallax in further detail The processing of relationship screening matching characteristic point group.
Processing s130 may further include the processing for otherwise screening matching characteristic point group.In other words, one In a little embodiments, processing s130 can be only by realizing Feature Points Matching based on the fixed processing than relationship of parallax.At other In embodiment, handling can be in conjunction with fixed than the processing of relationship and based on other matching/screening mode places based on parallax in s130 Reason is to realize Feature Points Matching.The other modes of matching characteristic point group are screened for example including to pixel or neighborhood territory pixel group application phase Matching characteristic point group is screened like property calculating.Here, the Similarity measures include flat to quadratic sum, the zero passage of pixel grey scale difference Quadratic sum, pixel grey scale absolute value of the difference and the zero passage mean pixel gray scale absolute value of the difference and neighborhood of equal pixel grey scale difference it Between normalize in degree of cross-correlation and the average normalized degree of cross-correlation of zero passage at least one of calculating.
For example, in some embodiments, processing s130 further includes to based on fixed screened than relationship of the parallax With feature point group, the Similarity measures to pixel or neighborhood territory pixel group are applied further to screen matching characteristic point group.Some In example, Similarity measures can only be applied to include same characteristic features point more than two feature point groups, i.e., be only applied to With in the not unique situation of result.
In further embodiments, it handles in s130, first for example, by applying phase to characteristic point or its neighborhood territory pixel group Matching characteristic point group is screened like property calculating, then for the matching characteristic point group obtained by Similarity measures screening, judgement It is fixed than relationship whether every group of characteristic point meets the parallax, to further screen matching characteristic point group.
It include respectively from the first image, the second image and third by handling each matching characteristic point group that s130 is obtained One characteristic point of image.It, can be by any two characteristic point in a matching characteristic point group come based in processing s140 The depth (Z coordinate) for calculating corresponding object point, the X of the object point, Y-coordinate are calculated further according to similar triangle theory.Based on two Matched characteristic point calculate depth value and method that X, Y coordinates are calculated based on depth value be it is known, details are not described herein.
In some preferred embodiments, it handles in s130 based on the every two characteristic point difference in a matching characteristic point group A depth value is calculated, and takes its average value as the depth value of corresponding object point.Theoretically, each depth value calculated It should be equal, but consider the influence of picture noise in actual conditions, then have small difference between three values.Therefore Its average value is taken, the influence of noise can be reduced, improves the accuracy of depth value.In some other embodiment, it can also choose Two pairs of characteristic points in feature point group calculate depth value and take its average value as the depth value of object point.
For the three-dimensional measurement for three-dimensionalreconstruction, processing s140 can also include calculating each matching characteristic point group The color value of corresponding object point, such as [R, G, B].The color value can be formed together voxel with the coordinate [X, Y, Z] of object point Information, such as [X, Y, Z, R, G, B].In three-dimensionalreconstruction, the voxel information of all object points can be combined, to form object or field The true color threedimensional model of scape.
Method for three-dimensional measurement 100 and three-dimension measuring system 10 are introduced in further detail below with reference to different embodiments, it is special It is not wherein based on the fixed processing than relationship screening matching characteristic point group of parallax.
Fig. 5 to Figure 11 illustrates three-dimension measuring system and method for three-dimensional measurement according to a first embodiment of the present invention.
Fig. 5 shows the camera array CA in three-dimension measuring system according to a first embodiment of the present invention, wherein the first phase Machine C1, second camera C2With third camera C3Focal length having the same and optical axis parallel to each other, and the optical center cloth of three cameras It sets on the same straight line (X-direction) perpendicular to optical axis.D in arrangement (a) shown in Fig. 51=D2;Arrange D in (b)1≠D2
The parallax that Fig. 6 A, Fig. 6 B and Fig. 6 C are shown between three cameras shown in Fig. 5 compares relationship surely.As shown, space Same object point P [X, Y, Z] passes through first camera C1, second camera C2With third camera C3The first image planes I is imaged in respectively1P1 Point, the second image planes I2P2Point and third image planes I3P3Point.Each image planes have the image planes set in the same manner as described above Coordinate system.In Fig. 6, the direction of straight line, y-axis are vertical with x-axis where the x-axis direction of each image planes corresponds to camera photocentre.Such as figure It is more clearly shown in 6C, in the direction of the x axis, the first parallax d generated between first camera and second camera1=x1-x2= D1× f/Z, the second parallax d generated between second camera and third camera2=x2-x3=D2× f/Z, meets d1:d2=D1:D2
Fig. 7 schematically shows the first image IM of the acquisition of camera array shown in Fig. 51, the second image IM2With third figure As IM3Example, wherein the axis of abscissas (u axis) of image correspond to camera photocentre where straight line direction.Ideally, As passed through the first image IM of superposition in Fig. 71, the second image IM2With third image IM3And obtain shown in the image IM ' of illusion, together One object point P [X, Y, Z] is in image IM1、IM2And IM3In correspondence picture point P1[u1,v1]、P2[u2,v2]、P3[u3,v3] with identical Ordinate, i.e. v1=v2=v3, and picture point P1[u1,v1]、P2[u2,v2] abscissa difference s1=u1-u2=d1/dp, picture point P2[u2,v2]、P3[u3,v3] abscissa difference s2=u2-u3=d2/dp, meet s1:s2=d1:d2=D1:D2, wherein dpFor figure As the side length of the pixel unit of sensor.
Due to the camera internal parameter that is generated in foozle, installation error and use process and/or external parameter Variation, practical the first image, the second image and the third image directly obtained from first camera, second camera and third camera are logical Above-mentioned ideal situation can often be deviateed.This can be by the calibration or correction physically to camera, and/or passes through calculating journey The correction of ordered pair image comes so that image is close to above-mentioned ideal situation.Therefore, method for three-dimensional measurement according to an embodiment of the present invention It may include the correction process to the first image, the second image and third image, which makes the first image, the second figure Correspond to the point abscissa having the same of the optical axis of first camera, second camera and third camera in picture and third image and indulges Coordinate, and the abscissa direction of the first image, the second image and third image both corresponds to the direction of the straight line, wherein institute The direction for stating abscissa and ordinate is mutually perpendicular to.
It is carried out before correction process processing s130 shown in Fig. 4, the i.e. processing of matching characteristic point, is preferably receiving image It carries out later and before extracting characteristic point, i.e., is carried out between processing s110 and s120 shown in Fig. 4.It should be noted, however, that It is that method for three-dimensional measurement according to an embodiment of the present invention is not limited to include the case where above-mentioned correction process, for example, being answered some In, it can be adjusted by the physics to camera array to be corrected, without the correction to image.
Correspondingly, as shown in figure 3, may further include correction unit in three-dimension measuring system 10 according to the present invention 13, which receives the image from camera permutation CA, and is based at least partially on described image and generates correction matrix, And correction matrix is supplied to processing unit 11.The correction matrix, when 11 are applied to the first image, second through the processing unit When image and third image, the above-mentioned correction process in method for three-dimensional measurement is realized.As shown in figure 3, three-dimensional measuring apparatus 20 can To include the correction unit 13.Processing unit 11 and correction unit 13 can be based on identical processor and memory or differences Processor and memory realize.
Based on the camera array with arrangement shown in Fig. 5, as shown in figure 8, three-dimensional survey according to a first embodiment of the present invention It, will be fixed than relationship screening matching characteristic point group based on parallax in processing s130 shown in Fig. 4 in the processing s230 of amount method 200 Processing be embodied as include: based on following coordinate relationship screen matching characteristic point group: the characteristic point P in the first image1[u1,v1] Abscissa and the second image in characteristic point P2[u2,v2] abscissa difference s1=u1-u2Described in the second image Characteristic point P2[u2,v2] abscissa and third image in characteristic point P3[u3,v3] abscissa difference s2=u2-u3Meet s1:s2=D1:D2, and the characteristic point ordinate having the same in the first image, the second image and third image, i.e. v1 =v2=v3.Other processing of method for three-dimensional measurement 200 and the corresponding position in the method for three-dimensional measurement 100 described above by reference to Fig. 4 Reason be it is identical, details are not described herein.
Fig. 9 is shown based on above-mentioned coordinate relationship s1:s2=D1:D2And v1=v2=v3Screen the place of matching characteristic point group One example of reason, processing 300.As shown, processing 300 includes:
S310: fisrt feature point and second is selected respectively in the two in the first image, the second image and third image The ordinate of characteristic point, the fisrt feature point and second feature point, which meets, is located at preset range relative to a target ordinate It is interior;
S320: calculate the first image, in the third party in the second image and third image with the fisrt feature point and the The expectation abscissa of the third feature point of two Feature Points Matchings, so that difference s1=u1-u2With difference s2=u2-u3Meet s1:s2= D1:D2;And
S330: based on the expectation position being made of the expectation abscissa and target ordinate of third feature point, described Third feature point is searched in three.
Processing 300 can be used for for example carrying out sweeping line by line to image according to certain sequence (increasing or decreasing) of ordinate It retouches to screen matching characteristic point group.(i.e. D is equally spaced with first camera, second camera and third camera below1=D2) feelings Fig. 7 is combined to introduce above-mentioned processing 300 in more detail for condition.It should be understood that it is introduced below exemplary only, and not restrictive 's.
It handles in s310, in the second image IM2In J there is target ordinate vtCharacteristic point P2[u2(j),v2(j)] In (1≤j≤J), one of characteristic point P is selected2[u2,v2] (fisrt feature point), wherein ordinate v2(j)=vt;Search the One image IM1With third image IM3In with target ordinate characteristic point P1[u1(i),v1And characteristic point P (i)]3[u3(j),v3 (j)], wherein v1(i)=v3(j)=vt, obtain the first image IM1In have I characteristic point P of the condition of satisfaction1[u1(i),v1 (i)], 1≤i≤I, the second image IM2In have K characteristic point P of the condition of satisfaction3[u3(k),v3(k)], (I or K are 1≤j≤K Then terminate to search for when zero, select fisrt feature point again);Assuming that I >=K, in order to reduce searching times, from third image IM3In Start to search for, i.e., the IM in third image3In select one of characteristic point P3[u3,v3] (second feature point) (if M < N, from Left figure starts to search for).
Fisrt feature point is selected in above description from the second image, this is only exemplary, the present invention is not limited to Fisrt feature point is selected since which image.For example, in other examples, can also meet in image more each first Characteristic point quantity I, the size of J, K of ordinate requirement, select fisrt feature point, in spy in the smallest image of characteristic point quantity Second feature point is selected in the small image of sign point quantity time, finally searches for third feature in the maximum image of characteristic point quantity Point.
In view of image error/noise, handling can be in v in s310t- ε~vtIn the ordinate preset range of+ε search and Selected characteristic point, ε are the integer more than or equal to 0, can be according to the installation of such as camera array and behaviour in service or image Quality determine, it is preferable that 0≤ε≤2.
Then, it handles in s320, calculates the first image IM using symmetry1Mid-term is to appear with characteristic point P2[u2,v2] and Characteristic point P3[u3,v3] matched characteristic point P1The abscissa positions of (third feature point) (expect abscissa ue), so that difference s1 =ue-u2With difference s2=u2-u3Meet s1:s2=D1:D2=1, i.e. ue=2u2-u3.It can be seen that at this point, the horizontal seat of characteristic point Mark has symmetrical coordinate relationship.
Next, in processing s330, based on the characteristic point P for expecting to occur1Expectation abscissa ueWith the vertical seat of above-mentioned target Mark vt, available characteristic point P1Expectation position [ue,vt], and based on the expectation position in the first image IM1Whether middle search Existing characteristics point P1.Equally, it is contemplated that image error/noise can set abscissa tolerance ε in some instancesuWith vertical seat Mark tolerance εv, and [ueu~ueu,vtv~vtv] in the range of in the first image IM1Middle search characteristics point P1(such as see Image IM in Fig. 71Range shown in middle dotted line).It is preferred that 0≤εu≤3,0≤εv≤ 3 in other examples, can only set cross One of coordinate tolerance and ordinate tolerance, details are not described herein.
If searching the characteristic point P of expectation in processing s3301, then characteristic point P1、P2、P3As a matching characteristic point Group.If search be before the due date to characteristic point P1, then selected fisrt feature point (P2) and second feature point (P3) do not have matching A possibility that, the screening of the bout terminates, and next can select next second feature point, then repeat the above processing s320 And s330.After traversing all second feature points, new fisrt feature point is selected, and similarly repeat the above processing.
Above with camera straight line arrangement and symmetrical (D1:D2=1) processing 300 is described for.For D1: D2The operation of the case where=1:R, R ≠ 1, above-mentioned processing 300 are similar, the difference is that calculating the expectation of third feature point When abscissa, by taking specific situation discussed above as an example, ue=(1+R) × u2-R×u3, at this time all in view of image coordinate For integer, so needing to be rounded the numerical value being calculated the expectation abscissa that could obtain needs, i.e. ue=round { (1+R) ×u2-R×u3, " round " indicates rounding operation herein.
It is only an example of the processing based on coordinate relationship screening matching characteristic point group shown in Fig. 9.Leading to Cross the case where other modes (such as Similarity measures to pixel or neighborhood territory pixel group) obtain candidate matching characteristic point group Under, whether s can also be met in certain range of tolerable variance with the abscissa for checking each characteristic point in candidate feature point group1:s2 =D1:D2, and in allowed limits whether the ordinate for judging each characteristic point, meets above-mentioned coordinate relationship to screen s1:s2=D1:D2And v1=v2=v3Feature point group.
It, will be based on the parallax generated between three cameras in method for three-dimensional measurement 200 according to a first embodiment of the present invention The fixed processing than relationship is reduced to the processing based on the coordinate relationship of character pair point in image, in Feature Points Matching/screening During with feature point group, Feature Points Matching is carried out with traditional Similarity measures based on to pixel or neighborhood territory pixel group Method is compared, since the former mainly does signed magnitude arithmetic(al) and a small amount of multiplying (in D to coordinate1:D2In the case where=1, Only need to carry out signed magnitude arithmetic(al)), and the latter usually requires to carry out intensive multiplying, such as the convolution of matrix, so phase Comparatively, the former substantially reduces the calculation amount of its matching process.Calculation amount is substantially reduced for based on fine definition figure The three-dimensional measurement of picture has very great significance with reconstruct and for real-time three-dimensional measurement and reconstruct tool, for the reality of the latter Now provide possibility.
Figure 10 schematically compared Feature Points Matching based on Similarity measures and according to a first embodiment of the present invention Coordinate relationship s is based in method for three-dimensional measurement 2001:s2=D1:D2Feature Points Matching.Display is for for example in (a) of Figure 10 Some characteristic point P in an image that binocular camera obtainsl1, 3 and self attributes and neighborhood category are found in another image Property very similar candidate feature point Pr1、Pr2、Pr3.And according in three-dimensionalreconstruction to the requirement of the uniqueness of Feature Points Matching, it is above-mentioned Not unique matching result can only be abandoned.Further, it is contemplated that due to image noise etc., incorrect matched feature Point Pr3It may be than correct matched characteristic point point Pr1It shows and characteristic point Pl1Bigger similitude, thus, based on to pixel or The feature point group screening of the Similarity measures of neighborhood territory pixel group may obtain the matching result of mistake.(b) display of Figure 10, for Image from first camera, second camera and the third camera arranged as shown in Figure 5 is based on coordinate relationship s1:s2=D1:D2, Unique matching characteristic point group (P can be filtered out11, P21, P31), and exclude according to attribute similarity may occur it is matched its Its two candidate feature point P32、P33
It can be seen that the Feature Points Matching based on the coordinate relationship can help to exclude the matching based on Similarity measures Obtained error result improves matched accuracy;Simultaneously as unique match can be obtained with more maximum probability as a result, avoiding Due to matching result it is not unique caused by it fails to match, so fixed than relationship/above-mentioned coordinate relationship characteristic point based on parallax Matching also contributes to improving matching rate, to help to realize highdensity characteristic point cloud.
Figure 11 is flow chart, shows an example of method for three-dimensional measurement according to a first embodiment of the present invention, three-dimensional Measurement method 400.Method for three-dimensional measurement 400 includes:
S410: the first image, the second image and third respectively from first camera, second camera and third camera are received Image;
S420: being corrected processing to the first image, the second image and third image, as discussed above, the correction Processing is so that correspond to the optical axis of first camera, second camera and third camera in the first image, the second image and third image Point abscissa having the same and ordinate, and the abscissa direction of the first image, the second image and third image is all right The direction of straight line described in Ying Yu;
S430: characteristic point is extracted in the first image, the second image and third image;
S440: matching the characteristic point in the first image, the second image and third image, which includes:
S441: matching characteristic point group: the characteristic point P in the first image is screened based on following coordinate relationship1[u1,v1] cross Characteristic point P in coordinate and the second image2[u2,v2] abscissa difference s1=u1-u2With the feature in the second image Point P2[u2,v2] abscissa and third image in characteristic point P3[u3,v3] abscissa difference s2=u2-u3Meet s1:s2 =D1:D2, and the characteristic point ordinate having the same in the first image, the second image and third image, i.e. v1=v2 =v3;And
S442: for the matching characteristic point group obtained by processing s441, based on the phase to pixel or the pixel group of neighborhood It is calculated like property, further progress screening;
S450: the three-dimensional coordinate of the corresponding object point of matching characteristic point group is calculated.
Processing s441 can be implemented as processing 300 for example shown in Fig. 9, but be not restricted to that this.
Processing s420 can be the correction process realized using any bearing calibration, including by applying centainly to image The correction process that correction matrix is realized.
According to the present invention, parallax is fixed can also be applied in correction process than relationship, such as can be compared surely based on parallax Relationship generates the correction matrix for being used for correction process.As an example, correction process is used for based on fixed generate than relationship of parallax The processing of matrix may include: to extract characteristic point in the first image, the second image and third image respectively, such as it is logical It crosses sift algorithm and extracts sparse features point dot matrix;Characteristic point in first image, the second image and third image is matched, Multiple matching characteristic point groups are obtained, such as pass through RANSAC algorithm;Using characteristic point in matching characteristic point group in each image Coordinate is applied to the parallax met between the characteristic point after each image in matching characteristic point group according to correction matrix Determine to establish over-determined systems than relationship;And the over-determined systems are resolved for example, by least square method, obtain correction square Battle array.
In method for three-dimensional measurement 400, matching characteristic point group is screened by processing s441 first, so that entering subsequent be based on The quantity of the characteristic point of the Feature Points Matching processing s442 of Similarity measures is greatly decreased, and can be substantially reduced Feature Points Matching mistake Calculation amount in journey.In addition, processing s441 and s442 is applied in combination, also contribute to Feature Points Matching accuracy and With rate, make it possible to obtain more highdensity characteristic point cloud.
Figure 12 to Figure 14 illustrates three-dimension measuring system and method for three-dimensional measurement according to a second embodiment of the present invention.
Figure 12 shows the camera array CA in three-dimension measuring system according to a second embodiment of the present invention, wherein the first phase Machine C1, second camera C2With third camera C3Focal length having the same and optical axis parallel to each other (Z-direction), first camera C1With Second camera C2Optical center O1、O2It is aligned in X direction, second camera C2With third camera C3Optical center O2、O3It is aligned along Y-direction. The optical center of second camera is D relative to the optical center offset in the X direction of first camera1, the optical center of third camera relative to The offset of the optical center of second camera in the Y direction is D2
Referring to what is discussed above in conjunction with Fig. 1 C, and as shown in figure 12, in the direction of the x axis, first camera and second camera Between the first parallax d for generating1=x1-x2=D1×f/Z;In the y-axis direction, it is generated between second camera and third camera Second parallax d2=y2-y3=D2× f/Z, meets d1:d2=D1:D2
Figure 13 schematically shows the first image IM of the acquisition of camera array shown in Figure 121, the second image IM2And third Image IM3Example, wherein the axis of abscissas (u axis) of image corresponds to first camera and the straight line where second camera optical center Direction (x-axis direction), the ordinate (v axis) of image correspond to the direction (y of the straight line where second camera and third camera optical center Direction).Ideally, as passed through the first image IM of superposition in Figure 131, the second image IM2With third image IM3And obtain vacation Shown in the image IM ' of elephant, same object point P [X, Y, Z] is in image IM1、IM2In correspondence picture point P1[u1,v1]、P2[u2,v2] have Identical ordinate, i.e. v1=v2, in image IM2、IM3In correspondence picture point P2[u2,v2]、P3[u3,v3] horizontal seat having the same Mark, i.e. u2=u3, and picture point P1[u1,v1]、P2[u2,v2] abscissa difference s3=u1-u2=d1/dp, picture point P2[u2,v2]、 P3[u3,v3] ordinate difference s4=v2-v3=d2/dp, meet s3:s4=d1:d2=D1:D2, wherein dpFor imaging sensor The side length of pixel unit.
Method for three-dimensional measurement according to the present embodiment may include the correction to the first image, the second image and third image Processing, the correction process to correspond to first camera, second camera and third in the first image, the second image and third image The point abscissa having the same and ordinate of the optical axis of camera, and the horizontal seat of the first image, the second image and third image The direction that direction corresponds to the optical center connection of first camera and second camera is marked, ordinate direction corresponds to second camera and third The direction of the optical center connection of camera.
Based on the camera array with arrangement shown in Figure 12, in method for three-dimensional measurement according to a second embodiment of the present invention, It will be embodied as including: to be based on based on the fixed processing than relationship screening matching characteristic point group of parallax in processing s130 shown in Fig. 4 Following coordinate relationship screens matching characteristic point group: the characteristic point P in the first image1[u1,v1] abscissa and the second image in Characteristic point P2[u2,v2] abscissa difference s3=u1-u2With the characteristic point P in the second image2[u2,v2] ordinate With the characteristic point P in third image3[u3,v3] ordinate difference s4=v2-v3Meet s3:s4=D1:D2And v1=v2, u2= u3.According in other processing of the method for three-dimensional measurement of the present embodiment and the method for three-dimensional measurement 100 above by reference to Fig. 4 description Respective handling can be same or similar, and details are not described herein.
Figure 14 shows an example of the processing based on above-mentioned coordinate relationship screening matching characteristic point group, processing 500.Such as Shown in Figure 14, processing 500 includes:
S510: fisrt feature point and second feature point, the fisrt feature point are selected in the first image and the second image Meet with the ordinate of second feature point and is located in preset range relative to a target ordinate;
S520: the difference s of the abscissa of fisrt feature point and the abscissa of second feature point is calculated3=u1-u2
S530: calculating difference s4, so that s3:s4=D1:D2
S540: the expectation in third image with the fisrt feature point and the matched third feature point of second feature point is calculated Ordinate, so that the difference of the expectations ordinate of the ordinate of second feature point and third feature point is above-mentioned the be calculated Two difference s4;And
S550: based on the expectation position being made of the abscissa of the expectation ordinate of the third feature point and second feature point It sets, third feature point is searched in third image.
The operation that characteristic point is selected in the preset range relative to coordinates of targets in processing s510 is referred to above The operation that processing s310 in combination processing 300 is introduced.In addition, being similar to processing 300, handle in 500 processing s550 Range of tolerable variance can be set relative to position is expected, details are not described herein.
It should be noted that due in image in this application abscissa and what ordinate indicated is the two of Relative vertical A direction, so the position that first camera and third camera are of virtually equity relative to second camera is closed in the present embodiment System, thus in processing s510, it is described " fisrt feature point and second feature point, institute to be selected in the first image and the second image The ordinate satisfaction for stating fisrt feature point and second feature point is located in preset range relative to a target ordinate " it is not limited to The first and second characteristic points are chosen in two images of horizontal alignment first.
In some preferred embodiments, after characteristic point being selected in the second image, search the first image and There is in third image with this feature point the characteristic point of identical ordinate and identical abscissa, and be based on the first image and third figure The quantity of characteristic point that searches as in determines next search order.For example, working as described in the first image with identical When the characteristic point quantity of ordinate is greater than the quantity of the characteristic point described in third image with identical abscissa, Ke Yijie Get off and select characteristic point from third image, the expectation position for the characteristic point being calculated and determined in the first image is simultaneously searched for.Processing S510 is intended to covering such case.
Processing 500 can be used for according to the feature in certain sequence (such as line by line or by column) traversal such as the second image Point screens characteristic point matching in the first image and third image, to screen matching characteristic point group.
It is similar with method for three-dimensional measurement according to a first embodiment of the present invention from the point of view of technical effect angle, according to this hair The method for three-dimensional measurement of bright second embodiment is during Feature Points Matching, relative to based on the phase to pixel or neighborhood territory pixel group For the characteristic point matching method calculated like property, the calculation amount of matching process is substantially reduced, three-dimensional measurement has been helped to improve Spatial accuracy and real-time.Meanwhile it can be in conjunction with based on to pixel or neighborhood picture according to the method for three-dimensional measurement of the present embodiment The characteristic point matching method of the Similarity measures of plain group, in this case, based on the fixed Feature Points Matching (screening than relationship of parallax Matching characteristic point group) method can effectively exclude the Feature Points Matching based on the Similarity measures to pixel or neighborhood territory pixel group Erroneous matching as a result, improving matched accuracy;And it helps avoid losing due to the not unique caused matching of matching result It loses, to help to improve matching rate, obtains highdensity characteristic point cloud.
Figure 15 to Figure 18 illustrates three-dimension measuring system and method for three-dimensional measurement according to a third embodiment of the present invention.
Figure 15 shows the camera array CA in three-dimension measuring system according to a third embodiment of the present invention, wherein the first phase Machine C1, second camera C2With third camera C3Focal length having the same and optical axis parallel to each other (Z-direction), first camera C1, Two camera C2With third camera C3At triangular arrangement, and their optical center O1、O2、O3It is arranged in perpendicular to the same of optical axis In plane, wherein first camera C1With second camera C2Optical center O1、O2It is aligned in X direction.The optical center of second camera is relative to The offset of the optical center of one camera in the X direction is D1, the optical center of third camera relative to second camera optical center in the X direction Offset be D2
Referring to what is discussed above in conjunction with Fig. 1 C, and as shown in figure 15, in the direction of the x axis, first camera and second camera Between the first parallax d for generating1=x1-x2=D1× f/Z, the second parallax d generated between second camera and third camera2= x2-x3=D2× f/Z, meets d1:d2=D1:D2
Figure 16 schematically shows the first image IM of the acquisition of camera array shown in Figure 151, the second image IM2And third Image IM3Example, wherein the axis of abscissas (u axis) of image corresponds to first camera and the straight line where third camera optical center Direction (x-axis direction).Ideally, as shown in figure 16, same object point P [X, Y, Z] is in image IM1、IM2、IM3It is middle to obtain respectively To picture point P1[u1,v1]、P2[u2,v2]、P3[u3,v3], as shown in imaginary image IM ', wherein image IM1、IM3In correspondence picture Point ordinate having the same, i.e. v1=v3, and picture point P1[u1,v1]、P2[u2,v2] abscissa difference s5=u1-u2=d1/ dp, picture point P2[u2,v2]、P3[u3,v3] abscissa difference s6=u2-u3=d2/dp, meet s5:s6=d1:d2=D1:D2, wherein dpFor the side length of the pixel unit of imaging sensor.
Method for three-dimensional measurement according to the present embodiment may include the correction to the first image, the second image and third image Processing, the correction process to correspond to first camera, second camera and third in the first image, the second image and third image The point abscissa having the same and ordinate of the optical axis of camera, and the horizontal seat of the first image, the second image and third image Mark direction both corresponds to the direction of the optical center connection of first camera and third camera.
Based on the camera array with arrangement shown in Figure 15, in method for three-dimensional measurement according to a third embodiment of the present invention, It will be embodied as including: to be based on based on the fixed processing than relationship screening matching characteristic point group of parallax in processing s130 shown in Fig. 4 Following coordinate relationship screens matching characteristic point group: the characteristic point P in the first image1[u1,v1] abscissa and the second image in Characteristic point P2[u2,v2] abscissa difference s5=u1-u2With the characteristic point P in the second image2[u2,v2] abscissa With the characteristic point P in third image3[u3,v3] abscissa difference s6=u2-u3Meet s5:s6=D1:D2, and the first figure Characteristic point ordinate having the same in picture and third image, i.e. v1=v3.According to the method for three-dimensional measurement of the present embodiment Other processing it is identical as the respective handling in the method for three-dimensional measurement 100 described above by reference to Fig. 4, details are not described herein.
Figure 17 shows an example of the processing based on above-mentioned coordinate relationship screening matching characteristic point group, processing 600.Such as Shown in Figure 17, processing 600 includes:
S610: fisrt feature point and second feature point, second feature point are selected respectively in the first image and third image Ordinate relative to fisrt feature point ordinate within a predetermined range;
S620: the third feature point for meeting following relationship in the second image with fisrt feature point and second feature point is calculated Abscissa is expected, so that the difference s5With difference s6Meet s5:s6=D1:D2;And
S630: based on abscissa is expected, third feature point is searched in the second image.
The operation that characteristic point is selected in the preset range relative to coordinates of targets in processing s610 is referred to above The operation that processing s310 in combination processing 300 is introduced.In addition, being similar to processing 300, handle in 600 processing s630 Range of tolerable variance can be set (referring to characteristic point P in Figure 16 relative to abscissa is expected2Range shown in the dotted line of left and right), herein It repeats no more.
It should be noted that not constraining the third in the second image in processing 600 compared with processing 300 shown in Fig. 9 The ordinate range of characteristic point, therefore based on the matched characteristic point of abscissa search is expected in the second image, compared to first Embodiment and second embodiment, obtained matching result may be not exclusive more greatly.In consideration of it, it is preferred that according to this The characteristic point based on the Similarity measures to pixel or neighborhood territory pixel group is combined in the method for three-dimensional measurement of invention 3rd embodiment Matching process.
Figure 18 shows an example of method for three-dimensional measurement according to a third embodiment of the present invention, method for three-dimensional measurement 700. As shown, method for three-dimensional measurement 700 includes:
S710: the first image, the second image and third respectively from first camera, second camera and third camera are received Image;
S720: being corrected processing to the first image, the second image and third image, as discussed above, the correction Processing is so that correspond to the optical axis of first camera, second camera and third camera in the first image, the second image and third image Point abscissa having the same and ordinate, and the abscissa direction of the first image, the second image and third image is all right It should be in the direction of first camera and the optical center connection of third camera;
S730: characteristic point is extracted in the first image, the second image and third image;
S740: matching the characteristic point in the first image, the second image and third image, which includes:
S741: matching characteristic point group is screened based on the Similarity measures to pixel or the pixel group of neighborhood;And
S742: matching characteristic point group: the characteristic point P in the first image is screened based on following coordinate relationship1[u1,v1] cross Characteristic point P in coordinate and the second image2[u2,v2] abscissa difference s5=u1-u2With the feature in the second image Point P2[u2,v2] abscissa and third image in characteristic point P3[u3,v3] abscissa difference s6=v2-v3Meet s5:s6 =D1:D2, and the characteristic point ordinate having the same in the first image and third image, i.e. v1=v3
S750: the three-dimensional coordinate of the corresponding object point of matching characteristic point group is calculated.
Wherein, processing s720 can be the correction process realized using any bearing calibration, including by applying to image The correction process that certain correction matrix is realized.
Further, it is understood that method for three-dimensional measurement 700 be not limited in processing s741 using based on to pixel or The processing of the Similarity measures of the pixel group of neighborhood can also be replaced by existing or later emerging other and be used for screening With the processing of feature point group/Feature Points Matching.
Processing s742 can be implemented as handling 600 shown in such as Figure 17, and but it is not limited to this.
Method for three-dimensional measurement 700 according to a third embodiment of the present invention can help to improve spy by using processing s742 The matched accuracy of sign point and matching rate, help to obtain more highdensity characteristic point cloud.
Although in the example shown in figure 18 above, screening meets coordinate relationship s5:s6=D1:D2Matching characteristic point group Processing is arranged on after the processing based on the Similarity measures screening matching characteristic point group to pixel or the pixel group of neighborhood, but Be as the processing s130 of method for three-dimensional measurement 100 is discussed as shown in connection with fig. 4 above, according to a third embodiment of the present invention three The tandem of above two processing can be exchanged in dimension measurement method, i.e., screening meets coordinate relationship s first5:s6=D1:D2 Matching characteristic point group, obtain candidate matching characteristic point group, it is then similar for these candidate matching characteristic point groups applications Property calculate, thus further screen matching characteristic point group.The method for three-dimensional measurement so realized is due to being based on coordinate relationship s5:s6 =D1:D2Screening so that the quantity for entering the characteristic point of the Screening Treatment based on Similarity measures greatly reduces, it is possible to The calculation amount of matching process is effectively reduced, while as described above, also contributing to the accuracy and matching of Feature Points Matching Rate helps to obtain more highdensity characteristic point cloud.
Figure 19 shows the more arrangements for the camera array that can be used for three-dimension measuring system according to the present invention.As schemed Show, camera array may include three cameras for being arranged as equilateral triangle, can also extend further to arrangement formed it is multiple just Three or more cameras of triangle, such as cellular arrangement shown in the lower right corner Fig. 9.In addition, including into right angled triangle cloth The camera array for three cameras set can further expand into a variety of other forms, such as rectangle (including square), T word It is shape, cross, diagonal linear and using these shapes as the extension form of unit.
Method for three-dimensional measurement according to the present invention can be based on three cameras or three or more phases in camera array The image of machine is implemented.Although describing the three-dimensional measurement system of the first, second, and third embodiment according to the present invention respectively above System and method for three-dimensional measurement, it will be appreciated by those skilled in the art that these embodiments or in which feature can carry out group It closes to form different technical solutions.For example, including first, second, third in camera array in other some embodiments With the 4th camera, method for three-dimensional measurement according to the present invention can for from the first, second, third camera one group of image and One group of image from the first, third and fourth camera carries out proposed by the present invention based on the fixed characteristic point than relationship of parallax respectively Matching treatment, and the matching characteristic point group finally in first and third camera is determined in conjunction with the matching result of two groups of images, To calculate the spatial position of corresponding object point.Such technical solution still uses the total inventive concept of the present invention, the application Invention scope be intended to cover such technical solution.
Next, being further described three-dimension measuring system 10 according to the present invention referring back to Fig. 3.As shown in figure 3, three-dimensional Measuring system 10 can further include projecting cell 14 other than it can use different camera array CA.Projection is single For member 14 for projecting projection pattern to the shooting area of camera array CA, which can be by the camera in camera array CA It is collected.Projection pattern can increase more characteristic point in shooting area, in some applications it is also possible that feature Point distribution is more uniform or makes up the defect that partial region lacks in individuality a little.
Projecting pattern may include point, line or their combination.Point may be alternatively formed to larger or tool after size increase There is the spot of specific shape, line may be alternatively formed to ribbon pattern wider or with other shapes feature after size increase. It is preferred that including lines in projection pattern, and in the extending direction of the lines and first camera, second camera and third camera extremely Both few optical center connection direction is not parallel, and facilitating to provide so more can be used in method for three-dimensional measurement according to the present invention The characteristic point based on the fixed matching treatment than relationship of parallax.
Projection pattern can also be encoded by features such as color, intensity, shape, distributions.For the projection by coding Pattern, characteristic point in the image obtained by camera with identical coding necessarily match point.Matching dimensionality is increased, is improved Matching rate and matched accuracy.
Projecting unit 14 is it is so structured that including light source and for forming projection figure based on the illumination light from the light source The optical element of case, such as diffraction optical element or grating etc..The illumination light that the light source issues includes to have first phase The light of wavelength in the operating wavelength range of machine, second camera and third camera can be monochromatic light, or polychromatic light, And it may include visible light and/or non-visible light, such as infrared light.In some applications, projecting unit 14 can be structured as Projecting direction can be adjusted, selectively to project pattern to different region projects according to different photographed scenes.In addition, Projecting unit 14 is it is so structured that can be realized timing list pattern projection perhaps more pattern projections of timing or can be according to difference Photographed scene project different patterns.
In some embodiments, it can also include sensor 15 in three-dimension measuring system 10, be used to detect projecting unit The 14 at least partly pattern characteristics projected, to obtain the additional information that can be used for three-dimensional measurement.For example, in some applications In, camera array CA works in visible wavelength and infrared wavelength, and projecting unit 14 projects projection pattern, sensing with infrared wavelength Device 15 is infrared sensor or infrared camera, and the information acquired at this time by camera array CA had both included the image that visible light is formed Information includes projection pattern again, can provide more characteristic points for being used for the three-dimensional measurement based on binocular vision, and passes through biography The projection pattern that sensor 15 obtains can be used for the measurement based on other three-dimensional measurement technologies, such as the measurement based on structure light. In three-dimension measuring system 10 according to the present invention, the information that sensor 15 obtains can for example be sent to correction unit 13, school Positive unit 13 for example can be by the three-dimensional measuring result based on information and the acquisition of other three-dimensional measurement technologies from sensor 15 For the calibration and/or correction to camera array or its image obtained.
For three-dimension measuring system 10 according to the present invention, it should be understood that it can be implemented as one and integrated is System, also can be implemented as a distributed system.For example, the camera array CA in three-dimension measuring system 10 may be mounted at one In a equipment, and processing unit 11 can be realized based on Internet Server, thus physically separating with camera array CA. Projecting unit 14 and sensor 15 can be installed together with camera array CA, can also be independently arranged.For control unit 12, situation is also similar.In addition, correction unit 13 can be with processing unit 11 together by same processor and associated Memory etc. realizes and (is configured to a part of three-dimensional measuring apparatus 20 representated by dotted line frame in Fig. 3 at this time), can also To be implemented separately, such as correction unit 13 can pass through the processor integrated with camera array CA and associated storage Device etc. is realized.
Several application examples of three-dimension measuring system according to the present invention described below and measurement method.
[application examples 1]
In the application example, three-dimension measuring system according to the present invention is embodied as the three-dimensional based on mobile phone and external camera mould group Measuring device.
Camera mould group includes three with model camera, three image centers point-blank, adjacent cameras center away from From equal, the optical axis of camera is parallel to each other, towards identical, all perpendicular to the straight line where image center.
Camera mould group is connect by WiFi and/or data line with mobile phone.3 cameras that mobile phone can control camera mould group are same Step shooting (photo and/or video) and equimultiple zoom.
The photo and/or video that camera mould group takes pass to mobile phone by WiFi and/or data line.Mobile phone passes through correction The picture frame in photo and/or video is corrected with APP.It for example can use in correction course before being placed on camera mould group The chessboard of side, known to checkerboard grid size.Correction based on auxiliary corrective tool as similar chessboard be it is known in the art, Details are not described herein.Method in the application example for correction is also not limited to the specific method.
After completing correction, the scene and object to be modeled using camera model shooting.It can further be integrated on mobile phone Or it is circumscribed with projection module, for projecting striped on subject, the direction of striped and three camera center line connecting directions are not In parallel.The processing unit (being made of the processing chip and storage unit of mobile phone) being integrated in mobile phone extracts on three camera photos Shared characteristic point and characteristic area.The characteristic point characteristic point included in addition to subject on image, also includes projection module The new feature point and characteristic area that the striped of projection is formed on object.It screens while there is image coordinate symmetric relation and close category The matching characteristic point group of property (close attribute is judged by the Similarity measures to pixel and neighborhood territory pixel group).Based on each Multiple depth values are calculated with feature point group, take the average value of these depth values.Then, the corresponding object of this feature point group is calculated The three dimensional space coordinate of point, and merge to form voxel information with its colouring information.Whole object or field are finally shown on mobile phone Scape true color point cloud or the threedimensional model that is reconstructed based on the cloud.
In a variation example, mobile phone can will receive photo from camera mould group and/or transmission of video to cloud Server.Server can quickly be corrected photo, extract characteristic point, and matching has image coordinate symmetric relation and phase The Corresponding matching feature point group of nearly attribute calculates the three-dimensional coordinate of the corresponding object point of matching characteristic point group and merges in colouring information Voxel information is formed, true color point cloud result or the threedimensional model reconstructed based on the cloud are passed back on mobile phone again and shown.
The application example can also be merged with TOF, solve image without feature (such as great Bai wall) or occlusion issue.The two point cloud melts It closes, had not only guaranteed that big object target can sample, but also compatible body surface details, spatial sampling can be formed more comprehensively, point The higher three-dimensional point cloud of resolution.Such as in mobile phone terminal, the 3d sampling module of " forward direction camera+TOF " fusion can scan pixel The face of class precision, and current TOF can only scan substantially surface profile.And the 3d module of " backward camera+TOF " fusion The limited short slab of TOF system projection distance is even more supplied.In addition, the three-dimensional point cloud that TOF is formed is turned by space coordinates It changes and projective transformation, calculates the initial parallax for corresponding to sampled point on image, images match process can be accelerated.
The basic process that image three-dimensional measuring system is merged with laser radar is as follows: 1.TOF generates three-dimensional point cloud;2. passing through The point cloud three-dimensional coordinate of laser radar is converted to reference camera and sat by the conversion between TOF coordinate system and reference camera coordinate system Mark is the three-dimensional coordinate of lower corresponding points;3. the three of sampled point will be corresponded under reference camera coordinate system according to the projection equation of camera Dimension coordinate is converted to the two-dimensional coordinate and parallax of image;4. corresponding to the two-dimensional coordinate and parallax of sampled point by reference to camera image Find the two-dimensional coordinate initial position on other camera images;5. centered on initial position, setting search neighborhood, at other Camera image essence looks for corresponding points, and precision is accurate to Pixel-level or sub-pixel;6. looking for matching result to calculate match point by essence Accurate parallax, then converse the three-dimensional coordinate of match point;7. according to initial matching result to picture portion, adjacent Corresponding matching point The region of clamping is corresponding region to be matched;8. matching the feature to be matched on multiple camera images in correspondence region to be matched Point;9. exporting the fusion results for the three-dimensional point cloud that the point cloud of TOF and Image Feature Point Matching are formed under reference camera coordinate system.
The application example can also be merged with structure light, solve the problems, such as image without feature (such as great Bai wall).Structure light can be Body surface forms high density feature.Structure light use is more flexible.Both the characteristic point that can have only increased body surface (allows not Pattern with position arranges repetition), it is matched by multicamera system and generates three-dimensional point cloud, coding pattern projection also can be generated and (protect Pattern coding or the neighborhood pattern coding for demonstrate,proving each point are unique), three are calculated by the triangle principle between camera and projection arrangement Dimension point cloud.
Furthermore it is also possible to form the assembled scheme of " camera+TOF+ structure light ", can so form ultra high density and surpass High-resolution three-dimensional point cloud can be used for the applied fields such as the needs such as VR/AR game or film three-dimensional virtual object very true to nature Jing Zhong.
[application examples 2]
In the application example, three-dimension measuring system according to the present invention is embodied as a kind of device for Vehicular automatic driving.
Camera model is mounted on front part of vehicle comprising three the same as model camera.Three image centers point-blank, Adjacent cameras center is equidistant, and the optical axis of camera is parallel to each other, towards identical, all perpendicular to straight where image center Line.Camera model is connect by data line with vehicle-mounted computer.Vehicle-mounted computer can control 3 camera sync pulse jammings of camera model (photo and/or video) and equimultiple zoom.
The photo and/or video that camera model takes pass to vehicle-mounted computer by data line.Vehicle-mounted computer is by same a period of time Between the multiple images (picture frame in photo or video) that shoot generate correction matrix.After correction, vehicle-mounted computer extracts three The characteristic point and characteristic area shared on the image of camera shooting, picks out the Corresponding matching with symmetrical structure and close attribute The matching characteristic of image coordinate symmetric relation and close attribute point group is screened while being had to group, and it is corresponding to calculate matching characteristic point group The three-dimensional coordinate of object point exports true color three-dimensional point cloud after merging with colouring information, be pushed to determining for Vehicular automatic driving Plan system.
The application example can also be merged with laser radar, solve image without feature (such as great Bai wall) or occlusion issue.Laser thunder Up to the spatial sampling of suitable lump, spatial point cloud can be formed in the undistinguishable situation of image.But by space Sample rate limitation, such as the electric pole outside tens meters, there may be leakages to adopt when subtended angle is less than the spatial sampling angular resolution of radar Sample.And picture system is more sensitive to the various features (including edge feature) of object.The fusion of the two point cloud, had both guaranteed big target It can sample, and the sampling of compatible body surface details and tiny object, spatial sampling can be formed more comprehensively, resolution ratio Higher three-dimensional point cloud.Sampled point is corresponded on image in addition, the three-dimensional point cloud that laser radar is formed is converted to by space Initial parallax can accelerate images match process.
The basic process that image three-dimensional measuring system is merged with laser radar is as follows: 1. laser radars generate three-dimensional point cloud; 2. the point cloud three-dimensional coordinate of laser radar is converted to by the conversion between laser radar coordinate system and reference camera coordinate system The three-dimensional coordinate of corresponding points under reference camera coordinate system;3., will be corresponding under reference camera coordinate system according to the projection equation of camera The three-dimensional coordinate of sampled point is converted to the two-dimensional coordinate and parallax of image;4. corresponding to the two dimension of sampled point by reference to camera image Coordinate and parallax find the two-dimensional coordinate initial position on other camera images;5. centered on initial position, setting search Neighborhood looks for corresponding points in other camera image essences, and precision is accurate to Pixel-level or sub-pixel;6. looking for matching result meter by essence The accurate parallax of match point is calculated, then converses the three-dimensional coordinate of match point;7. adjacent according to initial matching result to picture portion The region of Corresponding matching point clamping is corresponding region to be matched;8. being matched on multiple camera images in correspondence region to be matched Characteristic point to be matched;9. exporting the three-dimensional point that the point cloud of laser radar and Image Feature Point Matching are formed under reference camera coordinate system The fusion results of cloud.
[application examples 3]
In the application example, three-dimension measuring system according to the present invention is embodied as being used for based on unmanned plane and external camera model A kind of device taken photo by plane.
Camera model is placed on the airborne tripod head of unmanned plane, including 5 the same as model camera.5 cameras are scattered in cross Frame shape, in the same plane, the adjacent cameras center in vertical and horizontal direction is equidistant image center, and the optical axis of camera is mutual It is parallel, towards identical, all perpendicular to the plane where image center.Camera model passes through the airborne electricity of data line and unmanned plane Brain connection.Airborne computer can control 5 camera sync pulse jammings (photo and/or video) of camera model and equimultiple becomes It is burnt.
The photo and/or video that camera model takes pass to airborne computer by data line.Airborne computer is by same a period of time Between the multiple images (picture frame in photo or video) that shoot generate correction matrix.The calibrated matrix correction of image.Airborne electricity Brain can be with the characteristic point and characteristic area shared on the image of five cameras on real-time display camera model, and extracting has image seat The Corresponding matching feature point group of symmetric relation and close attribute is marked, the three-dimensional coordinate of the corresponding object point of matching characteristic point group is calculated, True color three-dimensional point cloud is exported after merging with colouring information.
Above description is only the preferred embodiment of the application and the explanation to institute's application technology principle.Those skilled in the art Member is it should be appreciated that invention scope involved in the application, however it is not limited to technology made of the specific combination of above-mentioned technical characteristic Scheme, while should also cover in the case where not departing from the inventive concept, it is carried out by above-mentioned technical characteristic or its equivalent feature Any combination and the other technical solutions formed.Such as features described above has similar function with (but being not limited to) disclosed herein Can technical characteristic replaced mutually and the technical solution that is formed.

Claims (50)

1. a kind of based on the fixed method for three-dimensional measurement than relationship of parallax, comprising:
Receive respectively from first camera, second camera and third camera the first image, the second image and third image, first Camera, second camera and third camera focal length having the same and optical axis parallel to each other, and first camera, second camera It is arranged on the same plane perpendicular to optical axis with the optical center of third camera;
Characteristic point is extracted in the first image, the second image and third image respectively;
Characteristic point in first image, the second image and third image is matched, which includes fixed based on following parallax Matching characteristic point group is screened than relationship: generating between the first image and the second image, in a first direction for same object point On the first parallax d1With the second parallax d generated between the second image and third image, in a second direction2Meet d1: d2=D1:D2, wherein D1For optical center offset in said first direction of the optical center relative to second camera of first camera, D2 For optical center offset in this second direction of the optical center relative to third camera of second camera, wherein first direction is flat For row in the plane and non-perpendicular to the direction of first camera and the optical center connection of second camera, second direction is described to be parallel to Plane and non-perpendicular to the direction of second camera and the optical center connection of third camera;And
Calculate the three-dimensional coordinate of the corresponding object point of matching characteristic point group.
2. method for three-dimensional measurement as described in claim 1, wherein the first direction is the light of first camera and second camera Heart line direction, second direction are second camera and third camera optical center connection direction.
3. method for three-dimensional measurement as described in claim 1, wherein the light of the first camera, second camera and third camera The heart is sequentially arranged on the straight line perpendicular to optical axis, and the first direction and second direction are the direction of the straight line;And And
It is described to determine than the processing of relationship screening matching characteristic point group to include: based on the screening of following coordinate relationship based on the parallax With feature point group: the difference s of the abscissa of the characteristic point in the first image and the abscissa of the characteristic point in the second image1With The difference s of the abscissa of the abscissa and characteristic point in third image of the characteristic point in two images2Meet s1:s2=D1: D2, and the characteristic point ordinate having the same in the first image, the second image and third image, the abscissa Direction corresponds to the direction of the straight line.
4. method for three-dimensional measurement as claimed in claim 3, further includes: carried out to the first image, the second image and third image Correction process, so that corresponding to first camera, second camera and third camera in the first image, the second image and third image The point abscissa having the same and ordinate of optical axis, and the abscissa direction of the first image, the second image and third image Both correspond to the direction of the straight line.
5. method for three-dimensional measurement as claimed in claim 3, wherein described to screen matching characteristic point group based on the coordinate relationship Processing include:
Select fisrt feature point and second feature point, institute respectively in the two in the first image, the second image and third image The ordinate satisfaction for stating fisrt feature point and second feature point is located in preset range relative to a target ordinate;
Calculate in the third party in the first image, the second image and third image with the fisrt feature point and second feature point The expectation abscissa for the third feature point matched, so that the difference s1With the difference s2Meet s1:s2=D1:D2;And
Based on the expectation position being made of the expectation abscissa and target ordinate of third feature point, searched in the third party Third feature point.
6. method for three-dimensional measurement as claimed in claim 5, wherein two in the first image, the second image and third image The fisrt feature point and second feature point that ordinate is the target ordinate are selected in person respectively.
7. method for three-dimensional measurement as claimed in claim 5, wherein described to expect location finding third feature point based on described Processing further include:
Range of tolerable variance is set, which includes at least one of abscissa range of tolerable variance and ordinate range of tolerable variance;With And
In the third party in the first image, the second image and third image, it is being located at institute relative to the expectation position It states in the region in range of tolerable variance, searches for third feature point.
8. method for three-dimensional measurement as claimed in claim 5, wherein described to screen matching characteristic point group based on the coordinate relationship Processing further include:
Calculate that the first image, ordinate is located in preset range relative to the target ordinate in the second image and third image Characteristic point quantity;And
Determine that the third party in the first image, the second image and third image is the first image, the second image and third figure There is the quantity of the characteristic point of given ordinate as in one of at most.
9. method for three-dimensional measurement as claimed in claim 5, wherein D1:D2=1:1, and the phase for calculating third feature point Processing to abscissa includes: to calculate the expectation abscissa, so that the spy in the characteristic point and third image in the first image Twice for levying the abscissa that the sum of abscissa of point is the characteristic point in the second image.
10. method for three-dimensional measurement as claimed in claim 2, wherein the optical center of first camera and second camera is arranged in vertically In in the first straight line of optical axis, the optical center of second camera and third camera is arranged in perpendicular to optical axis and straight perpendicular to described first In the second straight line of line, the first direction is the direction of first straight line, and the second direction is the direction of second straight line;And
It is described to determine than the processing of relationship screening matching characteristic point group to include: based on the screening of following coordinate relationship based on the parallax With feature point group: the difference s of the abscissa of the characteristic point in the first image and the abscissa of the characteristic point in the second image3With The difference s of the ordinate of the ordinate and characteristic point in third image of the characteristic point in two images4Meet s3:s4=D1: D2, the characteristic point of the first image and the second image ordinate having the same, the spy of the second image and third image Sign point abscissa having the same, the direction of the abscissa correspond to the first direction, and the direction of the ordinate is corresponding In the second direction.
11. method for three-dimensional measurement as claimed in claim 10, further includes: to the first image, the second image and third image Correction process, the correction process make in the first image, the second image and third image correspond to first camera, second camera and The point abscissa having the same and ordinate of the optical axis of third camera, and the first image, the second image and third image Abscissa direction corresponds to first direction, and ordinate direction corresponds to second direction.
12. method for three-dimensional measurement as claimed in claim 10, wherein described to screen matching characteristic point based on the coordinate relationship Group processing include:
Select fisrt feature point and second feature point, the fisrt feature point and second respectively in the first image and the second image The ordinate satisfaction of characteristic point is located in preset range relative to a target ordinate;
Calculate the difference s of the abscissa of fisrt feature point and the abscissa of second feature point3
Calculating difference s4, so that s3:s4=D1:D2
The expectation ordinate in third image with the fisrt feature point and the matched third feature point of second feature point is calculated, is made The difference for obtaining the ordinate of second feature point and the expectation ordinate of third feature point is above-mentioned the second difference s being calculated4; And
Based on the expectation position being made of the abscissa of the expectation ordinate of the third feature point and second feature point, in third Third feature point is searched in image.
13. method for three-dimensional measurement as claimed in claim 12, wherein described to be based on the expectation location finding third feature point Processing further include:
Set range of tolerable variance;And
In third image, it is located in the region in the range of tolerable variance relative to the expectation position, searches for third feature Point.
14. method for three-dimensional measurement as claimed in claim 2, wherein the first direction and second direction are first camera With the optical center connection direction of third camera.
15. method for three-dimensional measurement as claimed in claim 14, wherein described to match spy based on fixed screen than relationship of the parallax Levying a processing for group includes:
Matching characteristic point group is screened based on following coordinate relationship: in the abscissa and the second image of the characteristic point in the first image The difference s of the abscissa of characteristic point5With the characteristic point in the abscissa and third image of the characteristic point in the second image The difference s of abscissa6Meet s5:s6=D1:D2, the characteristic point vertical seat having the same in the first image and third image Mark, the direction of the abscissa correspond to the optical center connection direction of the first camera and third camera.
16. method for three-dimensional measurement as claimed in claim 15, further includes: to the first image, the second image and third image into Row correction process, so that corresponding to first camera, second camera and third camera in the first image, the second image and third image Optical axis point abscissa having the same and ordinate, and the abscissa side of the first image, the second image and third image To the direction for the optical center connection for both corresponding to first camera and third camera.
17. method for three-dimensional measurement as claimed in claim 15, wherein described to screen matching characteristic point based on the coordinate relationship Group processing include:
Select fisrt feature point and second feature point, the ordinate phase of second feature point respectively in the first image and third image For fisrt feature point ordinate within a predetermined range;
It calculates in the second image and meets s with fisrt feature point and second feature point5:s6=D1:D2The phase of the third feature point of relationship To abscissa;And
Based on abscissa is expected, third feature point is searched in the second image.
18. method for three-dimensional measurement as described in claim 1, wherein described in the first image, the second image and third image Characteristic point carry out matched processing further include: to based on the fixed matching characteristic point group screened than relationship of the parallax, apply With the Similarity measures to pixel or neighborhood territory pixel group further to screen matching characteristic point group.
19. method for three-dimensional measurement as claimed in claim 18, wherein it includes identical spy that the Similarity measures, which are only applied to, Levy more than two feature point groups of point.
20. method for three-dimensional measurement as described in claim 1, wherein described in the first image, the second image and third image Characteristic point matched further include: Similarity measures are applied to screen matching characteristic point to characteristic point or its neighborhood territory pixel group Group;And
It is described to be applied to screen by Similarity measures based on the fixed processing than relationship screening matching characteristic point group of the parallax Matching characteristic point group out.
21. the method for three-dimensional measurement as described in any one of claim 18-20, wherein the Similarity measures include to by The quadratic sum of pixel grey scale difference, the quadratic sum of zero passage mean pixel gray scale difference, pixel grey scale absolute value of the difference and zero passage are averaged picture Degree of cross-correlation is normalized between plain gray scale absolute value of the difference and neighborhood and the average normalized degree of cross-correlation of zero passage is constituted Group at least one of calculating.
22. method for three-dimensional measurement as described in claim 1, further includes: raw based on the first image, the second image and third image At correction matrix, the correction matrix is received from first camera, second camera and third camera after being used to be applied to Image, the processing for generating correction matrix include:
Characteristic point is extracted in the first image, the second image and third image respectively;
Characteristic point in first image, the second image and third image is matched, multiple matching characteristic point groups are obtained;
Using coordinate of the characteristic point in each image in matching characteristic point group, it is applied to after each image according to correction matrix The parallax met between characteristic point in matching characteristic point group is fixed than relationship, establishes over-determined systems;And
The over-determined systems are resolved, correction matrix is obtained.
23. method for three-dimensional measurement as described in claim 1, wherein described to calculate each characteristic point based on matching characteristic point group The three-dimensional coordinate of the corresponding object point of group includes: to calculate depth based at least two pairs of characteristic points in a matching characteristic point group Value, and take the average value of the depth value as the depth value of object point corresponding to the matching characteristic point group.
24. method for three-dimensional measurement as described in claim 1, further includes: with the first camera, second camera and third camera Operating wavelength range in wavelength light to the shooting area of the camera project pattern.
25. method for three-dimensional measurement as claimed in claim 24, wherein the pattern include striped, the direction of the striped with The optical center connection of at least the two is not parallel in first camera, second camera and third camera.
26. a kind of three-dimensional measuring apparatus, comprising: processor;With storage program instruction memory, wherein when described program instruct When being executed by the processor, so that the processor executes following operation:
Receive the first image, the second image and third image;
Characteristic point is extracted in the first image, the second image and third image respectively;
Characteristic point in first image, the second image and third image is matched, which includes: to be closed based on following coordinate System's screening matching characteristic point group: the difference of the abscissa of the characteristic point in the first image and the abscissa of the characteristic point in the second image Value is with the abscissa of the characteristic point in the second image and the difference of the abscissa of the characteristic point in third image at predetermined ratio Example relationship, and the characteristic point ordinate having the same in the first image and third image;And
Calculate the three-dimensional coordinate of the corresponding object point of matching characteristic point group.
27. three-dimensional measuring apparatus as claimed in claim 26, wherein the coordinate relationship further include: the institute in the second image Characteristic point is stated with ordinate identical with the characteristic point in the first image and third image.
28. three-dimensional measuring apparatus as claimed in claim 27, wherein described program instruction makes the processing when executed Device realizes the operation that matching characteristic point group is screened based on the coordinate relationship by following operation:
Select fisrt feature point and second feature point, institute respectively in the two in the first image, the second image and third image The ordinate satisfaction for stating fisrt feature point and second feature point is located in preset range relative to a target ordinate;
Calculate in the third party in the first image, the second image and third image with the fisrt feature point and second feature point The expectation abscissa for the third feature point matched, so that the characteristic point in the abscissa of the characteristic point in the first image and the second image The difference of abscissa and the abscissa of the characteristic point in the second image and the characteristic point in third image abscissa Difference is at predetermined ratio relationship;And
Based on the expectation position being made of the expectation abscissa and target ordinate of third feature point, searched in the third party Third feature point.
29. three-dimensional measuring apparatus as claimed in claim 28, wherein in the first image, the second image and third image The fisrt feature point and second feature point that ordinate is the target ordinate are selected in the two respectively.
30. three-dimensional measuring apparatus as claimed in claim 28, wherein described program instruction makes the processing when executed Device is realized by following operation based on the operation for expecting location finding third feature point:
Range of tolerable variance is set, which includes at least one of abscissa range of tolerable variance and ordinate range of tolerable variance;With And
In the third party in the first image, the second image and third image, it is being located at institute relative to the expectation position It states in the region in range of tolerable variance, searches for third feature point.
31. a kind of three-dimensional measuring apparatus, for being used cooperatively with camera array to carry out three-dimensional measurement, the camera array is at least Including first camera, second camera and third camera, the first camera, second camera and third camera focal length having the same And optical axis parallel to each other, and the optical center of first camera, second camera and third camera is arranged in perpendicular to the same of optical axis In one plane, the three-dimensional measuring apparatus includes:
Processing unit is received respectively from the first image of first camera, second camera and third camera, the second image and the Three images, and it is configured for following processing:
Characteristic point is extracted in the first image, the second image and third image respectively;
Characteristic point in first image, the second image and third image is matched, which includes fixed based on following parallax Matching characteristic point group is screened than relationship: generating between the first image and the second image, in a first direction for same object point On the first parallax d1With the second parallax d generated between the second image and third image, in a second direction2Meet d1: d2=D1:D2, wherein D1For optical center offset in said first direction of the optical center relative to second camera of first camera, D2 For optical center offset in this second direction of the optical center relative to third camera of second camera, wherein first direction is flat For row in the plane and non-perpendicular to the direction of first camera and the optical center connection of second camera, second direction is described to be parallel to Plane and non-perpendicular to the direction of second camera and the optical center connection of third camera;And
Calculate the three-dimensional coordinate of the corresponding object point of matching characteristic point group.
32. three-dimensional measuring apparatus as claimed in claim 31, wherein the first camera, second camera and third camera Optical center is sequentially arranged on the straight line perpendicular to optical axis, and the first direction and second direction are the direction of the straight line; And
The processing unit is configured to be based on the parallax by following processing calmly than relationship screening matching characteristic point group:
Matching characteristic point group is screened based on following coordinate relationship: in the abscissa and the second image of the characteristic point in the first image The abscissa of the characteristic point in the difference of the abscissa of characteristic point and the second image and the cross of the characteristic point in third image The difference of coordinate meets s1:s2=D1:D2, and the characteristic point in the first image, the second image and third image has phase Same ordinate, the direction of the abscissa correspond to the direction of the straight line.
33. three-dimensional measuring apparatus as claimed in claim 32, further includes:
Unit is corrected, based on the image from first camera, second camera and third camera, generation correction matrix simultaneously will be described Correction matrix is supplied to processing unit, and the correction matrix is applied to the first image, the second image and third when unit processed When image, make the optical axis for corresponding to first camera, second camera and third camera in the first image, the second image and third image Point abscissa having the same and ordinate, and the abscissa direction of the first image, the second image and third image is all right The direction of straight line described in Ying Yu.
34. three-dimensional measuring apparatus as claimed in claim 32, wherein the processing unit is configured to through following processing come base Matching characteristic point group is screened in the coordinate relationship:
Select fisrt feature point and second feature point, institute respectively in the two in the first image, the second image and third image The ordinate satisfaction for stating fisrt feature point and second feature point is located in preset range relative to a target ordinate;
Calculate in the third party in the first image, the second image and third image with the fisrt feature point and second feature point The expectation abscissa for the third feature point matched, so that the difference s1With the difference s2Meet s1:s2=D1:D2;And
Based on the expectation position being made of the expectation abscissa and target ordinate of third feature point, searched in the third party Third feature point.
35. three-dimensional measuring apparatus as claimed in claim 34, wherein the processing unit is configured to through following processing come base In the expectation location finding third feature point:
Range of tolerable variance is set, which includes at least one of abscissa range of tolerable variance and ordinate range of tolerable variance;With And
In the third party in the first image, the second image and third image, it is being located at institute relative to the expectation position It states in the region in range of tolerable variance, searches for third feature point.
36. three-dimensional measuring apparatus as claimed in claim 31, wherein the optical center of first camera and second camera is arranged in vertically In in the first straight line of optical axis, the optical center of second camera and third camera is arranged in perpendicular to optical axis and straight perpendicular to described first In the second straight line of line, the first direction is the direction of first straight line, and the second direction is the direction of second straight line;And
The processing unit is configured to be based on the parallax by following processing calmly than relationship screening matching characteristic point group:
Matched feature point group is screened based on following coordinate relationship: in the abscissa and the second image of the characteristic point in the first image Characteristic point abscissa difference s3With the characteristic point in the ordinate and third image of the characteristic point in the second image Ordinate difference s4Meet s3:s4=D1:D2, the characteristic point of the first image and the second image vertical seat having the same Mark, the characteristic point of the second image and third image abscissa having the same, the direction of the abscissa correspond to described The direction of first direction, the ordinate corresponds to the second direction.
37. three-dimensional measuring apparatus as claimed in claim 36, further includes:
Unit is corrected, based on the image from first camera, second camera and third camera, generation correction matrix simultaneously will be described Correction matrix is supplied to processing unit, and the correction matrix is applied to the first image, the second image and third when unit processed When image, make the optical axis for corresponding to first camera, second camera and third camera in the first image, the second image and third image Point abscissa having the same and ordinate, and the abscissa direction of the first image, the second image and third image is all right The direction of straight line described in Ying Yu.
38. three-dimensional measuring apparatus as claimed in claim 36, wherein the processing unit is configured to through following processing come base Matching characteristic point group is screened in the coordinate relationship:
Select fisrt feature point and second feature point, the fisrt feature point and second respectively in the first image and the second image The ordinate satisfaction of characteristic point is located in preset range relative to a target ordinate;
Calculate the difference s of the abscissa of fisrt feature point and the abscissa of second feature point3
Calculating difference s4, so that s3:s4=D1:D2
The expectation ordinate in third image with the fisrt feature point and the matched third feature point of second feature point is calculated, is made The difference for obtaining the ordinate of second feature point and the expectation ordinate of third feature point is above-mentioned the second difference s being calculated4; And
Based on the expectation position being made of the abscissa of the expectation ordinate of the third feature point and second feature point, in third Third feature point is searched in image.
39. three-dimensional measuring apparatus as claimed in claim 38, wherein the processing unit is configured to through following processing come base In the expectation location finding third feature point:
Range of tolerable variance is set, which includes at least one of abscissa range of tolerable variance and ordinate range of tolerable variance;With And
In third image, it is located in the region in the range of tolerable variance relative to the expectation position, searches for third feature Point.
40. three-dimensional measuring apparatus as claimed in claim 31, wherein the first direction and second direction are first camera With the optical center connection direction of third camera;And
The processing unit is configured to be based on the parallax by following processing calmly than relationship screening matching characteristic point group:
Matched feature point group is screened based on following coordinate relationship: in the abscissa and the second image of the characteristic point in the first image Characteristic point abscissa difference s5With the characteristic point in the abscissa and third image of the characteristic point in the second image Abscissa difference s6Meet s5:s6=D1:D2, the characteristic point vertical seat having the same in the first image and third image Mark, the direction of the abscissa correspond to the optical center connection direction of the first camera and third camera.
41. three-dimensional measuring apparatus as claimed in claim 40, further includes:
Unit is corrected, based on the image from first camera, second camera and third camera, generation correction matrix simultaneously will be described Correction matrix is supplied to processing unit, and the correction matrix is applied to the first image, the second image and third when unit processed When image, make the optical axis for corresponding to first camera, second camera and third camera in the first image, the second image and third image Point abscissa having the same and ordinate, and the abscissa direction of the first image, the second image and third image is all right It should be in the direction of first camera and the optical center connection of third camera.
42. three-dimensional measuring apparatus as claimed in claim 36, wherein the processing unit is configured to through following processing come base Matching characteristic point group is screened in the coordinate relationship:
Select fisrt feature point and second feature point, the ordinate phase of second feature point respectively in the first image and third image For fisrt feature point ordinate within a predetermined range;
It calculates in the second image and meets s with fisrt feature point and second feature point5:s6=D1:D2The phase of the third feature point of relationship To abscissa;And
Based on abscissa is expected, third feature point is searched in the second image.
43. three-dimensional measuring apparatus as claimed in claim 31, wherein the processing unit is additionally configured to: to based on the view The fixed matched feature point group screened than relationship of difference, is applied to the Similarity measures of pixel or neighborhood territory pixel group with further Screen matched feature point group.
44. three-dimensional measuring apparatus as claimed in claim 31, wherein the processing unit is additionally configured to: to characteristic point or its Neighborhood territory pixel group applies Similarity measures to screen matched feature point group;And to screened by Similarity measures It is carried out with feature point group described based on the fixed processing than relationship screening matching characteristic point group of the parallax.
45. the three-dimensional measuring apparatus as described in claim 33,37 or 41, wherein the correction unit is configured to by following Processing generates correction matrix:
Characteristic point is extracted in the first image, the second image and third image respectively;
Characteristic point in first image, the second image and third image is matched, multiple matched feature point groups are obtained;
Using coordinate of the characteristic point in each image in matched feature point group, according to correction matrix be applied to each image it The coordinate relationship met between the characteristic point in matched feature point group afterwards, establishes over-determined systems;And
The over-determined systems are resolved, correction matrix is obtained.
46. a kind of based on the fixed three-dimension measuring system than relationship of parallax, comprising:
Camera array, the camera array include at least first camera, second camera and third camera, the first camera, second Camera and third camera focal length having the same and optical axis parallel to each other, and first camera, second camera and third phase The optical center of machine is arranged on the same plane perpendicular to optical axis;And
Three-dimensional measuring apparatus as described in any one of claim 26-45.
47. three-dimension measuring system as claimed in claim 46, further includes: control unit is connect simultaneously with the camera array Communication, and be configured to control first camera, second camera and third camera and synchronously acquire image.
48. three-dimension measuring system as claimed in claim 47, wherein described control unit be additionally configured to control first camera, Second camera and third camera equimultiple zoom.
49. three-dimension measuring system as claimed in claim 48, further includes: projecting unit comprising light source and for being based on Illumination light from the light source forms the optical element of projection pattern, and the illumination light includes to have the first camera, second The light of camera and the wavelength in the operating wavelength range of third camera, and shooting of the projecting cell to the camera array Pattern is projected described in region project.
50. three-dimension measuring system as claimed in claim 49, wherein the projection pattern includes striped, and the striped Direction and the optical center connection in the first camera, second camera and third camera both at least it is not parallel.
CN201711164746.6A 2017-11-21 2017-11-21 Method, device and system for three-dimensional measurement Active CN109813251B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201711164746.6A CN109813251B (en) 2017-11-21 2017-11-21 Method, device and system for three-dimensional measurement
PCT/CN2018/114016 WO2019100933A1 (en) 2017-11-21 2018-11-05 Method, device and system for three-dimensional measurement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711164746.6A CN109813251B (en) 2017-11-21 2017-11-21 Method, device and system for three-dimensional measurement

Publications (2)

Publication Number Publication Date
CN109813251A true CN109813251A (en) 2019-05-28
CN109813251B CN109813251B (en) 2021-10-01

Family

ID=66599669

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711164746.6A Active CN109813251B (en) 2017-11-21 2017-11-21 Method, device and system for three-dimensional measurement

Country Status (2)

Country Link
CN (1) CN109813251B (en)
WO (1) WO2019100933A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110440712A (en) * 2019-08-26 2019-11-12 英特维科技(苏州)有限公司 Adaptive big depth of field 3-D scanning method and system
CN111457859A (en) * 2020-03-06 2020-07-28 深圳奥比中光科技有限公司 Alignment calibration method and system for 3D measuring device and computer readable storage medium
CN112033352A (en) * 2020-09-01 2020-12-04 珠海市一微半导体有限公司 Robot with multiple cameras for ranging and visual ranging method
CN112129262A (en) * 2020-09-01 2020-12-25 珠海市一微半导体有限公司 Visual ranging method and visual navigation chip of multi-camera group
CN113503830A (en) * 2021-07-05 2021-10-15 无锡维度投资管理合伙企业(有限合伙) Aspheric surface shape measuring method based on multiple cameras
CN115317747A (en) * 2022-07-28 2022-11-11 北京大学第三医院(北京大学第三临床医学院) Automatic trachea cannula navigation method and computer equipment
CN116524160A (en) * 2023-07-04 2023-08-01 应急管理部天津消防研究所 Product consistency auxiliary verification system and method based on AR identification

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110706334B (en) * 2019-09-26 2023-05-09 华南理工大学 Three-dimensional reconstruction method for industrial part based on three-dimensional vision
CN111028341B (en) * 2019-12-12 2020-08-04 天目爱视(北京)科技有限公司 Three-dimensional model generation method
CN113358020A (en) * 2020-03-05 2021-09-07 青岛海尔工业智能研究院有限公司 Machine vision detection system and method
CN111368745A (en) * 2020-03-06 2020-07-03 上海眼控科技股份有限公司 Frame number image generation method and device, computer equipment and storage medium
CN111612728B (en) * 2020-05-25 2023-07-25 北京交通大学 3D point cloud densification method and device based on binocular RGB image
CN112183436B (en) * 2020-10-12 2023-11-07 南京工程学院 Expressway visibility detection method based on pixel point eight-neighborhood gray scale comparison
CN112381874B (en) * 2020-11-04 2023-12-12 北京大华旺达科技有限公司 Calibration method and device based on machine vision
CN113310420B (en) * 2021-04-22 2023-04-07 中国工程物理研究院上海激光等离子体研究所 Method for measuring distance between two targets through image
CN113487679B (en) * 2021-06-29 2023-01-03 哈尔滨工程大学 Visual ranging signal processing method for automatic focusing system of laser marking machine
CN114087991A (en) * 2021-11-28 2022-02-25 中国船舶重工集团公司第七一三研究所 Underwater target measuring device and method based on line structured light
CN115082621B (en) * 2022-06-21 2023-01-31 中国科学院半导体研究所 Three-dimensional imaging method, device and system, electronic equipment and storage medium
CN116503570B (en) * 2023-06-29 2023-11-24 聚时科技(深圳)有限公司 Three-dimensional reconstruction method and related device for image
CN117611752B (en) * 2024-01-22 2024-04-02 卓世未来(成都)科技有限公司 Method and system for generating 3D model of digital person

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002032745A (en) * 2000-06-07 2002-01-31 Nec Corp Method for restoring three-dimensional scene structure and movement of camera directly from dot, line, and/or image intensity
CN103218799A (en) * 2012-01-18 2013-07-24 三星电子株式会社 Method and apparatus for camera tracking
CN103292710A (en) * 2013-05-27 2013-09-11 华南理工大学 Distance measuring method applying binocular visual parallax error distance-measuring principle
US20140168370A1 (en) * 2012-12-14 2014-06-19 Faro Technologies, Inc. Device for optically scanning and measuring an environment
CN105659592A (en) * 2014-09-22 2016-06-08 三星电子株式会社 Camera system for three-dimensional video
CN106813595A (en) * 2017-03-20 2017-06-09 北京清影机器视觉技术有限公司 Three-phase unit characteristic point matching method, measuring method and three-dimensional detection device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002063240A1 (en) * 2001-02-02 2002-08-15 Snap-On Technologies Inc Method and apparatus for mapping system calibration
US8090195B2 (en) * 2008-02-12 2012-01-03 Panasonic Corporation Compound eye imaging apparatus, distance measuring apparatus, disparity calculation method, and distance measuring method
JP5471355B2 (en) * 2009-11-24 2014-04-16 オムロン株式会社 3D visual sensor
CN104101293A (en) * 2013-04-07 2014-10-15 鸿富锦精密工业(深圳)有限公司 Measurement machine station coordinate system unification system and method
CN104897065A (en) * 2015-06-09 2015-09-09 河海大学 Measurement system for surface displacement field of shell structure

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002032745A (en) * 2000-06-07 2002-01-31 Nec Corp Method for restoring three-dimensional scene structure and movement of camera directly from dot, line, and/or image intensity
CN103218799A (en) * 2012-01-18 2013-07-24 三星电子株式会社 Method and apparatus for camera tracking
US20140168370A1 (en) * 2012-12-14 2014-06-19 Faro Technologies, Inc. Device for optically scanning and measuring an environment
CN103292710A (en) * 2013-05-27 2013-09-11 华南理工大学 Distance measuring method applying binocular visual parallax error distance-measuring principle
CN105659592A (en) * 2014-09-22 2016-06-08 三星电子株式会社 Camera system for three-dimensional video
CN106813595A (en) * 2017-03-20 2017-06-09 北京清影机器视觉技术有限公司 Three-phase unit characteristic point matching method, measuring method and three-dimensional detection device

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110440712A (en) * 2019-08-26 2019-11-12 英特维科技(苏州)有限公司 Adaptive big depth of field 3-D scanning method and system
CN111457859A (en) * 2020-03-06 2020-07-28 深圳奥比中光科技有限公司 Alignment calibration method and system for 3D measuring device and computer readable storage medium
CN111457859B (en) * 2020-03-06 2022-12-09 奥比中光科技集团股份有限公司 Alignment calibration method and system for 3D measuring device and computer readable storage medium
CN112033352A (en) * 2020-09-01 2020-12-04 珠海市一微半导体有限公司 Robot with multiple cameras for ranging and visual ranging method
CN112129262A (en) * 2020-09-01 2020-12-25 珠海市一微半导体有限公司 Visual ranging method and visual navigation chip of multi-camera group
CN112033352B (en) * 2020-09-01 2023-11-07 珠海一微半导体股份有限公司 Multi-camera ranging robot and visual ranging method
CN113503830A (en) * 2021-07-05 2021-10-15 无锡维度投资管理合伙企业(有限合伙) Aspheric surface shape measuring method based on multiple cameras
CN113503830B (en) * 2021-07-05 2023-01-03 无锡维度投资管理合伙企业(有限合伙) Aspheric surface shape measuring method based on multiple cameras
CN115317747A (en) * 2022-07-28 2022-11-11 北京大学第三医院(北京大学第三临床医学院) Automatic trachea cannula navigation method and computer equipment
CN116524160A (en) * 2023-07-04 2023-08-01 应急管理部天津消防研究所 Product consistency auxiliary verification system and method based on AR identification
CN116524160B (en) * 2023-07-04 2023-09-01 应急管理部天津消防研究所 Product consistency auxiliary verification system and method based on AR identification

Also Published As

Publication number Publication date
WO2019100933A1 (en) 2019-05-31
CN109813251B (en) 2021-10-01

Similar Documents

Publication Publication Date Title
CN109813251A (en) Method, apparatus and system for three-dimensional measurement
CN110285793B (en) Intelligent vehicle track measuring method based on binocular stereo vision system
US11145077B2 (en) Device and method for obtaining depth information from a scene
EP3516626B1 (en) Device and method for obtaining distance information from views
EP3427227B1 (en) Methods and computer program products for calibrating stereo imaging systems by using a planar mirror
CN110517216B (en) SLAM fusion method and system based on multiple types of cameras
CN103649674B (en) Measuring equipment and messaging device
US11521311B1 (en) Collaborative disparity decomposition
KR100513055B1 (en) 3D scene model generation apparatus and method through the fusion of disparity map and depth map
CN110390719A (en) Based on flight time point cloud reconstructing apparatus
CN110514143A (en) A kind of fringe projection system scaling method based on reflecting mirror
CN110009672A (en) Promote ToF depth image processing method, 3D rendering imaging method and electronic equipment
CN106033614B (en) A kind of mobile camera motion object detection method under strong parallax
CN109859272A (en) A kind of auto-focusing binocular camera scaling method and device
CN110838164B (en) Monocular image three-dimensional reconstruction method, system and device based on object point depth
TW201004818A (en) The asynchronous photography automobile-detecting apparatus
WO2018032841A1 (en) Method, device and system for drawing three-dimensional image
CN106767526A (en) A kind of colored multi-thread 3-d laser measurement method based on the projection of laser MEMS galvanometers
Krutikova et al. Creation of a depth map from stereo images of faces for 3D model reconstruction
CN110378995A (en) A method of three-dimensional space modeling is carried out using projection feature
Murray et al. Patchlets: Representing stereo vision data with surface elements
Neumann et al. A hierarchy of cameras for 3D photography
Ahmed et al. A system for 3D video acquisition and spatio-temporally coherent 3D animation reconstruction using multiple RGB-D cameras
Thomas et al. 3D image sequence acquisition for TV & film production
CN110827230A (en) Method and device for improving RGB image quality by TOF

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20221109

Address after: 100094 Room 701-2, Floor 7, Building 1, Yard 1, No. 81, Beiqing Road, Haidian District, Beijing

Patentee after: Beijing Yilian Technology Co.,Ltd.

Address before: Room 1021, Mingde building, School of business, Renmin University of China, 59 Zhongguancun Street, Haidian District, Beijing 100872

Patentee before: Jiang Jing