WO2018006246A1 - 四相机组平面阵列特征点匹配方法及基于其的测量方法 - Google Patents

四相机组平面阵列特征点匹配方法及基于其的测量方法 Download PDF

Info

Publication number
WO2018006246A1
WO2018006246A1 PCT/CN2016/088420 CN2016088420W WO2018006246A1 WO 2018006246 A1 WO2018006246 A1 WO 2018006246A1 CN 2016088420 W CN2016088420 W CN 2016088420W WO 2018006246 A1 WO2018006246 A1 WO 2018006246A1
Authority
WO
WIPO (PCT)
Prior art keywords
matching
point
image plane
image
points
Prior art date
Application number
PCT/CN2016/088420
Other languages
English (en)
French (fr)
Inventor
曹亮
Original Assignee
曹亮
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 曹亮 filed Critical 曹亮
Priority to US15/310,066 priority Critical patent/US10107617B2/en
Priority to JP2016566782A priority patent/JP6453908B2/ja
Priority to PCT/CN2016/088420 priority patent/WO2018006246A1/zh
Priority to EP16790253.5A priority patent/EP3285232B1/en
Priority to KR1020167033144A priority patent/KR101926953B1/ko
Priority to CN201680000645.2A priority patent/CN107850419B/zh
Publication of WO2018006246A1 publication Critical patent/WO2018006246A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/189Recording image signals; Reproducing recorded image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors

Definitions

  • the present invention relates to the field of optical electronic measurement technology; in particular, it relates to a four-camera group planar array feature point matching method and a four-camera group planar array feature point matching method.
  • three-dimensional stereo vision measurement generally adopts the method of line laser light screenshot measurement technology or binocular measurement plus structured light illumination.
  • the main reason that line laser or structured light is widely used in three-dimensional measurement is: by line laser or structured light indication
  • the corresponding matching points are defined in the imaging, which reduces the ambiguity of the matching and achieves a certain and unique match.
  • binocular matching cannot avoid the ambiguity of multi-point matching, and thus cannot meet the measurement requirements.
  • the line laser or structured light can only be used for the line laser or structured light imaging part. Measurements define the scope of application of the technology, and the use of line lasers or structured light can adversely affect the object being tested, such as a person.
  • the current binocular matching often uses a way of marking points on the surface of the object, and this method is also used to improve the accuracy of the matching.
  • the method of attaching the marker points to the surface of the object requires manual treatment and intervention of the object to be tested in advance.
  • an object of the embodiments of the present invention is to provide a four-camera group planar array feature point matching method and a measurement method based on a four-camera group planar array feature point matching method, which reduces the complexity of the matching method and the measurement method. Simplify the space size calculation process and reduce system measurement errors.
  • an embodiment of the present invention provides a four-camera group planar array feature point matching method, and the method includes the following steps:
  • A1 Taking any one of the four image planes corresponding to the four cameras of the four camera group as the base image plane, and finding a lateral direction and the base image for a feature point on the base image plane All matching points on the image plane adjacent to the plane that match the feature point;
  • A5. determining, according to the feature points in the base image plane, all the sub-match point groups found in the step a3, and the matching points found in the step a4, determining corresponding on the four image planes The uniqueness of the same viewpoint is matched to the point group.
  • the embodiment of the present invention provides a first possible implementation manner of the foregoing first aspect, wherein, in the step a1, the feature point on the base image plane is found according to a matching condition 1) And all matching points matching the feature point on the image plane adjacent to the base image plane in the lateral direction; wherein the matching condition 1) is two images adjacent to each other in the lateral direction by the viewpoint
  • the imaging point on the plane satisfies: the imaged point on the left image plane of the viewpoint and the imaged point on the corresponding right image plane of the viewpoint are located on the same line parallel to the horizontal coordinate axis, and on the left image plane
  • the horizontal offset of the imaging point relative to the coordinate origin of the left image plane is greater than the horizontal offset of the imaging point on the right image plane relative to the coordinate origin of the right image plane.
  • the embodiment of the present invention provides the second possible implementation manner of the foregoing first aspect, wherein in the step a2, the feature point on the base image plane in the step a1 is Finding all matching points matching the feature points on the image plane adjacent to the base image plane in the longitudinal direction according to the matching condition 2); wherein the matching condition 2) is that one of the viewed points is in the longitudinal direction
  • the image points on the upper two image planes satisfy that the imaged point on the upper image plane and the imaged point on the corresponding lower image plane are located on the same line parallel to the longitudinal coordinate axis.
  • the vertical offset of the imaging point on the upper image plane relative to the coordinate origin of the upper image plane is greater than the vertical offset of the imaging point on the lower image plane relative to the coordinate origin of the lower image plane.
  • the embodiment of the present invention provides a third possible implementation manner of the foregoing first aspect, wherein, in the step a3, all the matching points found in the step a1 and the step a2 are All the matching points found are re-matched according to the matching condition 3), the matching points that do not satisfy the matching condition 3) are excluded, and the matching points satisfying the matching condition 3) are subjected to the lateral offset value or the vertical offset value.
  • the size relationship is paired to form a sub-matching point group; wherein the matching condition 3) is a viewpoint being in a group of four cameras
  • the corresponding matching point group on the four image planes of the group satisfies: the ratio between the lateral offset value and the longitudinal offset value is equal to the ratio of the length to the width of the reference rectangle, wherein the lateral offset value is the left viewpoint of the viewed point
  • the longitudinal offset value is a vertical offset of the imaged point on the upper image plane relative to the coordinate origin of the upper image plane and an imaged point on the corresponding lower image plane of the viewpoint
  • the difference in the vertical offset of the coordinate origin of the lower image plane, wherein the reference rectangle is a rectangle formed by the four focal points of a set of four camera sets.
  • the embodiment of the present invention provides a fourth possible implementation manner of the foregoing first aspect, wherein, in the step a4, according to the feature point on the base image plane and the step a3 Finding all the sub-matching point groups, finding matching points on the diagonal position image plane according to the matching condition 1) and the matching condition 2) for any sub-matching point group, according to the step a3 Finding the coordinates of the sub-matching point group to obtain the coordinates of the matching point on the diagonal position image plane, the matching coordinate of the matching point is equal to the matching point on the image plane longitudinally adjacent to the diagonal position image plane
  • the abscissa the ordinate is equal to the ordinate of the matching point on the image plane laterally adjacent to the diagonal position image plane, after determining the coordinates, the matching point on the diagonal position image plane and the other three Image feature similarity matching is performed on the matching points on the image plane.
  • the four imaging points on the four image planes form a set of matching point groups; otherwise, the matching is not possible, and the diagonal position is like a plane Match point on The sub-match point group found in step a3 corresponding to the matching point is excluded.
  • the embodiment of the present invention provides the fifth possible implementation manner of the foregoing first aspect, wherein, in the step a5, the unique matching point groups corresponding to the same view point on the four image planes Satisfying the matching condition 4), the matching condition 4) is that a point of view on the four image planes corresponding to a group of four camera groups is formed into a rectangle, and the ratio of the lateral length to the longitudinal length of the rectangle is equal to the reference rectangle The ratio of length to width, the two image points on the two image planes in the diagonal position are respectively located on two straight lines parallel to the two diagonal lines of the reference rectangle.
  • the embodiment of the present invention provides the sixth possible implementation manner of the foregoing first aspect, wherein, in the step a5, if only one group of matching point groups satisfying the matching condition 4) is satisfied, the matching result is unique. .
  • the embodiment of the present invention provides the seventh possibility of the above first aspect.
  • the base rectangle has the feature The point is an end point; an extension of the diagonal of the base rectangle with the feature point as an end point passes through the diagonal position image plane, and the length of the diagonal line is equal to any two sets of matching corresponding to the feature point a distance between two matching points of the point group on the diagonal position image plane; the base rectangle is similar to the reference rectangle; and the other end points of the base rectangle are similar to the feature point image feature, and the feature point is a matching point on the base image plane;
  • the method for determining the unique matching point of the feature point on the diagonal position image plane is if the feature point on the base image plane is An end point of a diagonal line of the base rectangle, wherein a unique matching point on the image plane of the diagonal position is a pair of the extension points of the only corresponding rectangle on the diagonal position image plane passing through the feature point One of the two endpoints on the corner line, if the feature point is at the upper end of the base rectangle, the corresponding matching point on the diagonal position image plane is the end point of the upper end of the unique corresponding rectangle, otherwise a lower end end point; for the non-unique matching point on the diagonal position image plane, it can be determined that the unique matching point of the point on the base image plane is the feature point on the base image plane
  • the two sets of matching point groups corresponding to the feature points correspond to two different viewpoints, and the two different viewpoints are spatially located on the pixel projection line of the feature point relative to the base image plane. Extension line;
  • the plurality of sets of matching point groups correspond to a plurality of different viewpoints, and the plurality of different viewpoints are spatially located at the feature point relative to the base image The extension line of the planar pixel projection line.
  • an embodiment of the present invention provides an eighth possible implementation manner of the foregoing first aspect, wherein the feature point refers to an imaging point corresponding to one or more matching points, where the imaging point is different from Image features of other imaging points.
  • the present invention provides a measurement method based on a four-camera group planar array feature point matching method, comprising the following steps:
  • step b2 According to the spatial position coordinates of the viewpoint obtained in the step b2, three-dimensional point cloud data is formed, and a three-dimensional point cloud graphic is created to reproduce the three-dimensional stereoscopic image.
  • the embodiment of the present invention provides the first possible implementation manner of the foregoing second aspect, wherein in the step b2, if the camera in the upper left position of the group of four cameras is a camera in the upper left position and the b camera in the upper right position
  • the focus of the c camera in the lower left position and the d camera in the lower right position are O a , O b , O c and O d , respectively, and the four focal points are on the same plane and form a rectangle whose length O a O b is m, a width O a O c is n, is set to a rectangular center point O, the origin O to build a three-dimensional Cartesian coordinate system, where the X axis is parallel to the rectangle a O O O c O d and b side, Y The axis is parallel to the O a O c and O b O d sides of the rectangle.
  • the Z axis is perpendicular to the plane of the focus and parallel to the optical axis directions of the four cameras.
  • the configuration of the four cameras is identical, and one view of the object is viewed.
  • the spatial position coordinates of P are P(P x , P y , P z ), and the image coordinates of the imaging points on the image plane corresponding to the four cameras of the a camera, the b camera, the c camera, and the d camera are respectively P a (P ax, P ay), the spatial position P b (P bx, P by ), P c (P cx, P cy), P d (P dx, P dy), the coordinates of the point P Expression is:
  • f is the focal length of four cameras
  • u is the target length of the image sensor
  • v is the target surface width of the image sensor
  • ⁇ x is defined as lateral matching
  • the imaged point on the b image plane is relative to the a image plane
  • ⁇ y is the longitudinal matching
  • the imaged point on the c image plane is relative to the a image plane
  • a four-camera group planar array feature point matching method capable of quickly matching the viewed point in four image planes according to the position of an imaged point of a viewed object on a four image plane of a group of four camera groups
  • a corresponding unique set of imaging points which achieves versatility and unique matching of the viewpoints that can be imaged on all four cameras;
  • the same measurement method is adopted for any object.
  • the three-dimensional measurement of the object can be realized, and the measurement method does not need to perform any calibration on the field of view, and the measurement accuracy and resolution are only related to the measurement system, and independent of the object, the autonomic measurement can be completely realized.
  • Figure 1 is a schematic diagram of a spatial coordinate system established based on a set of four camera sets
  • FIG. 2 is a schematic diagram of projection of an imaged point on an image plane of two cameras of a subject and a viewpoint of the two cameras in a lateral direction of a group of four camera groups;
  • FIG. 3 is a schematic view showing the projection of the viewpoint and the corresponding imaging point of FIG. 2 on the OXZ coordinate plane;
  • FIG. 4 is a perspective view showing an ambiguity point when laterally matching two laterally adjacent image planes of a group of four camera sets;
  • Figure 5 is a perspective view of an ideal straight line imaged in four image planes of a group of four camera sets
  • Figure 6 is a plan view of the imaged points on the four image planes of Figure 5;
  • Figure 7 is a schematic diagram showing the projection of uniqueness of matching
  • Figure 8 is an exemplary perspective view showing the uniqueness of matching
  • Figure 9 is a plan view of the imaged points on the four image planes of Figure 8.
  • Figure 10 is another exemplary perspective view showing the uniqueness of matching
  • Figure 11 is a plan view of the imaged points on the four image planes of Figure 10;
  • Figure 12 is still another exemplary plan view showing the uniqueness of matching
  • FIG. 13 is a flowchart of a four-camera group planar array feature point matching method according to an embodiment of the present invention.
  • FIG. 14 is a flow chart of a measurement method based on a four-camera group planar array feature point matching method according to an embodiment of the present invention.
  • the imaged point refers to the image of the corresponding pixel position of the viewed object on the image plane (or image), and each viewed point of the object is respectively on the four image planes of a group of four camera sets.
  • a matching point refers to an image point on an image plane, and an image feature found on the image plane or other image plane that satisfies a certain matching condition with the imaging point and with the imaging point (eg Texture, color or grayscale, etc.) Similar imaging points. There may be one or more matching points corresponding to one imaging point.
  • a feature point refers to an image point corresponding to one or more matching points, the image point having image features different from other image points, such as different textures, colors or gray values, etc., in the embodiment of the present invention It is not specifically limited, and different image features can be selected according to actual conditions as a basis for judgment. Generally, the image points corresponding to the viewpoints at the edge of the object to be measured or the transition region of the texture have sharp image features.
  • the matching, matching operation and operation rules of the feature points are used for comparative analysis of the imaging points of two or more positions, and the image feature similarity indexes of the two or more imaging points are given. If the result reaches the predetermined value of the indicator, the matching is successful, and different image feature similarity indicators can be selected according to the actual situation.
  • the matching conditions 1), 2), 3) and 4) and the unique matching method proposed in the present invention are based on the current image processing method, except that the matching is performed on the similarity of the image features, according to the fourth
  • the geometric positional relationship between the imaging points on the four image planes of the camera group is matched according to the matching conditions proposed by the present invention, thereby verifying and excluding those matching ambiguity points whose image features are similar but the positional relationship is wrong, and the matching result is unique. Sex.
  • a four-camera planar array three-dimensional measurement system that satisfies the requirements and three-dimensional coordinates corresponding to the four-camera planar array three-dimensional measurement system.
  • a four-camera planar array three-dimensional measurement system and a corresponding three-dimensional coordinate system are established in the following manner:
  • FIG. 1 is a schematic diagram of a spatial coordinate system established based on a set of four camera sets, and a set of four camera sets are arranged in a 2 ⁇ 2 array, including a camera in the upper left position, a b camera in the upper right position, and a c in the lower left position.
  • the camera with the d camera in the lower right position has the same configuration of the four cameras.
  • the same configuration, that is, the parameters of the four cameras, the lens, the image sensor, and the focal length are exactly the same.
  • the focal points of the four cameras O a , O b , O c , O d are on the same plane, and the optical axes of the four cameras are perpendicular to the plane, and the lengths of O a , O b , O c , O d are m, A rectangle of width n, the center point of the rectangle is set to O, and a three-dimensional Cartesian coordinate system is established with O as the origin.
  • the X-axis is parallel to the rectangle and O a O b O c O d side
  • the X-axis direction is referred to as the longitudinal direction or transverse direction or horizontal direction
  • arrow X axis may extend (X-axis shown in the left-right direction The direction is to the right, defined here as the positive direction);
  • the Y axis is parallel to the O a O c and O b O d sides of the rectangle, the direction of the Y axis is called the width direction or the vertical direction or the longitudinal direction, and the Y axis can be above and below
  • the direction is extended (the direction indicated by the arrow of the Y-axis is upward, here defined as the positive direction);
  • the Z-axis is perpendicular to the plane of the focus and parallel to the optical axis directions of the four cameras, and the direction of the Z-axis is called the depth direction ( The direction indicated by the arrow defining the Z axis is the positive direction of the depth direction).
  • a plane parallel to the plane in which the focal points O a , O b , O c , O d are located and whose distance from the plane is the focal length f is set, and a, b, c, d are set on the plane.
  • the center of each image plane is the passing point of the optical axis of the corresponding camera, and the plane coordinate origin of each image plane is the upper left corner of the image plane, respectively O a ', O b ', O c ' and O d ', the two-dimensional rectangular coordinate axis of the image plane is set to O a 'X a Y a , and the two-dimensional rectangular coordinate axis of the b image plane is set to O b 'X b Y b ,c
  • the two-dimensional Cartesian coordinate axis of the image plane is set to O c 'X c Y c
  • the two-dimensional Cartesian coordinate axis of the d image plane is set to O d 'X d Y d
  • the length of each image plane is set (corresponding to each camera)
  • the target surface length of the image sensor is u
  • the width (corresponding to the target surface width of the image sensor of each camera) is v (not shown).
  • a set of four camera sets (ie, a four camera set plane array) proposed in the present invention adopting the above arrangement is a minimum basic three-dimensional stereo measurement unit, which can be constructed by more cameras (2N according to similar rules and methods).
  • a multi-camera planar array three-dimensional measurement system composed of N, which is a positive integer greater than or equal to 2, according to similar rules and methods It is possible to construct a three-dimensional coordinate system corresponding to each group of four camera groups, or to construct a unified three-dimensional coordinate system.
  • the matching method and the measuring method of the embodiment of the present invention are applicable to the matching and measurement of the three-dimensional coordinate position of the viewpoint of the object to be imaged which can be imaged on the four cameras of the set of four camera groups arranged as above.
  • a rectangle formed by four focuss of a group of four camera sets is referred to as a reference rectangle.
  • the a image plane and the d image plane are mutually diagonal position image planes
  • the b image plane and the c image plane are mutually diagonal position image planes.
  • FIG. 2 is a projection view of a certain viewpoint P of the object and the imaging point of the point on the a image plane and the b image plane.
  • P a , P b , P point, P a point and P b point on the plane parallel to the OXZ coordinate plane where the two focal points O a and O b are located are: P′ point, P a ′ Point, P b ' point.
  • the O a O b line is parallel to the imaging plane composed of two image planes of a and b , the triangle consisting of three points of P point, O a point and O b point intersects the plane where the two image planes a and b intersect, and the intersection line It is a straight line P a P b , so the P a P b straight line is parallel to the O a O b straight line.
  • FIG. 3 is a schematic view showing the projection of the P point, the P a point, and the P b point of FIG. 2 on the OXZ coordinate plane.
  • m is the length of O a O b
  • u is the target length of each image sensor
  • P' point, P a ' point and P b ' point are P point, P a point, and P b point respectively in OXZ
  • the projection points on the coordinate plane, P ax and P bx are the coordinate values of the P a 'point and the P b ' point in the X-axis direction on the a image plane and the b image plane, respectively.
  • P ax is greater than P bx , that is to say, the horizontal offset of the image point of the P point in the a image plane relative to the coordinate origin of the a image plane is greater than the image point of the point in the b image plane relative to the b image The horizontal offset of the coordinate origin of the plane.
  • Matching condition 1) When one viewpoint is imaged on four image planes of a group of four camera groups, the image points on the two image planes adjacent in the lateral direction satisfy: imaging of the viewpoint on the left image plane The image point on the right image plane corresponding to the viewpoint is located on the same line parallel to the horizontal coordinate axis, and the horizontal offset of the image point on the left image plane relative to the coordinate origin of the left image plane Greater than the image point on the right image plane The horizontal offset of the coordinate origin of the right image plane.
  • Matching condition 1) applies to the horizontal matching of a, b image planes, and the same applies to the horizontal matching of c, d image planes.
  • the two image planes adjacent to each other in the lateral direction of a group of four cameras are corresponding left and right planes, that is, the right image plane corresponding to the left image plane a is b, and the right image plane corresponding to the left image plane c is d.
  • Matching condition 2 When one viewpoint is imaged on four image planes of a group of four camera groups, the image points on the two image planes adjacent in the longitudinal direction satisfy: imaging of the viewpoint on the upper image plane The image point on the corresponding lower image plane of the point of view is on the same line parallel to the longitudinal coordinate axis, and the vertical offset of the image point on the upper image plane relative to the coordinate origin of the upper image plane A vertical offset greater than the coordinate origin of the imaged point on the lower image plane relative to the lower image plane.
  • Matching condition 2 applies to the longitudinal matching of a, c image planes, and the same applies to the vertical matching of b, d image planes.
  • the two image planes adjacent to each other in the longitudinal direction of a group of four cameras are corresponding upper and lower planes, that is, the lower image plane corresponding to the upper image plane a is c, and the lower image plane corresponding to the upper image plane b is d.
  • (P cx -P dx ) is the lateral shift of the imaged point on the d image plane relative to the imaged point on the c image plane in the lateral matching value.
  • (P by -P dy ) is the longitudinal offset of the imaged point on the d image plane relative to the imaged point on the b image plane in the longitudinal matching value.
  • Matching condition 3 A matching point group or a sub-matching point group corresponding to the four image planes of a group of four camera groups is satisfied: the ratio between the lateral offset value and the vertical offset value is equal to the length of the reference rectangle a ratio to a width, wherein the lateral offset value is a horizontal offset of the coordinate point of the imaged point on the left image plane relative to the coordinate origin of the left image plane and an image of the viewpoint on the corresponding right image plane The difference between the horizontal offset of the point relative to the coordinate origin of the right image plane, the vertical offset value being the vertical offset of the imaged point on the upper image plane relative to the coordinate origin of the upper image plane The difference between the amount and the vertical offset of the image point on the corresponding lower image plane relative to the coordinate origin of the lower image plane, wherein the matching point group means that the matching condition is satisfied on the four image planes 3) And four imaging points with similar image features, the sub-matching point group specifically refers to the imaging points on the two image planes adjacent to the base image plane that satisfy the
  • the uniqueness of the feature points can be achieved by using the four-camera group planar array matching method.
  • Matching the matching method has good universality and applicability, which proves as follows:
  • FIG. 4 is a perspective view showing only the ambiguity point when the a, b image plane is laterally matched.
  • the P point is a viewpoint of the spatial object to be measured, and the coordinates are P(P x , P y , P z ), the image points of the viewpoint on the a, b image plane are P a , P b , respectively.
  • the reference point P a point (feature point), to find the image plane b matching the point imaging point.
  • the imaging point on the b image plane matching the P a point is located on a line passing through the P a point and parallel to the X axis, and the matched imaging point is in the b image plane.
  • the horizontal coordinate value on the upper side is smaller than the horizontal coordinate value of the P a point on the a image plane.
  • the corresponding point P a viewpoint is located a P a O-ray resides (or referred to as pixel projection line, i.e., the focus of a camera
  • the extension between the a and b image planes belongs to the binocular match on the extension line of the ray determined by any imaging point.
  • P a is known.
  • the matching image point of the point determined on the b image plane is generally not more than one, but there may be more than one, thus causing a problem of uniqueness of binocular matching.
  • the problem that needs to be solved at present is to remove one or more points having ambiguity among the plurality of matching points satisfying the matching condition, and to determine the imaging point P b that uniquely matches the point P a on the b image plane.
  • the point P b ' is another image point that matches the point P a on the b image plane.
  • the point is on the same line as P a P b
  • the point is O b
  • the line determined by the line is on the same plane as the ray where O a P a is located and the two rays intersect, setting the intersection point to the point P'. Therefore, it can also be considered that the P b ' point is an imaging point of the P′ point on the b image plane, which also corresponds to the P a point on the a image plane.
  • Figure 5 is a schematic perspective view of an ideal straight line PP' imaged on four image planes
  • Figure 6 is a plan view of the imaged points on the four image planes of Figure 5.
  • the image points of the P and P' points on the c image plane and the d image plane can be seen.
  • the image points Pa and Pa' on the a image plane of the P point and the P' point coincide.
  • P a point, P b point, P c point, and P d point (P a , P b , P c , and P d ) are a set of matching point groups corresponding to point P, and are unique matching points corresponding to point P.
  • P a 'point, P b 'point, P c 'point, and P d 'point are a set of matching points corresponding to the P' point, And it is a unique matching point group corresponding to the P' point.
  • the uniqueness of the four-camera group planar array matching algorithm is that, according to the matching condition, P a is used as a reference point, and the image points uniquely matched with the b, c, and d image planes are respectively found.
  • P a is used as a reference point, and the image points uniquely matched with the b, c, and d image planes are respectively found.
  • a matching imaging point P b and another randomly assumed matching imaging point P b ' are found on the b image plane. If the two imaging points are matching points of P a , then according to the image imaging principle, in a The image points shown in Fig. 5 are obtained on the image planes of b, c, and d.
  • P a (P a '), P b P b ', P c P c ' and P d P d ' in Fig. 6 are projection images of the PP' line on the four image planes, wherein P a coincides with P a '.
  • Matching condition 4 an imaged point on the four image planes of a group of four camera groups is formed into a rectangle, and the ratio of the lateral length to the longitudinal length of the rectangle is equal to the ratio of the length to the width of the reference rectangle, and the two pairs are at The image points on the two image planes of the diagonal position are respectively located on two straight lines parallel to the two diagonal lines of the reference rectangle.
  • the unique matching point on each image plane, and the unique matching point group corresponding to each point on the PP' line can further calculate the spatial position coordinates of the point on the PP' line according to the above coordinate calculation formula. This is similar to the three views of mechanical drawing, the two views will be ambiguous to the understanding of the object, but this will not happen in the three views.
  • the point P b ' is another point at which the image feature on the b image plane matches the image feature of the P a point, and the P b ' point is not the projection image of the P' point on the b image plane.
  • Figure 7 is a projection diagram demonstrating the uniqueness of the match. Referring to Fig. 7, the imaging points of the P' point, the Pn point, and the P" point on the b image plane coincide. According to the imaging principle, a stereoscopic view of the Pn point projected on the four image planes is as shown in Fig. 8.
  • the projection points of the P n points on the image planes of a, b, c, and d are P na , P nb , P nc , and P nd , respectively, and FIG. 9 shows that P points and P n points are at a, b, c, and d. Image points on the image plane.
  • the four-camera group planar array feature point matching method it can be seen from Fig. 9 that the P nc point and the P a point are not on a longitudinal straight line, which does not satisfy the matching condition 2), and the P nd and P a points are not in a group of four The line on which the focus of the rectangle formed by the focus of the camera group is parallel does not satisfy the matching condition 4). Therefore, although the ambiguous imaging points P b ' and P nb appear when binocular matching occurs, according to the four camera group planar array feature point matching method, it can be determined that another arbitrary hypothetical matching point P nb is not with the P point in a An image point that matches the image point on the plane. In this case, the four-camera group planar array feature point matching method can effectively remove the ambiguity that is easy to occur in binocular matching.
  • the matching point groups corresponding to the viewpoints P, P1, P2, and P3 are actually (P a , P b , P c , and P d ), (P a1 , P b1 , P c1 , and P d1 ), (P a2 ) , P b2 , P c2 and P d2 ) and (P a3 , P b3 , P c3 and P d3 ), but due to the existence of the P3 point, the multiple points of the matching point group found for the P point according to the matching method are excluded (P In addition to a , P b , P c and P d ), there is also a set of ambiguous sets of matching points (P a , P b1 , P c2 and P d3 ).
  • the following method can be used to find a set of matching point groups that are ambiguous, that is, for the above case, according to the matching condition, multiple sets of matching point groups (P a , P b , P c are found for the imaging points P a in the base image plane).
  • the viewpoints that can be imaged on the four image planes of a group of four camera groups are matched according to the matching condition, thereby obtaining the spatial position coordinates of the viewpoint, and forming three-dimensional point cloud data, thereby establishing three-dimensional Point cloud graphics for three-dimensional reproduction of objects. Due to the unique nature of the four camera group matching, the spatial position coordinates of the viewpoint are also unique, and there is no one-to-many or many-to-one ambiguity.
  • a four-camera group planar array feature point matching method includes the following steps:
  • A1 Taking any one of the four image planes corresponding to the four cameras of the four camera group as the base image plane, and finding a lateral direction and the base image for a feature point on the base image plane All matching points on the image plane adjacent to the plane that match the feature point;
  • step a1 for a feature point in the base image plane, all matching points matching the feature point on the image plane adjacent to the base image plane in the lateral direction are found according to the matching condition 1).
  • step a2 the feature points on the base image plane in the step a1 are found on the image plane adjacent to the base image plane in the longitudinal direction according to the matching condition 2) All matching points that match the feature points.
  • step a3 all the matching points found in the step a1 and all the matching points found in the step a2 are re-matched according to the matching condition 3), and the matching points that do not satisfy the matching condition 3) are excluded.
  • the matching points satisfying the matching condition 3) are paired according to the magnitude relationship of the lateral offset value or the vertical offset value to form a sub-matching point group.
  • step a4 according to the feature points in the base image plane and all the child matching point groups found in the step a3, according to the matching condition 1) and the matching condition for any of the child matching point groups 2) finding the matching point on the diagonal position image plane, and obtaining the coordinates of the matching point on the diagonal position image plane according to the coordinates of the sub-matching point group found in the step a3, the matching point
  • the abscissa is equal to the abscissa of the matching point on the image plane longitudinally adjacent to the diagonal position image plane
  • the ordinate is equal to the ordinate of the matching point on the image plane laterally adjacent to the diagonal position image plane
  • image feature similarity matching is performed on the matching points on the diagonal position image plane and the matching points on the other three image planes, and if the matching is successful, four imaging images on the four image planes are obtained.
  • the points form a set of matching point groups; otherwise, they cannot be matched, and the matching points on the diagonal position image plane and the matching point groups found in the step
  • A5. determining, according to the feature points in the base image plane, all the sub-match point groups found in the step a3, and the matching points found in the step a4, determining corresponding on the four image planes The uniqueness of the same viewpoint is matched to the point group.
  • step a5 the unique matching point groups corresponding to the same viewpoint in the four image planes satisfy the matching condition 4), and the matching condition 4) is four images corresponding to one set of four camera groups.
  • the image forming points on the plane form a rectangle whose ratio of the lateral length to the longitudinal length is equal to the ratio of the length to the width of the reference rectangle, and the image points on the two image planes at the diagonal positions are respectively located at the reference rectangle Two diagonal lines parallel to each other.
  • step a5 if there is only one group of matching point groups satisfying the matching condition 4), then the result is unique.
  • step a5 if there are multiple sets of matching point groups satisfying the matching condition 4), it is determined whether there is a base rectangle on the base image plane that satisfies the following condition: the base rectangle has a feature point as an end point; the base rectangle An extension line of the diagonal line with the feature point as an end point passes through the diagonal position image plane, and the length of the diagonal line is equal to any two sets of matching point groups corresponding to the feature point at the diagonal position image
  • the distance between two matching points on the plane; the base rectangle is similar to the reference rectangle; and the other endpoints of the base rectangle are similar to the image features of the feature point, which is the matching point of the feature point on the base image plane ;
  • the rectangle corresponding to the equivalence of any base rectangle on the diagonal position image plane means that the rectangle and the base rectangle are congruent, and the four end points of the rectangle and the corresponding position of the base rectangle are four.
  • a unique corresponding rectangle whose endpoints satisfy the matching condition and the image features are similar to each other, and can be combined with various existing matching operation methods or rules to determine whether there is a unique corresponding rectangle corresponding to any base rectangle on the diagonal position image plane.
  • the related matching operation method or rule does not belong to the summary of the present invention. Therefore, those skilled in the art can think of combining a plurality of different ways to find a unique corresponding rectangle.
  • the method for determining the unique matching point of the feature point on the diagonal position image plane is if the feature point on the base image plane is An end point of a diagonal line of the base rectangle, wherein the only matching point on the diagonal position image plane is an extension of the unique corresponding rectangle on the diagonal position image plane through the pair of the feature points One of the two endpoints on the corner line, if the feature point is at the upper end of the base rectangle, the corresponding matching point on the diagonal position image plane is the end point of the upper end of the unique corresponding rectangle, otherwise a lower end end point; for the non-unique matching point on the diagonal position image plane, it can be determined that the unique matching point of the point on the base image plane is the feature point on the base image plane Pair Another end point on the
  • the two sets of matching point groups corresponding to the feature points correspond to two different viewpoints, and the two different viewpoints are spatially located on the pixel projection line of the feature point relative to the base image plane. Extension line;
  • the plurality of sets of matching point groups correspond to a plurality of different viewpoints, and the plurality of viewpoints are spatially located at the feature point relative to the base image plane
  • the extension line of the pixel projection line is
  • a measurement method based on a four-camera group planar array feature point matching method includes the following steps:
  • the matching method of the above is used to find the unique matching point group corresponding to all the feature points in the base image plane.
  • step b2 if the focus of the a camera in the upper left position of the four camera sets, the b camera in the upper right position, the c camera in the lower left position, and the d camera in the lower right position are O a , O b , O c , and O d , respectively.
  • the four focal points are on the same plane and form a rectangle whose length O a O b is m, the width O a O c is n, the center point of the rectangle is set to O, and a three-dimensional right angle is established with O as the origin a coordinate system in which the X axis is parallel to the O a O b and O c O d sides of the rectangle, the Y axis is parallel to the O a O c and O b O d sides of the rectangle, and the Z axis is perpendicular to the plane of the focus and four
  • the camera's optical axis direction is parallel, the configuration of the four cameras is exactly the same, and the spatial position coordinates of a viewed point P of the object being viewed are P(P x , P y , P z ), and the P point is in a camera, b camera, c Camera and d camera
  • the image coordinates of the corresponding imaging points on the image plane of the four cameras are P a (
  • f is the focal length of four cameras
  • u is the target length of the image sensor
  • v is the map
  • ⁇ x is defined as a lateral match
  • ⁇ y is the longitudinal offset value of the imaged point on the c image plane relative to the imaged point on the a image plane and the imaged point on the d image plane relative to the b image
  • step b2 According to the spatial position coordinates of the viewpoint obtained in the step b2, three-dimensional point cloud data is formed, and a three-dimensional point cloud graphic is created to reproduce the three-dimensional stereoscopic image.
  • the matching method of the present invention and the three-dimensional measuring system constructed thereby belong to an autonomous matching method and a measuring system, and the matching conditions and methods are not changed due to changes in the external environment.
  • the measured by the system can be determined.
  • Field of view, measurement accuracy, spatial resolution, adaptability to the environment, etc., for example, changing the focal length can measure objects at a distance, changing the camera such as using an infrared camera to achieve nighttime measurements, using a microscope head to achieve microscopic stereo measurement.
  • the calculation and measurement of the three-dimensional coordinate position of the feature point can be realized, but since the image plane has the influence of the field of view, resolution, environment and illumination conditions, the camera itself parameters such as aperture, shutter, exposure time
  • the adjustment of the shooting parameters, as well as the geometrical features, edge features, surface reflections and texture features of the object will affect the matching of the imaging points, there must be a situation that cannot be completely matched, because the image features also include continuity and other Constraint conditions
  • any known image processing methods and means may be used to perform other parts that cannot be completely matched.
  • Image analysis and processing can solve most of the problems of 3D object imaging. Based on the existing image processing technology, it provides basic matching methods and measurement methods for 3D vision measurement technology.
  • the method or function of the present invention if implemented in the form of a software functional module and sold or used as a standalone product, may be stored in a computer readable storage medium.
  • the computer software product is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, server or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present invention.
  • the foregoing storage medium includes various media that can store program codes, such as a USB flash drive, a mobile hard disk, a read only memory, a random access memory, a magnetic disk, or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

四相机组平面阵列特征点匹配方法及基于四相机组平面阵列特征点匹配方法的测量方法,属于光学电子测量领域。匹配方法包括以四个像平面(a,b,c,d)中的一个像平面为基像平面(a),对基像平面(a)上的一个特征点(P a)找出在横向方向上与该基像平面(a)相邻的像平面(b)上与该特征点(P a)匹配的所有匹配点(P b,P b1);对于基像平面上(a)的特征点(P a)找出在纵向方向上与该基像平面(a)相邻的像平面(c)上与该特征点(P a)匹配的所有匹配点(P c,P c2);将找出的横纵两个方向上所有匹配点(P b,P b1,P c,P c2)进行再匹配,找出所有子匹配点组((P b,P c), (P b1,P c2));找出对角位置像平面(d)上与基像平面(a)上的特征点(P a)以及找出的所有子匹配点组((P b,P c), (P b1,P c2))对应的匹配点(P d,P d3);确定四个像平面(a,b,c,d)中对应于同一被视点(P)的唯一性匹配点组(P a,P b,P c,P d)。对于每组唯一性匹配点组(P a,P b,P c,P d),可根据该匹配点组(P a,P b,P c,P d)的像坐标和相机系统本身的参数,计算被视点(P)的三维空间坐标。在任何光照条件下只要采集的图像足够清晰对于任何在四相机组平面阵列的图像上成像且有一定的图像特征的被视物,采用完全相同的匹配方法和测量方法均可以实现被视物的三维测量。

Description

四相机组平面阵列特征点匹配方法及基于其的测量方法 技术领域
本发明涉及光学电子测量技术领域;具体而言,涉及一种四相机组平面阵列特征点匹配方法及一种基于四相机组平面阵列特征点匹配方法的测量方法。
背景技术
目前三维立体视觉测量一般采用线激光光截图测量技术或双目测量加结构光照明的方式,线激光或结构光在三维测量中得到广泛使用的主要原因是:通过线激光或结构光的指示使成像中明确了对应的匹配点,减少了匹配的歧义性,实现了确定且唯一的匹配。但是如果取消了线激光或结构光的指示,双目匹配就不能避免出现多点匹配的歧义性,从而不能满足测量要求,同时采用线激光或结构光只能对线激光或结构光成像部位进行测量,限定了该技术的适用范围,而且线激光或结构光的使用对被测物例如人会产生不良影响。
此外,目前双目匹配还经常采用一种在被视物表面贴标识点的方式,采用这种方式也是为了提高匹配的准确性。但是,在被视物表面贴标志点的方式存在需要提前对被测物进行人工处理和干预的缺点。
发明内容
有鉴于此,本发明实施例的目的在于提供一种四相机组平面阵列特征点匹配方法及一种基于四相机组平面阵列特征点匹配方法的测量方法,降低匹配方法及测量方法的复杂度,简化空间尺寸计算过程,减小系统测量误差。
第一方面,本发明实施例提供了一种四相机组平面阵列特征点匹配方法,所述方法包括以下步骤:
a1.以一组四相机组的四个相机对应的四个像平面中的任意一个像平面为基像平面,对于所述基像平面上的一个特征点找出在横向方向上与该基像平面相邻的像平面上与该特征点匹配的所有匹配点;
a2.对于所述步骤a1中的所述基像平面上的所述特征点找出在纵向方向上与该基像平面相邻的像平面上与该特征点匹配的所有匹配点;
a3.将所述步骤a1中找出的所有匹配点和所述步骤a2中找出的所有匹配点进行再匹配,找出对应的所有子匹配点组;
a4.根据所述基像平面上的所述特征点以及所述步骤a3中找出的所有子匹配点组,找出对角位置像平面上与所述基像平面上的所述特征点以及所述步骤a3中找出的所有子匹配点组对应的匹配点,其中对角位置像平面是位于所述基像平面的对角位置的像平面;
a5.根据所述基像平面中的所述特征点、所述步骤a3中找出的所有子匹配点组和所述步骤a4中找出的匹配点,确定出所述四个像平面上对应于同一被视点的唯一性匹配点组。
结合第一方面,本发明实施例提供了上述第一方面的第一种可能的实现方式,其中,所述步骤a1中,对于所述基像平面上的所述特征点根据匹配条件1)找出在横向方向上与所述基像平面相邻的像平面上与该特征点匹配的所有匹配点;其中,所述匹配条件1)为,一个被视点在横向方向上相邻的两个像平面上的成像点满足:该被视点在左像平面上的成像点与该被视点在对应的右像平面上的成像点位于平行于横向坐标轴的同一条直线上,并且左像平面上的成像点相对于该左像平面的坐标原点的水平偏移量大于右像平面上的成像点相对于该右像平面的坐标原点的水平偏移量。
结合第一方面,本发明实施例提供了上述第一方面的第二种可能的实现方式,其中,所述步骤a2中,对于所述步骤a1中的所述基像平面上的所述特征点根据匹配条件2)找出在纵向方向上与所述基像平面相邻的像平面上与所述特征点匹配的所有匹配点;其中,所述匹配条件2)为,一个被视点在纵向方向上相邻的两个像平面上的成像点满足:该被视点在上像平面上的成像点与该被视点在对应的下像平面上的成像点位于平行于纵向坐标轴的同一条直线上,并且上像平面上的成像点相对于该上像平面的坐标原点的竖直偏移量大于下像平面上的成像点相对于该下像平面的坐标原点的竖直偏移量。
结合第一方面,本发明实施例提供了上述第一方面的第三种可能的实现方式,其中,所述步骤a3中,对于所述步骤a1中找出的所有匹配点和所述步骤a2中找出的所有匹配点根据匹配条件3)进行再匹配,排除不满足所述匹配条件3)的匹配点,并将满足所述匹配条件3)的匹配点按横向偏移值或纵向偏移值的大小关系进行配对形成子匹配点组;其中,所述匹配条件3)为,一个被视点在一组四相机 组的四个像平面上对应的匹配点组满足:横向偏移值与纵向偏移值之间的比值等于参考矩形的长度与宽度之比,其中横向偏移值是该被视点在一左像平面上的成像点相对于该左像平面的坐标原点的水平偏移量与该被视点在对应的右像平面上的成像点相对于该右像平面的坐标原点的水平偏移量的差值,纵向偏移值是该被视点在一上像平面上的成像点相对于该上像平面的坐标原点的竖直偏移量与该被视点在对应的下像平面上的成像点相对于该下像平面的坐标原点的竖直偏移量的差值,其中,参考矩形为一组四相机组的四个焦点形成的矩形。
结合第一方面,本发明实施例提供了上述第一方面的第四种可能的实现方式,其中,所述步骤a4中,根据所述基像平面上的所述特征点以及所述步骤a3中找出的所有子匹配点组,对于任一子匹配点组根据所述匹配条件1)和所述匹配条件2)找出所述对角位置像平面上的匹配点,根据所述步骤a3中找出的子匹配点组的坐标得到位于所述对角位置像平面上的匹配点的坐标,该匹配点的横坐标等于与所述对角位置像平面纵向相邻的像平面上的匹配点的横坐标,纵坐标等于与所述对角位置像平面横向相邻的像平面上的匹配点的纵坐标,在确定坐标之后,对所述对角位置像平面上的匹配点与其他三个像平面上的匹配点进行图像特征相似性匹配,如果匹配成功,则所述四个像平面上的四个成像点就组成一组匹配点组;否则不能匹配,将所述对角位置像平面上的匹配点和该匹配点对应的步骤a3中找出的子匹配点组排除。
结合第一方面,本发明实施例提供了上述第一方面的第五种可能的实现方式,其中,所述步骤a5中,所述四个像平面上对应于同一被视点的唯一性匹配点组满足匹配条件4),所述匹配条件4)为,一个被视点在一组四相机组对应的四个像平面上的成像点组成一个矩形,该矩形的横向长度与纵向长度之比等于参考矩形的长度与宽度之比,两对处于对角位置的两个像平面上的成像点分别位于与参考矩形的两条对角线平行的两条直线上。
结合第一方面,本发明实施例提供了上述第一方面的第六种可能的实现方式,其中,所述步骤a5中,如果满足匹配条件4)的匹配点组只有一组,则匹配结果唯一。
结合第一方面,本发明实施例提供了上述第一方面的第七种可能 的实现方式,其中,所述步骤a5中,如果存在满足匹配条件4)的多组匹配点组,则判断所述基像平面上是否存在满足以下条件的基矩形:所述基矩形以该特征点为一个端点;所述基矩形的以该特征点为一个端点的对角线的延长线通过所述对角位置像平面,并且该对角线的长度等于该特征点对应的任意两组匹配点组在所述对角位置像平面上的两个匹配点间的距离;所述基矩形与参考矩形相似;以及所述基矩形的其他端点与该特征点图像特征相似,是该特征点在所述基像平面上的匹配点;
如果所述基像平面上存在这样的矩形,则对于任一基矩形确定在所述对角位置像平面上是否存在与该基矩形全等的唯一对应的矩形;
如果存在,则首先确定该特征点在所述对角位置像平面上对应的唯一匹配点,然后根据所述匹配条件1)和所述匹配条件2)确定出该特征点在其余两个像平面上的唯一匹配点,排除存在歧义的匹配点组;其中确定该特征点在所述对角位置像平面上对应的唯一匹配点的方法是,如果所述基像平面上的所述特征点为所述基矩形的一条对角线的一个端点,则在所述对角位置像平面上的唯一匹配点为该对角位置像平面上的唯一对应的矩形中延长线通过所述特征点的对角线上的两个端点之一,若所述特征点在所述基矩形的上端,则所述对角位置像平面上对应的唯一匹配点为该唯一对应的矩形的上端的端点,否则就为下端的端点;对于所述对角位置像平面上的非唯一性匹配点,能够确定出该点在所述基像平面上的唯一匹配点是所述基像平面上的所述特征点所在的对角线上的另一个端点;
如果不存在,则所述特征点对应的两组匹配点组对应两个不同的被视点,所述两个不同的被视点在空间上位于该特征点相对于所述基像平面的像素投影线的延长线上;
如果所述基像平面上不存在这样的矩形,则所述多组匹配点组对应多个不同的被视点,所述多个不同的被视点在空间上位于该特征点相对于所述基像平面的像素投影线的延长线上。
结合第一方面,本发明实施例提供了上述第一方面的第八种可能的实现方式,其中,所述特征点是指对应有一个或多个匹配点的成像点,该成像点具有区别于其他成像点的图像特征。
第二方面,本发明提供了一种基于四相机组平面阵列特征点匹配方法的测量方法,包括以下步骤:
b1.图像采集完成后,使用上述的匹配方法找出所述基像平面中的所有特征点对应的唯一性匹配点组;
b2.根据所述步骤b1中获得的唯一性匹配点组的像坐标,计算被视点的空间位置坐标;
b3.根据所述步骤b2中获得的被视点的空间位置坐标,形成三维点云数据,建立三维点云图形,重现三维立体图像。
结合第二方面,本发明实施例提供了上述第二方面的第一种可能的实现方式,其中,所述步骤b2中,若一组四相机组的左上位置的a相机、右上位置的b相机、左下位置的c相机和右下位置的d相机的焦点分别为Oa、Ob、Oc和Od,且四个焦点在同一个平面上并组成一个矩形,该矩形的长度OaOb为m,宽度OaOc为n,矩形的中心点设为O,以O为原点建立一个三维直角坐标系,其中X轴平行于矩形的OaOb和OcOd边,Y轴平行于矩形的OaOc和ObOd边,Z轴垂直于焦点所在的平面且与四个相机的光轴方向平行,四台相机的配置完全相同,被视物的一个被视点P的空间位置坐标为P(Px,Py,Pz),P点在a相机、b相机、c相机和d相机四台相机对应的像平面上的成像点的像坐标分别为Pa(Pax,Pay)、Pb(Pbx,Pby)、Pc(Pcx,Pcy)、Pd(Pdx,Pdy),则P点的空间位置坐标的表达式为:
a,b像平面横向匹配得到Px坐标计算公式
Figure PCTCN2016088420-appb-000001
a,b像平面横向匹配得到Pz坐标计算公式
Figure PCTCN2016088420-appb-000002
c,d像平面横向匹配得到Px坐标计算公式
Figure PCTCN2016088420-appb-000003
c,d像平面横向匹配得到Pz坐标计算公式
Figure PCTCN2016088420-appb-000004
a,c像平面纵向匹配得到Py坐标计算公式
Figure PCTCN2016088420-appb-000005
a,c像平面纵向匹配得到Pz坐标计算公式
Figure PCTCN2016088420-appb-000006
b,d像平面纵向匹配得到Py坐标计算公式
Figure PCTCN2016088420-appb-000007
b,d像平面纵向匹配得到Pz坐标计算公式
Figure PCTCN2016088420-appb-000008
其中,f为四台相机的焦距,u为图像传感器的靶面长度,v为图像传感器的靶面宽度,其中定义△x为横向匹配时,b像平面上的成像点相对于a像平面上的成像点的横向偏移值和d像平面上的成像点相对于c像平面上的成像点的横向偏移值,△y为纵向匹配时,c像平面上的成像点相对于a像平面上的成像点的纵向偏移值和d像平面上的成像点相对于b像平面上的成像点的纵向偏移值。
本发明至少具有以下有益效果:
1.四相机组平面阵列特征点匹配方法,能够根据被视物的一个被视点在一组四相机组的四个像平面上的成像点的位置,快速匹配出该被视点在四个像平面上对应的唯一性成像点组,实现对能够在四个相机上均成像的被视点的通用性、唯一性的匹配;
2.基于四相机组平面阵列特征点匹配方法的测量方法,在任何光照条件下,只要采集的图像足够清晰,在未知被视物的情况下,对于任何被视物,采用完全相同的测量方法就可以实现对被视物的三维测量,并且该测量方法无需对视场做任何标定,其测量精度和分辨率只与测量系统有关,与被视物无关,能够完全实现自主性测量。
3.由于所述匹配方法和测量方法的通用性和可靠性,便于程序优化以及实现嵌入式级别和芯片级别的运算,从而快速实现三维感知和测量。
附图说明
为了更清楚地说明本发明中各个实施例的技术方案,下面将对各个实施例中所需要使用的附图作简单地介绍,应当理解,以下附图仅示出了本发明的某些实施例,因此不应被看作是对范围的限定,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图或实施例获得其他相关的附图或实施例。
图1是基于一组四相机组建立的空间坐标系的示意图;
图2是被视物的某个被视点和该被视点在一组四相机组的横向方向上的两个相机的像平面上的成像点的投影示意图;
图3是图2的被视点和对应的成像点在OXZ坐标平面上的投影示意图;
图4是在一组四相机组的横向相邻的两个像平面进行横向匹配时出现歧义点的立体示意图;
图5是一条理想直线在一组四相机组的四个像平面成像的立体示意图;
图6是图5的四个像平面上的成像点的平面示意图;
图7是证明匹配唯一性的投影示意图;
图8是证明匹配唯一性的一种示例性立体示意图;
图9是图8的四个像平面上的成像点的平面示意图;
图10是证明匹配唯一性的另一种示例性立体示意图;
图11是图10的四个像平面上的成像点的平面示意图;
图12是证明匹配唯一性的又一种示例性平面示意图;
图13是本发明的实施例的一种四相机组平面阵列特征点匹配方法的流程图;
图14是本发明的实施例的一种基于四相机组平面阵列特征点匹配方法的测量方法的流程图。
具体实施方式
为使本发明的实施例的目的、技术方案和优点更加清楚,下面将结合本发明实施例中的附图,对本发明的实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明的一部分实施例,而不是全部的实施例。因此,以下对在附图中提供的本发明的实施例 的详细描述并非旨在限制要求保护的本发明的范围,而是仅仅表示本发明的选定实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明的保护范围。
在本发明的描述中,需要理解的是,术语“原点”、“中心”、“纵向”、“横向”、“长度”、“宽度”、“深度”、“上”、“下”、“前”、“后”、“左”、“右”、“左上”、“左下”、“右上”、“右下”、“竖直”、“水平”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本发明和简化描述,而不是指示或暗示所指的设备或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本发明的限制。
本发明中,成像点是指被视物的被视点在像平面(或图像)上对应像素位置的呈像,被视物的每个被视点在一组四相机组的四个像平面上分别对应有一个成像点;匹配点是指对于一个像平面上的成像点,在该像平面或其他像平面上寻找到的与该成像点满足某种匹配条件且与该成像点的图像特征(例如纹理、颜色或灰度等)相近的成像点。一个成像点对应的匹配点可能有一个或多个。
本发明中,特征点是指对应有一个或多个匹配点的如下成像点,该成像点具有区别于其他成像点的图像特征,例如不同的纹理、颜色或灰度值等,本发明实施例中不作具体限定,可以根据实际情况选择不同的图像特征作为判断依据。一般的,被测物边缘或纹理过渡区等位置的被视点对应的成像点都具有鲜明的图像特征。
本发明中,特征点的匹配、匹配运算和运算规则用于将两个或更多个位置的成像点进行对比分析,给出两个或更多个成像点的图像特征相似性指标,如果对比结果达到指标预定值则匹配成功,可以根据实际情况选择不同的图像特征相似性指标。本发明中提出的匹配条件1)、2)、3)和4)以及唯一性匹配方法,是在目前的图像处理方法的基础上,除了在图像特征的相似性上进行匹配之外,根据四相机组的四个像平面上的成像点间的几何位置关系,按照本发明提出的匹配条件进行匹配,从而验证和排除那些图像特征相似但位置关系错误的匹配歧义点,保证了匹配结果的唯一性。
1、建立四相机组平面阵列前像模型三维坐标系及匹配条件的证明
①建立四相机组平面阵列三维测量系统和三维坐标系
为了便于阐述本发明实施例的四相机组平面阵列特征点匹配方法和测量方法,需要建立一个满足要求的四相机组平面阵列三维测量系统以及该四相机组平面阵列三维测量系统所对应的三维坐标系,为了更方便应用本发明实施例的方法,使用前向投影模型,采用以如下方式建立的四相机组平面阵列三维测量系统和对应的三维坐标系:
图1是基于一组四相机组建立的空间坐标系的示意图,设一组四相机组以2×2阵列形式设置,包括分别位于左上位置的a相机、右上位置的b相机、左下位置的c相机和右下位置的d相机的四个配置相同的相机,相同配置即四个相机的镜头、图像传感器、焦距等参数完全相同。四个相机的焦点Oa、Ob、Oc、Od在同一平面上,且四个相机的光轴均垂直于该平面,Oa、Ob、Oc、Od组成长度为m,宽度为n的矩形,矩形的中心点设为O,以O为原点建立一个三维直角坐标系。其中,X轴平行于矩形的OaOb和OcOd边,X轴的方向称为长度方向或水平方向或横向方向,X轴可以在左右方向上延伸(X轴的箭头所示的方向向右,在此定义为正方向);Y轴平行于矩形的OaOc和ObOd边,Y轴的方向称为宽度方向或竖直方向或纵向方向,Y轴可以在上下方向上延伸(Y轴的箭头所示的方向向上,在此定义为正方向);Z轴垂直于焦点所在的平面且与四个相机的光轴方向平行,Z轴的方向称为深度方向(定义Z轴的箭头所示的方向为深度方向的正方向)。在Z轴的正方向上,设与焦点Oa、Ob、Oc、Od所在的平面平行且与该平面的距离为焦距f的平面,在该平面上设a、b、c、d四个像平面,根据成像原理,每个像平面的中心为各自对应的相机的光轴的通过点,设每个像平面的平面坐标原点为该像平面的左上角,分别为Oa'、Ob'、Oc'和Od',a像平面的二维直角坐标轴设为Oa'XaYa,b像平面的二维直角坐标轴设为Ob'XbYb,c像平面的二维直角坐标轴设为Oc'XcYc,d像平面的二维直角坐标轴设为Od'XdYd,并且设每个像平面的长度(对应于各相机的图像传感器的靶面长度)为u,宽度(对应于各相机的图像传感器的靶面宽度)为v(图中未示出)。
本发明中所提出的采用以上布置结构的一组四相机组(即四相机组平面阵列)为一个最小的基本三维立体测量单元,可以根据类似的规则和方法构建由更多相机(2N个,其中N为大于等于2的正整数)组成的多相机组平面阵列三维测量系统,同时根据类似的规则和方法 可以构建各组四相机组对应的三维坐标系,或者构建统一的三维坐标系。本发明的实施例的匹配方法和测量方法适用于能够在如上布置的一组四相机组的四个相机上均成像的被测物的被视点的三维坐标位置的匹配和测量。
为了便于描述,将一组四相机组的四个焦点形成的矩形称为参考矩形。同时,a像平面和d像平面互为对角位置像平面,b像平面和c像平面互为对角位置像平面。
②四相机组平面阵列特征点横向匹配和纵向匹配及特征点空间位置坐标计算
图2是被视物的某个被视点P和该点在a像平面和b像平面上的成像点的投影示意图,参见图2,按照成像原理,P点在a、b像平面上的成像点分别为Pa、Pb,P点、Pa点、Pb点在两个焦点Oa、Ob所在的与OXZ坐标平面平行的平面上的投影分别为:P′点、Pa′点、Pb′点。
由于OaOb直线平行于a,b两个像平面组成的成像面,P点、Oa点和Ob点三点组成的三角形与a,b两个像平面所在的平面相交,交线为直线PaPb,所以PaPb直线与OaOb直线平行。
图3是图2的P点、Pa点和Pb点在OXZ坐标平面上的投影示意图。
参见图3,m为OaOb的长度,u为各图像传感器的靶面长度,P'点、Pa'点和Pb'点分别为P点、Pa点、Pb点在OXZ坐标平面上的投影点,Pax和Pbx分别为Pa'点和Pb'点在a像平面和b像平面上的X轴方向的坐标值。
明显的,Pax大于Pbx,也就是说P点在a像平面中的成像点相对于a像平面的坐标原点的水平偏移量大于该点在b像平面中的成像点相对于b像平面的坐标原点的水平偏移量。
因此,得出以下匹配条件:
匹配条件1):一个被视点在一组四相机组的四个像平面上成像时,在横向方向上相邻的两个像平面上的成像点满足:该被视点在左像平面上的成像点与该被视点在对应的右像平面上的成像点位于平行于横向坐标轴的同一条直线上,并且该左像平面上的成像点相对于该左像平面的坐标原点的水平偏移量大于该右像平面上的成像点相 对于该右像平面的坐标原点的水平偏移量。
匹配条件1)适用于a,b像平面的横向匹配,同理也适用于c,d像平面的横向匹配。一组四相机组中横向方向上相邻的两个像平面为对应的左平面和右平面,即左像平面a对应的右像平面为b,左像平面c对应的右像平面为d。
同理,可以得出以下匹配条件:
匹配条件2):一个被视点在一组四相机组的四个像平面上成像时,在纵向方向上相邻的两个像平面上的成像点满足:该被视点在上像平面上的成像点与该被视点在对应的下像平面上的成像点位于平行于纵向坐标轴的同一条直线上,并且上像平面上的成像点相对于该上像平面的坐标原点的竖直偏移量大于下像平面上的成像点相对于该下像平面的坐标原点的竖直偏移量。
匹配条件2)适用于a,c像平面的纵向匹配,同理也适用于b,d像平面的纵向匹配。一组四相机组中纵向方向上相邻的两个像平面为对应的上平面和下平面,即上像平面a对应的下像平面为c,上像平面b对应的下像平面为d。
基于上述内容以及三角形相似原理,对于图3有:
Figure PCTCN2016088420-appb-000009
Figure PCTCN2016088420-appb-000010
根据①②推导出
Figure PCTCN2016088420-appb-000011
根据①③推导出
Figure PCTCN2016088420-appb-000012
将(Pax-Pbx)定义为横向匹配时b像平面上的成像点相对于a像平面上的成像点的横向偏移值,定义为Δx。那么,可以得出:
a,b像平面横向匹配时Px坐标计算公式(公式一):
Figure PCTCN2016088420-appb-000013
a,b像平面横向匹配时Pz坐标计算公式(公式二):
Figure PCTCN2016088420-appb-000014
同理,可以推导出,c,d像平面横向匹配时的坐标计算公式:
c,d像平面横向匹配时Px坐标计算公式(公式三):
Figure PCTCN2016088420-appb-000015
c,d像平面横向匹配时Pz坐标计算公式(公式四):
Figure PCTCN2016088420-appb-000016
即(Pax-Pbx)=(Pcx-Pdx)=Δx,(Pcx-Pdx)为横向匹配时d像平面上的成像点相对于c像平面上的成像点的横向偏移值。
与横向匹配同理,还可以推导出纵向匹配时的坐标计算公式,将(Pay-Pcy)定义为纵向匹配时c像平面上的成像点相对于a像平面上的成像点的纵向偏移值,定义为Δy。
a,c像平面纵向匹配时Py坐标计算公式(公式五):
Figure PCTCN2016088420-appb-000017
a,c像平面纵向匹配时Pz坐标计算公式(公式六):
Figure PCTCN2016088420-appb-000018
b,d像平面纵向匹配时Py坐标计算公式(公式七):
Figure PCTCN2016088420-appb-000019
b,d像平面纵向匹配时Pz坐标计算公式(公式八):
Figure PCTCN2016088420-appb-000020
即(Pay-Pcy)=(Pby-Pdy)=Δy,(Pby-Pdy)为纵向匹配时d像平面上的成像点相对于b像平面上的成像点的纵向偏移值。
③四相机组平面阵列特征点横向匹配和纵向匹配的进一步引伸
对于同一个被视点,根据公式二、公式四、公式六和公式八,可以推导出:
Figure PCTCN2016088420-appb-000021
进而得到以下公式(公式九):
Figure PCTCN2016088420-appb-000022
由此,得出以下匹配条件:
匹配条件3):一个被视点在一组四相机组的四个像平面上对应的匹配点组或子匹配点组满足:横向偏移值与纵向偏移值之间的比值等于参考矩形的长度与宽度之比,其中横向偏移值是该被视点在一左像平面上的成像点相对于该左像平面的坐标原点的水平偏移量与该被视点在对应的右像平面上的成像点相对于该右像平面的坐标原点的水平偏移量的差值,纵向偏移值是该被视点在一上像平面上的成像点相对于该上像平面的坐标原点的竖直偏移量与该被视点在对应的下像平面上的成像点相对于该下像平面的坐标原点的竖直偏移量的差值,其中匹配点组是指四个像平面上满足匹配条件3)且图像特征相似的四个成像点,子匹配点组特指与基像平面相邻的两个像平面上满足匹配条件3)的成像点。
2、四相机组平面阵列特征点匹配方法的唯一性证明
采用四相机组平面阵列匹配方法可以实现对特征点的唯一性匹 配,该匹配方法具有良好的普遍性和适用性,证明如下:
首先,图4是仅示出在a,b像平面进行横向匹配时出现歧义点的立体示意图,参见图4,P点为空间被测物的一被视点,坐标为P(Px,Py,Pz),该被视点在a,b像平面上的成像点分别为Pa、Pb。在进行匹配运算时,假设以Pa点为基准点(特征点),寻找b像平面上与该点匹配的成像点。当Pa点确定后,根据匹配条件1),b像平面上与Pa点匹配的成像点位于通过Pa点且与X轴平行的一条直线上,且该匹配的成像点在b像平面上的水平坐标值小于Pa点在a像平面上的水平坐标值。对于Pa点来说,其在a像平面上的位置一旦确定,则Pa点对应的被视点就位于OaPa所在的射线(或称为像素投影线,即由一台相机的焦点和任一成像点确定的射线)的延长线上,a、b像平面间的匹配属于双目匹配,根据目前已知的双目匹配的运算规则、运算方法和约束条件,在已知Pa点的图像特征时,据此确定的该点在b像平面上的匹配的成像点一般不会只有一个,而是可能有多个,因而导致双目匹配的唯一性出现问题。目前需要解决的问题就是在满足匹配条件的多个匹配点中,去除存在歧义性的一个或多个点,确定出b像平面上与Pa点唯一匹配的成像点Pb
为此,假设Pb′点为另一个在b像平面上与Pa点匹配的成像点,按照匹配条件1),该点与PaPb在同一条直线上,并且该点与Ob的连线确定的射线与OaPa所在的射线在同一平面上且两条射线相交,将交点设为P′点。因此,也可以认为Pb′点是P′点在b像平面上的成像点,该点也与a像平面上的Pa点相对应。
图5是一条理想直线PP′在四个像平面上成像的立体示意图;图6是图5的四个像平面上的成像点的平面示意图。参见图5和图6,可以看到P点和P′点在c像平面和d像平面上的成像点。其中,P点和P′点在a像平面上的成像点Pa与Pa'重合。Pa点、Pb点、Pc点和Pd点(Pa、Pb、Pc和Pd)为P点对应的一组匹配点组,并且是与P点对应的唯一性匹配点组;Pa'点、Pb'点、Pc'点和Pd'点(Pa'、Pb'、Pc'和Pd')为P'点对应的一组匹配点组,并且是与P'点对应的唯一性匹配点组。
四相机组平面阵列匹配算法的唯一性在于,按照匹配条件,以Pa为基准点,分别找出b、c、d像平面上与其唯一匹配的成像点。首先,在b像平面上找到了匹配的成像点Pb和另一个任意假设的匹配 的成像点Pb′,如果两个成像点都是Pa的匹配点,那么按照图像成像原理,在a、b、c、d像平面上就得到如图5所示的成像点。还可以推理得出,图6中的Pa(Pa')、PbPb'、PcPc'和PdPd'为PP′直线在四个像平面上的投影图像,其中Pa与Pa'重合。
如果存在一条与图5中的PP′相同或类似的理想直线,则该直线在一组四相机组的四个像平面上的成像点如图5和图6所示。PP′仅是一种示例,实际上,对于PP′之间的所有被视点在a像平面上的成像点均为Pa点,换句话说,对于这种情况,如果以a像平面作为基像平面,则Pa点对应着多组匹配点,但是每一组匹配点对应唯一的被视点。
根据图6,结合公式九:
Figure PCTCN2016088420-appb-000023
可以得到:
匹配条件4):一个被视点在一组四相机组的四个像平面上的成像点组成一个矩形,该矩形的横向长度与纵向长度之比等于参考矩形的长度与宽度之比,两对处于对角位置的两个像平面上的成像点分别位于与参考矩形的两条对角线平行的两条直线上。
根据上述内容,可以得到,当被视物中出现如上所示的类似PP′的直线时,如果按照a、b像平面进行双目匹配时出现单点对多点的匹配,就不能满足唯一性匹配的要求,从而产生匹配歧义。但如果将c、d像平面作为a、b像平面进行匹配的参照,特别地是在d像平面上,PP′直线变为一条倾斜的直线,则可以在四个像平面中找出P′在每个像平面上的唯一性匹配点,以及PP′直线上的每个点对应的唯一性匹配点组,进而可以根据上述坐标计算公式计算出PP′直线上的点的空间位置坐标。这就与机械制图需要三视图类似,两个视图会出现对物体理解的歧义性,但三视图就不会发生这样的情况。
为了进一步证明匹配的唯一性,考虑以下情况:
第一种情况,假设Pb′点为另一个在b像平面上的图像特征与Pa点的图像特征相匹配的点,同时Pb′点不是P′点在b像平面上的投影成像点,而是空间中的另一个能够在b像平面上产生Pb′投影成像点的任意点,设为Pn点。图7是证明匹配唯一性的投影示意图。 参见图7,P′点、Pn点和P″点在b像平面上的成像点重合。按照成像原理,Pn点在四个像平面上投影的立体示意图如图8所示。如果设Pn点在a、b、c和d像平面上的投影点分别为Pna、Pnb、Pnc和Pnd,则图9示出P点和Pn点在a、b、c和d像平面上的成像点。
按照四相机组平面阵列特征点匹配方法,从图9中可以看出,Pnc点与Pa点不在一条纵向直线上,不满足匹配条件2),Pnd与Pa点不在与一组四相机组的焦点形成的矩形的对角线平行的一条直线上,不满足匹配条件4)。因此,虽然双目匹配时出现了匹配歧义的成像点Pb′和Pnb,但是按照四相机组平面阵列特征点匹配方法,可以确定另一个任意假设的匹配点Pnb不是与P点在a像平面上的成像点匹配的成像点。在这种情况下,采用四相机组平面阵列特征点匹配方法,可以有效去除双目匹配中容易出现的歧义性。
第二种情况,假设Pb'点不是P′点在b像平面上的投影,而具体地是与OaOb直线平行的直线PP″上的任意一点P″在b像平面上的投影,设为Pb″。图10示出P″点在四个像平面上投影的立体示意图,图11示出P点和P″点在a、b、c和d像平面上的各成像点,其中Pb″点与Pn点重合。
按照四相机组平面阵列特征点匹配方法,从图11中可以看出,Pc″点与Pa点不在一条纵向直线上,不满足匹配条件2),Pd″点与Pa点不在与一组四相机组的焦点形成的矩形的对角线平行的一条直线上,不满足匹配条件4)。因此,可以确定另一个与OaOb直线平行的直线PP″上的任意一点P″在b像平面上的成像点不是与P点在a像平面上的成像点匹配的成像点。
但是如果假设有一个与一组四相机组的焦点形成的矩形平行的平面,在该平面上存在四个图像特征相近的被视点P、P1、P2、P3,并且这四个被视点布置成与该矩形尺寸相似的矩形,如图12所示,那么,如果寻找P点对应的匹配点,则P3点也满足四相机组平面阵列特征点匹配条件,这是第二种情况下考虑的进一步特例。
即实际上被视点P、P1、P2和P3对应的匹配点组分别为(Pa、Pb、Pc和Pd)、(Pa1、Pb1、Pc1和Pd1)、(Pa2、Pb2、Pc2和Pd2)和(Pa3、Pb3、Pc3和Pd3),但是由于P3点的存在使得对于P点根据匹配方法找出的多组匹配点组中除(Pa、Pb、Pc和Pd)外,还存在有歧义的一组匹配点组(Pa、Pb1、Pc2和Pd3)。
如果出现这种情况,可以在a像平面上找到与被视点P、P1、P2和P3相对应的四个点Pa、Pa1、Pa2和Pa3,其中(Pa、Pa1、Pa2和Pa3)、(Pb、Pb1、Pb2和Pb3)、(Pc、Pc1、Pc2和Pc3)和(Pd、Pd1、Pd2和Pd3)分别形成与一组四相机组的焦点形成的矩形相似的矩形。
可以采用以下方式找出存在歧义的一组匹配点组,即针对上述情况,根据匹配条件,对于基像平面中的成像点Pa找出多组匹配点组(Pa、Pb、Pc和Pd)和(Pa、Pb1、Pc2和Pd3),则a像平面上存在以Pa点为左上端点且与参考矩形相似的矩形PaPa1Pa2Pa3;进一步地只需确定出d像平面上存在与矩形PaPa1Pa2Pa3全等的唯一对应的矩形Pa3Pb3Pc3Pd3,则可以确定Pa对应的唯一匹配点为Pd,Pa3对应的唯一匹配点为Pd3,排除存在歧义的匹配点组。
同理,对于以b、c或d像平面为基像平面的情况,可以采用类似的推理方式,排除找出的存在歧义的一组或多组匹配点组。
本发明的实施例中,对于能够在一组四相机组的四个像平面上成像的被视点按照匹配条件进行匹配,从而得到被视点的空间位置坐标,形成三维点云数据,进而可以建立三维点云图形,对被视物进行三维立体重现。由于四相机组匹配的唯一性特性,被视点的空间位置坐标也是唯一的,不会出现一对多或多对一的歧义性。
3、四相机组平面阵列特征点匹配方法和测量方法
根据前述匹配原理、匹配条件、匹配公式和匹配方法,一方面,参见图13,一种四相机组平面阵列特征点匹配方法,包括以下步骤:
a1.以一组四相机组的四个相机对应的四个像平面中的任意一个像平面为基像平面,对于所述基像平面上的一个特征点找出在横向方向上与该基像平面相邻的像平面上与该特征点匹配的所有匹配点;
步骤a1中,对于所述基像平面中的一个特征点根据匹配条件1)找出在横向方向上与所述基像平面相邻的像平面上与该特征点匹配的所有匹配点。
a2.对于所述步骤a1中的所述基像平面上的所述特征点找出在纵向方向上与该基像平面相邻的像平面上与该特征点匹配的所有匹配点;
步骤a2中,对于所述步骤a1中的所述基像平面上的所述特征点根据匹配条件2)找出在纵向方向上与所述基像平面相邻的像平面上 与所述特征点匹配的所有匹配点。
a3.将所述步骤a1中找出的所有匹配点和所述步骤a2中找出的所有匹配点进行再匹配,找出对应的所有子匹配点组;
步骤a3中,对于所述步骤a1中找出的所有匹配点和所述步骤a2中找出的所有匹配点根据匹配条件3)进行再匹配,排除不满足所述匹配条件3)的匹配点,并将满足所述匹配条件3)的匹配点按横向偏移值或纵向偏移值的大小关系进行配对形成子匹配点组。
a4.根据所述基像平面上的所述特征点以及所述步骤a3中找出的所有子匹配点组,找出对角位置像平面上与所述基像平面上的所述特征点以及所述步骤a3中找出的所有子匹配点组对应的匹配点,其中对角位置像平面是位于所述基像平面的对角位置的像平面;
步骤a4中,根据所述基像平面中的所述特征点以及所述步骤a3中找出的所有子匹配点组,对于任一子匹配点组根据所述匹配条件1)和所述匹配条件2)找出所述对角位置像平面上的匹配点,根据所述步骤a3中找出的子匹配点组的坐标得到所述对角位置像平面上的匹配点的坐标,该匹配点的横坐标等于与所述对角位置像平面纵向相邻的像平面上的匹配点的横坐标,纵坐标等于与所述对角位置像平面横向相邻的像平面上的匹配点的纵坐标,在确定坐标之后,对所述对角位置像平面上的匹配点与其他三个像平面上的匹配点进行图像特征相似性匹配,如果匹配成功,则所述四个像平面上的四个成像点就形成一组匹配点组;否则不能匹配,将所述对角位置像平面上的匹配点和该匹配点对应的所述步骤a3中找出的匹配点组排除。
a5.根据所述基像平面中的所述特征点、所述步骤a3中找出的所有子匹配点组和所述步骤a4中找出的匹配点,确定出所述四个像平面上对应于同一被视点的唯一性匹配点组。
步骤a5中,所述四个像平面上对应于同一被视点的唯一性匹配点组满足匹配条件4),所述匹配条件4)为,一个被视点在一组四相机组对应的四个像平面上的成像点组成一个矩形,该矩形的横向长度与纵向长度之比等于参考矩形的长度与宽度之比,两对处于对角位置的两个像平面上的成像点分别位于与参考矩形的两条对角线平行的两条直线上。
步骤a5中,如果满足匹配条件4)的匹配点组只有一组,则匹 配结果唯一。
步骤a5中,如果存在满足匹配条件4)的多组匹配点组,则判断所述基像平面上是否存在满足以下条件的基矩形:所述基矩形以特征点为一个端点;所述基矩形的以该特征点为一个端点的对角线的延长线通过所述对角位置像平面,并且该对角线的长度等于该特征点对应的任意两组匹配点组在所述对角位置像平面上的两个匹配点间的距离;所述基矩形与参考矩形相似;以及所述基矩形的其他端点与特征点的图像特征相似,是该特征点在所述基像平面上的匹配点;
如果所述基像平面上存在这样的矩形,则对于任一基矩形确定在所述对角位置像平面上是否存在与该基矩形全等的唯一对应的矩形;
(本发明实施例中,对角位置像平面上与任一基矩形全等的唯一对应的矩形是指该矩形与基矩形全等,且该矩形的四个端点与基矩形的相应位置的四个端点互相满足匹配条件且图像特征相似的唯一对应的矩形,可以结合现有的各种匹配运算方法或规则,确定出对角位置像平面上是否存在与任一基矩形对应的唯一对应的矩形,而相关的匹配运算方法或规则不属于本发明的发明内容,因此,此处不作详细说明,本领域普通技术人员能够想到结合多种不同方式来找出唯一对应的矩形。)
如果存在,则首先确定该特征点在所述对角位置像平面上对应的唯一匹配点,然后根据所述匹配条件1)和所述匹配条件2)确定出该特征点在其余两个像平面上的唯一匹配点,排除存在歧义的匹配点组;其中确定该特征点在所述对角位置像平面上对应的唯一匹配点的方法是,如果所述基像平面上的所述特征点为所述基矩形的一条对角线的一个端点,则在所述对角位置像平面上的唯一匹配点为该对角位置像平面上的唯一对应的矩形的延长线通过所述特征点的对角线上的两个端点之一,若所述特征点在所述基矩形的上端,则所述对角位置像平面上对应的唯一匹配点为该唯一对应的矩形的上端的端点,否则就为下端的端点;对于所述对角位置像平面上的非唯一性匹配点,能够确定出该点在所述基像平面上的唯一匹配点是所述基像平面上的所述特征点所在的对角线上的另一个端点;
如果不存在,则所述特征点对应的两组匹配点组对应两个不同的被视点,所述两个不同的被视点在空间上位于该特征点相对于所述基像平面的像素投影线的延长线上;
如果所述基像平面上不存在这样的矩形,则所述多组匹配点组对应多个不同的被视点,所述多个被视点在空间上位于该特征点相对于所述基像平面的像素投影线的延长线上。
下面以一实施例详细说明:
参见图12,简单描述如下,假设以a像平面作为基像平面,对于a像平面上的一个特征点Pa,找出在b像平面上与该点匹配的所有匹配点,比如Pb、Pb1;接着,对于特征点Pa找出在c像平面上与该点匹配的所有匹配点,比如Pc、Pc2;进而,将Pb、Pb1与Pc、Pc2进行再匹配,确定出子匹配点组(Pb,Pc)和(Pb1,Pc2);随后分别确定出(Pb,Pc)对应的Pd及Pd的坐标,(Pb1,Pc2)对应的Pd3及Pd3的坐标,若经判断(Pa,Pb,Pc,Pd)中的四个成像点的图像特征相似且(Pa,Pb1,Pc2,Pd3)中的四个成像点的图像特征也相似,则Pa点对应有两组匹配点组(Pa,Pb,Pc,Pd)和(Pa,Pb1,Pc2,Pd3);接着确定出a像平面上存在基矩形PaPa1Pa2Pa3,该矩形满足:以Pa为一个端点,对角线PaPa3的延长线通过d像平面且PaPa3的长度等于PdPd3的距离,与参考矩形相似,同时Pa,Pa1,Pa2和Pa3的图像特征相似,则进一步确定出d像平面上存在与该矩形全等的唯一对应的矩形PdPd1Pd2Pd3,其中Pa与Pd、Pa1与Pd1、Pa2与Pd2和Pa3与Pd3分别互为对应的匹配点且图像特征相似,则可以确定d像平面上与Pa对应的唯一匹配点为Pd,而d像平面上的非唯一性匹配点Pd3对应a像平面上的特征点Pa3;据此排除了匹配点组(Pa,Pb1,Pc2,Pd3),确定出(Pa,Pb,Pc,Pd)是对应于同一被视点的唯一性匹配点组。
另一方面,参见图14,一种基于四相机组平面阵列特征点匹配方法的测量方法,包括以下步骤:
b1.图像采集完成后,使用上述的匹配方法找出所述基像平面中的所有特征点对应的唯一性匹配点组。
b2.根据所述步骤b1中获得的唯一性匹配点组的像坐标,计算被视点的空间位置坐标。
步骤b2中,若一组四相机组的左上位置的a相机、右上位置的b相机、左下位置的c相机和右下位置的d相机的焦点分别为Oa、Ob、Oc和Od,且四个焦点在同一个平面上并组成一个矩形,该矩形的长度OaOb为m,宽度OaOc为n,矩形的中心点设为O,以O为原点建立一个三维直角坐标系,其中X轴平行于矩形的OaOb和OcOd 边,Y轴平行于矩形的OaOc和ObOd边,Z轴垂直于焦点所在的平面且与四个相机的光轴方向平行,四台相机的配置完全相同,被视物的一个被视点P的空间位置坐标为P(Px,Py,Pz),P点在a相机、b相机、c相机和d相机四台相机的像平面上对应的成像点的像坐标分别为Pa(Pax,Pay)、Pb(Pbx,Pby)、Pc(Pcx,Pcy)、Pd(Pdx,Pdy),则P点的空间位置坐标的表达式为:
a,b像平面横向匹配得到Px坐标计算公式
Figure PCTCN2016088420-appb-000024
a,b像平面横向匹配得到Pz坐标计算公式
Figure PCTCN2016088420-appb-000025
c,d像平面横向匹配得到Px坐标计算公式
Figure PCTCN2016088420-appb-000026
c,d像平面横向匹配得到Pz坐标计算公式
Figure PCTCN2016088420-appb-000027
a,c像平面纵向匹配得到Py坐标计算公式
Figure PCTCN2016088420-appb-000028
a,c像平面纵向匹配得到Pz坐标计算公式
Figure PCTCN2016088420-appb-000029
b,d像平面纵向匹配得到Py坐标计算公式
Figure PCTCN2016088420-appb-000030
b,d像平面纵向匹配得到Pz坐标计算公式
Figure PCTCN2016088420-appb-000031
其中,f为四台相机的焦距,u为图像传感器的靶面长度,v为图 像传感器的靶面宽度,其中定义△x为横向匹配时,b像平面上的成像点相对于a像平面上的成像点的横向偏移值和d像平面上的成像点相对于c像平面上的成像点的横向偏移值,△y为纵向匹配时,c像平面上的成像点相对于a像平面上的成像点的纵向偏移值和d像平面上的成像点相对于b像平面上的成像点的纵向偏移值。
b3.根据所述步骤b2中获得的被视点的空间位置坐标,形成三维点云数据,建立三维点云图形,重现三维立体图像。
本发明的匹配方法和由此构造的三维测量系统属于自主性匹配方法和测量系统,匹配条件和方法不会因外界环境的变化而变化,测量系统参数一旦确定,就可以确定该系统所测量的视场范围、测量精度、空间分辨率、对环境的适应性等,例如,改变焦距可以测量远距离的物体,改变相机如采用红外摄像机可实现夜间测量,采用显微镜头可以实现微观立体测量等。
按照以上方法和步骤可以实现被视物特征点的三维坐标位置的计算和测量,但由于像平面有视场范围、分辨率、环境和光照条件的影响,摄像机本身参数如光圈、快门、暴光时间等拍摄参数的调整,同时被视物的几何特征、边缘特征、表面反射和纹理特征等对成像点的匹配会产生影响,必然存在着不能完全匹配的情况,由于图像特征还包括连续性等其它约束条件,通过本发明的实施例提供的方法在解决了图像边缘和其他特征明显的关键点后的测量后,可以采用任何已知的图像处理方法和手段,将其他不能完全匹配的部分,进行图像分析和处理,从而能够解决大部分三维物体成像的问题,在现有图像处理技术的基础上,为三维视觉测量技术提供了基本的匹配方法和测量方法。
在本发明所提供的几个实施例中,应该理解到,所提供的方法也可以通过其他的方式实现。也应当注意,在有些作为替换的实现方式中,所述方法步骤或流程可以以不同于权利要求或附图中所标注的顺序发生。例如,两个连续的步骤或流程实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,依具体情况而定。
本发明所述的方法或功能,如果以软件功能模块的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出 来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器或者网络设备等)执行本发明各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器、随机存取存储器、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述仅为本发明的优选实施例而已,并不用于限制本发明,对于本领域普通技术人员来说,本发明可以有各种更改和变化。凡在本发明的精神和原则之内所作的任何修改、等同替换、变化等,均应包含在本发明的保护范围之内。

Claims (11)

  1. 一种四相机组平面阵列特征点匹配方法,其特征在于,包括以下步骤:
    a1.以一组四相机组的四个相机对应的四个像平面中的任意一个像平面为基像平面,对于所述基像平面上的一个特征点找出在横向方向上与该基像平面相邻的像平面上与该特征点匹配的所有匹配点;
    a2.对于所述步骤a1中的所述基像平面上的所述特征点找出在纵向方向上与该基像平面相邻的像平面上与该特征点匹配的所有匹配点;
    a3.将所述步骤a1中找出的所有匹配点和所述步骤a2中找出的所有匹配点进行再匹配,找出对应的所有子匹配点组;
    a4.根据所述基像平面上的所述特征点以及所述步骤a3中找出的所有子匹配点组,找出对角位置像平面上与所述基像平面上的所述特征点以及所述步骤a3中找出的所有子匹配点组对应的匹配点,其中对角位置像平面是位于所述基像平面的对角位置的像平面;
    a5.根据所述基像平面上的所述特征点、所述步骤a3中找出的所有子匹配点组和所述步骤a4中找出的匹配点,确定出所述四个像平面上对应于同一被视点的唯一性匹配点组。
  2. 根据权利要求1所述的匹配方法,其特征在于,
    所述步骤a1中,对于所述基像平面上的一个特征点根据匹配条件1)找出在横向方向上与所述基像平面相邻的像平面上与该特征点匹配的所有匹配点;其中,所述匹配条件1)为,一个被视点在横向方向上相邻的两个像平面上的成像点满足:该被视点在左像平面上的成像点与该被视点在对应的右像平面上的成像点位于平行于横向坐标轴的同一条直线上,并且左像平面上的成像点相对于该左像平面的坐标原点的水平偏移量大于右像平面上的成像点相对于该右像平面的坐标原点的水平偏移量。
  3. 根据权利要求1所述的匹配方法,其特征在于,
    所述步骤a2中,对于所述步骤a1中的所述基像平面上的所述特征点根据匹配条件2)找出在纵向方向上与所述基像平面相邻的像平面上与所述特征点匹配的所有匹配点;其中,所述匹配条件2)为,一个被视点在纵向方向上相邻的两个像平面上的成像点满足:该被视 点在上像平面上的成像点与该被视点在对应的下像平面上的成像点位于平行于纵向坐标轴的同一条直线上,并且上像平面上的成像点相对于该上像平面的坐标原点的竖直偏移量大于下像平面上的成像点相对于该下像平面的坐标原点的竖直偏移量。
  4. 根据权利要求1所述的匹配方法,其特征在于,
    所述步骤a3中,对于所述步骤a1中找出的所有匹配点和所述步骤a2中找出的所有匹配点根据匹配条件3)进行再匹配,排除不满足所述匹配条件3)的匹配点,并将满足所述匹配条件3)的匹配点按横向偏移值或纵向偏移值的大小关系进行配对形成子匹配点组;其中,所述匹配条件3)为,一个被视点在一组四相机组的四个像平面上对应的匹配点组满足:横向偏移值与纵向偏移值之间的比值等于参考矩形的长度与宽度之比,其中横向偏移值是该被视点在一左像平面上的成像点相对于该左像平面的坐标原点的水平偏移量与该被视点在对应的右像平面上的成像点相对于该右像平面的坐标原点的水平偏移量的差值,纵向偏移值是该被视点在一上像平面上的成像点相对于该上像平面的坐标原点的竖直偏移量与该被视点在对应的下像平面上的成像点相对于该下像平面的坐标原点的竖直偏移量的差值,其中,参考矩形为一组四相机组的四个焦点形成的矩形。
  5. 根据权利要求1所述的匹配方法,其特征在于,
    所述步骤a4中,根据所述基像平面上的所述特征点以及所述步骤a3中找出的子匹配点组,对于任一子匹配点组根据所述匹配条件1)和所述匹配条件2)找出所述对角位置像平面上的匹配点,根据所述步骤a3中找出的所有子匹配点组的坐标得到位于所述对角位置像平面上的匹配点的坐标,该匹配点的横坐标等于与所述对角位置像平面纵向相邻的像平面上的匹配点的横坐标,纵坐标等于与所述对角位置像平面横向相邻的像平面上的匹配点的纵坐标,在确定坐标之后,对所述对角位置像平面上的匹配点与其他三个像平面上的匹配点进行图像特征相似性匹配,如果匹配成功,则所述四个像平面上的四个成像点就组成一组匹配点组;否则不能匹配,将所述对角位置像平面上的匹配点和该匹配点对应的所述步骤a3中找出的子匹配点组排除。
  6. 根据权利要求1所述的匹配方法,其特征在于,
    所述步骤a5中,所述四个像平面上对应于同一被视点的唯一性匹配点组满足匹配条件4),所述匹配条件4)为,一个被视点在一组 四相机组对应的四个像平面上的成像点组成一个矩形,该矩形的横向长度与纵向长度之比等于参考矩形的长度与宽度之比,两对处于对角位置的两个像平面上的成像点分别位于与参考矩形的两条对角线平行的两条直线上。
  7. 根据权利要求1所述的匹配方法,其特征在于,
    所述步骤a5中,如果满足匹配条件4)的匹配点组只有一组,则匹配结果唯一。
  8. 根据权利要求1所述的匹配方法,其特征在于,
    所述步骤a5中,如果存在满足匹配条件4)的多组匹配点组,则判断所述基像平面上是否存在满足以下条件的基矩形:所述基矩形以该特征点为一个端点;所述基矩形的以该特征点为一个端点的对角线的延长线通过所述对角位置像平面,并且该对角线的长度等于该特征点对应的任意两组匹配点组在所述对角位置像平面上的两个匹配点间的距离;所述基矩形与参考矩形相似;以及所述基矩形的其他端点与该特征点的图像特征相似,是该特征点在所述基像平面上的匹配点;
    如果所述基像平面上存在这样的矩形,则对于任一基矩形确定在所述对角位置像平面上是否存在与该基矩形全等的唯一对应的矩形;
    如果存在,则首先确定该特征点在所述对角位置像平面上对应的唯一匹配点,然后根据所述匹配条件1)和所述匹配条件2)确定出该特征点在其余两个像平面上的唯一匹配点,排除存在歧义的匹配点组;其中确定该特征点在所述对角位置像平面上对应的唯一匹配点的方法是,如果所述基像平面上的所述特征点为所述基矩形的一条对角线的一个端点,则在所述对角位置像平面上的唯一匹配点为该对角位置像平面上的唯一对应的矩形中延长线通过所述特征点的对角线上的两个端点之一,若所述特征点在所述基矩形的上端,则所述对角位置像平面上对应的唯一匹配点为该唯一对应的矩形的上端的端点,否则就为下端的端点;对于所述对角位置像平面上的非唯一性匹配点,能够确定出该点在所述基像平面上的唯一匹配点是所述基像平面上的所述特征点所在的对角线上的另一个端点;
    如果不存在,则所述特征点对应的两组匹配点组对应两个不同的被视点,所述两个不同的被视点在空间上位于该特征点相对于所述基 像平面的像素投影线的延长线上;
    如果所述基像平面上不存在这样的矩形,则所述多组匹配点组对应多个不同的被视点,所述多个不同的被视点在空间上位于该特征点相对于所述基像平面的像素投影线的延长线上。
  9. 根据权利要求1-8中任一项所述的匹配方法,其特征在于,所述特征点是指对应有一个或多个匹配点的成像点,该成像点具有区别于其他成像点的图像特征。
  10. 一种基于四相机组平面阵列特征点匹配方法的测量方法,包括以下步骤:
    b1.图像采集完成后,使用权利要求1-9中任一项所述的匹配方法找出所述基像平面上的所有特征点对应的唯一性匹配点组;
    b2.根据所述步骤b1中获得的唯一性匹配点组的像坐标,计算被视点的空间位置坐标;
    b3.根据所述步骤b2中获得的被视点的空间位置坐标,形成三维点云数据,建立三维点云图形,重现三维立体图像。
  11. 根据权利要求10所述的测量方法,其特征在于,所述步骤b2中,若一组四相机组的左上位置的a相机、右上位置的b相机、左下位置的c相机和右下位置的d相机的焦点分别为Oa、Ob、Oc和Od,且四个焦点在同一个平面上并组成一个矩形,该矩形的长度OaOb为m,宽度OaOc为n,矩形的中心点设为O,以O为原点建立一个三维直角坐标系,其中X轴平行于矩形的OaOb和OcOd边,Y轴平行于矩形的OaOc和ObOd边,Z轴垂直于焦点所在的平面且与四个相机的光轴方向平行,四台相机的配置完全相同,被视物的一个被视点P的空间位置坐标为P(Px,Py,Pz),P点在a相机、b相机、c相机和d相机四台相机对应的像平面上的成像点的像坐标分别为Pa(Pax,Pay)、Pb(Pbx,Pby)、Pc(Pcx,Pcy)、Pd(Pdx,Pdy),则P点的空间位置坐标的表达式为:
    a,b像平面横向匹配得到Px坐标计算公式
    Figure PCTCN2016088420-appb-100001
    a,b像平面横向匹配得到Pz坐标计算公式
    Figure PCTCN2016088420-appb-100002
    c,d像平面横向匹配得到Px坐标计算公式
    Figure PCTCN2016088420-appb-100003
    c,d像平面横向匹配得到Pz坐标计算公式
    Figure PCTCN2016088420-appb-100004
    a,c像平面纵向匹配得到Py坐标计算公式
    Figure PCTCN2016088420-appb-100005
    a,c像平面纵向匹配得到Pz坐标计算公式
    Figure PCTCN2016088420-appb-100006
    b,d像平面纵向匹配得到Py坐标计算公式
    Figure PCTCN2016088420-appb-100007
    b,d像平面纵向匹配得到Pz坐标计算公式
    Figure PCTCN2016088420-appb-100008
    其中,f为四台相机的焦距,u为图像传感器的靶面长度,v为图像传感器的靶面宽度,其中定义△x为横向匹配时,b像平面上的成像点相对于a像平面上的成像点的横向偏移值和d像平面上的成像点相对于c像平面上的成像点的横向偏移值,△y为纵向匹配时,c像平面上的成像点相对于a像平面上的成像点的纵向偏移值和d像平面上的成像点相对于b像平面上的成像点的纵向偏移值。
PCT/CN2016/088420 2016-07-04 2016-07-04 四相机组平面阵列特征点匹配方法及基于其的测量方法 WO2018006246A1 (zh)

Priority Applications (6)

Application Number Priority Date Filing Date Title
US15/310,066 US10107617B2 (en) 2016-07-04 2016-07-04 Feature point matching method of planar array of four-camera group and measuring method based on the same
JP2016566782A JP6453908B2 (ja) 2016-07-04 2016-07-04 4カメラ組の平面アレイの特徴点のマッチング方法、及びそれに基づく測定方法
PCT/CN2016/088420 WO2018006246A1 (zh) 2016-07-04 2016-07-04 四相机组平面阵列特征点匹配方法及基于其的测量方法
EP16790253.5A EP3285232B1 (en) 2016-07-04 2016-07-04 Method for matching feature points of planar array of four-phase unit and measurement method on basis thereof
KR1020167033144A KR101926953B1 (ko) 2016-07-04 2016-07-04 4카메라 그룹 평면 어레이의 특징점의 매칭 방법 및 그에 기초한 측정 방법
CN201680000645.2A CN107850419B (zh) 2016-07-04 2016-07-04 四相机组平面阵列特征点匹配方法及基于其的测量方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/088420 WO2018006246A1 (zh) 2016-07-04 2016-07-04 四相机组平面阵列特征点匹配方法及基于其的测量方法

Publications (1)

Publication Number Publication Date
WO2018006246A1 true WO2018006246A1 (zh) 2018-01-11

Family

ID=60901500

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/088420 WO2018006246A1 (zh) 2016-07-04 2016-07-04 四相机组平面阵列特征点匹配方法及基于其的测量方法

Country Status (6)

Country Link
US (1) US10107617B2 (zh)
EP (1) EP3285232B1 (zh)
JP (1) JP6453908B2 (zh)
KR (1) KR101926953B1 (zh)
CN (1) CN107850419B (zh)
WO (1) WO2018006246A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109631829A (zh) * 2018-12-17 2019-04-16 南京理工大学 一种自适应快速匹配的双目测距方法

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3360023A4 (en) * 2015-10-09 2018-10-10 SZ DJI Technology Co., Ltd. Salient feature based vehicle positioning
DE102016217792A1 (de) * 2016-09-16 2018-03-22 Xion Gmbh Justiersystem
GB201803286D0 (en) 2018-02-28 2018-04-11 3D Oscopy Ltd Imaging system and method
CN109798831A (zh) * 2018-12-28 2019-05-24 辽宁红沿河核电有限公司 一种用于燃料组件的双目视觉测量方法
WO2021097744A1 (zh) * 2019-11-21 2021-05-27 北京机电研究所有限公司 用于三维尺寸的动态测量装置及其测量方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08285532A (ja) * 1995-04-17 1996-11-01 Mitsubishi Heavy Ind Ltd 移動体の位置計測方法及びその装置
US20050075585A1 (en) * 2002-08-26 2005-04-07 Mun-Sang Kim Apparatus and method for measuring jaw motion
CN102679961A (zh) * 2012-05-23 2012-09-19 武汉大学 便携式四目立体摄影测量系统及方法
CN105627926A (zh) * 2016-01-22 2016-06-01 尹兴 四像机组平面阵列特征点三维测量系统及测量方法

Family Cites Families (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5818959A (en) * 1995-10-04 1998-10-06 Visual Interface, Inc. Method of producing a three-dimensional image from two-dimensional images
JP4723825B2 (ja) 2004-07-15 2011-07-13 株式会社若本製作所 骨組構造
JP2007043466A (ja) * 2005-08-03 2007-02-15 Mitsubishi Electric Corp 画像合成装置並びに多カメラ監視システム
US8253740B2 (en) 2006-02-27 2012-08-28 Koninklijke Philips Electronics N.V. Method of rendering an output image on basis of an input image and a corresponding depth map
CN101033966A (zh) 2007-04-12 2007-09-12 上海交通大学 交通事故现场的摄影测量方法
JP4985241B2 (ja) 2007-08-31 2012-07-25 オムロン株式会社 画像処理装置
US8351685B2 (en) 2007-11-16 2013-01-08 Gwangju Institute Of Science And Technology Device and method for estimating depth map, and method for generating intermediate image and method for encoding multi-view video using the same
KR20090055803A (ko) 2007-11-29 2009-06-03 광주과학기술원 다시점 깊이맵 생성 방법 및 장치, 다시점 영상에서의변이값 생성 방법
US8069190B2 (en) 2007-12-27 2011-11-29 Cloudscale, Inc. System and methodology for parallel stream processing
JP5121477B2 (ja) 2008-01-30 2013-01-16 株式会社ユニバーサルエンターテインメント 紙幣処理装置及び真贋判定方法
JP5173536B2 (ja) 2008-04-02 2013-04-03 シャープ株式会社 撮像装置及び光軸制御方法
CN101382417B (zh) * 2008-10-08 2010-07-07 北京信息科技大学 非接触六自由度位移测量装置
US8395642B2 (en) 2009-03-17 2013-03-12 Mitsubishi Electric Research Laboratories, Inc. Method for virtual image synthesis
US8502862B2 (en) * 2009-09-30 2013-08-06 Disney Enterprises, Inc. Method and system for utilizing pre-existing image layers of a two-dimensional image to create a stereoscopic image
US8643701B2 (en) 2009-11-18 2014-02-04 University Of Illinois At Urbana-Champaign System for executing 3D propagation for depth image-based rendering
CN102668537B (zh) * 2009-12-24 2015-03-11 夏普株式会社 多眼摄像装置及多眼摄像方法
WO2011089528A2 (en) * 2010-01-22 2011-07-28 DenCT Ltd Methods and apparatus for multi-camera x-ray flat panel detector
CN101782386B (zh) 2010-01-28 2011-05-25 南京航空航天大学 非视觉几何的摄像机阵列视频定位方法及系统
US20110249889A1 (en) 2010-04-08 2011-10-13 Sreenivas Kothandaraman Stereoscopic image pair alignment apparatus, systems and methods
US20120105574A1 (en) * 2010-10-28 2012-05-03 Henry Harlyn Baker Panoramic stereoscopic camera
WO2012061571A1 (en) * 2010-11-03 2012-05-10 The Trustees Of Columbia University In The City Of New York Camera systems and methods for gigapixel computational imaging
CN202109886U (zh) * 2011-05-31 2012-01-11 吴江迈为技术有限公司 一种电池片位置检测装置
US9300946B2 (en) 2011-07-08 2016-03-29 Personify, Inc. System and method for generating a depth map and fusing images from a camera array
US9329365B2 (en) * 2011-09-23 2016-05-03 Goodrich Corporation Wide field of view monocentric lens system for infrared aerial reconnaissance camera systems
CN102447934B (zh) * 2011-11-02 2013-09-04 吉林大学 稀疏镜头采集的组合立体图像系统中立体元的合成方法
CN103148806A (zh) 2011-12-07 2013-06-12 史金龙 基于投影与多目视觉的船舶钢板动态三维测量系统
WO2014074202A2 (en) * 2012-08-20 2014-05-15 The Regents Of The University Of California Monocentric lens designs and associated imaging systems having wide field of view and high resolution
WO2014171438A1 (ja) * 2013-04-19 2014-10-23 凸版印刷株式会社 3次元形状計測装置、3次元形状計測方法及び3次元形状計測プログラム
CN103513295B (zh) 2013-09-25 2016-01-27 青海中控太阳能发电有限公司 一种基于多相机实时拍摄与图像处理的天气监测系统与方法
JP2015158827A (ja) * 2014-02-25 2015-09-03 株式会社リコー 座標検出システム、情報処理装置、座標検出方法及びプログラム
US9262801B2 (en) * 2014-04-01 2016-02-16 Gopro, Inc. Image taping in a multi-camera array
CN105825493B (zh) * 2015-01-09 2019-05-03 华为技术有限公司 图像配准方法和装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08285532A (ja) * 1995-04-17 1996-11-01 Mitsubishi Heavy Ind Ltd 移動体の位置計測方法及びその装置
US20050075585A1 (en) * 2002-08-26 2005-04-07 Mun-Sang Kim Apparatus and method for measuring jaw motion
CN102679961A (zh) * 2012-05-23 2012-09-19 武汉大学 便携式四目立体摄影测量系统及方法
CN105627926A (zh) * 2016-01-22 2016-06-01 尹兴 四像机组平面阵列特征点三维测量系统及测量方法

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
FENG, ZEXI ET AL.: "3D Reconstruction Algorithm for Computer Vision Using Four Camera Array", JOURNAL OF COMPUTER APPLICATIONS (CHINA), vol. 31, no. 4, 30 April 2011 (2011-04-30), pages 1043 - 1046 *
See also references of EP3285232A4 *
TU, LIFEN ET AL.: "Real Scene 3D Modelling Based on Four-Camera Vision System", JOURNAL OF APPLIED OPTICS (CHINA), vol. 37, no. 1, 31 January 2016 (2016-01-31), pages 13 - 14 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109631829A (zh) * 2018-12-17 2019-04-16 南京理工大学 一种自适应快速匹配的双目测距方法

Also Published As

Publication number Publication date
EP3285232A1 (en) 2018-02-21
US20180073857A1 (en) 2018-03-15
CN107850419B (zh) 2018-09-04
JP2018526689A (ja) 2018-09-13
KR101926953B1 (ko) 2018-12-07
EP3285232B1 (en) 2019-01-23
KR20180019049A (ko) 2018-02-23
EP3285232A4 (en) 2018-02-21
CN107850419A (zh) 2018-03-27
US10107617B2 (en) 2018-10-23
JP6453908B2 (ja) 2019-01-16

Similar Documents

Publication Publication Date Title
WO2018006246A1 (zh) 四相机组平面阵列特征点匹配方法及基于其的测量方法
JP5480914B2 (ja) 点群データ処理装置、点群データ処理方法、および点群データ処理プログラム
Ahmadabadian et al. A comparison of dense matching algorithms for scaled surface reconstruction using stereo camera rigs
US10452949B2 (en) System and method for scoring clutter for use in 3D point cloud matching in a vision system
WO2011145285A1 (ja) 画像処理装置、画像処理方法およびプログラム
CN109827521B (zh) 一种快速多线结构光视觉测量系统标定方法
CN113223135B (zh) 一种基于特殊复合平面镜虚像成像的三维重建装置和方法
JP2017510793A (ja) 2つのカメラからの曲線の集合の構造化光整合
JP2011123051A (ja) 三次元計測方法
WO2023201578A1 (zh) 单目激光散斑投影系统的外参数标定方法和装置
Fernandez et al. Planar-based camera-projector calibration
CA3233222A1 (en) Method, apparatus and device for photogrammetry, and storage medium
JP2021193400A (ja) アーチファクトを測定するための方法
KR20230065978A (ko) 구조화된 광을 사용하여 장면에서 평면 표면들을 직접 복구하기 위한 시스템, 방법 및 매체
CN109493378B (zh) 一种基于单目视觉与双目视觉相结合的垂直度检测方法
Harvent et al. Multi-view dense 3D modelling of untextured objects from a moving projector-cameras system
JP6486083B2 (ja) 情報処理装置、情報処理方法及びプログラム
CN115375773A (zh) 单目激光散斑投影系统的外参数标定方法和相关装置
CN209820423U (zh) 用于激光平面空间方程快速标定的辅助标靶屏幕装置
Bender et al. A Hand-held Laser Scanner based on Multi-camera Stereo-matching
Bergamini et al. Fundamental Matrix: Digital Camera calibration and Essential Matrix parameters
Vehar et al. Single-shot structured light with diffractive optic elements for real-time 3D imaging in collaborative logistic scenarios
CN212163540U (zh) 全向立体视觉的摄像机配置系统
KR101541114B1 (ko) 3d 스캐너를 위한 추가 촬영 방향 결정 방법
Liu et al. A monocular vision 3D measurement method based on refraction of light

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2016566782

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 15310066

Country of ref document: US

REEP Request for entry into the european phase

Ref document number: 2016790253

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 20167033144

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2016790253

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE