WO2010061861A1 - Système et procédé de traitement des concordances stéréoscopiques, et support d'enregistrement - Google Patents

Système et procédé de traitement des concordances stéréoscopiques, et support d'enregistrement Download PDF

Info

Publication number
WO2010061861A1
WO2010061861A1 PCT/JP2009/069888 JP2009069888W WO2010061861A1 WO 2010061861 A1 WO2010061861 A1 WO 2010061861A1 JP 2009069888 W JP2009069888 W JP 2009069888W WO 2010061861 A1 WO2010061861 A1 WO 2010061861A1
Authority
WO
WIPO (PCT)
Prior art keywords
parallax
image
point
stereo matching
images
Prior art date
Application number
PCT/JP2009/069888
Other languages
English (en)
Japanese (ja)
Inventor
小泉 博一
神谷 俊之
弘之 柳生
Original Assignee
Necシステムテクノロジー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Necシステムテクノロジー株式会社 filed Critical Necシステムテクノロジー株式会社
Priority to KR1020117011815A priority Critical patent/KR101260132B1/ko
Priority to CN2009801472168A priority patent/CN102239503B/zh
Publication of WO2010061861A1 publication Critical patent/WO2010061861A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Definitions

  • the present invention relates to a stereo matching processing device, a stereo matching processing method, and a recording medium. More particularly, the present invention relates to a method of automatically generating three-dimensional data from stereo images.
  • a method of automatically generating three-dimensional data from stereo images a method of generating three-dimensional data [DSM (Digital Surface Model) data] indicating terrain by stereo matching based on an image obtained from an artificial satellite or aircraft is widely used. It has been done. Here, corresponding points in each image capturing the same point are determined for two images captured from different viewpoints with the stereo matching processing, so-called stereo images, and the parallax is used to obtain the triangulation principle. It is to find the depth and shape to the object.
  • DSM Digital Surface Model
  • Patent Document 1 discloses a method using an area correlation method generally used widely.
  • this area correlation method a correlation window is set in the left image and used as a template, and a search window in the right image is moved to calculate a cross correlation coefficient with the template, which is regarded as a high degree of coincidence. It is a method of obtaining corresponding points by searching.
  • the movement range of the search window is limited to the epipolar line direction in the image, so that for each point in the left image, the x direction of the point in the corresponding right image It is possible to obtain the positional displacement amount, that is, the parallax.
  • the epipolar line is a straight line which can be drawn as an existing range of a point corresponding to the point in the other image, with respect to a certain point in one image in the stereo image.
  • the epipolar line is described in "Image analysis handbook" (Mikio Takagi and Yoshihisa Shimoda, Tokyo University Press, January 1991, p. 597-599).
  • the epipolar line direction is different from the scanning line direction of the image
  • the epipolar line direction can be made to coincide with the scanning line direction of the image, and rearrangement can be performed.
  • the method of this coordinate conversion is described in the above-mentioned "image analysis handbook".
  • image analysis handbook In a stereo image rearranged as described above, the movement range of the search window of the corresponding point can be limited to the scanning line, so parallax is obtained as the difference between the x coordinates of corresponding points in the left and right images. .
  • Patent Document 3 describes a technique of a stereo image processing apparatus capable of automatically obtaining three-dimensional data with respect to a complicated object from a satellite stereo image or an aerial stereo image without an operator.
  • erroneous data such as noise or defects in three-dimensional data obtained by stereo processing means is automatically used by using external information of a building etc. obtained from map data of the map data storage means.
  • Correct to The map data storage means provides map data such as the outline of a building to the DSM data automatic correction means.
  • the present invention has been made in view of the above-described circumstances, and its object is to improve the speed and accuracy of stereo matching processing.
  • a stereo matching processing device is An image data acquisition unit that acquires image data of a plurality of images obtained by photographing a predetermined area from a plurality of different positions; A reference disparity setting unit that sets a reference disparity corresponding to the plurality of images; A search range setting unit for setting a predetermined range smaller than the range of the image as a search range of stereo matching based on the point of the image giving the reference parallax set by the reference parallax setting unit; The search range set by the search range setting unit is based on the point of another image giving the reference parallax set by the reference parallax setting unit at an arbitrary point in one image of the plurality of images A search unit for searching for corresponding points of the other image; And the like.
  • a stereo matching method is An image data acquisition step of acquiring image data of a plurality of images obtained by photographing a predetermined area from a plurality of different positions; A reference disparity setting step of setting a reference disparity corresponding to the plurality of images; A search range setting step of setting a predetermined range smaller than the range of the image data as a search range of stereo matching based on the point of the image giving the standard parallax set in the standard parallax setting step; The search range set in the search range setting step on the basis of the point of another image giving the reference parallax set in the reference parallax setting step at any point in one image of the plurality of images A searching step of searching for corresponding points of the other image; And the like.
  • a computer readable recording medium storing a program according to the third aspect of the present invention is Computer, An image data acquisition unit that acquires image data of a plurality of images obtained by photographing a predetermined area from a plurality of different positions; A reference disparity setting unit that sets a reference disparity corresponding to the plurality of images; A search range setting unit for setting a predetermined range smaller than the range of the image as a search range of stereo matching based on the point of the image giving the reference parallax set by the reference parallax setting unit; The search range set by the search range setting unit is based on the point of another image giving the reference parallax set by the reference parallax setting unit at an arbitrary point in one image of the plurality of images A program for functioning as a search unit for searching corresponding points of the other images is recorded.
  • the speed and accuracy of stereo matching processing can be improved in a method of automatically generating three-dimensional data from stereo images.
  • FIG. 13 is a flowchart showing an example of the operation of height positioning processing according to Embodiment 2.
  • FIG. 16 is a block diagram showing an example of configuration of a stereo image processing device 1 according to a third embodiment.
  • FIG. 16 is a flowchart showing an example of the operation of height positioning processing according to Embodiment 3.
  • FIG. 18 is a block diagram showing an example of configuration of a stereo image processing device 1 according to a fourth embodiment.
  • FIG. 16 is a flowchart showing an example of the operation of height positioning processing according to Embodiment 4.
  • FIG. 18 is a block diagram showing an example of configuration of a stereo image processing device 1 according to a fifth embodiment.
  • FIG. 21 is a flowchart showing an example of the operation of height positioning processing according to Embodiment 5.
  • FIG. FIG. 6 is a block diagram showing an example of a physical configuration when the stereo image processing apparatus 1 is mounted on a computer.
  • FIG. 1 is a block diagram showing an example of the configuration of a stereo image processing apparatus according to Embodiment 1 of the present invention.
  • the stereo image processing device 1 includes an image data input unit 10, a stereo matching unit 11, a reference disparity setting unit 12, and a search range setting unit 13.
  • the stereo image processing device 1 is connected to the height calculator 2.
  • the image data input unit 10 has a function of inputting image data, and inputs a plurality of image data used for stereo matching processing.
  • the image data is, for example, an aerial photograph image converted into a digital image.
  • the image data includes the position at which the image was captured, the direction in which the image was captured, and the angle of view.
  • FIG. 2 schematically shows an example of an aerial photograph to be converted into image data.
  • the aerial photograph shown in FIG. 2 is composed of an aerial photograph 101A and an aerial photograph 101B taken continuously from an aircraft flying above.
  • the aerial photograph 101A and the aerial photograph 101B are taken with 60% overlap in the traveling direction of the aircraft.
  • the overlap part is an image obtained by photographing the same area from different positions.
  • the image in the present embodiment is an image generated by digital conversion of an aerial photograph such as the aerial photograph 101A and the aerial photograph 101B.
  • the image used in the present invention is not limited to an aerial image, and it is an image from a digitized satellite image, a digital image taken with a common digital camera, and an analog picture taken with a common analog camera Or digital images digitized by
  • the stereo matching unit 11 in FIG. 1 detects a position in an image that captures the same point for a plurality of pieces of image data obtained by capturing the same area from different positions. That is, a set of points corresponding to the same point of a plurality of images is detected. The set of points corresponding to the same point is usually detected from the position where the correlation coefficient becomes maximum, by correlating the image of the corresponding small area in the two images.
  • stereo matching processing there are various methods for performing stereo matching processing, such as a method of obtaining and correlating common feature amounts, and a method of obtaining correlation between left and right images, but there are various methods for performing stereo matching processing in this embodiment. There is no limitation on the method used. For example, stereo matching processing described in Japanese Patent Publication No. 8-16930 may be used.
  • the height calculator 2 generates DSM data according to the principle of triangulation from the parallax of the corresponding points obtained by the stereo matching process. For example, since the position of the corresponding feature generates a predetermined positional deviation (parallax) between a pair of aerial photographs 101A and 101B, the stereo matching process measures the positional deviation by measuring the positional deviation. The height of the surface layer of the feature including the elevation value and the horizontal coordinate, that is, three-dimensional data are determined.
  • FIG. 3 is a schematic view showing an example of the real world ground state.
  • FIG. 3 shows a cross section of a part of the real world, in which features exist on topographical features.
  • FIG. 4 is a schematic view showing DSM data generated by stereo matching processing from an image of a part of the real world shown in FIG.
  • the DSM data represents the top surface height data, and therefore, for the ground surface hidden by a roof or the like, it represents the height of the roof including the elevation value.
  • the reference disparity setting unit 12 of FIG. 1 sets a disparity which gives a disparity of a reference for searching for a corresponding point in two images. For example, it is set based on the parallax which gives the height of the reference
  • the search range setting unit 13 sets a range in which the stereo matching unit 11 searches for a corresponding point based on the point of the image to which the reference parallax set by the reference parallax setting unit 12 is given. Usually, a predetermined range smaller than the range of the image is set as a search range for stereo matching.
  • FIG. 4 conceptually shows heights corresponding to the reference elevation and the search range.
  • a surface represented by a line SL of a predetermined height H from the height G of the origin of the altitude of the aircraft is taken as the reference elevation.
  • the range between the line BL and the line UL is set as the height SH corresponding to the search range in the real space.
  • FIG. 5 shows a search plane that describes the search range.
  • FIG. 5 shows the scanning line (epipolar line) A of the image converted from the aerial photograph 101A and the scanning line (epipolar line) B corresponding to the scanning line A of the image converted from the aerial photograph 101B arranged vertically. is there.
  • the plane defined by the axes AB is generally referred to as the search plane.
  • the central positions of the blocks on A and B are represented by vertical lines and horizontal lines, respectively, and one correspondence between the two images is represented by the intersection of vertical lines and horizontal lines.
  • the stereo matching unit 11 searches corresponding points of the two images on the scan line A and the scan line B.
  • the 45-degree angle line in the search plane shows a constant parallax, ie, a constant height, in the two images.
  • the line gl is, for example, a line of parallax corresponding to the height of a ground reference, and represents the origin of the altitude of the aircraft.
  • the line sp is a height giving the reference parallax and corresponds to the line SL in FIG.
  • the width SP corresponds to the height H in FIG.
  • the line u in FIG. 5 corresponds to the line UL in FIG. 4 and the line 1 corresponds to the line BL in FIG.
  • the range of the width R enclosed by the line u and the line 1 indicates the search range.
  • the stereo matching unit 11 searches for combinations of correspondences between AB, for example, between the line u and the line 1 in FIG.
  • the point on scan line B that corresponds to a point on scan line A is searched in the portion between vertical line u and line 1 that passes through that point on scan line A.
  • the portion between the line u passing through the point of the scan line B and the line u and the line l is searched.
  • FIG. 6 shows an example of the search range.
  • the range for searching for the feature having the maximum height assumed including that can be defined.
  • the parallax corresponding to the unevenness is represented by a line ls.
  • the search range is indicated by a line u and a line l of a range R including the height difference of the relief with reference to the parallax sp obtained by adding the height of the assumed average of the feature to the average height of the relief.
  • the stereo matching process can be performed quickly by setting the search range including the height of the feature assumed to be the elevation difference.
  • the search range is limited, the possibility of erroneous recognition of corresponding points of two images is reduced.
  • FIG. 7 is a flowchart showing an example of the operation of the height positioning process according to the first embodiment.
  • the image data input unit 10 inputs a plurality of images to be subjected to stereo matching processing (step S11).
  • the reference parallax setting unit 12 sets, for example, a reference parallax from the average elevation of the regions indicated by the plurality of input images and the height of the assumed feature (step S12).
  • the search range setting unit 13 sets a search range for the reference disparity from the elevation difference of the area and the height of the feature (step S13).
  • the stereo matching unit 11 searches for the corresponding point of the other image in the set search range with reference to the point of the other image giving the reference parallax with respect to the point of one image (step S14).
  • the height calculation unit 2 calculates the height corresponding to the point and the coordinates on the map of the point from the position in the image of each point associated with a plurality of images (step S15).
  • the stereo matching process can be performed quickly by setting the reference parallax and the search range including the height of the feature assumed to be the height difference. .
  • the search range is limited, the possibility of erroneous recognition of corresponding points of two images is reduced.
  • FIG. 8 is a block diagram showing an example of the configuration of a stereo image processing apparatus 1 according to Embodiment 2 of the present invention.
  • a reference disparity and a search range are set based on elevation data of map data.
  • the stereo image processing apparatus 1 of FIG. 8 includes a map data input unit 14 and a region division unit 15 in addition to the configuration of the first embodiment.
  • the map data input unit 14 inputs map data of a target area of the input image data.
  • the map data contains elevation data at each point of the map mesh.
  • the area dividing unit 15 divides the input image data according to the elevation data. If the elevation difference in elevation data is small, the entire image may not be divided into one area. If the height difference in the map data of the range of the image exceeds the predetermined range, the image is divided into a plurality of regions.
  • the reference disparity setting unit 12 sets, for each of the areas divided by the area dividing unit 15, a reference disparity based on the elevation data.
  • the search range setting unit 13 also sets a search range for each of the divided areas, in consideration of the height of the feature assumed to be the height difference of the elevation data.
  • the stereo matching unit 11 extracts corresponding pairs of points by searching corresponding points of the plurality of images in accordance with the reference disparity and the search range set for each of the divided regions.
  • the height calculator 2 generates DSM data according to the principle of triangulation from the parallax of the corresponding points obtained by the stereo matching process.
  • FIG. 9 shows an example of a search plane in which a reference disparity and a search range are set for each divided area.
  • a search plane in which a reference disparity and a search range are set for each divided area.
  • it is divided
  • the reference disparity in each area is sp1, sp2, sp3, and sp4.
  • a range of width R1 bounded by u1 and l1 is set as the search range.
  • a range of width R3 bordered by u3 and 13, and a range of width R4 bordered by u4 and 14 are respectively set as search ranges.
  • the search range is smaller than that in FIG.
  • the range of the image is divided into a plurality of areas, and the search range is set to each divided area, so that the search range can be limited to the range in which corresponding points exist.
  • stereo matching processing can be performed quickly and accurately.
  • Region division can be set so that the height difference in the region is equal to or less than a predetermined value.
  • the number of divisions may be determined according to the height difference of the image range.
  • FIG. 10 is a flowchart showing an example of the operation of the height positioning process according to the second embodiment.
  • the map data input unit 14 inputs map data of the target area (step S22).
  • the area dividing unit 15 divides the image into areas based on the elevation data included in the map data (step S23).
  • the reference disparity setting unit 12 sets a reference disparity for each of the divided areas based on the elevation data and the divided areas (step S24).
  • the search range setting unit 13 sets a search range for each area according to the height difference of the divided areas (step S25).
  • the stereo matching unit 11 searches for a corresponding point of the other image in the search range set for each region, based on the point of the other image giving the reference parallax, with respect to the point of one image (Step S26).
  • the height calculation unit 2 calculates the height corresponding to the point and the coordinates on the map of the point from the position in the image of each point associated with a plurality of images (step S27).
  • the search range can be limited to a small range in which the corresponding points exist in accordance with the elevation data of the map data.
  • stereo matching processing can be performed quickly and accurately.
  • FIG. 11 is a block diagram showing a configuration example of the stereo image processing device 1 according to the third embodiment.
  • the stereo image processing apparatus 1 of the third embodiment includes a reference point setting unit 16 in addition to the configuration of the first embodiment.
  • the reference point setting unit 16 selects, from the image input by the image data input unit 10, a reference point for setting a reference parallax according to a predetermined method. For example, the reference points randomly select a number of points according to the size and scale of the image. In addition, points of a predetermined distribution may be selected.
  • the reference point setting unit 16 causes the stereo matching unit 11 to extract a set of corresponding points in the plurality of images for the selected point. Also, three-dimensional data of the selected point is calculated. Then, three-dimensional data of the selected point is sent to the reference disparity setting unit 12.
  • the reference disparity setting unit 12 sets a reference disparity from the three-dimensional data received from the reference point setting unit 16 in a predetermined method. For example, the average or the median of the heights of three-dimensional data is used as a reference elevation, and the parallax corresponding to the reference elevation is used as a reference parallax. Alternatively, the reference disparity may be set for each of the predetermined divided areas on average.
  • the search range setting unit 13 sets a search range in accordance with the reference disparity.
  • the search range can be set in consideration of the variance of the height of the three-dimensional data of the selected point.
  • the search range may be set for each of the divided areas.
  • the stereo matching unit 11 searches for corresponding points of a plurality of images in accordance with the set reference parallax search range, and extracts a set of corresponding points.
  • a corresponding point is searched in the search range.
  • the height calculator 2 generates DSM data according to the principle of triangulation from the parallax of the corresponding points obtained by the stereo matching process.
  • FIG. 12 is a flowchart showing an example of the operation of the height positioning process according to the third embodiment.
  • the reference point setting unit 16 selects a reference point in the image by a predetermined method (step S32).
  • the stereo matching unit 11 performs stereo matching processing on the selected reference point, and extracts a corresponding set of a plurality of images (step S33).
  • the reference point setting unit 16 calculates the height of the reference point from the corresponding set of selected points (step S34).
  • the reference disparity setting unit 12 sets a reference disparity based on the height of the reference point (step S35).
  • the reference parallax setting unit 12 may set the reference parallax based on the set of correspondences of the selected points (the reference point setting unit 16 may not calculate the height of the reference point).
  • the search range setting unit 13 sets a search range in accordance with the reference disparity (step S36).
  • the stereo matching unit 11 searches for a corresponding point of the other image in the search range set for each region, based on the point of the other image giving the reference parallax, with respect to the point of one image (Step S37).
  • the height calculation unit 2 calculates the height corresponding to the point and the coordinates on the map of the point from the position in the image of each point associated with a plurality of images (step S38).
  • the search range can be limited to a small range in which the corresponding points exist so as to fit the image.
  • stereo matching processing can be performed quickly and accurately.
  • FIG. 13 is a block diagram showing a configuration example of the stereo image processing device 1 according to the fourth embodiment.
  • the stereo image processing apparatus 1 of the fourth embodiment externally inputs the reference point of the third embodiment.
  • the stereo image processing apparatus 1 of FIG. 13 includes a reference point input unit 17 in addition to the configuration of the third embodiment.
  • the reference point input unit 17 inputs data indicating the position of the reference point in the image.
  • the image data may be displayed on a display device (not shown) and the point selected on the screen may be input.
  • coordinates in the image may be input as data.
  • the operations of the reference point setting unit 16, the reference disparity setting unit 12, the search range setting unit 13, and the stereo matching unit 11 are the same as in the third embodiment.
  • a reference point considered to be appropriate from the input image For example, it is considered appropriate for the standard elevation, such as aviation marker lights such as high-rise buildings, feature points of steel towers, feature points of artificial features such as buildings, etc. .
  • FIG. 14 is a flowchart illustrating an example of the operation of the height adjustment process according to the fourth embodiment.
  • the reference point input unit 17 inputs data indicating the position of the reference point in the image (step S42). There may be a plurality of reference points to be selected.
  • the reference point setting unit 16 sends the position of the input point to the stereo matching unit 11, and the stereo matching unit 11 performs stereo matching processing on the selected reference points to extract a corresponding set of a plurality of images. (Step S43).
  • the reference point setting unit 16 calculates the height of the reference point from the corresponding set of selected points (step S44). Thereafter, the corresponding height calculation (step S48) from the reference parallax setting (step S45) is the same as steps S35 to S38 in FIG.
  • stereo image processing apparatus 1 of the fourth embodiment in addition to the third embodiment, it is possible to select a point with which the stereo matching process can be accurately performed with a parallax of a suitable reference in accordance with the image. As a result, stereo matching processing can be performed quickly and accurately.
  • FIG. 15 is a block diagram showing a configuration example of the stereo image processing device 1 according to the fifth embodiment.
  • the stereo image processing device 1 according to the fifth embodiment externally receives the reference point according to the third embodiment and a set corresponding to stereo matching.
  • the stereo image processing device 1 of FIG. 15 includes a reference corresponding point input unit 18 in place of the reference point input unit 17 and the reference point setting unit 16 of the third embodiment.
  • the reference corresponding point input unit 18 inputs data indicating the position in the image of the reference point and the position of the point in the other image corresponding to the stereo matching. For example, two image data may be displayed on a display (not shown), and a set of corresponding points selected on the screen may be input. Also, the coordinates of a set of corresponding points in two images may be input as data.
  • the operations of the reference disparity setting unit 12, the search range setting unit 13, and the stereo matching unit 11 are the same as in the first embodiment.
  • a reference corresponding point that is considered appropriate from the input image. For example, it is possible to select corresponding points that are considered to be appropriate for the standard elevation and that are stereo-matched, such as aviation marker lights such as high-rise buildings, feature points of steel towers, and feature points of artificial features such as buildings.
  • FIG. 16 is a flowchart showing an example of the operation of the height positioning process according to the fifth embodiment.
  • the reference corresponding point input unit 18 indicates the positions of reference corresponding points corresponding to stereo matching in two images among those images.
  • Data is input (step S52). There may be a plurality of reference points to be selected.
  • step S56 the corresponding height calculation (step S56) from the reference parallax setting (step S53) is the same as steps S35 to S38 in FIG.
  • the reference parallax and the search range are set by selecting the corresponding points that are stereo matched with the reference parallax appropriate to the image. Can. As a result, stereo matching processing can be performed quickly and accurately.
  • FIG. 17 is a block diagram showing an example of a physical configuration when the stereo image processing apparatus 1 is mounted on a computer.
  • the stereo image processing device 1 according to the present embodiment can be realized by the same hardware configuration as a general computer device.
  • the stereo image processing apparatus 1 includes a control unit 21, a main storage unit 22, an external storage unit 23, an operation unit 24, a display unit 25, and an input / output unit 26, as shown in FIG.
  • the main storage unit 22, the external storage unit 23, the operation unit 24, the display unit 25, and the input / output unit 26 are all connected to the control unit 21 via the internal bus 20.
  • the control unit 21 includes a CPU (Central Processing Unit) or the like, and executes stereo matching processing in accordance with the control program 30 stored in the external storage unit 23.
  • CPU Central Processing Unit
  • the main storage unit 22 comprises a RAM (Random-Access Memory) or the like, loads the control program 30 stored in the external storage unit 23, and is used as a work area of the control unit 21.
  • RAM Random-Access Memory
  • the external storage unit 23 comprises a non-volatile memory such as a flash memory, a hard disk, a DVD-RAM (Digital Versatile Disc Random-Access Memory), a DVD-RW (Digital Versatile Disc Rewritable), etc.
  • a control program 30 to be performed is stored in advance, and data stored by the control program 30 is supplied to the control unit 21 according to an instruction of the control unit 21, and the data supplied from the control unit 21 is stored.
  • the operation unit 24 includes a keyboard and a pointing device such as a mouse, and an interface device for connecting the keyboard and the pointing device to the internal bus 20.
  • Input of image data, instructions for transmission / reception, designation of an image to be displayed, a position in an image of a reference point for setting a reference altitude, and the like are input through the operation unit 24 and supplied to the control unit 21.
  • the display unit 25 is configured of a CRT (Cathode Ray Tube), an LCD (Liquid Crystal Display), or the like, and displays an image or a result of stereo matching processing.
  • CTR Cathode Ray Tube
  • LCD Liquid Crystal Display
  • the input / output unit 26 includes a wireless transceiver, a wireless modem or a network termination device, and a serial interface or a LAN (Local Area Network) interface connected with them. Image data can be received through the input / output unit 26 and the calculated result can be transmitted.
  • a wireless transceiver a wireless modem or a network termination device
  • a serial interface or a LAN (Local Area Network) interface connected with them.
  • Image data can be received through the input / output unit 26 and the calculated result can be transmitted.
  • Image data input unit 10 stereo matching unit 11, reference disparity setting unit 12, search range setting unit 13, map data input unit 14, area division unit of stereo image processing apparatus 1 shown in FIGS. 1, 8, 11, 13 or 15
  • the control program 30 controls the processing of the control unit 21, the main storage unit 22, the external storage unit 23, the operation unit 24, the display unit 25, and the processing of the reference point setting unit 16, the reference point input unit 17 and the reference corresponding point input unit 18. And the processing by using the input / output unit 26 as a resource.
  • composition is included as a suitable modification of the present invention.
  • the reference disparity setting unit divides the plurality of images into two or more corresponding regions, and sets the reference disparity in each of the regions.
  • the search range setting unit sets a predetermined range smaller than the range of the image as a search range of stereo matching, based on the point of the image giving the reference parallax set in each of the regions. It is characterized by
  • a map data acquisition unit for acquiring elevation data of a map corresponding to the plurality of images
  • the reference parallax setting unit sets the reference parallax based on the elevation data acquired by the map data acquisition unit.
  • the reference parallax setting unit may calculate parallax by stereo matching for points selected from the plurality of images by a predetermined method, and set the reference parallax based on the calculated parallax.
  • a reference point input unit configured to obtain a position in the image of a point for which parallax is to be obtained among the plurality of images.
  • the reference parallax setting unit may calculate the parallax by stereo matching for the point acquired by the reference point input unit, and set the reference parallax based on the calculated parallax.
  • the reference parallax setting unit may set the reference parallax based on the parallax provided by the set of corresponding points input by the corresponding point input unit.
  • the reference parallax setting step divides one set of images to be subjected to the stereo matching process into two or more corresponding areas, and sets the reference parallax in each of the areas.
  • the search range setting step sets a predetermined range smaller than the range of the image as a search range of stereo matching, based on the point of the image giving the reference parallax set in each of the regions. It is characterized by
  • the method includes a map data acquisition step of acquiring elevation data of a map corresponding to the plurality of images;
  • the reference parallax setting step sets the reference parallax based on the elevation data acquired in the map data acquisition step.
  • parallax may be calculated by stereo matching for a point selected from the plurality of images by a predetermined method, and the reference parallax may be set based on the calculated parallax.
  • the reference parallax setting step may calculate the parallax by stereo matching for the point acquired in the reference point input step, and set the reference parallax based on the calculated parallax.
  • the method further comprises a corresponding point input step of inputting a set of corresponding points for stereo matching from the plurality of images
  • the reference parallax setting step may set the reference parallax based on the parallax given by the set of corresponding points input in the corresponding point input step.
  • the central processing unit of the stereo image processing apparatus 1 including the control unit 21, the main storage unit 22, the external storage unit 23, the operation unit 24, the input / output unit 26, and the internal bus 20 is a dedicated system. Regardless, it can be realized using an ordinary computer system.
  • a computer program for executing the above-mentioned operation is stored in a computer readable recording medium (flexible disc, CD-ROM, DVD-ROM, etc.) and distributed, and the computer program is installed in the computer.
  • the stereo image processing apparatus 1 may be configured to execute the above process.
  • the computer program may be stored in a storage device of a server device on a communication network such as the Internet, and the stereo image processing device 1 may be configured by a normal computer system downloading or the like.
  • the application program portion is used as the recording medium or the storage device. It may be stored.
  • the computer program may be posted on a bulletin board (BBS, Bulletin Board System) on a communication network, and the computer program may be distributed via the network. Then, the computer program may be activated and executed in the same manner as other application programs under the control of the OS so that the above-described processing can be executed.
  • BSS bulletin Board System

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)
  • Measurement Of Optical Distance (AREA)
  • Processing Or Creating Images (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

Une unité d'entrée d'images (10) acquiert des images numériques se rapportant à une pluralité d'images obtenues par capture d'une région prédéterminée depuis une pluralité de positions différentes. Une unité de définition de parallaxe de référence (12) définit une parallaxe de référence correspondant à la pluralité d'images. Une unité de définition de plage de recherche (13) définit comme plage de recherche pour mise en concordance stéréoscopique une plage prédéterminée inférieure à la plage d'image considérée en se basant sur le point d'image donnant la parallaxe de référence définie par l'unité de définition de parallaxe de référence (12). Pour un point facultatif se trouvant dans les limites de l'une des images, une unité d'établissement des correspondances stéréoscopiques (11) recherche un point de correspondance de l'autre image dans la plage de recherche définie par l'unité de définition de plage de recherche (13) en se basant sur le point de l'autre image donnant la parallaxe de référence définie par l'unité de définition de parallaxe de référence (12).
PCT/JP2009/069888 2008-11-25 2009-11-25 Système et procédé de traitement des concordances stéréoscopiques, et support d'enregistrement WO2010061861A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020117011815A KR101260132B1 (ko) 2008-11-25 2009-11-25 스테레오 매칭 처리 장치, 스테레오 매칭 처리 방법, 및 기록 매체
CN2009801472168A CN102239503B (zh) 2008-11-25 2009-11-25 立体匹配处理设备、立体匹配处理方法和记录介质

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2008-300221 2008-11-25
JP2008300221A JP5229733B2 (ja) 2008-11-25 2008-11-25 ステレオマッチング処理装置、ステレオマッチング処理方法およびプログラム

Publications (1)

Publication Number Publication Date
WO2010061861A1 true WO2010061861A1 (fr) 2010-06-03

Family

ID=42225734

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/069888 WO2010061861A1 (fr) 2008-11-25 2009-11-25 Système et procédé de traitement des concordances stéréoscopiques, et support d'enregistrement

Country Status (4)

Country Link
JP (1) JP5229733B2 (fr)
KR (1) KR101260132B1 (fr)
CN (1) CN102239503B (fr)
WO (1) WO2010061861A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102650518A (zh) * 2011-02-25 2012-08-29 株式会社理光 测量方法和设备
US9563937B2 (en) 2013-07-12 2017-02-07 Mitsubishi Electric Corporation High-resolution image generation apparatus, high-resolution image generation method, and high-resolution image generation program

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5780413B2 (ja) * 2011-02-18 2015-09-16 国際航業株式会社 堆積量推定方法、堆積量推定図、及び堆積量推定プログラム
JP5769248B2 (ja) * 2011-09-20 2015-08-26 Necソリューションイノベータ株式会社 ステレオマッチング処理装置、ステレオマッチング処理方法、及び、プログラム
US9374571B2 (en) 2011-10-11 2016-06-21 Panasonic Intellectual Property Management Co., Ltd. Image processing device, imaging device, and image processing method
CN104025153B (zh) * 2011-12-30 2017-09-15 英特尔公司 粗到细多个视差候选立体匹配
JP5698301B2 (ja) * 2013-04-22 2015-04-08 株式会社パスコ 図化システム、図化方法及びプログラム
KR101690645B1 (ko) 2015-09-21 2016-12-29 경북대학교 산학협력단 다단계 시차영상 분할이 적용된 시차탐색범위 추정 방법 및 이를 이용한 스테레오 영상 정합장치
CN108377376B (zh) * 2016-11-04 2021-01-26 宁波舜宇光电信息有限公司 视差计算方法,双摄像头模组和电子设备
KR102468897B1 (ko) * 2017-10-16 2022-11-21 삼성전자주식회사 깊이 값을 추정하는 방법 및 장치
CN111311667B (zh) * 2020-02-14 2022-05-17 苏州浪潮智能科技有限公司 一种内容自适应双目匹配方法和装置
KR102520189B1 (ko) * 2021-03-02 2023-04-10 네이버랩스 주식회사 무인 비행체 또는 항공기에 의해 촬영된 항공 영상에 기반하여 hd 맵을 생성하는 방법 및 시스템

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002063580A (ja) * 2000-08-22 2002-02-28 Asia Air Survey Co Ltd 不定形窓を用いた画像間拡張イメージマッチング方法
JP2006113832A (ja) * 2004-10-15 2006-04-27 Canon Inc ステレオ画像処理装置およびプログラム

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4480212B2 (ja) * 1999-11-05 2010-06-16 アジア航測株式会社 空中写真の位置及び姿勢の計算方法
CN1264062C (zh) * 2002-12-31 2006-07-12 清华大学 一种多视角x射线立体成像的方法与系统
KR100739730B1 (ko) * 2005-09-03 2007-07-13 삼성전자주식회사 3d 입체 영상 처리 장치 및 방법
JP4813263B2 (ja) * 2006-06-05 2011-11-09 株式会社トプコン 画像処理装置及びその処理方法

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002063580A (ja) * 2000-08-22 2002-02-28 Asia Air Survey Co Ltd 不定形窓を用いた画像間拡張イメージマッチング方法
JP2006113832A (ja) * 2004-10-15 2006-04-27 Canon Inc ステレオ画像処理装置およびプログラム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102650518A (zh) * 2011-02-25 2012-08-29 株式会社理光 测量方法和设备
US9563937B2 (en) 2013-07-12 2017-02-07 Mitsubishi Electric Corporation High-resolution image generation apparatus, high-resolution image generation method, and high-resolution image generation program

Also Published As

Publication number Publication date
KR20110087303A (ko) 2011-08-02
JP5229733B2 (ja) 2013-07-03
JP2010128622A (ja) 2010-06-10
CN102239503A (zh) 2011-11-09
KR101260132B1 (ko) 2013-05-02
CN102239503B (zh) 2013-11-13

Similar Documents

Publication Publication Date Title
WO2010061861A1 (fr) Système et procédé de traitement des concordances stéréoscopiques, et support d'enregistrement
US11783543B2 (en) Method and system for displaying and navigating an optimal multi-dimensional building model
US8831335B2 (en) Stereo matching processing apparatus, stereo matching processing method and computer-readable recording medium
US8577139B2 (en) Method of orthoimage color correction using multiple aerial images
JP7003594B2 (ja) 3次元点群表示装置、3次元点群表示システム、3次元点群表示方法および3次元点群表示プログラム、記録媒体
JP5311465B2 (ja) ステレオマッチング処理システム、ステレオマッチング処理方法、及びプログラム
CN113409459A (zh) 高精地图的生产方法、装置、设备和计算机存储介质
KR102204043B1 (ko) 실시간 지물이미지 위치 보정을 통한 영상 정밀도 향상 기능의 자동영상처리 시스템
KR20040055510A (ko) 소수의 지상 기준점을 이용한 이코노스 알피씨 데이터갱신방법
CN112967344A (zh) 相机外参标定的方法、设备、存储介质及程序产品
CN110889899A (zh) 一种数字地表模型的生成方法及装置
JP6396982B2 (ja) 空間モデル処理装置
CN111429548B (zh) 数字地图生成方法及系统
JP4592510B2 (ja) 立体地図画像生成装置および方法
CN107478235A (zh) 网络环境下基于模板的动态地图获取系统
CN116805277B (zh) 一种视频监控目标节点像素坐标转换方法及系统
CN108731644B (zh) 基于铅直辅助线的倾斜摄影测图方法及其系统
JP5126020B2 (ja) 画像処理装置、画像処理方法、及びプログラム
JP6876484B2 (ja) データ処理装置、データ処理方法、及び、プログラム
JPH1132250A (ja) 音声案内型景観ラベリング装置およびシステム
JPH1166355A (ja) 変形ラベル型景観ラベリング装置およびシステム
JP2021173801A (ja) 情報処理装置、制御方法、プログラム及び記憶媒体
CN115657072A (zh) 基于航拍实时图像的空间目标定位方法
JPH1131238A (ja) 差分型景観ラベリング装置およびシステム
JPH1131237A (ja) ノイズ除去型景観ラベリング装置およびシステム

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200980147216.8

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09829109

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20117011815

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09829109

Country of ref document: EP

Kind code of ref document: A1