WO2010061861A1 - Stereo matching process device, stereo matching process method, and recording medium - Google Patents

Stereo matching process device, stereo matching process method, and recording medium Download PDF

Info

Publication number
WO2010061861A1
WO2010061861A1 PCT/JP2009/069888 JP2009069888W WO2010061861A1 WO 2010061861 A1 WO2010061861 A1 WO 2010061861A1 JP 2009069888 W JP2009069888 W JP 2009069888W WO 2010061861 A1 WO2010061861 A1 WO 2010061861A1
Authority
WO
WIPO (PCT)
Prior art keywords
parallax
image
point
stereo matching
images
Prior art date
Application number
PCT/JP2009/069888
Other languages
French (fr)
Japanese (ja)
Inventor
小泉 博一
神谷 俊之
弘之 柳生
Original Assignee
Necシステムテクノロジー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Necシステムテクノロジー株式会社 filed Critical Necシステムテクノロジー株式会社
Priority to KR1020117011815A priority Critical patent/KR101260132B1/en
Priority to CN2009801472168A priority patent/CN102239503B/en
Publication of WO2010061861A1 publication Critical patent/WO2010061861A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Definitions

  • the present invention relates to a stereo matching processing device, a stereo matching processing method, and a recording medium. More particularly, the present invention relates to a method of automatically generating three-dimensional data from stereo images.
  • a method of automatically generating three-dimensional data from stereo images a method of generating three-dimensional data [DSM (Digital Surface Model) data] indicating terrain by stereo matching based on an image obtained from an artificial satellite or aircraft is widely used. It has been done. Here, corresponding points in each image capturing the same point are determined for two images captured from different viewpoints with the stereo matching processing, so-called stereo images, and the parallax is used to obtain the triangulation principle. It is to find the depth and shape to the object.
  • DSM Digital Surface Model
  • Patent Document 1 discloses a method using an area correlation method generally used widely.
  • this area correlation method a correlation window is set in the left image and used as a template, and a search window in the right image is moved to calculate a cross correlation coefficient with the template, which is regarded as a high degree of coincidence. It is a method of obtaining corresponding points by searching.
  • the movement range of the search window is limited to the epipolar line direction in the image, so that for each point in the left image, the x direction of the point in the corresponding right image It is possible to obtain the positional displacement amount, that is, the parallax.
  • the epipolar line is a straight line which can be drawn as an existing range of a point corresponding to the point in the other image, with respect to a certain point in one image in the stereo image.
  • the epipolar line is described in "Image analysis handbook" (Mikio Takagi and Yoshihisa Shimoda, Tokyo University Press, January 1991, p. 597-599).
  • the epipolar line direction is different from the scanning line direction of the image
  • the epipolar line direction can be made to coincide with the scanning line direction of the image, and rearrangement can be performed.
  • the method of this coordinate conversion is described in the above-mentioned "image analysis handbook".
  • image analysis handbook In a stereo image rearranged as described above, the movement range of the search window of the corresponding point can be limited to the scanning line, so parallax is obtained as the difference between the x coordinates of corresponding points in the left and right images. .
  • Patent Document 3 describes a technique of a stereo image processing apparatus capable of automatically obtaining three-dimensional data with respect to a complicated object from a satellite stereo image or an aerial stereo image without an operator.
  • erroneous data such as noise or defects in three-dimensional data obtained by stereo processing means is automatically used by using external information of a building etc. obtained from map data of the map data storage means.
  • Correct to The map data storage means provides map data such as the outline of a building to the DSM data automatic correction means.
  • the present invention has been made in view of the above-described circumstances, and its object is to improve the speed and accuracy of stereo matching processing.
  • a stereo matching processing device is An image data acquisition unit that acquires image data of a plurality of images obtained by photographing a predetermined area from a plurality of different positions; A reference disparity setting unit that sets a reference disparity corresponding to the plurality of images; A search range setting unit for setting a predetermined range smaller than the range of the image as a search range of stereo matching based on the point of the image giving the reference parallax set by the reference parallax setting unit; The search range set by the search range setting unit is based on the point of another image giving the reference parallax set by the reference parallax setting unit at an arbitrary point in one image of the plurality of images A search unit for searching for corresponding points of the other image; And the like.
  • a stereo matching method is An image data acquisition step of acquiring image data of a plurality of images obtained by photographing a predetermined area from a plurality of different positions; A reference disparity setting step of setting a reference disparity corresponding to the plurality of images; A search range setting step of setting a predetermined range smaller than the range of the image data as a search range of stereo matching based on the point of the image giving the standard parallax set in the standard parallax setting step; The search range set in the search range setting step on the basis of the point of another image giving the reference parallax set in the reference parallax setting step at any point in one image of the plurality of images A searching step of searching for corresponding points of the other image; And the like.
  • a computer readable recording medium storing a program according to the third aspect of the present invention is Computer, An image data acquisition unit that acquires image data of a plurality of images obtained by photographing a predetermined area from a plurality of different positions; A reference disparity setting unit that sets a reference disparity corresponding to the plurality of images; A search range setting unit for setting a predetermined range smaller than the range of the image as a search range of stereo matching based on the point of the image giving the reference parallax set by the reference parallax setting unit; The search range set by the search range setting unit is based on the point of another image giving the reference parallax set by the reference parallax setting unit at an arbitrary point in one image of the plurality of images A program for functioning as a search unit for searching corresponding points of the other images is recorded.
  • the speed and accuracy of stereo matching processing can be improved in a method of automatically generating three-dimensional data from stereo images.
  • FIG. 13 is a flowchart showing an example of the operation of height positioning processing according to Embodiment 2.
  • FIG. 16 is a block diagram showing an example of configuration of a stereo image processing device 1 according to a third embodiment.
  • FIG. 16 is a flowchart showing an example of the operation of height positioning processing according to Embodiment 3.
  • FIG. 18 is a block diagram showing an example of configuration of a stereo image processing device 1 according to a fourth embodiment.
  • FIG. 16 is a flowchart showing an example of the operation of height positioning processing according to Embodiment 4.
  • FIG. 18 is a block diagram showing an example of configuration of a stereo image processing device 1 according to a fifth embodiment.
  • FIG. 21 is a flowchart showing an example of the operation of height positioning processing according to Embodiment 5.
  • FIG. FIG. 6 is a block diagram showing an example of a physical configuration when the stereo image processing apparatus 1 is mounted on a computer.
  • FIG. 1 is a block diagram showing an example of the configuration of a stereo image processing apparatus according to Embodiment 1 of the present invention.
  • the stereo image processing device 1 includes an image data input unit 10, a stereo matching unit 11, a reference disparity setting unit 12, and a search range setting unit 13.
  • the stereo image processing device 1 is connected to the height calculator 2.
  • the image data input unit 10 has a function of inputting image data, and inputs a plurality of image data used for stereo matching processing.
  • the image data is, for example, an aerial photograph image converted into a digital image.
  • the image data includes the position at which the image was captured, the direction in which the image was captured, and the angle of view.
  • FIG. 2 schematically shows an example of an aerial photograph to be converted into image data.
  • the aerial photograph shown in FIG. 2 is composed of an aerial photograph 101A and an aerial photograph 101B taken continuously from an aircraft flying above.
  • the aerial photograph 101A and the aerial photograph 101B are taken with 60% overlap in the traveling direction of the aircraft.
  • the overlap part is an image obtained by photographing the same area from different positions.
  • the image in the present embodiment is an image generated by digital conversion of an aerial photograph such as the aerial photograph 101A and the aerial photograph 101B.
  • the image used in the present invention is not limited to an aerial image, and it is an image from a digitized satellite image, a digital image taken with a common digital camera, and an analog picture taken with a common analog camera Or digital images digitized by
  • the stereo matching unit 11 in FIG. 1 detects a position in an image that captures the same point for a plurality of pieces of image data obtained by capturing the same area from different positions. That is, a set of points corresponding to the same point of a plurality of images is detected. The set of points corresponding to the same point is usually detected from the position where the correlation coefficient becomes maximum, by correlating the image of the corresponding small area in the two images.
  • stereo matching processing there are various methods for performing stereo matching processing, such as a method of obtaining and correlating common feature amounts, and a method of obtaining correlation between left and right images, but there are various methods for performing stereo matching processing in this embodiment. There is no limitation on the method used. For example, stereo matching processing described in Japanese Patent Publication No. 8-16930 may be used.
  • the height calculator 2 generates DSM data according to the principle of triangulation from the parallax of the corresponding points obtained by the stereo matching process. For example, since the position of the corresponding feature generates a predetermined positional deviation (parallax) between a pair of aerial photographs 101A and 101B, the stereo matching process measures the positional deviation by measuring the positional deviation. The height of the surface layer of the feature including the elevation value and the horizontal coordinate, that is, three-dimensional data are determined.
  • FIG. 3 is a schematic view showing an example of the real world ground state.
  • FIG. 3 shows a cross section of a part of the real world, in which features exist on topographical features.
  • FIG. 4 is a schematic view showing DSM data generated by stereo matching processing from an image of a part of the real world shown in FIG.
  • the DSM data represents the top surface height data, and therefore, for the ground surface hidden by a roof or the like, it represents the height of the roof including the elevation value.
  • the reference disparity setting unit 12 of FIG. 1 sets a disparity which gives a disparity of a reference for searching for a corresponding point in two images. For example, it is set based on the parallax which gives the height of the reference
  • the search range setting unit 13 sets a range in which the stereo matching unit 11 searches for a corresponding point based on the point of the image to which the reference parallax set by the reference parallax setting unit 12 is given. Usually, a predetermined range smaller than the range of the image is set as a search range for stereo matching.
  • FIG. 4 conceptually shows heights corresponding to the reference elevation and the search range.
  • a surface represented by a line SL of a predetermined height H from the height G of the origin of the altitude of the aircraft is taken as the reference elevation.
  • the range between the line BL and the line UL is set as the height SH corresponding to the search range in the real space.
  • FIG. 5 shows a search plane that describes the search range.
  • FIG. 5 shows the scanning line (epipolar line) A of the image converted from the aerial photograph 101A and the scanning line (epipolar line) B corresponding to the scanning line A of the image converted from the aerial photograph 101B arranged vertically. is there.
  • the plane defined by the axes AB is generally referred to as the search plane.
  • the central positions of the blocks on A and B are represented by vertical lines and horizontal lines, respectively, and one correspondence between the two images is represented by the intersection of vertical lines and horizontal lines.
  • the stereo matching unit 11 searches corresponding points of the two images on the scan line A and the scan line B.
  • the 45-degree angle line in the search plane shows a constant parallax, ie, a constant height, in the two images.
  • the line gl is, for example, a line of parallax corresponding to the height of a ground reference, and represents the origin of the altitude of the aircraft.
  • the line sp is a height giving the reference parallax and corresponds to the line SL in FIG.
  • the width SP corresponds to the height H in FIG.
  • the line u in FIG. 5 corresponds to the line UL in FIG. 4 and the line 1 corresponds to the line BL in FIG.
  • the range of the width R enclosed by the line u and the line 1 indicates the search range.
  • the stereo matching unit 11 searches for combinations of correspondences between AB, for example, between the line u and the line 1 in FIG.
  • the point on scan line B that corresponds to a point on scan line A is searched in the portion between vertical line u and line 1 that passes through that point on scan line A.
  • the portion between the line u passing through the point of the scan line B and the line u and the line l is searched.
  • FIG. 6 shows an example of the search range.
  • the range for searching for the feature having the maximum height assumed including that can be defined.
  • the parallax corresponding to the unevenness is represented by a line ls.
  • the search range is indicated by a line u and a line l of a range R including the height difference of the relief with reference to the parallax sp obtained by adding the height of the assumed average of the feature to the average height of the relief.
  • the stereo matching process can be performed quickly by setting the search range including the height of the feature assumed to be the elevation difference.
  • the search range is limited, the possibility of erroneous recognition of corresponding points of two images is reduced.
  • FIG. 7 is a flowchart showing an example of the operation of the height positioning process according to the first embodiment.
  • the image data input unit 10 inputs a plurality of images to be subjected to stereo matching processing (step S11).
  • the reference parallax setting unit 12 sets, for example, a reference parallax from the average elevation of the regions indicated by the plurality of input images and the height of the assumed feature (step S12).
  • the search range setting unit 13 sets a search range for the reference disparity from the elevation difference of the area and the height of the feature (step S13).
  • the stereo matching unit 11 searches for the corresponding point of the other image in the set search range with reference to the point of the other image giving the reference parallax with respect to the point of one image (step S14).
  • the height calculation unit 2 calculates the height corresponding to the point and the coordinates on the map of the point from the position in the image of each point associated with a plurality of images (step S15).
  • the stereo matching process can be performed quickly by setting the reference parallax and the search range including the height of the feature assumed to be the height difference. .
  • the search range is limited, the possibility of erroneous recognition of corresponding points of two images is reduced.
  • FIG. 8 is a block diagram showing an example of the configuration of a stereo image processing apparatus 1 according to Embodiment 2 of the present invention.
  • a reference disparity and a search range are set based on elevation data of map data.
  • the stereo image processing apparatus 1 of FIG. 8 includes a map data input unit 14 and a region division unit 15 in addition to the configuration of the first embodiment.
  • the map data input unit 14 inputs map data of a target area of the input image data.
  • the map data contains elevation data at each point of the map mesh.
  • the area dividing unit 15 divides the input image data according to the elevation data. If the elevation difference in elevation data is small, the entire image may not be divided into one area. If the height difference in the map data of the range of the image exceeds the predetermined range, the image is divided into a plurality of regions.
  • the reference disparity setting unit 12 sets, for each of the areas divided by the area dividing unit 15, a reference disparity based on the elevation data.
  • the search range setting unit 13 also sets a search range for each of the divided areas, in consideration of the height of the feature assumed to be the height difference of the elevation data.
  • the stereo matching unit 11 extracts corresponding pairs of points by searching corresponding points of the plurality of images in accordance with the reference disparity and the search range set for each of the divided regions.
  • the height calculator 2 generates DSM data according to the principle of triangulation from the parallax of the corresponding points obtained by the stereo matching process.
  • FIG. 9 shows an example of a search plane in which a reference disparity and a search range are set for each divided area.
  • a search plane in which a reference disparity and a search range are set for each divided area.
  • it is divided
  • the reference disparity in each area is sp1, sp2, sp3, and sp4.
  • a range of width R1 bounded by u1 and l1 is set as the search range.
  • a range of width R3 bordered by u3 and 13, and a range of width R4 bordered by u4 and 14 are respectively set as search ranges.
  • the search range is smaller than that in FIG.
  • the range of the image is divided into a plurality of areas, and the search range is set to each divided area, so that the search range can be limited to the range in which corresponding points exist.
  • stereo matching processing can be performed quickly and accurately.
  • Region division can be set so that the height difference in the region is equal to or less than a predetermined value.
  • the number of divisions may be determined according to the height difference of the image range.
  • FIG. 10 is a flowchart showing an example of the operation of the height positioning process according to the second embodiment.
  • the map data input unit 14 inputs map data of the target area (step S22).
  • the area dividing unit 15 divides the image into areas based on the elevation data included in the map data (step S23).
  • the reference disparity setting unit 12 sets a reference disparity for each of the divided areas based on the elevation data and the divided areas (step S24).
  • the search range setting unit 13 sets a search range for each area according to the height difference of the divided areas (step S25).
  • the stereo matching unit 11 searches for a corresponding point of the other image in the search range set for each region, based on the point of the other image giving the reference parallax, with respect to the point of one image (Step S26).
  • the height calculation unit 2 calculates the height corresponding to the point and the coordinates on the map of the point from the position in the image of each point associated with a plurality of images (step S27).
  • the search range can be limited to a small range in which the corresponding points exist in accordance with the elevation data of the map data.
  • stereo matching processing can be performed quickly and accurately.
  • FIG. 11 is a block diagram showing a configuration example of the stereo image processing device 1 according to the third embodiment.
  • the stereo image processing apparatus 1 of the third embodiment includes a reference point setting unit 16 in addition to the configuration of the first embodiment.
  • the reference point setting unit 16 selects, from the image input by the image data input unit 10, a reference point for setting a reference parallax according to a predetermined method. For example, the reference points randomly select a number of points according to the size and scale of the image. In addition, points of a predetermined distribution may be selected.
  • the reference point setting unit 16 causes the stereo matching unit 11 to extract a set of corresponding points in the plurality of images for the selected point. Also, three-dimensional data of the selected point is calculated. Then, three-dimensional data of the selected point is sent to the reference disparity setting unit 12.
  • the reference disparity setting unit 12 sets a reference disparity from the three-dimensional data received from the reference point setting unit 16 in a predetermined method. For example, the average or the median of the heights of three-dimensional data is used as a reference elevation, and the parallax corresponding to the reference elevation is used as a reference parallax. Alternatively, the reference disparity may be set for each of the predetermined divided areas on average.
  • the search range setting unit 13 sets a search range in accordance with the reference disparity.
  • the search range can be set in consideration of the variance of the height of the three-dimensional data of the selected point.
  • the search range may be set for each of the divided areas.
  • the stereo matching unit 11 searches for corresponding points of a plurality of images in accordance with the set reference parallax search range, and extracts a set of corresponding points.
  • a corresponding point is searched in the search range.
  • the height calculator 2 generates DSM data according to the principle of triangulation from the parallax of the corresponding points obtained by the stereo matching process.
  • FIG. 12 is a flowchart showing an example of the operation of the height positioning process according to the third embodiment.
  • the reference point setting unit 16 selects a reference point in the image by a predetermined method (step S32).
  • the stereo matching unit 11 performs stereo matching processing on the selected reference point, and extracts a corresponding set of a plurality of images (step S33).
  • the reference point setting unit 16 calculates the height of the reference point from the corresponding set of selected points (step S34).
  • the reference disparity setting unit 12 sets a reference disparity based on the height of the reference point (step S35).
  • the reference parallax setting unit 12 may set the reference parallax based on the set of correspondences of the selected points (the reference point setting unit 16 may not calculate the height of the reference point).
  • the search range setting unit 13 sets a search range in accordance with the reference disparity (step S36).
  • the stereo matching unit 11 searches for a corresponding point of the other image in the search range set for each region, based on the point of the other image giving the reference parallax, with respect to the point of one image (Step S37).
  • the height calculation unit 2 calculates the height corresponding to the point and the coordinates on the map of the point from the position in the image of each point associated with a plurality of images (step S38).
  • the search range can be limited to a small range in which the corresponding points exist so as to fit the image.
  • stereo matching processing can be performed quickly and accurately.
  • FIG. 13 is a block diagram showing a configuration example of the stereo image processing device 1 according to the fourth embodiment.
  • the stereo image processing apparatus 1 of the fourth embodiment externally inputs the reference point of the third embodiment.
  • the stereo image processing apparatus 1 of FIG. 13 includes a reference point input unit 17 in addition to the configuration of the third embodiment.
  • the reference point input unit 17 inputs data indicating the position of the reference point in the image.
  • the image data may be displayed on a display device (not shown) and the point selected on the screen may be input.
  • coordinates in the image may be input as data.
  • the operations of the reference point setting unit 16, the reference disparity setting unit 12, the search range setting unit 13, and the stereo matching unit 11 are the same as in the third embodiment.
  • a reference point considered to be appropriate from the input image For example, it is considered appropriate for the standard elevation, such as aviation marker lights such as high-rise buildings, feature points of steel towers, feature points of artificial features such as buildings, etc. .
  • FIG. 14 is a flowchart illustrating an example of the operation of the height adjustment process according to the fourth embodiment.
  • the reference point input unit 17 inputs data indicating the position of the reference point in the image (step S42). There may be a plurality of reference points to be selected.
  • the reference point setting unit 16 sends the position of the input point to the stereo matching unit 11, and the stereo matching unit 11 performs stereo matching processing on the selected reference points to extract a corresponding set of a plurality of images. (Step S43).
  • the reference point setting unit 16 calculates the height of the reference point from the corresponding set of selected points (step S44). Thereafter, the corresponding height calculation (step S48) from the reference parallax setting (step S45) is the same as steps S35 to S38 in FIG.
  • stereo image processing apparatus 1 of the fourth embodiment in addition to the third embodiment, it is possible to select a point with which the stereo matching process can be accurately performed with a parallax of a suitable reference in accordance with the image. As a result, stereo matching processing can be performed quickly and accurately.
  • FIG. 15 is a block diagram showing a configuration example of the stereo image processing device 1 according to the fifth embodiment.
  • the stereo image processing device 1 according to the fifth embodiment externally receives the reference point according to the third embodiment and a set corresponding to stereo matching.
  • the stereo image processing device 1 of FIG. 15 includes a reference corresponding point input unit 18 in place of the reference point input unit 17 and the reference point setting unit 16 of the third embodiment.
  • the reference corresponding point input unit 18 inputs data indicating the position in the image of the reference point and the position of the point in the other image corresponding to the stereo matching. For example, two image data may be displayed on a display (not shown), and a set of corresponding points selected on the screen may be input. Also, the coordinates of a set of corresponding points in two images may be input as data.
  • the operations of the reference disparity setting unit 12, the search range setting unit 13, and the stereo matching unit 11 are the same as in the first embodiment.
  • a reference corresponding point that is considered appropriate from the input image. For example, it is possible to select corresponding points that are considered to be appropriate for the standard elevation and that are stereo-matched, such as aviation marker lights such as high-rise buildings, feature points of steel towers, and feature points of artificial features such as buildings.
  • FIG. 16 is a flowchart showing an example of the operation of the height positioning process according to the fifth embodiment.
  • the reference corresponding point input unit 18 indicates the positions of reference corresponding points corresponding to stereo matching in two images among those images.
  • Data is input (step S52). There may be a plurality of reference points to be selected.
  • step S56 the corresponding height calculation (step S56) from the reference parallax setting (step S53) is the same as steps S35 to S38 in FIG.
  • the reference parallax and the search range are set by selecting the corresponding points that are stereo matched with the reference parallax appropriate to the image. Can. As a result, stereo matching processing can be performed quickly and accurately.
  • FIG. 17 is a block diagram showing an example of a physical configuration when the stereo image processing apparatus 1 is mounted on a computer.
  • the stereo image processing device 1 according to the present embodiment can be realized by the same hardware configuration as a general computer device.
  • the stereo image processing apparatus 1 includes a control unit 21, a main storage unit 22, an external storage unit 23, an operation unit 24, a display unit 25, and an input / output unit 26, as shown in FIG.
  • the main storage unit 22, the external storage unit 23, the operation unit 24, the display unit 25, and the input / output unit 26 are all connected to the control unit 21 via the internal bus 20.
  • the control unit 21 includes a CPU (Central Processing Unit) or the like, and executes stereo matching processing in accordance with the control program 30 stored in the external storage unit 23.
  • CPU Central Processing Unit
  • the main storage unit 22 comprises a RAM (Random-Access Memory) or the like, loads the control program 30 stored in the external storage unit 23, and is used as a work area of the control unit 21.
  • RAM Random-Access Memory
  • the external storage unit 23 comprises a non-volatile memory such as a flash memory, a hard disk, a DVD-RAM (Digital Versatile Disc Random-Access Memory), a DVD-RW (Digital Versatile Disc Rewritable), etc.
  • a control program 30 to be performed is stored in advance, and data stored by the control program 30 is supplied to the control unit 21 according to an instruction of the control unit 21, and the data supplied from the control unit 21 is stored.
  • the operation unit 24 includes a keyboard and a pointing device such as a mouse, and an interface device for connecting the keyboard and the pointing device to the internal bus 20.
  • Input of image data, instructions for transmission / reception, designation of an image to be displayed, a position in an image of a reference point for setting a reference altitude, and the like are input through the operation unit 24 and supplied to the control unit 21.
  • the display unit 25 is configured of a CRT (Cathode Ray Tube), an LCD (Liquid Crystal Display), or the like, and displays an image or a result of stereo matching processing.
  • CTR Cathode Ray Tube
  • LCD Liquid Crystal Display
  • the input / output unit 26 includes a wireless transceiver, a wireless modem or a network termination device, and a serial interface or a LAN (Local Area Network) interface connected with them. Image data can be received through the input / output unit 26 and the calculated result can be transmitted.
  • a wireless transceiver a wireless modem or a network termination device
  • a serial interface or a LAN (Local Area Network) interface connected with them.
  • Image data can be received through the input / output unit 26 and the calculated result can be transmitted.
  • Image data input unit 10 stereo matching unit 11, reference disparity setting unit 12, search range setting unit 13, map data input unit 14, area division unit of stereo image processing apparatus 1 shown in FIGS. 1, 8, 11, 13 or 15
  • the control program 30 controls the processing of the control unit 21, the main storage unit 22, the external storage unit 23, the operation unit 24, the display unit 25, and the processing of the reference point setting unit 16, the reference point input unit 17 and the reference corresponding point input unit 18. And the processing by using the input / output unit 26 as a resource.
  • composition is included as a suitable modification of the present invention.
  • the reference disparity setting unit divides the plurality of images into two or more corresponding regions, and sets the reference disparity in each of the regions.
  • the search range setting unit sets a predetermined range smaller than the range of the image as a search range of stereo matching, based on the point of the image giving the reference parallax set in each of the regions. It is characterized by
  • a map data acquisition unit for acquiring elevation data of a map corresponding to the plurality of images
  • the reference parallax setting unit sets the reference parallax based on the elevation data acquired by the map data acquisition unit.
  • the reference parallax setting unit may calculate parallax by stereo matching for points selected from the plurality of images by a predetermined method, and set the reference parallax based on the calculated parallax.
  • a reference point input unit configured to obtain a position in the image of a point for which parallax is to be obtained among the plurality of images.
  • the reference parallax setting unit may calculate the parallax by stereo matching for the point acquired by the reference point input unit, and set the reference parallax based on the calculated parallax.
  • the reference parallax setting unit may set the reference parallax based on the parallax provided by the set of corresponding points input by the corresponding point input unit.
  • the reference parallax setting step divides one set of images to be subjected to the stereo matching process into two or more corresponding areas, and sets the reference parallax in each of the areas.
  • the search range setting step sets a predetermined range smaller than the range of the image as a search range of stereo matching, based on the point of the image giving the reference parallax set in each of the regions. It is characterized by
  • the method includes a map data acquisition step of acquiring elevation data of a map corresponding to the plurality of images;
  • the reference parallax setting step sets the reference parallax based on the elevation data acquired in the map data acquisition step.
  • parallax may be calculated by stereo matching for a point selected from the plurality of images by a predetermined method, and the reference parallax may be set based on the calculated parallax.
  • the reference parallax setting step may calculate the parallax by stereo matching for the point acquired in the reference point input step, and set the reference parallax based on the calculated parallax.
  • the method further comprises a corresponding point input step of inputting a set of corresponding points for stereo matching from the plurality of images
  • the reference parallax setting step may set the reference parallax based on the parallax given by the set of corresponding points input in the corresponding point input step.
  • the central processing unit of the stereo image processing apparatus 1 including the control unit 21, the main storage unit 22, the external storage unit 23, the operation unit 24, the input / output unit 26, and the internal bus 20 is a dedicated system. Regardless, it can be realized using an ordinary computer system.
  • a computer program for executing the above-mentioned operation is stored in a computer readable recording medium (flexible disc, CD-ROM, DVD-ROM, etc.) and distributed, and the computer program is installed in the computer.
  • the stereo image processing apparatus 1 may be configured to execute the above process.
  • the computer program may be stored in a storage device of a server device on a communication network such as the Internet, and the stereo image processing device 1 may be configured by a normal computer system downloading or the like.
  • the application program portion is used as the recording medium or the storage device. It may be stored.
  • the computer program may be posted on a bulletin board (BBS, Bulletin Board System) on a communication network, and the computer program may be distributed via the network. Then, the computer program may be activated and executed in the same manner as other application programs under the control of the OS so that the above-described processing can be executed.
  • BSS bulletin Board System

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)
  • Image Analysis (AREA)
  • Processing Or Creating Images (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

An image input unit (10) acquires image data relating to a plurality of images obtained by capturing a predetermined region from a plurality of different positions.  A reference parallax setting unit (12) sets a reference parallax corresponding to the plurality of images.  A search range setting unit (13) sets as a stereo matching search range, a predetermined range smaller than the image range based on the image point giving the reference parallax set by the reference parallax setting unit (12).  For an optional point within one of the images, a stereo matching unit (11) searches for a corresponding point of the other image in the search range set by the search range setting unit (13) based on the point of the other image giving the reference parallax set by the reference parallax setting unit (12).

Description

ステレオマッチング処理装置、ステレオマッチング処理方法および記録媒体Stereo matching processing device, stereo matching processing method and recording medium
 本発明は、ステレオマッチング処理装置、ステレオマッチング処理方法および記録媒体に関する。より詳しくは、ステレオ画像から3次元データを自動生成する方法に関する。 The present invention relates to a stereo matching processing device, a stereo matching processing method, and a recording medium. More particularly, the present invention relates to a method of automatically generating three-dimensional data from stereo images.
 ステレオ画像から3次元データを自動生成する方法として、人工衛星や航空機等から得られる画像を基に、地形を示す3次元データ[DSM(Digital Surface Model)データ]をステレオマッチングによって生成する方法が広く行われている。ここで、ステレオマッチング処理とは異なる視点から撮影した2枚の画像、いわゆるステレオ画像について、同一の点を撮像している各画像中の対応点を求め、その視差を用いて三角測量の原理によって対象までの奥行きや形状を求めることである。 As a method of automatically generating three-dimensional data from stereo images, a method of generating three-dimensional data [DSM (Digital Surface Model) data] indicating terrain by stereo matching based on an image obtained from an artificial satellite or aircraft is widely used. It has been done. Here, corresponding points in each image capturing the same point are determined for two images captured from different viewpoints with the stereo matching processing, so-called stereo images, and the parallax is used to obtain the triangulation principle. It is to find the depth and shape to the object.
 このステレオマッチング処理については既に様々な手法が提案されている。例えば、特許文献1には、一般的に広く用いられている面積相関法を用いる手法が開示されている。この面積相関法は左画像中に相関窓を設定してこれをテンプレートとし、右画像中の探索窓を動かしてテンプレートとの相互相関係数を算出し、これを一致度とみなして高いものを探索することによって対応点を得る方法である。 Various methods have already been proposed for this stereo matching process. For example, Patent Document 1 discloses a method using an area correlation method generally used widely. In this area correlation method, a correlation window is set in the left image and used as a template, and a search window in the right image is moved to calculate a cross correlation coefficient with the template, which is regarded as a high degree of coincidence. It is a method of obtaining corresponding points by searching.
 上記の方法においては処理量を軽減するために、探索窓の移動範囲を画像中のエピポーラ線方向に限定することによって、左画像中の各点について、対応する右画像中の点のx方向の位置ずれ量、すなわち視差を得ることができる。ここで、エピポーラ線とはステレオ画像において片方の画像中のある点について、他方の画像中で当該点に対応する点の存在範囲として引くことができる直線である。エピポーラ線については、「画像解析ハンドブック」(高木幹夫・下田陽久監修、東京大学出版会刊、1991年1月、頁597-599)に記載されている。 In the above method, in order to reduce the amount of processing, the movement range of the search window is limited to the epipolar line direction in the image, so that for each point in the left image, the x direction of the point in the corresponding right image It is possible to obtain the positional displacement amount, that is, the parallax. Here, the epipolar line is a straight line which can be drawn as an existing range of a point corresponding to the point in the other image, with respect to a certain point in one image in the stereo image. The epipolar line is described in "Image analysis handbook" (Mikio Takagi and Yoshihisa Shimoda, Tokyo University Press, January 1991, p. 597-599).
 通常、エピポーラ線方向は画像の走査線方向とは異なるが、座標変換を行うことで、エピポーラ線方向を画像の走査線方向に一致させ、再配列を行うことができる。この座標変換の方法については上記の「画像解析ハンドブック」に記載されている。上記のような再配列を行ったステレオ画像においては、対応点の探索窓の移動範囲を走査線上に限定することができるため、視差は左右画像中の対応点同士のx座標の差として得られる。 Usually, although the epipolar line direction is different from the scanning line direction of the image, by performing coordinate conversion, the epipolar line direction can be made to coincide with the scanning line direction of the image, and rearrangement can be performed. The method of this coordinate conversion is described in the above-mentioned "image analysis handbook". In a stereo image rearranged as described above, the movement range of the search window of the corresponding point can be limited to the scanning line, so parallax is obtained as the difference between the x coordinates of corresponding points in the left and right images. .
 ステレオ画像にはそれぞれ、物体の影になる部分(オクルージョン領域)が発生する。このオクルージョン領域にステレオマッチングの対応を与えないことによって、正確な対応を与えるマッチング方式が提案されている(特許文献2参照)。 In each stereo image, a portion (occlusion region) which is a shadow of an object is generated. There has been proposed a matching method which gives accurate correspondence by not giving stereo correspondence correspondence to this occlusion area (see Patent Document 2).
 また、特許文献3には、衛星ステレオ画像や航空ステレオ画像からオペレータを介さずに自動的に複雑な対象に対して3次元データが得られるステレオ画像処理装置の技術が記載されている。特許文献3の技術では、ステレオ処理手段によって得られた3次元データ中の雑音や欠損等の誤ったデータを、地図データ蓄積手段の地図データから得られる建造物等の外形情報を用いて自動的に補正する。地図データ蓄積手段はDSMデータ自動補正手段に対して建造物の外形等の地図データを提供する。 Further, Patent Document 3 describes a technique of a stereo image processing apparatus capable of automatically obtaining three-dimensional data with respect to a complicated object from a satellite stereo image or an aerial stereo image without an operator. In the technique of Patent Document 3, erroneous data such as noise or defects in three-dimensional data obtained by stereo processing means is automatically used by using external information of a building etc. obtained from map data of the map data storage means. Correct to The map data storage means provides map data such as the outline of a building to the DSM data automatic correction means.
特開平3-167678号公報Japanese Patent Application Laid-Open No. 3-167678 特開平4-299474号公報JP-A-4-299474 特開2002-157576号公報JP 2002-157576 A
 ステレオマッチング処理を行う1組の画像で、一方の画像内のある点について、他方の画像の対応する点を探索するときに、画像の走査線(エピポーラ線)全体に亘って探索するのでは、処理に時間がかかり、また、ミスマッチングの確率が増加する。 When searching for a corresponding point in another image with respect to a certain point in one image in one set of images to be subjected to stereo matching processing, if searching is performed over the entire scanning line (epipolar line) of the image, Processing takes time, and the probability of mismatching increases.
 本発明は上述の事情に鑑みてなされたもので、ステレオマッチング処理の速度と精度を向上することを目的とする。 The present invention has been made in view of the above-described circumstances, and its object is to improve the speed and accuracy of stereo matching processing.
 本発明の第1の観点に係るステレオマッチング処理装置は、
 複数の異なる位置から所定の領域を撮影した複数の画像の画像データを取得する画像データ取得部と、
 前記複数の画像に対応して基準の視差を設定する基準視差設定部と、
 前記基準視差設定部で設定された基準の視差を与える画像の点を基準に、前記画像の範囲よりも小さい所定の範囲をステレオマッチングの探索範囲に設定する探索範囲設定部と、
 前記複数の画像の一つの画像内の任意の点について、前記基準視差設定部で設定された基準の視差を与える他の画像の点を基準に、前記探索範囲設定部で設定された探索範囲で前記他の画像の対応する点を探索する探索部と、
を備えることを特徴とする。
A stereo matching processing device according to a first aspect of the present invention is
An image data acquisition unit that acquires image data of a plurality of images obtained by photographing a predetermined area from a plurality of different positions;
A reference disparity setting unit that sets a reference disparity corresponding to the plurality of images;
A search range setting unit for setting a predetermined range smaller than the range of the image as a search range of stereo matching based on the point of the image giving the reference parallax set by the reference parallax setting unit;
The search range set by the search range setting unit is based on the point of another image giving the reference parallax set by the reference parallax setting unit at an arbitrary point in one image of the plurality of images A search unit for searching for corresponding points of the other image;
And the like.
 本発明の第2の観点に係るステレオマッチング処理方法は、
 複数の異なる位置から所定の領域を撮影した複数の画像の画像データを取得する画像データ取得ステップと、
 前記複数の画像に対応して基準の視差を設定する基準視差設定ステップと、
 前記基準視差設定ステップで設定された基準の視差を与える画像の点を基準に、前記画像データの範囲よりも小さい所定の範囲をステレオマッチングの探索範囲に設定する探索範囲設定ステップと、
 前記複数の画像の一つの画像内の任意の点について、前記基準視差設定ステップで設定された基準の視差を与える他の画像の点を基準に、前記探索範囲設定ステップで設定された探索範囲で前記他の画像の対応する点を探索する探索ステップと、
を備えることを特徴とする。
A stereo matching method according to a second aspect of the present invention is
An image data acquisition step of acquiring image data of a plurality of images obtained by photographing a predetermined area from a plurality of different positions;
A reference disparity setting step of setting a reference disparity corresponding to the plurality of images;
A search range setting step of setting a predetermined range smaller than the range of the image data as a search range of stereo matching based on the point of the image giving the standard parallax set in the standard parallax setting step;
The search range set in the search range setting step on the basis of the point of another image giving the reference parallax set in the reference parallax setting step at any point in one image of the plurality of images A searching step of searching for corresponding points of the other image;
And the like.
 本発明の第3の観点に係るプログラムを記録したコンピュータ読み取り可能な記録媒体は、
 コンピュータを、
 複数の異なる位置から所定の領域を撮影した複数の画像の画像データを取得する画像データ取得部と、
 前記複数の画像に対応して基準の視差を設定する基準視差設定部と、
 前記基準視差設定部で設定された基準の視差を与える画像の点を基準に、前記画像の範囲よりも小さい所定の範囲をステレオマッチングの探索範囲に設定する探索範囲設定部と、
 前記複数の画像の一つの画像内の任意の点について、前記基準視差設定部で設定された基準の視差を与える他の画像の点を基準に、前記探索範囲設定部で設定された探索範囲で前記他の画像の対応する点を探索する探索部
として機能させるためのプログラムを記録することを特徴とする。
A computer readable recording medium storing a program according to the third aspect of the present invention is
Computer,
An image data acquisition unit that acquires image data of a plurality of images obtained by photographing a predetermined area from a plurality of different positions;
A reference disparity setting unit that sets a reference disparity corresponding to the plurality of images;
A search range setting unit for setting a predetermined range smaller than the range of the image as a search range of stereo matching based on the point of the image giving the reference parallax set by the reference parallax setting unit;
The search range set by the search range setting unit is based on the point of another image giving the reference parallax set by the reference parallax setting unit at an arbitrary point in one image of the plurality of images A program for functioning as a search unit for searching corresponding points of the other images is recorded.
 本発明によれば、ステレオ画像から3次元データを自動生成する方法においてステレオマッチング処理の速度と精度を向上できる。 According to the present invention, the speed and accuracy of stereo matching processing can be improved in a method of automatically generating three-dimensional data from stereo images.
本発明の実施の形態1に係るステレオ画像処理装置の構成例を示すブロック図である。It is a block diagram which shows the structural example of the stereo image processing apparatus concerning Embodiment 1 of this invention. 画像データに変換される航空写真の一例を模式的に示す図である。It is a figure which shows typically an example of the aerial photograph converted into image data. 実世界の地上の状態の一例を示す模式図である。It is a schematic diagram which shows an example of the state on the ground of a real world. 図3に示す実世界の一部を撮影した画像から、ステレオマッチング処理により生成したDSMデータを示す模式図である。It is a schematic diagram which shows the DSM data produced | generated by the stereo matching process from the image which image | photographed a part of real world shown in FIG. 探索範囲を説明する探索平面を示す図である。It is a figure which shows the search plane which illustrates a search range. 探索範囲の一例を示す図である。It is a figure which shows an example of a search range. 実施の形態1に係る高さ標定処理の動作の一例を示すフローチャートである。5 is a flowchart showing an example of the operation of height positioning processing according to Embodiment 1; 本発明の実施の形態2に係るステレオ画像処理装置の構成例を示すブロック図である。It is a block diagram which shows the structural example of the stereo image processing apparatus concerning Embodiment 2 of this invention. 分割した領域ごとに基準の標高と探索範囲を設定する探索平面の例を示す図である。It is a figure which shows the example of the search plane which sets a reference | standard altitude and a search range for every divided area | region. 実施の形態2に係る高さ標定処理の動作の一例を示すフローチャートである。FIG. 13 is a flowchart showing an example of the operation of height positioning processing according to Embodiment 2. FIG. 実施の形態3に係るステレオ画像処理装置1の構成例を示すブロック図である。FIG. 16 is a block diagram showing an example of configuration of a stereo image processing device 1 according to a third embodiment. 実施の形態3に係る高さ標定処理の動作の一例を示すフローチャートである。FIG. 16 is a flowchart showing an example of the operation of height positioning processing according to Embodiment 3. FIG. 実施の形態4に係るステレオ画像処理装置1の構成例を示すブロック図である。FIG. 18 is a block diagram showing an example of configuration of a stereo image processing device 1 according to a fourth embodiment. 実施の形態4に係る高さ標定処理の動作の一例を示すフローチャートである。FIG. 16 is a flowchart showing an example of the operation of height positioning processing according to Embodiment 4. FIG. 実施の形態5に係るステレオ画像処理装置1の構成例を示すブロック図である。FIG. 18 is a block diagram showing an example of configuration of a stereo image processing device 1 according to a fifth embodiment. 実施の形態5に係る高さ標定処理の動作の一例を示すフローチャートである。FIG. 21 is a flowchart showing an example of the operation of height positioning processing according to Embodiment 5. FIG. ステレオ画像処理装置1をコンピュータに実装する場合の、物理的な構成の一例を示すブロック図である。FIG. 6 is a block diagram showing an example of a physical configuration when the stereo image processing apparatus 1 is mounted on a computer.
 以下、この発明の実施の形態について図面を参照しながら詳細に説明する。なお、図中同一または相当部分には同一符号を付す。 Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. In the drawings, the same or corresponding parts are denoted by the same reference numerals.
(実施の形態1)
 図1は、本発明の実施の形態1に係るステレオ画像処理装置の構成例を示すブロック図である。ステレオ画像処理装置1は、画像データ入力部10、ステレオマッチング部11、基準視差設定部12および探索範囲設定部13を備える。ステレオ画像処理装置1は、高さ計算部2に接続される。
Embodiment 1
FIG. 1 is a block diagram showing an example of the configuration of a stereo image processing apparatus according to Embodiment 1 of the present invention. The stereo image processing device 1 includes an image data input unit 10, a stereo matching unit 11, a reference disparity setting unit 12, and a search range setting unit 13. The stereo image processing device 1 is connected to the height calculator 2.
 画像データ入力部10は、画像データを入力する機能を有し、ステレオマッチング処理に使用する複数の画像データを入力する。画像データは、例えば、デジタル画像に変換された航空写真の画像である。画像データには、画像を撮影した位置、撮影した方向、画角などが含まれる。 The image data input unit 10 has a function of inputting image data, and inputs a plurality of image data used for stereo matching processing. The image data is, for example, an aerial photograph image converted into a digital image. The image data includes the position at which the image was captured, the direction in which the image was captured, and the angle of view.
 図2は、画像データに変換される航空写真の一例を模式的に示す。図2に示される航空写真は、上空を飛行する航空機から連続撮影された、航空写真101Aと航空写真101Bから構成される。航空写真101Aと航空写真101Bとは、航空機の進行方向について60%オーバーラップして撮影されたものである。オーバーラップ部分は、異なる位置から同じ領域を撮影した画像である。 FIG. 2 schematically shows an example of an aerial photograph to be converted into image data. The aerial photograph shown in FIG. 2 is composed of an aerial photograph 101A and an aerial photograph 101B taken continuously from an aircraft flying above. The aerial photograph 101A and the aerial photograph 101B are taken with 60% overlap in the traveling direction of the aircraft. The overlap part is an image obtained by photographing the same area from different positions.
 本実施例における画像は、航空写真101A、航空写真101Bを一例とする航空写真がデジタル変換されて生成された画像である。本発明に用いる画像は、航空画像に限定されるものではなく、デジタル化された衛星写真による画像や、一般的なデジタルカメラで撮影したデジタル画像や、一般的なアナログカメラで撮影したアナログ写真をスキャニングでデジタル化したデジタル画像などであってもよい。 The image in the present embodiment is an image generated by digital conversion of an aerial photograph such as the aerial photograph 101A and the aerial photograph 101B. The image used in the present invention is not limited to an aerial image, and it is an image from a digitized satellite image, a digital image taken with a common digital camera, and an analog picture taken with a common analog camera Or digital images digitized by
 図1のステレオマッチング部11は、異なる位置から同じ領域を撮影した複数の画像データに対して、同じ地点を写す画像内の位置を検出する。すなわち、複数の画像の同じ地点に対応する点の組を検出する。同じ地点に対応する点の組は、通常2つの画像の中で、対応する付近の小領域の画像相関をとり、相関係数が最大となる位置から検出される。 The stereo matching unit 11 in FIG. 1 detects a position in an image that captures the same point for a plurality of pieces of image data obtained by capturing the same area from different positions. That is, a set of points corresponding to the same point of a plurality of images is detected. The set of points corresponding to the same point is usually detected from the position where the correlation coefficient becomes maximum, by correlating the image of the corresponding small area in the two images.
 なお、ステレオマッチング処理を行うための手法には、一般的な特徴量を求めて対応付けるものや、左右画像の相関を求めるものなど、様々なものが存在するが、本実施例におけるステレオマッチング処理に使用される手法に制限はない。例えば、特公平8-16930に記載のステレオマッチング処理を使用してもよい。 Note that there are various methods for performing stereo matching processing, such as a method of obtaining and correlating common feature amounts, and a method of obtaining correlation between left and right images, but there are various methods for performing stereo matching processing in this embodiment. There is no limitation on the method used. For example, stereo matching processing described in Japanese Patent Publication No. 8-16930 may be used.
 高さ計算部2は、ステレオマッチング処理によって得られた対応する点の視差から、三角測量の原理によって、DSMデータを生成する。例えば、一組の航空写真101A、航空写真101B間では、対応する地物の位置が、所定の位置ずれ(視差)を生じているので、ステレオマッチング処理は、この位置ずれを計測することにより、標高値を含む地物の表層の高さ、および水平方向の座標、すなわち3次元データを求める。 The height calculator 2 generates DSM data according to the principle of triangulation from the parallax of the corresponding points obtained by the stereo matching process. For example, since the position of the corresponding feature generates a predetermined positional deviation (parallax) between a pair of aerial photographs 101A and 101B, the stereo matching process measures the positional deviation by measuring the positional deviation. The height of the surface layer of the feature including the elevation value and the horizontal coordinate, that is, three-dimensional data are determined.
 図3は、実世界の地上の状態の一例を示す模式図である。図3は、実世界の一部の断面を示したものであり、起伏のある地形の上に地物が存在する。 FIG. 3 is a schematic view showing an example of the real world ground state. FIG. 3 shows a cross section of a part of the real world, in which features exist on topographical features.
 図4は、図3に示す実世界の一部を撮影した画像から、ステレオマッチング処理により生成したDSMデータを示す模式図である。DSMデータは、最表面の高さデータを表すため、屋根等で隠された地表面については、標高値を含む屋根の高さを表す。 FIG. 4 is a schematic view showing DSM data generated by stereo matching processing from an image of a part of the real world shown in FIG. The DSM data represents the top surface height data, and therefore, for the ground surface hidden by a roof or the like, it represents the height of the roof including the elevation value.
 図1の基準視差設定部12は、2つの画像で対応する点を探索する基準の視差を与える視差を設定する。例えば、実空間におけるステレオマッチングで探索すべき範囲の基準の高さを与える視差を基準に設定する。探索範囲設定部13は、基準視差設定部12で設定された基準の視差を与える画像の点を基準に、ステレオマッチング部11で対応する点を探索する範囲を設定する。通常は、画像の範囲よりも小さい所定の範囲をステレオマッチングの探索範囲として設定する。 The reference disparity setting unit 12 of FIG. 1 sets a disparity which gives a disparity of a reference for searching for a corresponding point in two images. For example, it is set based on the parallax which gives the height of the reference | standard of the range which should be searched by stereo matching in real space. The search range setting unit 13 sets a range in which the stereo matching unit 11 searches for a corresponding point based on the point of the image to which the reference parallax set by the reference parallax setting unit 12 is given. Usually, a predetermined range smaller than the range of the image is set as a search range for stereo matching.
 図4は、基準標高と探索範囲に対応する高さを概念的に示す。例えば、図3のような起伏のある地表について、航空機の高度の原点の高さGから所定の高さHの線SLで表される面を基準の標高とする。そして、線SLの面を基準にして、線BLから線ULの間の範囲を実空間における探索範囲に相当する高さSHとする。 FIG. 4 conceptually shows heights corresponding to the reference elevation and the search range. For example, for a rough surface as shown in FIG. 3, a surface represented by a line SL of a predetermined height H from the height G of the origin of the altitude of the aircraft is taken as the reference elevation. Then, with reference to the plane of the line SL, the range between the line BL and the line UL is set as the height SH corresponding to the search range in the real space.
 図5は、探索範囲を説明する探索平面を示す。図5は、航空写真101Aを変換した画像の走査線(エピポーラ線)Aと、航空写真101Bを変換した画像の、走査線Aに対応する走査線(エピポーラ線)Bを垂直に配置したものである。軸ABによって構成される平面を一般に探索平面と呼ぶ。A、B上のブロックの中心位置は、それぞれ縦線、横線で表現され、2つの画像の間の一つの対応は、縦線と横線の交点で表される。ステレオマッチング部11は、2つの画像の対応する点を、走査線Aと走査線Bの上で探索する。探索平面で45度の角度の線は、2つの画像において視差が一定、すなわち一定の高さを示す。 FIG. 5 shows a search plane that describes the search range. FIG. 5 shows the scanning line (epipolar line) A of the image converted from the aerial photograph 101A and the scanning line (epipolar line) B corresponding to the scanning line A of the image converted from the aerial photograph 101B arranged vertically. is there. The plane defined by the axes AB is generally referred to as the search plane. The central positions of the blocks on A and B are represented by vertical lines and horizontal lines, respectively, and one correspondence between the two images is represented by the intersection of vertical lines and horizontal lines. The stereo matching unit 11 searches corresponding points of the two images on the scan line A and the scan line B. The 45-degree angle line in the search plane shows a constant parallax, ie, a constant height, in the two images.
 図5において、線glは例えば地面の基準の高さに相当する視差の線であり、航空機の高度の原点を表すとする。線spは、基準視差を与える高さであり図4の線SLに対応する。幅SPは図4の高さHに相当する。図5の線uは、図4の線ULに対応し、線lは図4の線BLに対応する。線uと線lに囲まれる幅Rの範囲が探索範囲を示す。 In FIG. 5, the line gl is, for example, a line of parallax corresponding to the height of a ground reference, and represents the origin of the altitude of the aircraft. The line sp is a height giving the reference parallax and corresponds to the line SL in FIG. The width SP corresponds to the height H in FIG. The line u in FIG. 5 corresponds to the line UL in FIG. 4 and the line 1 corresponds to the line BL in FIG. The range of the width R enclosed by the line u and the line 1 indicates the search range.
 ステレオマッチング部11は、例えば図5の線uと線lの間で、AB間の対応の組合せを探索する。走査線Aの上のある点に対応する走査線Bの点は、走査線Aのその点を通る縦線の線uと線lの間の部分で探索される。走査線Bの点を決めて、走査線Aの対応点を探索する場合は、走査線Bのその点を通る横線のと線uと線lの間の部分で探索されることになる。 The stereo matching unit 11 searches for combinations of correspondences between AB, for example, between the line u and the line 1 in FIG. The point on scan line B that corresponds to a point on scan line A is searched in the portion between vertical line u and line 1 that passes through that point on scan line A. When the point of the scan line B is determined and the corresponding point of the scan line A is searched, the portion between the line u passing through the point of the scan line B and the line u and the line l is searched.
 図6は、探索範囲の一例を示す。地面の起伏の高低差がわかっている場合に、それを含めて想定される最大高さの地物を探索する範囲を定めることができる。図6では、起伏に対応する視差が線lsで表されている。探索範囲は、起伏の平均高さに地物の想定平均の高さを加えた視差spを基準に、起伏の高低差を含めた範囲Rの線uと線lで示される。 FIG. 6 shows an example of the search range. When the height difference of the ground surface is known, the range for searching for the feature having the maximum height assumed including that can be defined. In FIG. 6, the parallax corresponding to the unevenness is represented by a line ls. The search range is indicated by a line u and a line l of a range R including the height difference of the relief with reference to the parallax sp obtained by adding the height of the assumed average of the feature to the average height of the relief.
 地表の標高がわかっている場合に、その高低差と想定される地物の高さを含めて探索範囲を設定することにより、ステレオマッチング処理を速く行うことができる。また、探索範囲を限定するので、2つの画像の対応点を誤認識する可能性が減少する。 When the elevation of the ground surface is known, the stereo matching process can be performed quickly by setting the search range including the height of the feature assumed to be the elevation difference. In addition, since the search range is limited, the possibility of erroneous recognition of corresponding points of two images is reduced.
 図7は、実施の形態1に係る高さ標定処理の動作の一例を示すフローチャートである。画像データ入力部10がステレオマッチング処理をする複数の画像を入力する(ステップS11)。基準視差設定部12は、例えば、入力された複数の画像により示される領域の平均の標高と想定される地物の高さから、基準の視差を設定する(ステップS12)。探索範囲設定部13は、その領域の標高差と地物の高さから、基準の視差に対して探索範囲を設定する(ステップS13)。 FIG. 7 is a flowchart showing an example of the operation of the height positioning process according to the first embodiment. The image data input unit 10 inputs a plurality of images to be subjected to stereo matching processing (step S11). The reference parallax setting unit 12 sets, for example, a reference parallax from the average elevation of the regions indicated by the plurality of input images and the height of the assumed feature (step S12). The search range setting unit 13 sets a search range for the reference disparity from the elevation difference of the area and the height of the feature (step S13).
 ステレオマッチング部11は、一方の画像の点に対して、基準の視差を与える他の画像の点を基準に、設定された探索範囲で他の画像の対応する点を探索する(ステップS14)。高さ計算部2は、複数の画像で対応づけられた各点の画像内の位置から、その点に対応する高さと、その点の地図上の座標を計算する(ステップS15)。 The stereo matching unit 11 searches for the corresponding point of the other image in the set search range with reference to the point of the other image giving the reference parallax with respect to the point of one image (step S14). The height calculation unit 2 calculates the height corresponding to the point and the coordinates on the map of the point from the position in the image of each point associated with a plurality of images (step S15).
 本実施の形態1のステレオ画像処理装置1によれば、高低差と想定される地物の高さを含めて基準の視差および探索範囲を設定することにより、ステレオマッチング処理を速く行うことができる。また、探索範囲を限定するので、2つの画像の対応点を誤認識する可能性が減少する。 According to the stereo image processing apparatus 1 of the first embodiment, the stereo matching process can be performed quickly by setting the reference parallax and the search range including the height of the feature assumed to be the height difference. . In addition, since the search range is limited, the possibility of erroneous recognition of corresponding points of two images is reduced.
 (実施の形態2)
 図8は、本発明の実施の形態2に係るステレオ画像処理装置1の構成例を示すブロック図である。実施の形態2では、地図データの標高データに基づいて、基準の視差および探索範囲を設定する。
Second Embodiment
FIG. 8 is a block diagram showing an example of the configuration of a stereo image processing apparatus 1 according to Embodiment 2 of the present invention. In the second embodiment, a reference disparity and a search range are set based on elevation data of map data.
 図8のステレオ画像処理装置1は、実施の形態1の構成に加えて、地図データ入力部14と領域分割部15を備える。地図データ入力部14は、入力した画像データの対象の地域の地図データを入力する。地図データは、地図のメッシュの各点における標高データを含む。 The stereo image processing apparatus 1 of FIG. 8 includes a map data input unit 14 and a region division unit 15 in addition to the configuration of the first embodiment. The map data input unit 14 inputs map data of a target area of the input image data. The map data contains elevation data at each point of the map mesh.
 領域分割部15は、入力した画像データを標高データに応じて分割する。標高データの高低差が小さければ、画像全体を分割せず1つの領域とする場合もある。画像の範囲の地図データで高低差が所定の範囲を超えていれば、画像を複数の領域に分割する。 The area dividing unit 15 divides the input image data according to the elevation data. If the elevation difference in elevation data is small, the entire image may not be divided into one area. If the height difference in the map data of the range of the image exceeds the predetermined range, the image is divided into a plurality of regions.
 基準視差設定部12は、領域分割部15で分割された領域ごとに、標高データに基づいて基準の視差を設定する。探索範囲設定部13はまた、標高データの高低差と想定される地物の高さを考慮して、分割された領域ごとに探索範囲を設定する。 The reference disparity setting unit 12 sets, for each of the areas divided by the area dividing unit 15, a reference disparity based on the elevation data. The search range setting unit 13 also sets a search range for each of the divided areas, in consideration of the height of the feature assumed to be the height difference of the elevation data.
 ステレオマッチング部11は、分割された領域ごとに設定された基準の視差と探索範囲に合わせて、複数の画像の対応する点を探索して、対応点の組を抽出する。高さ計算部2は、ステレオマッチング処理によって得られた対応する点の視差から、三角測量の原理によって、DSMデータを生成する。 The stereo matching unit 11 extracts corresponding pairs of points by searching corresponding points of the plurality of images in accordance with the reference disparity and the search range set for each of the divided regions. The height calculator 2 generates DSM data according to the principle of triangulation from the parallax of the corresponding points obtained by the stereo matching process.
 図9は、分割した領域ごとに基準の視差と探索範囲を設定する探索平面の例を示す。図9の例では、地表の起伏lsに合わせて4つの領域に分割されている。各領域において基準の視差をsp1、sp2、sp3、sp4としている。そして、1つめの領域では、u1、l1を境界とする幅R1の範囲を探索範囲に設定している。以下、u2、l2を境界とする幅R2の範囲、u3、l3を境界とする幅R3の範囲、u4、l4を境界とする幅R4の範囲をそれぞれ探索範囲とする。 FIG. 9 shows an example of a search plane in which a reference disparity and a search range are set for each divided area. In the example of FIG. 9, it is divided | segmented into four area | region according to the unevenness ls of the ground surface. The reference disparity in each area is sp1, sp2, sp3, and sp4. Then, in the first region, a range of width R1 bounded by u1 and l1 is set as the search range. Hereinafter, a range of width R2 bordered by u2 and 12, a range of width R3 bordered by u3 and 13, and a range of width R4 bordered by u4 and 14 are respectively set as search ranges.
 図9に示すように、探索範囲は図6に比べて小さくなっている。起伏の高低差がある場合に、画像の範囲を複数の領域に分割し、分割した領域ごとに探索範囲を設定することにより、探索範囲を対応点が存在する範囲に小さく限定できる。その結果、ステレオマッチング処理を速くまた正確に行うことができる。 As shown in FIG. 9, the search range is smaller than that in FIG. When there is a difference in height of unevenness, the range of the image is divided into a plurality of areas, and the search range is set to each divided area, so that the search range can be limited to the range in which corresponding points exist. As a result, stereo matching processing can be performed quickly and accurately.
 なお、領域分割は、領域内の高低差が所定の値以下になるように設定できる。あるいは、画像範囲の高低差に応じて分割数を決めてもよい。 Region division can be set so that the height difference in the region is equal to or less than a predetermined value. Alternatively, the number of divisions may be determined according to the height difference of the image range.
 図10は、実施の形態2に係る高さ標定処理の動作の一例を示すフローチャートである。画像データ入力部10がステレオマッチング処理をする複数の画像を入力する(ステップS21)と、地図データ入力部14は、その対象の地域の地図データを入力する(ステップS22)。領域分割部15は、地図データに含まれる標高データに基づいて、画像を領域に分割する(ステップS23)。 FIG. 10 is a flowchart showing an example of the operation of the height positioning process according to the second embodiment. When the image data input unit 10 inputs a plurality of images to be subjected to stereo matching processing (step S21), the map data input unit 14 inputs map data of the target area (step S22). The area dividing unit 15 divides the image into areas based on the elevation data included in the map data (step S23).
 基準視差設定部12は、標高データと分割領域に基づいて、分割された領域ごとに基準の視差を設定する(ステップS24)。探索範囲設定部13は、分割された領域の高低差に応じて、領域ごとに探索範囲を設定する(ステップS25)。 The reference disparity setting unit 12 sets a reference disparity for each of the divided areas based on the elevation data and the divided areas (step S24). The search range setting unit 13 sets a search range for each area according to the height difference of the divided areas (step S25).
 ステレオマッチング部11は、一方の画像の点に対して、基準の視差を与える他の画像の点を基準に、領域ごとに設定された探索範囲で他の画像の対応する点を探索する(ステップS26)。高さ計算部2は、複数の画像で対応づけられた各点の画像内の位置から、その点に対応する高さと、その点の地図上の座標を計算する(ステップS27)。 The stereo matching unit 11 searches for a corresponding point of the other image in the search range set for each region, based on the point of the other image giving the reference parallax, with respect to the point of one image (Step S26). The height calculation unit 2 calculates the height corresponding to the point and the coordinates on the map of the point from the position in the image of each point associated with a plurality of images (step S27).
 実施の形態2のステレオ画像処理装置1によれば、地図データの標高データに合わせて探索範囲を対応点が存在する範囲に小さく限定できる。その結果、ステレオマッチング処理を速くまた正確に行うことができる。 According to the stereo image processing apparatus 1 of the second embodiment, the search range can be limited to a small range in which the corresponding points exist in accordance with the elevation data of the map data. As a result, stereo matching processing can be performed quickly and accurately.
 (実施の形態3)
 図11は、実施の形態3に係るステレオ画像処理装置1の構成例を示すブロック図である。実施の形態3のステレオ画像処理装置1は、実施の形態1の構成に加えて、基準点設定部16を備える。
Third Embodiment
FIG. 11 is a block diagram showing a configuration example of the stereo image processing device 1 according to the third embodiment. The stereo image processing apparatus 1 of the third embodiment includes a reference point setting unit 16 in addition to the configuration of the first embodiment.
 基準点設定部16は、画像データ入力部10で入力した画像の中から、所定の方法で基準の視差を設定する基準の点を選択する。例えば基準の点は、画像の大きさと尺度に応じた数の点をランダムに選択する。また、所定の分布の点を選択するのでもよい。 The reference point setting unit 16 selects, from the image input by the image data input unit 10, a reference point for setting a reference parallax according to a predetermined method. For example, the reference points randomly select a number of points according to the size and scale of the image. In addition, points of a predetermined distribution may be selected.
 基準点設定部16は、選択した点について、ステレオマッチング部11で複数の画像で対応する点の組を抽出する。また、選択した点の3次元データを計算する。そして、選択した点の3次元データを基準視差設定部12に送る。 The reference point setting unit 16 causes the stereo matching unit 11 to extract a set of corresponding points in the plurality of images for the selected point. Also, three-dimensional data of the selected point is calculated. Then, three-dimensional data of the selected point is sent to the reference disparity setting unit 12.
 基準視差設定部12は、基準点設定部16から受け取った3次元データから、所定の方法で基準の視差を設定する。例えば、3次元データの高さの平均または中央値を基準標高として、基準標高に対応する視差を基準の視差とする。あるいは、所定の分割領域ごとに平均して、それぞれの領域ごとに基準の視差を設定してもよい。 The reference disparity setting unit 12 sets a reference disparity from the three-dimensional data received from the reference point setting unit 16 in a predetermined method. For example, the average or the median of the heights of three-dimensional data is used as a reference elevation, and the parallax corresponding to the reference elevation is used as a reference parallax. Alternatively, the reference disparity may be set for each of the predetermined divided areas on average.
 探索範囲設定部13は、基準の視差に合わせて探索範囲を設定する。例えば、選択された点の3次元データの高さの分散を考慮して、探索範囲を設定することができる。また、画像の範囲が複数の領域に分割される場合には、分割された領域ごとに探索範囲を設定してもよい。 The search range setting unit 13 sets a search range in accordance with the reference disparity. For example, the search range can be set in consideration of the variance of the height of the three-dimensional data of the selected point. When the range of the image is divided into a plurality of areas, the search range may be set for each of the divided areas.
 ステレオマッチング部11は、設定された基準の視差探索範囲に合わせて、複数の画像の対応する点を探索して、対応点の組を抽出する。分割された領域ごとに基準の視差と探索範囲が設定されている場合には、その探索範囲で対応する点を探索する。高さ計算部2は、ステレオマッチング処理によって得られた対応する点の視差から、三角測量の原理によって、DSMデータを生成する。 The stereo matching unit 11 searches for corresponding points of a plurality of images in accordance with the set reference parallax search range, and extracts a set of corresponding points. When the reference disparity and the search range are set for each of the divided areas, a corresponding point is searched in the search range. The height calculator 2 generates DSM data according to the principle of triangulation from the parallax of the corresponding points obtained by the stereo matching process.
 図12は、実施の形態3に係る高さ標定処理の動作の一例を示すフローチャートである。画像データ入力部10がステレオマッチング処理をする複数の画像を入力する(ステップS31)と、基準点設定部16は所定の方法で画像の中の基準点を選択する(ステップS32)。ステレオマッチング部11は、選択された基準点についてステレオマッチング処理を行って、複数の画像で対応する組を抽出する(ステップS33)。 FIG. 12 is a flowchart showing an example of the operation of the height positioning process according to the third embodiment. When the image data input unit 10 inputs a plurality of images to be subjected to stereo matching processing (step S31), the reference point setting unit 16 selects a reference point in the image by a predetermined method (step S32). The stereo matching unit 11 performs stereo matching processing on the selected reference point, and extracts a corresponding set of a plurality of images (step S33).
 基準点設定部16は、選択した点の対応の組から基準点の高さを計算する(ステップS34)。基準視差設定部12は、基準点の高さに基づいて、基準の視差を設定する(ステップS35)。ここで、基準視差設定部12は、選択した点の対応の組に基づいて基準の視差を設定してもよい(基準点設定部16で基準点の高さを計算しなくてもよい)。探索範囲設定部13は、基準の視差に合わせて探索範囲を設定する(ステップS36)。 The reference point setting unit 16 calculates the height of the reference point from the corresponding set of selected points (step S34). The reference disparity setting unit 12 sets a reference disparity based on the height of the reference point (step S35). Here, the reference parallax setting unit 12 may set the reference parallax based on the set of correspondences of the selected points (the reference point setting unit 16 may not calculate the height of the reference point). The search range setting unit 13 sets a search range in accordance with the reference disparity (step S36).
 ステレオマッチング部11は、一方の画像の点に対して、基準の視差を与える他の画像の点を基準に、領域ごとに設定された探索範囲で他の画像の対応する点を探索する(ステップS37)。高さ計算部2は、複数の画像で対応づけられた各点の画像内の位置から、その点に対応する高さと、その点の地図上の座標を計算する(ステップS38)。 The stereo matching unit 11 searches for a corresponding point of the other image in the search range set for each region, based on the point of the other image giving the reference parallax, with respect to the point of one image (Step S37). The height calculation unit 2 calculates the height corresponding to the point and the coordinates on the map of the point from the position in the image of each point associated with a plurality of images (step S38).
 実施の形態3のステレオ画像処理装置1によれば、地図データがない場合でも、画像に適合するように探索範囲を対応点が存在する範囲に小さく限定できる。その結果、ステレオマッチング処理を速くまた正確に行うことができる。 According to the stereo image processing device 1 of the third embodiment, even when there is no map data, the search range can be limited to a small range in which the corresponding points exist so as to fit the image. As a result, stereo matching processing can be performed quickly and accurately.
 (実施の形態4)
 図13は、実施の形態4に係るステレオ画像処理装置1の構成例を示すブロック図である。実施の形態4のステレオ画像処理装置1は、実施の形態3の基準点を外部から入力する。図13のステレオ画像処理装置1は、実施の形態3の構成に加えて、基準点入力部17を備える。
Embodiment 4
FIG. 13 is a block diagram showing a configuration example of the stereo image processing device 1 according to the fourth embodiment. The stereo image processing apparatus 1 of the fourth embodiment externally inputs the reference point of the third embodiment. The stereo image processing apparatus 1 of FIG. 13 includes a reference point input unit 17 in addition to the configuration of the third embodiment.
 基準点入力部17は、基準点の画像内の位置を示すデータを入力する。例えば、画像データを表示装置(図示せず)に表示して、その画面の上で選択された点を入力してもよい。また画像内の座標をデータで入力してもよい。 The reference point input unit 17 inputs data indicating the position of the reference point in the image. For example, the image data may be displayed on a display device (not shown) and the point selected on the screen may be input. Also, coordinates in the image may be input as data.
 基準点設定部16、基準視差設定部12、探索範囲設定部13およびステレオマッチング部11の動作は、実施の形態3と同様である。 The operations of the reference point setting unit 16, the reference disparity setting unit 12, the search range setting unit 13, and the stereo matching unit 11 are the same as in the third embodiment.
 実施の形態4では、入力した画像から適切と考えられる基準点を設定することができる。例えば、高層ビルなどの航空標識灯、鉄塔の特徴点、ビルなど人工的な地物の特徴点など、基準の標高に適切と考えられ、ステレオマッチング処理が正確に行える点を選択することができる。 In the fourth embodiment, it is possible to set a reference point considered to be appropriate from the input image. For example, it is considered appropriate for the standard elevation, such as aviation marker lights such as high-rise buildings, feature points of steel towers, feature points of artificial features such as buildings, etc. .
 図14は、実施の形態4に係る高さ標定処理の動作の一例を示すフローチャートである。画像データ入力部10がステレオマッチング処理をする複数の画像を入力する(ステップS41)と、基準点入力部17はその画像内の基準点の位置を示すデータを入力する(ステップS42)。選択する基準点は、複数あってもよい。 FIG. 14 is a flowchart illustrating an example of the operation of the height adjustment process according to the fourth embodiment. When the image data input unit 10 inputs a plurality of images to be subjected to stereo matching processing (step S41), the reference point input unit 17 inputs data indicating the position of the reference point in the image (step S42). There may be a plurality of reference points to be selected.
 基準点設定部16は、入力された点の位置をステレオマッチング部11におくり、ステレオマッチング部11は、選択された基準点についてステレオマッチング処理を行って、複数の画像で対応する組を抽出する(ステップS43)。基準点設定部16は、選択した点の対応の組から基準点の高さを計算する(ステップS44)。以降、基準視差設定(ステップS45)から対応高さ計算(ステップS48)は、図12のステップS35~ステップS38と同様である。 The reference point setting unit 16 sends the position of the input point to the stereo matching unit 11, and the stereo matching unit 11 performs stereo matching processing on the selected reference points to extract a corresponding set of a plurality of images. (Step S43). The reference point setting unit 16 calculates the height of the reference point from the corresponding set of selected points (step S44). Thereafter, the corresponding height calculation (step S48) from the reference parallax setting (step S45) is the same as steps S35 to S38 in FIG.
 実施の形態4のステレオ画像処理装置1によれば、実施の形態3に加えて、画像に合わせて適切な基準の視差で、かつ、ステレオマッチング処理が正確に行える点を選択することができる。その結果、ステレオマッチング処理を速くまた正確に行うことができる。 According to the stereo image processing apparatus 1 of the fourth embodiment, in addition to the third embodiment, it is possible to select a point with which the stereo matching process can be accurately performed with a parallax of a suitable reference in accordance with the image. As a result, stereo matching processing can be performed quickly and accurately.
 (実施の形態5)
 図15は、実施の形態5に係るステレオ画像処理装置1の構成例を示すブロック図である。実施の形態5のステレオ画像処理装置1は、実施の形態3の基準点とそれにステレオマッチングについて対応する組を外部から入力する。図15のステレオ画像処理装置1は、実施の形態3の基準点入力部17および基準点設定部16に代えて、基準対応点入力部18を備える。
Fifth Embodiment
FIG. 15 is a block diagram showing a configuration example of the stereo image processing device 1 according to the fifth embodiment. The stereo image processing device 1 according to the fifth embodiment externally receives the reference point according to the third embodiment and a set corresponding to stereo matching. The stereo image processing device 1 of FIG. 15 includes a reference corresponding point input unit 18 in place of the reference point input unit 17 and the reference point setting unit 16 of the third embodiment.
 基準対応点入力部18は、基準点の画像内の位置と、それにステレオマッチング対応する他の画像内の点の位置を示すデータを入力する。例えば、2つの画像データを表示装置(図示せず)に表示して、その画面の上で選択された対応点の組を入力してもよい。また2つの画像内の対応点の組の座標をデータで入力してもよい。 The reference corresponding point input unit 18 inputs data indicating the position in the image of the reference point and the position of the point in the other image corresponding to the stereo matching. For example, two image data may be displayed on a display (not shown), and a set of corresponding points selected on the screen may be input. Also, the coordinates of a set of corresponding points in two images may be input as data.
 基準視差設定部12、探索範囲設定部13およびステレオマッチング部11の動作は、実施の形態1と同様である。 The operations of the reference disparity setting unit 12, the search range setting unit 13, and the stereo matching unit 11 are the same as in the first embodiment.
実施の形態5では、入力した画像から適切と考えられる基準対応点を設定することができる。例えば、高層ビルなどの航空標識灯、鉄塔の特徴点、ビルなど人工的な地物の特徴点など、基準の標高に適切と考えられ、ステレオマッチングしている対応点を選択することができる。 In the fifth embodiment, it is possible to set a reference corresponding point that is considered appropriate from the input image. For example, it is possible to select corresponding points that are considered to be appropriate for the standard elevation and that are stereo-matched, such as aviation marker lights such as high-rise buildings, feature points of steel towers, and feature points of artificial features such as buildings.
 図16は、実施の形態5に係る高さ標定処理の動作の一例を示すフローチャートである。画像データ入力部10がステレオマッチング処理をする複数の画像を入力する(ステップS51)と、基準対応点入力部18はそれらの画像のうち2つの画像でステレオマッチング対応する基準対応点の位置を示すデータを入力する(ステップS52)。選択する基準点は、複数あってもよい。 FIG. 16 is a flowchart showing an example of the operation of the height positioning process according to the fifth embodiment. When the image data input unit 10 inputs a plurality of images to be subjected to stereo matching processing (step S51), the reference corresponding point input unit 18 indicates the positions of reference corresponding points corresponding to stereo matching in two images among those images. Data is input (step S52). There may be a plurality of reference points to be selected.
 以降、基準視差設定(ステップS53)から対応高さ計算(ステップS56)は、図12のステップS35~ステップS38と同様である。 After that, the corresponding height calculation (step S56) from the reference parallax setting (step S53) is the same as steps S35 to S38 in FIG.
 実施の形態5のステレオ画像処理装置1によれば、画像に合わせて適切な基準の視差で、かつ、ステレオマッチングしている対応点を選択することにより、基準の視差と探索範囲を設定することができる。その結果、ステレオマッチング処理を速くまた正確に行うことができる。 According to the stereo image processing device 1 of the fifth embodiment, the reference parallax and the search range are set by selecting the corresponding points that are stereo matched with the reference parallax appropriate to the image. Can. As a result, stereo matching processing can be performed quickly and accurately.
 図17は、ステレオ画像処理装置1をコンピュータに実装する場合の、物理的な構成の一例を示すブロック図である。本実施の形態に係るステレオ画像処理装置1は、一般的なコンピュータ装置と同様のハードウェア構成によって実現することができる。ステレオ画像処理装置1は、図17に示すように、制御部21、主記憶部22、外部記憶部23、操作部24、表示部25および入出力部26を備える。主記憶部22、外部記憶部23、操作部24、表示部25および入出力部26はいずれも内部バス20を介して制御部21に接続されている。 FIG. 17 is a block diagram showing an example of a physical configuration when the stereo image processing apparatus 1 is mounted on a computer. The stereo image processing device 1 according to the present embodiment can be realized by the same hardware configuration as a general computer device. The stereo image processing apparatus 1 includes a control unit 21, a main storage unit 22, an external storage unit 23, an operation unit 24, a display unit 25, and an input / output unit 26, as shown in FIG. The main storage unit 22, the external storage unit 23, the operation unit 24, the display unit 25, and the input / output unit 26 are all connected to the control unit 21 via the internal bus 20.
 制御部21はCPU(Central Processing Unit)等から構成され、外部記憶部23に記憶されている制御プログラム30に従って、ステレオマッチング処理を実行する。 The control unit 21 includes a CPU (Central Processing Unit) or the like, and executes stereo matching processing in accordance with the control program 30 stored in the external storage unit 23.
 主記憶部22はRAM(Random-Access Memory)等から構成され、外部記憶部23に記憶されている制御プログラム30をロードし、制御部21の作業領域として用いられる。 The main storage unit 22 comprises a RAM (Random-Access Memory) or the like, loads the control program 30 stored in the external storage unit 23, and is used as a work area of the control unit 21.
 外部記憶部23は、フラッシュメモリ、ハードディスク、DVD-RAM(Digital Versatile Disc Random-Access Memory)、DVD-RW(Digital Versatile Disc ReWritable)等の不揮発性メモリから構成され、前記の処理を制御部21に行わせるための制御プログラム30を予め記憶し、また、制御部21の指示に従って、この制御プログラム30が記憶するデータを制御部21に供給し、制御部21から供給されたデータを記憶する。 The external storage unit 23 comprises a non-volatile memory such as a flash memory, a hard disk, a DVD-RAM (Digital Versatile Disc Random-Access Memory), a DVD-RW (Digital Versatile Disc Rewritable), etc. A control program 30 to be performed is stored in advance, and data stored by the control program 30 is supplied to the control unit 21 according to an instruction of the control unit 21, and the data supplied from the control unit 21 is stored.
 操作部24はキーボードおよびマウスなどのポインティングデバイス等と、キーボードおよびポインティングデバイス等を内部バス20に接続するインタフェース装置から構成されている。操作部24を介して、画像データの入力、送受信などの指示、表示する画像の指定、基準の標高を設定する基準点の画像内の位置などが入力され、制御部21に供給される。 The operation unit 24 includes a keyboard and a pointing device such as a mouse, and an interface device for connecting the keyboard and the pointing device to the internal bus 20. Input of image data, instructions for transmission / reception, designation of an image to be displayed, a position in an image of a reference point for setting a reference altitude, and the like are input through the operation unit 24 and supplied to the control unit 21.
 表示部25は、CRT(Cathode Ray Tube)またはLCD(Liquid Crystal Display)などから構成され、画像やステレオマッチング処理された結果を表示する。 The display unit 25 is configured of a CRT (Cathode Ray Tube), an LCD (Liquid Crystal Display), or the like, and displays an image or a result of stereo matching processing.
 入出力部26は、無線送受信機、無線モデムまたは網終端装置、およびそれらと接続するシリアルインタフェースまたはLAN(Local Area Network)インタフェース等から構成されている。入出力部26を介して、画像データを受信し、また計算した結果を送信できる。 The input / output unit 26 includes a wireless transceiver, a wireless modem or a network termination device, and a serial interface or a LAN (Local Area Network) interface connected with them. Image data can be received through the input / output unit 26 and the calculated result can be transmitted.
 図1、8、11、13または15に示すステレオ画像処理装置1の画像データ入力部10、ステレオマッチング部11、基準視差設定部12、探索範囲設定部13、地図データ入力部14、領域分割部15、基準点設定部16、基準点入力部17および基準対応点入力部18の処理は、制御プログラム30が、制御部21、主記憶部22、外部記憶部23、操作部24、表示部25および入出力部26などを資源として用いて処理することによって実行する。 Image data input unit 10, stereo matching unit 11, reference disparity setting unit 12, search range setting unit 13, map data input unit 14, area division unit of stereo image processing apparatus 1 shown in FIGS. 1, 8, 11, 13 or 15 The control program 30 controls the processing of the control unit 21, the main storage unit 22, the external storage unit 23, the operation unit 24, the display unit 25, and the processing of the reference point setting unit 16, the reference point input unit 17 and the reference corresponding point input unit 18. And the processing by using the input / output unit 26 as a resource.
 その他、本発明の好適な変形として、以下の構成が含まれる。 In addition, the following composition is included as a suitable modification of the present invention.
 本発明の第1の観点に係るステレオマッチング処理装置について、
 好ましくは、前記基準視差設定部は、前記複数の画像をそれぞれ対応する2以上の領域に分けて、それぞれの領域で前記基準の視差を設定し、
 前記探索範囲設定部は、前記それぞれの領域で設定された前記基準の視差を与える画像の点を基準に、該画像の範囲よりも小さい所定の範囲をステレオマッチングの探索範囲に設定する、
ことを特徴とする。
About the stereo matching processing device according to the first aspect of the present invention,
Preferably, the reference disparity setting unit divides the plurality of images into two or more corresponding regions, and sets the reference disparity in each of the regions.
The search range setting unit sets a predetermined range smaller than the range of the image as a search range of stereo matching, based on the point of the image giving the reference parallax set in each of the regions.
It is characterized by
 好ましくは、前記複数の画像に対応する地図の標高データを取得する地図データ取得部を備え、
 前記基準視差設定部は、前記地図データ取得部で取得した標高データに基づいて前記基準の視差を設定する。
Preferably, a map data acquisition unit for acquiring elevation data of a map corresponding to the plurality of images is provided,
The reference parallax setting unit sets the reference parallax based on the elevation data acquired by the map data acquisition unit.
 または、前記基準視差設定部は、前記複数の画像から所定の方法で選択した点について、ステレオマッチングによって視差を算出し、該算出した視差に基づいて、前記基準の視差を設定してもよい。 Alternatively, the reference parallax setting unit may calculate parallax by stereo matching for points selected from the plurality of images by a predetermined method, and set the reference parallax based on the calculated parallax.
 さらに、前記複数の画像のうち、視差を求める点の画像中の位置を取得する基準点入力部を備え、
 前記基準視差設定部は、前記基準点入力部で取得した点について、ステレオマッチングによって視差を算出し、該算出した視差に基づいて、前記基準の視差を設定してもよい。
And a reference point input unit configured to obtain a position in the image of a point for which parallax is to be obtained among the plurality of images.
The reference parallax setting unit may calculate the parallax by stereo matching for the point acquired by the reference point input unit, and set the reference parallax based on the calculated parallax.
 あるいはまた、前記複数の画像から、ステレオマッチングについて対応する点の組を入力する対応点入力部を備え、
 前記基準視差設定部は、前記対応点入力部で入力した対応する点の組で与えられる視差に基づいて、前記基準の視差を設定してもよい。
Alternatively, it further comprises a corresponding point input unit for inputting a set of corresponding points for stereo matching from the plurality of images,
The reference parallax setting unit may set the reference parallax based on the parallax provided by the set of corresponding points input by the corresponding point input unit.
 本発明の第2の観点に係るステレオマッチング処理方法について、
 好ましくは、前記基準視差設定ステップは、前記ステレオマッチング処理を行う1組の画像をそれぞれ対応する2以上の領域に分けて、それぞれの領域で前記基準の視差を設定し、
 前記探索範囲設定ステップは、前記それぞれの領域で設定された前記基準の視差を与える画像の点を基準に、該画像の範囲よりも小さい所定の範囲をステレオマッチングの探索範囲に設定する、
ことを特徴とする。
About the stereo matching processing method which concerns on the 2nd viewpoint of this invention,
Preferably, the reference parallax setting step divides one set of images to be subjected to the stereo matching process into two or more corresponding areas, and sets the reference parallax in each of the areas.
The search range setting step sets a predetermined range smaller than the range of the image as a search range of stereo matching, based on the point of the image giving the reference parallax set in each of the regions.
It is characterized by
 好ましくは、前記複数の画像に対応する地図の標高データを取得する地図データ取得ステップを備え、
 前記基準視差設定ステップは、前記地図データ取得ステップで取得した標高データに基づいて前記基準の視差を設定する。
Preferably, the method includes a map data acquisition step of acquiring elevation data of a map corresponding to the plurality of images;
The reference parallax setting step sets the reference parallax based on the elevation data acquired in the map data acquisition step.
 または、前記基準視差設定ステップは、前記複数の画像から所定の方法で選択した点について、ステレオマッチングによって視差を算出し、該算出した視差に基づいて、前記基準の視差を設定してもよい。 Alternatively, in the reference parallax setting step, parallax may be calculated by stereo matching for a point selected from the plurality of images by a predetermined method, and the reference parallax may be set based on the calculated parallax.
 さらに、前記複数の画像のうち、視差を求める点の画像中の位置を取得する基準点入力ステップを備え、
 前記基準視差設定ステップは、前記基準点入力ステップで取得した点について、ステレオマッチングによって視差を算出し、該算出した視差に基づいて、前記基準の視差を設定してもよい。
And a reference point input step of acquiring the position in the image of the point for which parallax is to be determined among the plurality of images.
The reference parallax setting step may calculate the parallax by stereo matching for the point acquired in the reference point input step, and set the reference parallax based on the calculated parallax.
 あるいはまた、前記複数の画像から、ステレオマッチングについて対応する点の組を入力する対応点入力ステップを備え、
 前記基準視差設定ステップは、前記対応点入力ステップで入力した対応する点の組で与えられる視差に基づいて、前記基準の視差を設定してもよい。
Alternatively, the method further comprises a corresponding point input step of inputting a set of corresponding points for stereo matching from the plurality of images,
The reference parallax setting step may set the reference parallax based on the parallax given by the set of corresponding points input in the corresponding point input step.
 その他、前記のハードウエェア構成やフローチャートは一例であり、任意に変更および修正が可能である。 In addition, the above hardware configuration and flowchart are merely examples, and arbitrary changes and modifications are possible.
 制御部21、主記憶部22、外部記憶部23、操作部24、入出力部26および内部バス20などから構成されるステレオ画像処理装置1の処理を行う中心となる部分は、専用のシステムによらず、通常のコンピュータシステムを用いて実現可能である。たとえば、前記の動作を実行するためのコンピュータプログラムを、コンピュータが読み取り可能な記録媒体(フレキシブルディスク、CD-ROM、DVD-ROM等)に格納して配布し、当該コンピュータプログラムをコンピュータにインストールすることにより、前記の処理を実行するステレオ画像処理装置1を構成してもよい。また、インターネット等の通信ネットワーク上のサーバ装置が有する記憶装置に当該コンピュータプログラムを格納しておき、通常のコンピュータシステムがダウンロード等することでステレオ画像処理装置1を構成してもよい。 The central processing unit of the stereo image processing apparatus 1 including the control unit 21, the main storage unit 22, the external storage unit 23, the operation unit 24, the input / output unit 26, and the internal bus 20 is a dedicated system. Regardless, it can be realized using an ordinary computer system. For example, a computer program for executing the above-mentioned operation is stored in a computer readable recording medium (flexible disc, CD-ROM, DVD-ROM, etc.) and distributed, and the computer program is installed in the computer. Thus, the stereo image processing apparatus 1 may be configured to execute the above process. In addition, the computer program may be stored in a storage device of a server device on a communication network such as the Internet, and the stereo image processing device 1 may be configured by a normal computer system downloading or the like.
 また、ステレオ画像処理装置1の機能を、OS(オペレーティングシステム)とアプリケーションプログラムの分担、またはOSとアプリケーションプログラムとの協働により実現する場合などには、アプリケーションプログラム部分のみを記録媒体や記憶装置に格納してもよい。 In addition, when realizing the function of the stereo image processing apparatus 1 by sharing the OS (operating system) and the application program or by the cooperation of the OS and the application program, only the application program portion is used as the recording medium or the storage device. It may be stored.
 また、搬送波にコンピュータプログラムを重畳し、通信ネットワークを介して配信することも可能である。たとえば、通信ネットワーク上の掲示板(BBS, Bulletin Board System)に前記コンピュータプログラムを掲示し、ネットワークを介して前記コンピュータプログラムを配信してもよい。そして、このコンピュータプログラムを起動し、OSの制御下で、他のアプリケーションプログラムと同様に実行することにより、前記の処理を実行できるように構成してもよい。 It is also possible to superimpose a computer program on a carrier wave and deliver it via a communication network. For example, the computer program may be posted on a bulletin board (BBS, Bulletin Board System) on a communication network, and the computer program may be distributed via the network. Then, the computer program may be activated and executed in the same manner as other application programs under the control of the OS so that the above-described processing can be executed.
 なお、本願については、日本国特許出願2008-300221号を基礎とする優先権を主張し、本明細書中に日本国特許出願2008-300221号の明細書、特許請求の範囲、図面全体を参照として取り込むものとする。 The present application claims priority based on Japanese Patent Application No. 2008-300221, and refers to the specification, claims, and whole drawings of Japanese Patent Application No. 2008-300221 in the present specification. Shall be imported as
  1 ステレオ画像処理装置
  2 高さ計算部
 10 画像データ入力部
 11 ステレオマッチング部
 12 基準視差設定部
 13 探索範囲設定部
 14 地図データ入力部
 15 領域分割部
 16 基準点設定部
 17 基準点入力部
 18 基準対応点入力部
 21 制御部
 22 主記憶部
 23 外部記憶部
 24 操作部
 25 表示部
 26 入出力部
 30 制御プログラム
DESCRIPTION OF SYMBOLS 1 stereo image processing apparatus 2 height calculation part 10 image data input part 11 stereo matching part 12 reference parallax setting part 13 search range setting part 14 map data input part 15 area division part 16 reference point setting part 17 reference point input part 18 reference Corresponding point input unit 21 Control unit 22 Main storage unit 23 External storage unit 24 Operation unit 25 Display unit 26 Input / output unit 30 Control program

Claims (13)

  1.  複数の異なる位置から所定の領域を撮影した複数の画像の画像データを取得する画像データ取得部と、
     前記複数の画像に対応して基準の視差を設定する基準視差設定部と、
     前記基準視差設定部で設定された基準の視差を与える画像の点を基準に、前記画像の範囲よりも小さい所定の範囲をステレオマッチングの探索範囲に設定する探索範囲設定部と、
     前記複数の画像の一つの画像内の任意の点について、前記基準視差設定部で設定された基準の視差を与える他の画像の点を基準に、前記探索範囲設定部で設定された探索範囲で前記他の画像の対応する点を探索する探索部と、
    を備えることを特徴とするステレオマッチング処理装置。
    An image data acquisition unit that acquires image data of a plurality of images obtained by photographing a predetermined area from a plurality of different positions;
    A reference disparity setting unit that sets a reference disparity corresponding to the plurality of images;
    A search range setting unit for setting a predetermined range smaller than the range of the image as a search range of stereo matching based on the point of the image giving the reference parallax set by the reference parallax setting unit;
    The search range set by the search range setting unit is based on the point of another image giving the reference parallax set by the reference parallax setting unit at an arbitrary point in one image of the plurality of images A search unit for searching for corresponding points of the other image;
    A stereo matching processor comprising:
  2.  前記基準視差設定部は、前記複数の画像をそれぞれ対応する2以上の領域に分けて、それぞれの領域で前記基準の視差を設定し、
     前記探索範囲設定部は、前記それぞれの領域で設定された前記基準の視差を与える画像の点を基準に、該画像の範囲よりも小さい所定の範囲をステレオマッチングの探索範囲に設定する、
    ことを特徴とする請求項1に記載のステレオマッチング処理装置。
    The reference parallax setting unit divides the plurality of images into two or more corresponding areas, and sets the reference parallax in each of the areas.
    The search range setting unit sets a predetermined range smaller than the range of the image as a search range of stereo matching, based on the point of the image giving the reference parallax set in each of the regions.
    The stereo matching processing device according to claim 1, characterized in that:
  3.  前記複数の画像に対応する地図の標高データを取得する地図データ取得部を備え、
     前記基準視差設定部は、前記地図データ取得部で取得した標高データに基づいて前記基準の視差を設定することを特徴とする請求項2に記載のステレオマッチング処理装置。
    A map data acquisition unit for acquiring elevation data of a map corresponding to the plurality of images;
    The stereo matching processing device according to claim 2, wherein the reference disparity setting unit sets the reference disparity based on the elevation data acquired by the map data acquisition unit.
  4.  前記基準視差設定部は、前記複数の画像から所定の方法で選択した点について、ステレオマッチングによって視差を算出し、該算出した視差に基づいて、前記基準の視差を設定することを特徴とする請求項2に記載のステレオマッチング処理装置。 The reference parallax setting unit calculates the parallax by stereo matching at a point selected from the plurality of images by a predetermined method, and sets the reference parallax based on the calculated parallax. Item 3. A stereo matching processor according to item 2.
  5.  前記複数の画像のうち、視差を求める点の画像中の位置を取得する基準点入力部を備え、
     前記基準視差設定部は、前記基準点入力部で取得した点について、ステレオマッチングによって視差を算出し、該算出した視差に基づいて、前記基準の視差を設定することを特徴とする請求項4に記載のステレオマッチング処理装置。
    A reference point input unit configured to obtain a position in the image of a point for which parallax is to be obtained among the plurality of images;
    The reference parallax setting unit calculates the parallax by stereo matching for the point acquired by the reference point input unit, and sets the reference parallax based on the calculated parallax. The stereo matching processor as described.
  6.  前記複数の画像から、ステレオマッチングについて対応する点の組を入力する対応点入力部を備え、
     前記基準視差設定部は、前記対応点入力部で入力した対応する点の組で与えられる視差に基づいて、前記基準の視差を設定する、
    ことを特徴とする請求項2に記載のステレオマッチング処理装置。
    A corresponding point input unit for inputting a set of corresponding points for stereo matching from the plurality of images;
    The reference parallax setting unit sets the reference parallax based on the parallax given by the set of corresponding points input by the corresponding point input unit.
    The stereo matching processing device according to claim 2, characterized in that:
  7.  複数の異なる位置から所定の領域を撮影した複数の画像の画像データを取得する画像データ取得ステップと、
     前記複数の画像に対応して基準の視差を設定する基準視差設定ステップと、
     前記基準視差設定ステップで設定された基準の視差を与える画像の点を基準に、前記画像データの範囲よりも小さい所定の範囲をステレオマッチングの探索範囲に設定する探索範囲設定ステップと、
     前記複数の画像の一つの画像内の任意の点について、前記基準視差設定ステップで設定された基準の視差を与える他の画像の点を基準に、前記探索範囲設定ステップで設定された探索範囲で前記他の画像の対応する点を探索する探索ステップと、
    を備えることを特徴とするステレオマッチング処理方法。
    An image data acquisition step of acquiring image data of a plurality of images obtained by photographing a predetermined area from a plurality of different positions;
    A reference disparity setting step of setting a reference disparity corresponding to the plurality of images;
    A search range setting step of setting a predetermined range smaller than the range of the image data as a search range of stereo matching based on the point of the image giving the standard parallax set in the standard parallax setting step;
    The search range set in the search range setting step on the basis of the point of another image giving the reference parallax set in the reference parallax setting step at any point in one image of the plurality of images A searching step of searching for corresponding points of the other image;
    A stereo matching processing method comprising:
  8.  前記基準視差設定ステップは、前記ステレオマッチング処理を行う1組の画像をそれぞれ対応する2以上の領域に分けて、それぞれの領域で前記基準の視差を設定し、
     前記探索範囲設定ステップは、前記それぞれの領域で設定された前記基準の視差を与える画像の点を基準に、該画像の範囲よりも小さい所定の範囲をステレオマッチングの探索範囲に設定する、
    ことを特徴とする請求項7に記載のステレオマッチング処理方法。
    The reference parallax setting step divides one set of images to be subjected to the stereo matching process into two or more corresponding areas, and sets the reference parallax in each of the areas.
    The search range setting step sets a predetermined range smaller than the range of the image as a search range of stereo matching, based on the point of the image giving the reference parallax set in each of the regions.
    The stereo matching method according to claim 7, characterized in that:
  9.  前記複数の画像に対応する地図の標高データを取得する地図データ取得ステップを備え、
     前記基準視差設定ステップは、前記地図データ取得ステップで取得した標高データに基づいて前記基準の視差を設定することを特徴とする請求項8に記載のステレオマッチング処理方法。
    Providing a map data acquisition step of acquiring elevation data of a map corresponding to the plurality of images;
    The stereo matching method according to claim 8, wherein the reference parallax setting step sets the reference parallax based on the elevation data acquired in the map data acquisition step.
  10.  前記基準視差設定ステップは、前記複数の画像から所定の方法で選択した点について、ステレオマッチングによって視差を算出し、該算出した視差に基づいて、前記基準の視差を設定することを特徴とする請求項8に記載のステレオマッチング処理方法。 The reference parallax setting step calculates the parallax by stereo matching at a point selected from the plurality of images according to a predetermined method, and sets the reference parallax based on the calculated parallax. Item 9. A stereo matching method according to Item 8.
  11.  前記複数の画像のうち、視差を求める点の画像中の位置を取得する基準点入力ステップを備え、
     前記基準視差設定ステップは、前記基準点入力ステップで取得した点について、ステレオマッチングによって視差を算出し、該算出した視差に基づいて、前記基準の視差を設定することを特徴とする請求項10に記載のステレオマッチング処理方法。
    A reference point input step of acquiring a position in the image of a point for which parallax is to be obtained among the plurality of images;
    The reference parallax setting step calculates the parallax by stereo matching for the point acquired in the reference point input step, and sets the reference parallax based on the calculated parallax. Stereo matching processing method described.
  12.  前記複数の画像から、ステレオマッチングについて対応する点の組を入力する対応点入力ステップを備え、
     前記基準視差設定ステップは、前記対応点入力ステップで入力した対応する点の組で与えられる視差に基づいて、前記基準の視差を設定する、
    ことを特徴とする請求項8に記載のステレオマッチング処理方法。
    A corresponding point input step of inputting a set of corresponding points for stereo matching from the plurality of images;
    The reference parallax setting step sets the reference parallax based on the parallax given by the set of corresponding points input in the corresponding point input step.
    The stereo matching method according to claim 8, characterized in that:
  13.  コンピュータを、
     複数の異なる位置から所定の領域を撮影した複数の画像の画像データを取得する画像データ取得部と、
     前記複数の画像に対応して基準の視差を設定する基準視差設定部と、
     前記基準視差設定部で設定された基準の視差を与える画像の点を基準に、前記画像の範囲よりも小さい所定の範囲をステレオマッチングの探索範囲に設定する探索範囲設定部と、
     前記複数の画像の一つの画像内の任意の点について、前記基準視差設定部で設定された基準の視差を与える他の画像の点を基準に、前記探索範囲設定部で設定された探索範囲で前記他の画像の対応する点を探索する探索部、
    として機能させることを特徴とするプログラムを記録したコンピュータ読み取り可能な記録媒体。
    Computer,
    An image data acquisition unit that acquires image data of a plurality of images obtained by photographing a predetermined area from a plurality of different positions;
    A reference disparity setting unit that sets a reference disparity corresponding to the plurality of images;
    A search range setting unit for setting a predetermined range smaller than the range of the image as a search range of stereo matching based on the point of the image giving the reference parallax set by the reference parallax setting unit;
    The search range set by the search range setting unit is based on the point of another image giving the reference parallax set by the reference parallax setting unit at an arbitrary point in one image of the plurality of images A search unit for searching for corresponding points of the other image;
    A computer readable recording medium recorded with a program characterized in that it functions as:
PCT/JP2009/069888 2008-11-25 2009-11-25 Stereo matching process device, stereo matching process method, and recording medium WO2010061861A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020117011815A KR101260132B1 (en) 2008-11-25 2009-11-25 Stereo matching process device, stereo matching process method, and recording medium
CN2009801472168A CN102239503B (en) 2008-11-25 2009-11-25 Stereo matching process device, stereo matching process method, and recording medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2008-300221 2008-11-25
JP2008300221A JP5229733B2 (en) 2008-11-25 2008-11-25 Stereo matching processing device, stereo matching processing method and program

Publications (1)

Publication Number Publication Date
WO2010061861A1 true WO2010061861A1 (en) 2010-06-03

Family

ID=42225734

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/069888 WO2010061861A1 (en) 2008-11-25 2009-11-25 Stereo matching process device, stereo matching process method, and recording medium

Country Status (4)

Country Link
JP (1) JP5229733B2 (en)
KR (1) KR101260132B1 (en)
CN (1) CN102239503B (en)
WO (1) WO2010061861A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102650518A (en) * 2011-02-25 2012-08-29 株式会社理光 Measuring method and equipment
US9563937B2 (en) 2013-07-12 2017-02-07 Mitsubishi Electric Corporation High-resolution image generation apparatus, high-resolution image generation method, and high-resolution image generation program

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5780413B2 (en) * 2011-02-18 2015-09-16 国際航業株式会社 Deposit amount estimation method, deposit amount estimation diagram, and deposit amount estimation program
JP5769248B2 (en) * 2011-09-20 2015-08-26 Necソリューションイノベータ株式会社 Stereo matching processing device, stereo matching processing method, and program
US9374571B2 (en) 2011-10-11 2016-06-21 Panasonic Intellectual Property Management Co., Ltd. Image processing device, imaging device, and image processing method
CN104025153B (en) * 2011-12-30 2017-09-15 英特尔公司 It is thick to arrive thin multiple parallax candidate Stereo matchings
JP5698301B2 (en) * 2013-04-22 2015-04-08 株式会社パスコ Plotting system, plotting method, and program
KR101690645B1 (en) 2015-09-21 2016-12-29 경북대학교 산학협력단 Method for estimating of disparity search range applied multi-level disparity image partitioning and device for matching of stereo image using thereof
CN108377376B (en) * 2016-11-04 2021-01-26 宁波舜宇光电信息有限公司 Parallax calculation method, double-camera module and electronic equipment
KR102468897B1 (en) * 2017-10-16 2022-11-21 삼성전자주식회사 Method and apparatus of estimating depth value
CN111311667B (en) * 2020-02-14 2022-05-17 苏州浪潮智能科技有限公司 Content self-adaptive binocular matching method and device
KR102520189B1 (en) * 2021-03-02 2023-04-10 네이버랩스 주식회사 Method and system for generating high-definition map based on aerial images captured from unmanned air vehicle or aircraft

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002063580A (en) * 2000-08-22 2002-02-28 Asia Air Survey Co Ltd Inter-image expansion image matching method using indefinite shape window
JP2006113832A (en) * 2004-10-15 2006-04-27 Canon Inc Stereoscopic image processor and program

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4480212B2 (en) * 1999-11-05 2010-06-16 アジア航測株式会社 Calculation method of aerial photo position and orientation
CN1264062C (en) * 2002-12-31 2006-07-12 清华大学 Method of multi viewing angle x-ray stereo imaging and system
KR100739730B1 (en) * 2005-09-03 2007-07-13 삼성전자주식회사 Apparatus and method for processing 3D dimensional picture
JP4813263B2 (en) * 2006-06-05 2011-11-09 株式会社トプコン Image processing apparatus and processing method thereof

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002063580A (en) * 2000-08-22 2002-02-28 Asia Air Survey Co Ltd Inter-image expansion image matching method using indefinite shape window
JP2006113832A (en) * 2004-10-15 2006-04-27 Canon Inc Stereoscopic image processor and program

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102650518A (en) * 2011-02-25 2012-08-29 株式会社理光 Measuring method and equipment
US9563937B2 (en) 2013-07-12 2017-02-07 Mitsubishi Electric Corporation High-resolution image generation apparatus, high-resolution image generation method, and high-resolution image generation program

Also Published As

Publication number Publication date
KR20110087303A (en) 2011-08-02
JP5229733B2 (en) 2013-07-03
JP2010128622A (en) 2010-06-10
CN102239503A (en) 2011-11-09
KR101260132B1 (en) 2013-05-02
CN102239503B (en) 2013-11-13

Similar Documents

Publication Publication Date Title
WO2010061861A1 (en) Stereo matching process device, stereo matching process method, and recording medium
US11783543B2 (en) Method and system for displaying and navigating an optimal multi-dimensional building model
US8831335B2 (en) Stereo matching processing apparatus, stereo matching processing method and computer-readable recording medium
US8577139B2 (en) Method of orthoimage color correction using multiple aerial images
JP7003594B2 (en) 3D point cloud display device, 3D point cloud display system, 3D point cloud display method, 3D point cloud display program, recording medium
JP5311465B2 (en) Stereo matching processing system, stereo matching processing method, and program
CN113409459A (en) Method, device and equipment for producing high-precision map and computer storage medium
KR102204043B1 (en) System for automatic satellite image processing for improving image accuracy by position correcting of geographic feature
KR20040055510A (en) Ikonos imagery rpc data update method using additional gcp
CN112967344A (en) Method, apparatus, storage medium, and program product for camera external reference calibration
CN110889899A (en) Method and device for generating digital earth surface model
JP6396982B2 (en) Spatial model processor
CN111429548B (en) Digital map generation method and system
JP4592510B2 (en) 3D map image generating apparatus and method
CN107478235A (en) The dynamic map based on template obtains system under network environment
CN116805277B (en) Video monitoring target node pixel coordinate conversion method and system
CN108731644B (en) Oblique photography mapping method and system based on vertical auxiliary line
JP5126020B2 (en) Image processing apparatus, image processing method, and program
JP6876484B2 (en) Data processing equipment, data processing methods, and programs
JPH1132250A (en) Verbal guiding type sight labeling device and system
JPH1166355A (en) Transformational label type scene labeling device and system
JP2021173801A (en) Information processing device, control method, program, and storage medium
CN115657072A (en) Aerial real-time image-based space target positioning method
JPH1131238A (en) Differential landscape labeling device and system
JPH1131237A (en) Noise removal type landscape labeling device and system

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200980147216.8

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09829109

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20117011815

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09829109

Country of ref document: EP

Kind code of ref document: A1