WO2008152740A1 - Digital aerial photographing three-dimensional measurement system - Google Patents

Digital aerial photographing three-dimensional measurement system Download PDF

Info

Publication number
WO2008152740A1
WO2008152740A1 PCT/JP2007/062373 JP2007062373W WO2008152740A1 WO 2008152740 A1 WO2008152740 A1 WO 2008152740A1 JP 2007062373 W JP2007062373 W JP 2007062373W WO 2008152740 A1 WO2008152740 A1 WO 2008152740A1
Authority
WO
WIPO (PCT)
Prior art keywords
point
image
photo
camera
orientation
Prior art date
Application number
PCT/JP2007/062373
Other languages
French (fr)
Japanese (ja)
Inventor
Fumio Shinohara
Original Assignee
Information & Science Techno-System Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Information & Science Techno-System Co., Ltd. filed Critical Information & Science Techno-System Co., Ltd.
Priority to JP2009519130A priority Critical patent/JPWO2008152740A1/en
Priority to PCT/JP2007/062373 priority patent/WO2008152740A1/en
Publication of WO2008152740A1 publication Critical patent/WO2008152740A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Definitions

  • the present invention is a system related to photogrammetry performed from an aircraft using a digital camera as an imaging device. Background art
  • Photogrammetry is a technique that calculates the three-dimensional coordinates of an object based on the information on the position and orientation of the force camera when the object is photographed from multiple positions and angles.
  • aerial photogrammetry is the one that calculates the latitude / longitude / altitude of a feature or terrain by photographing the direction directly below the flying aircraft, especially a chemical film as a photographic recording medium. So ⁇ digital aerial photogrammetry is what uses electronic media
  • the image recorded on the imaging surface of the force camera is the one that is projected by central projection within the range that falls within the angle of view of the camera.
  • the first line that hits the straight line is recorded at that point.
  • the three-dimensional coordinates of each point on a photo can be reversed if the optical system of the force camera that took the photo and the position and orientation of the imaging surface at the time of taking the photo are known. Somewhere on each straight line you get when you follow. At this time
  • Patent Document 1 As described in Japanese Patent Application Laid-Open No. 2000-107-1089, which is Patent Document 1.
  • the position specifying unit specifies the position based on the amount of error comparing the image data captured by the image pickup unit and the geographical data corresponding to the shooting range of the image pickup unit at each point.
  • Patent Document 2 As described in Japanese Patent Application Laid-Open No. 2 0 06-0 7 9 5 21, which is Patent Document 2.
  • 3D coordinate information at a predetermined position of the shooting object using multiple image information taken with a digital camera from multiple points with different shooting positions for the shooting object and 3D coordinate information of the shooting object An invention is also disclosed in which image coordinate information on image information to be obtained is obtained, and an RGB value corresponding to the obtained image coordinate information is obtained to form an orthographic projection image of the object to be photographed.
  • an image from an arbitrary viewpoint is synthesized using an image from a camera installed in a car.
  • an invention that presents the driver with an easy-to-determine object other than the road surface by synthesizing the road surface projection image with an index for distinguishing objects other than the road surface by the road surface index combining means is also disclosed. ing.
  • an image input means for acquiring two stereo images whose horizontal axes are parallel projections, and image input
  • An image matching means for calculating the similarity between the corrected image obtained by deforming the local region of one of the two images obtained by the means and the other image; and image matching
  • An invention comprising a 3D point cloud generation means for generating a 3D point cloud based on the similarity calculated by the means is also disclosed.
  • a reference image and a reference image are associated with each other based on a predetermined correlation.
  • the distance between adjacent windows in the reference image can be reduced. Or cancel the overlap, and the reference image window and reference image
  • An invention that performs stereo matching with the image window and restores the shape of the object has also been made public.
  • Patent Document 2 Japanese Patent Laid-Open No. 2 0 06-0 7 9 5 21, which is Patent Document 2, forms an orthographic projection image from three-dimensional coordinate information of an object to be imaged. It does not calculate three-dimensional coordinates by shooting an object from multiple points with a digital camera.
  • Patent Document 3 displays an image of the surroundings of the vehicle in an easy-to-understand manner by combining images taken with a camera mounted on the vehicle. It is not intended to calculate high-precision 3D coordinates by correcting the image.
  • the invention described in Japanese Patent Application Laid-Open No. 2 0 0 6-1 9 5 7 5 8, which is Patent Document 4, is a method in which two cameras are fixed, photographed while moving in parallel, and stereo-matched. Even if the position and posture of the camera are indefinite, it cannot be handled.
  • the stereo matching is based on the correspondence between the windows, and it does not calculate the three-dimensional coordinates with high accuracy by finding the correspondence for each point over the entire area where the two images overlap.
  • an object of the present invention is to provide a digital aerial three-dimensional measurement system capable of calculating three-dimensional coordinates with high accuracy from an image photographed with a digital camera. Disclosure of the invention
  • the present invention is based on the specifications of a digital camera. Geometric elements and multiple aerial photographs taken with the digital camera are input to a computer, and the coordinates on the image of the aerial photograph are transformed for radial and tangential distortions using the internal orientation elements as coefficients. A correction means for storing the corrected photograph data in the storage device, and reading the aerial photograph from the storage device and inputting the GPS data received when the aerial photograph was taken to the computer.
  • Deviation correction means for storing in a storage device the data of a deviation-corrected photograph that has been reprojected so as to be in a state of being photographed in a downward direction based on the position and orientation; And the deviation-corrected photos are read from the storage device, and the images that overlap each other in the deviation-corrected photos are paired, and a plurality of points on one photo are the other photo.
  • Stereo matching means for storing data of three-dimensional coordinates in a storage device, the internal orientation element, the aerial photograph, the position and orientation, and the three-dimensional coordinates are read from the storage device and expressed by the three-dimensional coordinates.
  • a number of internal orientation elements By calculating the correspondence when shooting with the above-mentioned position and orientation with some distortion, linking each point on the aerial photograph to the three-dimensional coordinates, the image is orthogonally projected from the central projection
  • the composition of the digital aerial 3D measurement system which is composed of an orthorectification means that outputs from the computer the data of the corrected photograph converted to, and the typographic point extraction or stepping of the photogrammetry
  • the pixel value acquisition when matching images by rhe matching when the point is near the center, it has a square shape, and when the point is near the outer edge of the image, the shape is elongated in the radial direction.
  • the set frame is rotated according to the direction of camera movement, distorted according to the camera tilt, or scaled according to the camera altitude, and the pixel value acquisition position is adjusted according to the frame shape.
  • the image matching method is characterized in that the pixel value is calculated based on the distance from the center of the surrounding pixel, and the coplanar conditions and the coplanar conditions for the overlapping of the shooting range in the photogrammetry external orientation. Use line conditions In the camera posture estimation performed in this way, there are two consecutive photographs along the camera traveling direction, and a photograph taken at a position where the photograph and the photographing range overlap and are orthogonal to the traveling direction.
  • Fig. 1 is a flowchart showing the flow of processing of the digital aerial 3D measurement system of the present invention.
  • Fig. 2 shows the processing of the digital aerial 3D measurement system of the present invention, and the relationship between data and equipment.
  • Fig. 3 shows the configuration of the digital aerial 3D measurement system computer of the present invention, and
  • Fig. 4 shows the deformation of the figure in the image correlation method of the digital aerial 3D measurement system of the present invention.
  • Fig. 5 is a diagram showing the deformation of the figure in the image correlation method of the digital aerial photography 3D measurement system according to the present invention, and
  • Fig. 6 is the pixel value of the digital aerial photography 3D measurement system according to the present invention.
  • Fig. 7 shows acquisition, Fig.
  • Fig. 9 shows a set of images for estimating the position and orientation of the 3D measurement system.
  • Fig. 9 shows a method for correcting the deviation of the digital aerial 3D measurement system of the present invention.
  • Fig. 10 shows the present invention.
  • Fig. 11 shows a sample of a digital aerial 3D measurement system with corrected displacement.
  • Fig. 1 1 shows the stereo matching method of the digital aerial 3D measurement system of the present invention.
  • Fig. 2 is a perspective view of a consumer digital camera used in the digital aerial 3D measurement system of the present invention
  • Fig. 1 3 is a consumer digital camera used in the digital aerial 3D measurement system of the present invention.
  • Fig. 14 is a front view of a consumer digital camera used in the digital aerial 3D measurement system of the present invention.
  • FIG. 1 is a flowchart showing the flow of processing of the digital aerial photography 3D measurement system according to the present invention.
  • FIG. 2 which is a chart is a diagram showing the relationship between the processing, data and apparatus of the digital aerial three-dimensional measurement system according to the present invention.
  • FIG. 3 is a diagram showing a configuration of a digital aerial imaging three-dimensional measurement system computer according to the present invention.
  • the digital aerial 3D measurement system 1 consists of geometric correction, 2 linkage analysis 3, type point extraction 4, external orientation 5, displacement correction 6, stereo Vching
  • each photo is shot in sync with the GPS signal and shot, but it is a single photo that serves as a reference for determining which photo specifically corresponds to which GPS record.
  • the remaining image and the GPS record are collated and analyzed based on that.
  • data for external orientation 5 is generated. Specifically, the characteristic points that are shared by using image matching technology between images with a part of the shooting range overlapped. Find and save the coordinates on the image as data In the external orientation 5, the position of the camera's imaging surface at the time of capturing each photo is determined with high accuracy based on the position data measured by the GPS and the photos taken by the camera. presume.
  • the initial condition is the assumption that each photograph was taken from the position as recorded in the GPS, and it can be obtained by following the optical system from typpoint 4a. Calculate the position and orientation of the imaging surface using a solution such as the Newton method so that the twist of each ray that should originally be on the same plane is minimized.
  • the information can be used for this estimation, but this is not essential.
  • a re-projected photograph is generated to the image that should have been taken when the photograph was taken in the direct downward direction. Specifically, using the results of external orientation 5 for each photo, the coordinates on the image of each point on the photo are converted to the positions that should be projected when taken in the direct downward direction.
  • the correspondence between the points on each photo (determining the force at which the point on one side appears on the other side, Deriving the intersection of straight lines obtained by tracing back the system and finding the coordinates Since the photograph is a raster image, this processing is done by finding the correspondence for each pixel. .
  • the photo Since the photo is a central projection, when a deep (high altitude) object appears in the vicinity of the outer edge of the photo, an area that cannot be seen in the shadow (falling down) occurs. Because it varies with the angle, it differs from photo to photo, and you cannot perform stereo matching7 on this range. Therefore, if you want to get 3D information, you can make up for the fall in each photo 5, which requires photos from various positions or angles.
  • Orthorectification 8 reprojects the central projection into an orthographic projection.
  • the photo is a central projection
  • the coordinates of each point on the photo are obtained, and after mapping each point to this coordinate, the depth axis (altitude axis) is crushed so that it is only the vertical and horizontal axes.
  • an orthographically projected photo is obtained.
  • 3D data is generated based on the coordinates obtained as a result of stereo-yching7, and the orthorectified photo8 is used as the texture. If used, a 3D photographic map can be obtained.
  • the geometric correction 2 means inputs the data of the internal orientation element 9b and the aerial photograph 10a, and outputs the data of the geometrically corrected photograph 2a.
  • a radial distortion equation and a tangential distortion equation are determined, and the inverse transformation is applied to the aerial photograph 10 a.
  • it is a process that transforms the coordinates on the image by inverse transformation of the distortion and resamples it into a raster image by an internal method such as cubic interpolation.
  • the data of the internal orientation element 9 b is the internal orientation with T data of the force meter specification 9 a as input.
  • the data for aerial photograph 10 a is taken a; It is an electronic image file with a shell IJ on the output format of a consumer digital camera 14. Each photo is taken so that 60% or more of the + ⁇ -shadow range overlaps the previous and next photos, and at least one of the flight routes is a non-consecutive flight route (ie Taken from a crossing or parallel path) and 30% or more of each other's shooting range overlap each other.
  • Geometrically corrected photo 2a data is an electronic image file obtained by removing the distortion inherent in the optical system of consumer digital camera 14 from each of aerial photo 10a data.
  • the data in 9a is data that has attributes such as the principal force lens of the consumer digital force meter 14 and the lens, that is, principal point position, focal length ⁇ pixel size, pixel spacing, etc. Is a set of
  • the work of internal orientation 9 is the distortion of the optical system of the consumer digital camera 14, that is, the ideal center projection determined by the camera specification 9 a and the actual center projection Measure the error between and and determine the parameter to correct it.
  • the central projection in photography is ideally determined by the camera's principal point position, focal length, pixel dimensions, and pixel spacing, but the decision itself must use approximate conditions.
  • the actual lens is not ideal, it is necessary to measure distortion aberration etc. in combination and determine the parameters to correct the error from the ideal central projection .
  • the consumer digital camera 14 and the aviation GPS receiver 16 are installed on the aircraft via the mounting and control base 15 and the functions of the aviation GPS receiver 16 are used.
  • Using the functions of the mounting and control pedestal 15 to send a shutter signal synchronized with the recording of the aviation GPS receiver 16 to the consumer digital camera 14 Use the functions of the control pedestal 15 to take a photo from the aircraft with the function of the consumer digital camera 14 while keeping the consumer digital camera 14 in a position facing directly downward. .
  • Consumer digital cameras 14 are commercially available single-lens reflex digital cameras and the like. It has an interface that accepts a shutter signal from the outside, and attaches it to the control base 1 5.
  • the mounting part is versatile and can be replaced when a higher-performance camera appears.
  • Mounting and control pedestal 15 is a pedestal mounted on an aircraft, equipped with a consumer digital force meter 14 and an aviation GPS receiver 16.
  • a machine that absorbs vibrations from aircraft and tilts the mounting part with a motor so that the consumer digital camera 14 always faces directly below. Has the ability. It also has a function to receive a signal from the aviation GPS receiver 16 and record it in the built-in memory force, and at the same time send a digital camera for consumer use 14 in sync with the signal reception. Have.
  • the aviation GPS receiver 16 is a receiver for three-dimensional positioning of an aircraft based on the arrival time of radio waves of time signals emitted from satellites.
  • the electronic board is installed. Installed in the control pedestal 15, attach only the antenna to an appropriate place outside the aircraft or inside the aircraft where radio waves can be easily received, and wire the antenna and board with a cable.
  • the means of continuity analysis 3 inputs aerial photograph 10a and GPS data 12a data, and outputs camera position 3a data.
  • the photos Entering a standard photo of aerial photos 10 a and the GPS data 1 2 a corresponding to that photo, the photos are first linked to the GPS record. After that, one photo from the aerial photograph 10 0a that is not linked to the GPS record is taken from the photo associated with the GPS record. Select several points on one photo, find out where they are on the other photo by image matching using the image correlation method, and determine the overlap of the two images.
  • the difference between the shooting positions is converted into an actual distance, and compared with the latitude and longitude information of GPS data 1 2 a, Connect the record of GPS data 1 2 a that should correspond to the geographic coordinates of the other photo. This is recursively repeated, and the GPS record corresponding to each photograph of the aerial photograph 1 O a is taken out from the GPS data 1 2 a and connected to obtain camera position 3 a.
  • the GPS data 1 2 a data is data obtained by improving the accuracy of the aerial GPS data 10 b in which the flight path was recorded at the time of aerial photography 10 by performing the GPS analysis 1 2.
  • G P S Includes data such as time, geographic coordinates, and moving speed.
  • the data at camera position 3a is the GPS data 1 2a for each of the aerial photos 1 0a, and records (GPS time, geographic coordinates, movement speed, etc.) corresponding to the moment of shooting are recorded. It is the linked data.
  • the GPS analysis 1 2 processing is more accurate by performing RTK-GPS analysis using aerial GPS data 10 b as mobile station data and fixed-point GPS data 1 1 a as base station data. Calculate flight path of aerial photography 1 0.
  • the aerial GPS data 10 b data is the data recorded by the aerial GPS receiver 16 when the flight route during the aerial photography 10 is implemented. Includes data such as GPS time, geographic coordinates, travel speed, and carrier waves.
  • the fixed-point GPS data 1 1 a data is the data observed during the same period including or including the fixed-point GPS receiver 1 7 installed at the point where the geographical coordinates are known in advance, and The geographical coordinates of the point that were known in advance. Includes data such as G PS time and carrier wave.
  • Fixed point positioning 1 1 work is close to the flight path of aerial photography 10 (preferably the entire flight path is within a radius of 10 km), and the fixed point GPS is received at a point where the geographical coordinates are known in advance. Aircraft 17 will be installed, and positioning will be performed during the same period as aerial photography 10 or during that period. If an electronic reference point managed by the Geospatial Information Authority of Japan exists near the flight path of aerial photography 10, observations using that electronic reference point satisfy the requirements for fixed-point positioning 1 1, and from that electronic reference point, The provided data can be used as fixed-point GPS data 1 1 a.
  • the fixed point GPS receiver 17 is a receiver that is installed at a location where the geographical coordinates are known in advance, and records the carrier wave at the geographical coordinates based on the arrival time of the radio wave of the time signal emitted from the satellite.
  • the tie point extraction 4 means inputs the data of the geometrically corrected photograph 2a and the force mela position 3a, and outputs the data of the tie point 4a. It is also possible to input the internal orientation element 9b. Based on the relationship between the camera position 3a and the position information, the geometrically corrected photographs 2a that overlap each other are paired. The determination of whether or not they overlap is determined by whether or not the distance between the camera positions 3a corresponding to each photo exceeds a threshold given from the outside. When the internal orientation element 9 b is input, this threshold value is calculated by estimating the size of the shooting range from the angle of view of the internal orientation element 9 b and the altitude of the camera position 3 a. Next, for each pair, select several points on one photo, find out where they are on the other photo by image matching using the image correlation method, and determine the coordinates of each image. Record as correspondence.
  • the data for typpoint 4a is adjacent to the data for geometrically corrected data 2a. This is a record of the coordinates of each point in the image for a pair of matching photos.
  • the method of external orientation 5 is to input the data of geometrically corrected photo 2a, camera position 3a, typpoint 4a and internal orientation element 9b, and the data of position and orientation 5a. Output.
  • data of G CP data 1 3 b can be input.
  • the center projection is traced back using the internal orientation element 9b, and the incident ray on the imaging surface is calculated.
  • the incident light beam to the corresponding point on the other photograph should be on the same plane. Therefore, by using a solution such as Newton's method, the position and orientation of each imaging surface when the twist of each corresponding ray is minimized are calculated, and the position of the imaging surface at the time of each photography is calculated. Use posture information.
  • the incident ray is similarly calculated for each point on the photograph corresponding to the point of GCP data 1 3 b, and the geography associated with that point is calculated using the above solution.
  • Information on the position and orientation of each imaging plane is calculated so that the distance between the coordinates and the ray is also the shortest at the same time.
  • the same preprocessing as geometric correction 2 is performed using internal orientation element 9 b. Apply to 1 3 b.
  • the data of the position / orientation 5a is data of the position and orientation of the camera imaging surface at the moment when each photograph of the geometrically corrected photograph 2a was taken.
  • the data of G C P data 1 3 b is the data that combines the coordinates on the image and the geographical coordinates of the photo that includes the point of G C P information 1 3 a in the aerial photo 10 a.
  • the processing of GCP designation 1 3 is to find the point corresponding to the location on the aerial photograph 1 0 a image from the state of the ground reference point of GCP information 1 3 a, and to determine the geographical coordinate (latitude of the ground reference point) , Longitude, altitude) and the image coordinates of the image.
  • GCP information 1 3 a data is recorded in aerial photography 1 0 a in aerial photography 1 0 and is easy to interpret on aerial photography 1 0 a in aerial photography 1 0 (Scenery photo, etc., which is a clue to image interpretation) and geographic coordinate data at that point.
  • the displacement correction 6 means inputs the data of the internal orientation element 9b, the aerial photograph 10a and the position and orientation 5a, and outputs the data of the displacement corrected photograph 6a. From the position and orientation of position 5a, re-project the photo of geometrically corrected photo 2a so that it is in the shooting state when the camera is facing directly below.
  • the geometrically corrected photograph 2 a is an image that has been resampled once, so re-processing it may result in loss of image definition.
  • the processing here converts the coordinates on the image by the inverse transformation of the distortion, cancels the position / posture inclination, converts the coordinates on the image in the downward direction, Resampled into a raster image by the method.
  • the data of the deviation corrected photo 6a is an electronic image file obtained by projecting each of the photos of the geometrically corrected photo 2a into a photo when taken in the downward direction.
  • it is generated by reprocessing from the aerial photo 10a where the inherent distortion of the optical system is removed. To do.
  • the stereo matching 7 means inputs the data of the internal orientation element 9 b, the displacement corrected photo 6 a and the position and orientation 5 a and outputs the data of the three-dimensional coordinate 7 a.
  • a pair of photos that have been corrected is selected based on the position information of position and orientation 5a.
  • the image correlation method is used to find where each point on one photo appears on the other photo. Then, for each point on each photo, a straight line that reverses the central projection from the real space is calculated based on the position and orientation of the position and orientation 5a.
  • the straight line extending from a certain point in one photo is 0 or the distance from the straight line extending from the same point in the other photo is 0 or Very short. Therefore, the midpoint of the distance is the geographical coordinate of that point. The correspondence between the coordinates on the image and the geographical coordinates obtained in this way is recorded.
  • the data of the three-dimensional coordinate 7 a is data that associates the coordinates on the image and the geographical coordinates of the point that is common to the other photos for each of the photos of the displacement corrected photo 6 a.
  • the orthorectification 8 means inputs the data of the internal orientation element 9b, the aerial photograph 10a and the three-dimensional coordinate 7a, and outputs the corrected photograph 8a.
  • the projection from the point is calculated as the correspondence between image coordinates and geographic coordinates.
  • geographic coordinates are connected to each point on the image of the aerial photograph 10a, and based on the geographic coordinates, each image coordinate is converted from a central projection to an orthogonal projection. Record the re-sampled coordinates on the converted image into a raster image by an interpolation method such as cubic interpolation.
  • the data of the corrected photo 8a is an electronic image file that is an electronic image file that is obtained by projecting the range shown in each of the aerial photos 10a and the other and im into an orthogonal projection.
  • Each means of the imaging 3D measurement system 1 is realized by a computer 18. As shown in FIG. 3, the computer 18 includes an input device 18 a, a central processing device 18 b, a storage device 18 c, and an output device 18 d.
  • the input device 1 8 a is a device that takes in the input data 1 8 e from the outside into the computer 1 8. Inputs 7 to 1 8 e are stored in accordance with instructions from the central processing unit 1 8 b.
  • Input data 1 8 e is internal orientation element 9 b, aerial photograph 10 0 a, aerial GPS data
  • central processing unit is a command execution and control by analyzing instructions
  • the storage device 18 c is a device that records programs, input data 18 e, internal data 18 f, output data 18 g, and the like, and may be constructed as a database.
  • the internal data 1 8 f is data created in the computer 1 8 during the process. Geometrically corrected photo 2a, camera position 3a, typpoint 4a, position There are data such as 5a, displacement corrected photo 6a, 3D coordinate 7a, etc.
  • the output device 18 d is a device for taking out output data 18 g from the computer 18 to the outside.
  • the output data 18 g is called from the storage device 18 c by the instruction of the central processing unit 18 b.
  • the output data 1 8 g is data such as the corrected photo 8 a. It is output by means such as displaying on a screen, printing on paper, or recording on a recording medium.
  • FIG. 4 and FIG. 5 are diagrams showing the deformation of a figure in the image correlation method of the digital aerial photography three-dimensional measurement system according to the present invention.
  • FIG. 6 is a diagram showing the acquisition of pixel values of the digital aerial imaging three-dimensional measurement system according to the present invention.
  • Image verification is a technology that detects where two or more digital images appear on one image but on the other.
  • photogrammetry for each point on each photo recorded as a raster image, which point on the other photo corresponds to that point is detected using the image correlation method.
  • this method is used by means of typpoint extraction 4 and stereo matching 7.
  • a frame B of the same size is placed on the other image B, the pixel value B of the point B included therein is acquired, and the correlation coefficient with the pixel value A in the frame A acquired previously is obtained. . While fixing frame A on image A, move frame B on the other image B little by little to find the correlation number one after another. When the correlation coefficient is the highest, center point B of frame B Is considered to be a point that coincides with the center point A of frame A on image A.
  • the shape of the frame 20 a placed on the image 19 is deformed.
  • the frame 20 a is distorted according to the inclination of the camera in the direction directly below.
  • the tilt information used at this time can be input from the outside, but if it is not input, it is treated as having no tilt.
  • the frame 20 a is enlarged or reduced according to the camera height.
  • the altitude information used at this time can be obtained from the camera position 3a.
  • the position for obtaining the pixel value 20b for calculating the correlation coefficient is also changed.
  • the pixel value 2 0 b was obtained from the center of each pixel, and the position of each center point is changed according to the deformation of the frame 20 0 a.
  • the pixel value 2 0 b of that pixel is used as it is.
  • the values obtained by interpolating the pixel values 2 0 b of the nearest four pixels surrounding the acquisition position by the distance between the center and the acquisition position are used.
  • the equation used for interpolation follows an interpolation method such as cubic interpolation.
  • Fig. 7 shows the coplanar conditions. Note that the coplanar condition refers to the point on the imaging surface corresponding to each point of two photographs of the same point, respectively. These lens principal points have the property of being on the same plane.
  • the point 2 3 a on the imaging surface 2 3 corresponding to the point 2 2 a on the imaging surface 2 2 is searched for by using the image correlation method for two photographs whose imaging ranges overlap each other.
  • the geographical coordinates of point 2 2 a and point 2 3 a can be expressed by an expression using the camera position and orientation as variables.
  • point 2 2 a, the lens principal point 2 2 b, point 2 3 a, and lens principal point 2 3 b are on the same plane. If you find many pairs of points that correspond between two photos, it is possible to estimate the error included in the position and orientation and eliminate the error by combining these equations and conditions. it can.
  • Fig. 7 shows the collinear conditions.
  • the collinear condition is a property that the point of the imaging range, the point on the imaging surface corresponding to the point, and the lens principal point exist on the same straight line.
  • the estimation method based on coplanar conditions can remove variations in position and orientation information between images, but if the entire image contains the same error, it cannot be removed.
  • the error derived from the degree of freedom of rotation about the straight line connecting the lens principal points of the two photographs to which the coplanar condition is applied may remain large. Therefore, in the conventional method, the point where the geographical coordinates that can be seen in any one of the photographs is known as the ground reference point, and recorded on the ground reference point 2 4 c in the shooting range 2 4 b and on the imaging surface 24.
  • the collinear conditions for the selected point 2 4 a and the lens principal point 2 4 d the error contained in the position and attitude information of the photograph is estimated, and the error is removed.
  • FIG. 8 is a diagram showing a set of images in position and posture estimation of the digital aerial imaging three-dimensional measurement system according to the present invention.
  • images 25 b taken at positions orthogonal to these traveling directions are overlapped so that the two continuous images 25, 25 a along the traveling direction overlap the shooting range.
  • the position and orientation of each imaging plane are calculated so that the twisting of the corresponding rays between the three pairs is minimized, and this is equal to the position and orientation of the imaging plane at the time of each photo shoot. .
  • the degrees of freedom of rotation around the straight line connecting the lens principal points for these three coplanar conditions cancel each other out.
  • the error resulting from this freedom is negligible. Therefore, the camera position and orientation information can be estimated with very high accuracy without using the collinear condition.
  • applying the coplanar condition to that image also adjusts the position and orientation information of that image.
  • the position and orientation information of the original three images can be estimated with high accuracy.
  • the number of images to be estimated can be increased inductively. That is, when a group of three or more images satisfies the conditions for estimation by this method, an image not included in those groups overlaps with any image included in those groups. If so, the image can also be subject to estimation by this method in addition to the group.
  • displacement correction 6 is applied to each photograph.
  • the central projection photograph tilted by the camera tilt at the time of taking a picture is re-projected to the central projection in the direct downward direction.
  • FIG. 9 is a diagram showing a deviation correction method of the digital aerial imaging three-dimensional measurement system according to the present invention.
  • the latitudes and meridians locally form square squares, but all of the photos taken with the camera tilted 26 have distorted squares.
  • the image is corrected so that it becomes a square cell, and a displacement corrected photo 2 7 is generated.
  • the imaging surface 26 a tilted with respect to the imaging range 26 b is rotated around the lens principal point 26 c so that the light rays from the imaging range 26 b remain horizontal.
  • the position of each point is moved to a position where each corresponding light beam intersects the rotated imaging surface 27 a.
  • FIG. 10 is a diagram showing a resample of an image subjected to displacement correction in the digital aerial photography three-dimensional measurement system according to the present invention.
  • resampling is performed so that the directions of the edges of the images to be stereo matched are aligned.
  • the image is rotated 29 a so that the sides of the resampled image are aligned with the aircraft traveling direction 28 a at the time of each photographing.
  • the position and orientation estimation method is used. Obtain the position and orientation of the camera when they were photographed. Then, for each point on one image, a corresponding point is found from the other image by the image correlation method.
  • FIG. 11 is a diagram showing a stereo-matching method of the digital aerial imaging three-dimensional measurement system according to the present invention.
  • the optical system is turned upside down from the point 30 0a on one imaging surface 30 and the straight line 30d that extends to the shooting range 30c through the lens principal point 30b is uniquely determined. You can.
  • the equation of the straight line 3 1 d extending from the corresponding point 3 1 a on the other imaging surface 3 1 to the imaging range 3 1 c through the main lens point 3 1 b is reversed. Can also be determined uniquely.
  • Point 3 0 a and point 3 1 a are the same object point projected by different central projection systems, so line 3 0 d and line 3 1 d should intersect, and these lines 3 0
  • the result of solving the equations of d and 3 1 d simultaneously corresponds to the coordinate of the intersection, that is, the three-dimensional coordinate 3 2. If the straight lines 3 0 d and 3 1 d do not intersect, find the midpoint of the two straight lines and use it as the 3D coordinate 3 2.
  • the number of stages of processing accuracy is n, and threshold values t h (1) to t h (n) and threshold values d i f f (l) to d i f f (n) are determined. Place moving point X a on image A and moving point X b on image B, and determine the initial position and the direction to move them.
  • Step i (i 1 to n)
  • the moving point X is the point closest to the initial position with respect to the direction in which the moving point is moved among the points that have not yet been determined in steps 0 to (i 1 1). Put a. Also, the moving point X b is placed at the initial position of image B.
  • the frames centered at the moving point Xa and the moving point Xb are defined as Wa and Wb, respectively, and the correlation coefficient cc between the frame Wa and the frame Wb is obtained by the image correlation method.
  • the correlation coefficient cc is the threshold th ( i) If so, go to the third step, otherwise go to the fourth step.
  • Step 6 Move the moving point X b to the next point in image B and return to the second step. If there is no next point, proceed to Step 6.
  • FIG. 12 is a perspective view of a consumer digital camera used in the digital aerial photography three-dimensional measurement system according to the present invention.
  • FIG. 13 is a plan view of a consumer digital camera used in the digital aerial photography 3D measurement system of the present invention.
  • FIG. 14 is a front view of a consumer digital camera used in the digital aerial photography 3D measurement system of the present invention.
  • the consumer digital camera 14 has a camera 14a, a controller 14b, an outer frame 14c, an inner frame 14d, a motor 14e, a mounting 14f, and It consists of 14 g.
  • the mounting and control base 15 for installing the consumer digital camera 14 has a circular hole 15b in the center, and supports 1 for supporting the consumer digital camera 1 4 at the four corners. 5 a is provided.
  • the camera 14a is installed with the lens facing down. Installation. When fixed to the control pedestal 15, the photograph can be taken through the hole 15 b.
  • the number of pixels is 12,000,000 or more, and a wide-angle lens is attached for use.
  • the control unit 14 b is a box provided on the camera 14 a, has a gyro inside, detects the inclination with respect to the direction of gravity, and the camera 14 a is always directly below. It can be controlled to turn in the direction.
  • the control unit 14 b is connected with an aeronautical GPS antenna 16, a computer 18, and the like via a cable. Connect the camera and the shutter signal transmission cable, receive the GPS signal, and send a signal to turn off the shutter at the same time.
  • the outer frame 14c is a substantially octagonal frame that covers the periphery of the inner frame 14d.
  • the inner frame 14 d is connected to the outer frame 14 c in such a way that it freely moves on both the pitch axis and the roll axis.
  • the inner frame 14 d is a substantially octagonal frame to which the camera 14 a and the control unit 14 b are attached.
  • the inclination of the inner frame 14 d can be changed by the motor 14 e.
  • the motor 14 e is installed in the four directions of the camera 14 a and operates according to instructions from the control unit 14 b to adjust the inclination of the camera 14 a.
  • the attachment portions 14 f are portions fixed to the support columns 15 a at the four corners of the outer frame 14 c. Since it is connected to the four corners via a coil spring 14 g, the vibration can be absorbed and the camera 14 a can be prevented from shaking.
  • the digital aerial 3D measurement system has only a digital camera for consumer use and a GPS as measuring instruments, and is small, light, and low in cost while maintaining accuracy.
  • aerial photometry can be used even for small-scale projects that were difficult in terms of cost.
  • consumer digital cameras can be replaced, and even if the performance of consumer digital cameras changes, it can be easily accommodated by simply adjusting the software. It is possible to do this.
  • the measurement equipment is only a consumer digital camera and GPS, and while maintaining accuracy, it is small, lightweight, and low cost. Aerial photogrammetry will also be available.
  • the software can be used to correct or remove error elements, estimate the direction of the camera's line of sight, etc. Compared to the case, it becomes possible to calculate with high accuracy.
  • consumer digital cameras can be replaced, and even if the performance of consumer digital cameras changes, it can be easily accommodated by adjusting the software. It can be used.

Abstract

A digital aerial photographing three-dimensional measurement system capable of calculating a three-dimensional coordinate with high accuracy from an image captured by means of a digital camera is provided. The digital aerial photographing three-dimensional measurement system is characterized in that the internal orientation element based on the specification of a digital camera, a plurality of aerial photographs captured by the digital camera, and the GPS data received with the photographing of the aerial photographs are inputted to a computer and that corrected photographs are outputted from the computer by a geometric correction means, a continuous analysis means, a tie-point extraction means, an external orientation means, a deviation correction means, stereo matching means, and an orthographic correction means.

Description

明細書 デジタル空 3次元計測システム 技術分野  Description Digital Sky 3D Measurement System Technical Field
本発明は 、 撮像機器と してデジタノレカメ ラを利用 して航空機から行う写真測量 に関するシステムである。 背景技術  The present invention is a system related to photogrammetry performed from an aircraft using a digital camera as an imaging device. Background art
写真測量は 、 対象物を複数の位置 · 角度から写真に収め、 これらの写真を撮影 したと きの力メ ラ位置 · 姿勢の情報を元に 、 対象物の 3次元座標を算出する技術 でめる。 写 測量の中でもゝ 飛行する航空機から直下方向を撮影する こ とで地物 又は地形の緯度 • 経度 · 高度を算出する ものを航空写真測量といい、 特に写真の 記録媒体と して化学的なフィルムではな < 電子的なメディ アを利用する ものをデ ジタル航空写真測量という  Photogrammetry is a technique that calculates the three-dimensional coordinates of an object based on the information on the position and orientation of the force camera when the object is photographed from multiple positions and angles. The Among photogrammetry, aerial photogrammetry is the one that calculates the latitude / longitude / altitude of a feature or terrain by photographing the direction directly below the flying aircraft, especially a chemical film as a photographic recording medium. So <digital aerial photogrammetry is what uses electronic media
通常、 力 メ ラの撮像面に記録される画像は、 カメ ラの画角に入る範囲を中心投 影によつて射 a したものである。 即ち、 撮像面上の各点から レンズの主点、を通る 直線を延ばしていったと き その直線がはじめてぶつかる ものがその点に記録さ れ O o  Normally, the image recorded on the imaging surface of the force camera is the one that is projected by central projection within the range that falls within the angle of view of the camera. In other words, when a straight line that passes through the main point of the lens from each point on the imaging surface is extended, the first line that hits the straight line is recorded at that point.
したがつて ある写真上に写っている各点の 3次元座標は、 その写真を撮った 力メ ラの光学系 と 、 撮ったと きの撮像面の位置及び姿勢が分かれば、 その光学系 を逆にたどつていつた場合に得られる各 1 本の直線上のどこかにある 。 このと き Therefore, the three-dimensional coordinates of each point on a photo can be reversed if the optical system of the force camera that took the photo and the position and orientation of the imaging surface at the time of taking the photo are known. Somewhere on each straight line you get when you follow. At this time
、 同一の対象を違う位置及び姿勢で撮影した写真がも う 1 枚あれば、 同様に して 各点に对 J¾、する直線を得る とができ るが 、 先の写真と後の写真とで同一の点に 対し得られる直線は必ず異なつてお り 、 かつ必ず交わる。 即ち、 その交点 、 求 める座標にあた Ό。 If there is another photo of the same object taken at different positions and orientations, a straight line can be obtained for each point in the same way. The straight lines that can be obtained for the same point are always different and always intersect. In other words, at the intersection, the coordinates to be found.
従来の航ェ写真測量におレ、ては、 専用に開発された大掛かり なシステムを使用 する こ とが多 < 、 精度は高レ、カ 、 導入コス トゃ運用コス ト も高く 、 予算規模の大 さレ、プロ ジェク トでないと困難であつた しかしながら、 民生用デジタノレカメ ラ の市場も急成長してお り 、 写真測量に耐え う る高解像度を持ったカメ ラ も存在す る。 Conventional navigational photogrammetry often uses a large-scale system that has been developed exclusively for its use, with high accuracy, high cost, high implementation costs, and high operational costs. However, it is difficult to do so unless it is a project. The market is growing rapidly, and there are cameras with high resolution that can withstand photogrammetry.
特許文献 1 である特開 2 0 0 7 ― 1 0 8 0 2 9 号公報に記載されている よ う に As described in Japanese Patent Application Laid-Open No. 2000-107-1089, which is Patent Document 1.
、 撮像部の撮影した撮像デ一タ と各地点における撮像部の撮影範囲に対応する地 理データ と を比較した誤差量に基づレ、て位置特定部が位置を特定する こ と によ りThe position specifying unit specifies the position based on the amount of error comparing the image data captured by the image pickup unit and the geographical data corresponding to the shooting range of the image pickup unit at each point.
、 G P S の衛星補足数が十分確保でさない条件下においても測位する こ とができ る発明も公開されている。 In addition, an invention that can perform positioning even under conditions in which the number of GPS satellite supplements is not sufficient is also disclosed.
特許文献 2 である特開 2 0 0 6 ― 0 7 9 5 2 1 号公報に記載されている よ う に As described in Japanese Patent Application Laid-Open No. 2 0 06-0 7 9 5 21, which is Patent Document 2.
、 撮影対象物につき撮影位置を異ならせて複数地点からデジタルカメ ラで撮影し た複数の画像情報と、 撮影対象物の 3 次元座標情報と を用い、 撮影対象物の所定 位置における 3 次元座標情報に対 J¾、する画像情報上での画像座標情報を求め、 求 め られた画像座標情報に対応する R G B値を取得して撮影対象物の正射投影画像 を形成する発明も公開されている。 3D coordinate information at a predetermined position of the shooting object using multiple image information taken with a digital camera from multiple points with different shooting positions for the shooting object and 3D coordinate information of the shooting object An invention is also disclosed in which image coordinate information on image information to be obtained is obtained, and an RGB value corresponding to the obtained image coordinate information is obtained to form an orthographic projection image of the object to be photographed.
特許文献 3 である特開 2 0 0 1 — 1 1 4 0 4 7号公報に記載されている よ う に 、 車に設置されたカメ ラからの画像を用いて任意の視点からの画像を合成する際 に、 路面指標合成手段によって路面投影像に路面以外の物体を区別する指標を合 成する こ とで、 運転者に路面以外の物体を判断しやすいよ う に提示する発明も公 開されている。  As described in Japanese Patent Application Laid-Open No. 2 0 0 1 — 1 1 4 0 4 7, which is Patent Document 3, an image from an arbitrary viewpoint is synthesized using an image from a camera installed in a car. In addition, an invention that presents the driver with an easy-to-determine object other than the road surface by synthesizing the road surface projection image with an index for distinguishing objects other than the road surface by the road surface index combining means is also disclosed. ing.
特許文献 4である特開 2 0 0 6 - 1 9 5 7 5 8 号公報に記載されている よ う に 、 横軸が平行投影のステ レオ画像を 2枚取得する画像入力手段と、 画像入力手段 によ り 得られた 2枚の画像の う ち 1枚の画像における局所領域を変形させた補正 画像に対し、 も う 一枚の画像との類似度を算出する画像照合手段と、 画像照合手 段によ り 算出された類似度を基に 3次元点群を生成する 3次元点群生成手段を備 える発明も公開されている。  As described in Japanese Patent Application Laid-Open No. 2000-0615, which is Patent Document 4, an image input means for acquiring two stereo images whose horizontal axes are parallel projections, and image input An image matching means for calculating the similarity between the corrected image obtained by deforming the local region of one of the two images obtained by the means and the other image; and image matching An invention comprising a 3D point cloud generation means for generating a 3D point cloud based on the similarity calculated by the means is also disclosed.
特許文献 5 である再表 2 0 0 4 - 0 3 8 6 6 0号公報に記載されている よ う に 、 基準画像および参照画像に対して、 それぞれ所定の相関関係に基づいて対応付 けされた複数のウィ ン ドウを設定し、 この う ち、 参照画像に形成された複数のゥ ィ ン ドウを伸縮させて変形させるこ と によ り 、 参照画像における隣接する ウイ ン ドウ同士間の離間又は重な り合いを解消させて、 基準画像のウイ ン ドウ と参照画 像のウィ ン ドウ と のステ レオマッチングを行い、 対象物の形状を復元する発明も 公開されている。 As described in Table 2 of Japanese Patent Application Laid-Open No. 2000-042086, a reference image and a reference image are associated with each other based on a predetermined correlation. By setting multiple windows, and by expanding and contracting the multiple windows formed in the reference image, the distance between adjacent windows in the reference image can be reduced. Or cancel the overlap, and the reference image window and reference image An invention that performs stereo matching with the image window and restores the shape of the object has also been made public.
しかしなが ら、 従来の専用システムは、 全てハー ドウェアで構成されているた め、 装置全体が大型化しやすく 、 メ ンテナンスにも手間が掛かる。 また、 技術が 向上しても部分的に交換するこ とが困難である。  However, since the conventional dedicated system is entirely composed of hardware, the entire device tends to be large and maintenance is troublesome. Moreover, even if the technology improves, it is difficult to exchange partly.
尚、 特許文献 1 である特開 2 0 0 7 — 1 0 8 0 2 9号公報に記載の発明は、 天 空画像カメ ラを使用 して赤外線で全周囲を撮像する ものであるため、 専用のハ ー ドウヱァが必要と な り 、 民生用デジタルカメ ラで実現するこ と はできない。  Note that the invention described in Japanese Patent Application Laid-Open No. 2 0 0 7 — 1 0 8 0 2 9, which is Patent Document 1, uses a sky image camera to capture the entire periphery with infrared rays, and is therefore dedicated. This hardware is necessary and cannot be realized with a consumer digital camera.
特許文献 2である特開 2 0 0 6 — 0 7 9 5 2 1 号公報に記載の発明は、 撮影対 象物の 3次元座標情報から正射投影画像を形成する ものであ り 、 撮影対象物を複 数地点からデジタルカメ ラで撮影して 3次元座標を算出する ものではない。  The invention described in Japanese Patent Laid-Open No. 2 0 06-0 7 9 5 21, which is Patent Document 2, forms an orthographic projection image from three-dimensional coordinate information of an object to be imaged. It does not calculate three-dimensional coordinates by shooting an object from multiple points with a digital camera.
特許文献 3 である特開 2 0 0 1 — 1 1 4 0 4 7号公報に記載の発明は、 車両に 搭載したカメ ラで撮影した画像を合成して車両の周囲の状況を判 りやすく 表示す る ものであ り 、 画像を捕正する等して高精度な 3次元座標を算出する ものではな レ、。  The invention described in Japanese Patent Application Laid-Open No. 2 0 0 1 — 1 1 4 0 4 7, which is Patent Document 3, displays an image of the surroundings of the vehicle in an easy-to-understand manner by combining images taken with a camera mounted on the vehicle. It is not intended to calculate high-precision 3D coordinates by correcting the image.
特許文献 4である特開 2 0 0 6 - 1 9 5 7 5 8号公報に記載の発明は、 2台の カメ ラを固定し、 平行移動しながら撮影してステ レオマ ッチングする ものであ り 、 カメ ラの位置や姿勢が不定である場合にも対応でき る ものではない。  The invention described in Japanese Patent Application Laid-Open No. 2 0 0 6-1 9 5 7 5 8, which is Patent Document 4, is a method in which two cameras are fixed, photographed while moving in parallel, and stereo-matched. Even if the position and posture of the camera are indefinite, it cannot be handled.
特許文献 5 である再表 2 0 0 4 - 0 3 8 6 6 0号公報に記載の発明は、 同一対 象物を撮影した基準画像と参照画像をそれぞれ同数のウイ ン ドウに分割しウ イ ン ドウ同士の対応関係によってステレオマッチングする ものであ り 、 2つの画像が 重な り合う領域全体にわたって 1 点ごと の対応関係を求め高精度な 3次元座標を 算出するものではない。  The invention described in Japanese Patent Laid-Open No. 2 0 0 4-0 3 8 6 60, which is Patent Document 5, divides a standard image obtained by photographing the same object and a reference image into the same number of windows, respectively. The stereo matching is based on the correspondence between the windows, and it does not calculate the three-dimensional coordinates with high accuracy by finding the correspondence for each point over the entire area where the two images overlap.
そこで、 本発明は、 デジタルカメ ラで撮影した画像から高い精度で 3 次元座標 を算出する こ とができるデジタル空撮 3 次元計測システムを提供する こ と を目的 とするものである。 発明の開示  In view of the above, an object of the present invention is to provide a digital aerial three-dimensional measurement system capable of calculating three-dimensional coordinates with high accuracy from an image photographed with a digital camera. Disclosure of the invention
本発明は、 上記の課題を解決するために、 デジタルカメ ラの仕様に基づく 内部 標定要素と前記デジタルカメ ラで撮影した複数の航空写真をコ ンピュータに入力 し、 前記内部標定要素を係数と して放射方向歪みと接線方向歪みについて前記航 空写真の画像上座標を変換した幾何補正済写真のデータを記憶装置に保存する幾 何補正の手段と、 前記航空写真を記憶装置から読み出すと共に前記航空写真の撮 影に伴い受信した G P Sデータ をコ ンピュータに入力 し、 前記航空写真について 連続する 2枚の写真の重な り 具合を求めて前記 G P Sデータの高度情報を基に撮 影位置の差を算出 し、 前記 G P Sデータの緯度及び経度情報と比較して各写真に 結び付けた力メ ラ位置のデータを記憶装置に保存する連続性解析の手段と、 前記 幾何補正済写真と前記カメ ラ位置を記憶装置から読み出 し、 前記カメ ラ位置を基 に前記幾何補正済写真の う ち隣接する写真同士をペア と したと き、 一方の写真上 の複数点が他方の写真の どこに写っているかを画像照合によ り 探し出 したタイ ポ イ ン 卜のデータ を記憶装置に保存するタイポイ ン ト抽出の手段と、 前記内部標定 要素と前記幾何補正済写真と前記カメ ラ位置と前記タイ ボイ ン トを記憶装置から 読み出 し、 前記タイ ボイ ン トが前記カメ ラ位置において直下方向の姿勢で撮影さ れたと仮定して実空間での中心投影を逆にたどる直線を算出 し、 ペアの写真にお ける 2つの直線間の捩れが最小と なる よ う に姿勢を調整した位置姿勢のデータを 記憶装置に保存する外部標定の手段と、 前記内部標定要素と前記航空写真と前記 位置姿勢を記憶装置から読み出し、 前記航空写真を前記内部標定要素で幾何補正 したものに対し前記位置姿勢を基に直下方向の姿勢で撮影した状態と なる よ う に 投影し直した偏位補正済写真のデータを記憶装置に保存する偏位補正の手段と、 前記位置 勢と前記偏位補正済写真を記憶装置から読み出 し、 前記偏位補正済写 真の う ち重な り 合う 写真同士をペアに したと き、 一方の写真上の複数の点が他方 の写真のどこに写っているかを面像照合によ り探し出 し、 前記位置姿勢に基づい て実空間での中心投影を逆にたどる直線を算出 して、 ペアの写真における 2つの 直線によ り 求めた 3次元座標のデータを記憶装置に保存するステ レオマッチング の手段と、 前記内部標定要素と前記航空写真と前記位置姿勢と前記 3次元座標を 記憶装置から読み出 し、 前記 3次元座標で表される ものを前記内部標定要素の幾 何的歪みを持った状態で前記位置姿勢の姿勢で撮影した際の対応関係を算出 して 前記航空写真上の各点を前記 3次元座標に結び付け、 画像を中心投影から正射影 に変換した捕正済写真のデータをコ ン ピュータから出力するオルソ捕正の手段と からなる こ と を特徴とするデジタル空撮 3次元計測システムの構成と、 写真測量 のタイポイ ン ト抽出又はステ レオマッチングで画像照合する際の画素値取得にお いて、 画像上の点に対して、 前記点が中央付近の場合は正方形状、 前記点が画像 外縁付近の場合は放射方向に引き延ばした形状で設定した枠を、 カメ ラの進行方 向に合わせて回転、 カメ ラの傾きに合わせて歪曲、 又はカメ ラの高度に合わせて 拡大縮小し、 前記枠の形状に合わせて画素値の取得位置を変更して周囲のピクセ ル中心と の距離によ り 画素値を算出する こ と を特徴とする画像照合方法と、 写真 測量の外部標定で撮影範囲が重な り合う写真の共面条件及び共線条件を利用 して 行う カメ ラの姿勢推定において、 カメ ラの進行方向に沿った連続する 2枚の写真 と 、 前記写真と撮影範囲が重な り合い且つ進行方向と直交する位置で撮影した写 真とで推定する こ と を特徴とする姿勢推定方法と、 写真測量の偏位補正で直下方 向の中心投影に捕正する際において、 写真撮影時のカメ ラの傾きによって傾いた 中心投影の写真に対し、 撮像面が水平となる よ う にレンズ主点を中心に回転させ 、 回転前の撮像面上の各点を回転後の撮影面に投影し直した後、 各写真の辺の向 き を揃えて リ サンプルする こ と を特徴とする偏位補正方法と、 写真測量のステ レ ォマッチングで一方の画像上の各点に対応する他方の画像上の点を探し出す場合 において、 処理精度の段階と しきい値を設定し、 帰納的な手順で各点の 3次元座 標を求めるこ と を特徴とするステ レオマッチング方法とによ り 実現した。 図面の簡単な説明 In order to solve the above problems, the present invention is based on the specifications of a digital camera. Geometric elements and multiple aerial photographs taken with the digital camera are input to a computer, and the coordinates on the image of the aerial photograph are transformed for radial and tangential distortions using the internal orientation elements as coefficients. A correction means for storing the corrected photograph data in the storage device, and reading the aerial photograph from the storage device and inputting the GPS data received when the aerial photograph was taken to the computer. Calculate the difference in the shooting position based on the altitude information of the GPS data by determining the overlapping degree of two consecutive photos, and compare the latitude and longitude information of the GPS data with the power associated with each photo Continuity analysis means for storing camera position data in a storage device, the geometrically corrected photograph and the camera position are read from the storage device, and the geometry is based on the camera position. When pairing adjacent photos of the correct photos, the data of the typo し た found by image matching where multiple points on one photo appear in the other photo The tie point extracting means stored in the storage device, the internal orientation element, the geometrically corrected photograph, the camera position, and the tie point are read from the storage device, and the tie point is read from the camera. Assuming that the camera was photographed in the position directly below at the position L, the straight line that follows the central projection in real space was calculated, and the attitude between the two lines in the pair of photographs was minimized. Means for storing the position and orientation data adjusted in the storage device in the storage device, reading the internal orientation element, the aerial photograph and the position and orientation from the storage device, and geometrically correcting the aerial photograph with the internal orientation element Deviation correction means for storing in a storage device the data of a deviation-corrected photograph that has been reprojected so as to be in a state of being photographed in a downward direction based on the position and orientation; And the deviation-corrected photos are read from the storage device, and the images that overlap each other in the deviation-corrected photos are paired, and a plurality of points on one photo are the other photo. The image is searched for by image comparison, and a straight line that reverses the central projection in real space is calculated based on the position and orientation, and is obtained from the two straight lines in the pair of photographs. Stereo matching means for storing data of three-dimensional coordinates in a storage device, the internal orientation element, the aerial photograph, the position and orientation, and the three-dimensional coordinates are read from the storage device and expressed by the three-dimensional coordinates. A number of internal orientation elements By calculating the correspondence when shooting with the above-mentioned position and orientation with some distortion, linking each point on the aerial photograph to the three-dimensional coordinates, the image is orthogonally projected from the central projection The composition of the digital aerial 3D measurement system, which is composed of an orthorectification means that outputs from the computer the data of the corrected photograph converted to, and the typographic point extraction or stepping of the photogrammetry In the pixel value acquisition when matching images by rhe matching, when the point is near the center, it has a square shape, and when the point is near the outer edge of the image, the shape is elongated in the radial direction. The set frame is rotated according to the direction of camera movement, distorted according to the camera tilt, or scaled according to the camera altitude, and the pixel value acquisition position is adjusted according to the frame shape. The image matching method is characterized in that the pixel value is calculated based on the distance from the center of the surrounding pixel, and the coplanar conditions and the coplanar conditions for the overlapping of the shooting range in the photogrammetry external orientation. Use line conditions In the camera posture estimation performed in this way, there are two consecutive photographs along the camera traveling direction, and a photograph taken at a position where the photograph and the photographing range overlap and are orthogonal to the traveling direction. At the time of capturing a central projection that is tilted by the camera tilt at the time of taking a picture when correcting to the central projection in the downward direction by correcting the deviation of photogrammetry and the posture estimation method characterized by estimation Rotate around the lens principal point so that the imaging surface is horizontal, reproject each point on the imaging surface before rotation onto the rotated imaging surface, and then align the sides of each photo In the case of finding a point on the other image corresponding to each point on one image by stereo matching of photogrammetry, and a method of correcting accuracy characterized by re-sampling Set thresholds and recursively That you determine the three-dimensional coordinates of it was realized Ri by the and the stereo matching method which is characterized. Brief Description of Drawings
第 1 図は本発明であるデジタル空搌 3次元計測システムの処理の流れを示すフ ローチャー ト、 第 2 図は本発明であるデジタル空撮 3次元計測システムの処理、 データ及び装置の関係を示す図、 第 3 図は本発明であるデジタル空撮 3次元計測 システム コ ンピュータ の構成を示す図、 第 4図は本発明であるデジタル空撮 3次 元計測システムの画像相関法における図形の変形を示す図、 第 5図は本発明であ るデジタル空撮 3 次元計測システムの画像相関法における図形の変形を示す図、 第 6 図は本発明であるデジタル空撮 3 次元計測システムの画素値の取得を示す図 、 第 7図は共面条件及び共線条件を示す図、 第 8 図は本発明であるデジタル空撮 3次元計測システムの位置及び姿勢推定における画像の組を示す図、 第 9図は本 発明であるデジタル空撮 3 次元計測システムの偏位補正の手法を示す図、 第 1 0 図は本発明であるデジタル空撮 3次元計測システムの偏位補正した画像の リ サン プルを示す図、 第 1 1 図は本発明であるデジタル空撮 3 次元計測システムのステ レオマッチングの手法を示す図、 第 1 2 は本発明であるデジタル空撮 3次元計測 システムで使用する民生用デジタルカメ ラの斜視図、 第 1 3 図は本発明であるデ ジタル空撮 3 次元計測システムで使用する民生用デジタルカ メ ラ の平面図、 第 1 4 図は本発明であるデジタル空撮 3次元計測システムで使用する民生用デジタル カメ ラの正面図である。 発明を実施するための 良の形 Fig. 1 is a flowchart showing the flow of processing of the digital aerial 3D measurement system of the present invention. Fig. 2 shows the processing of the digital aerial 3D measurement system of the present invention, and the relationship between data and equipment. Fig. 3 shows the configuration of the digital aerial 3D measurement system computer of the present invention, and Fig. 4 shows the deformation of the figure in the image correlation method of the digital aerial 3D measurement system of the present invention. Fig. 5 is a diagram showing the deformation of the figure in the image correlation method of the digital aerial photography 3D measurement system according to the present invention, and Fig. 6 is the pixel value of the digital aerial photography 3D measurement system according to the present invention. Fig. 7 shows acquisition, Fig. 7 shows coplanar conditions and collinear conditions, and Fig. 8 shows digital aerial photography according to the present invention. Fig. 9 shows a set of images for estimating the position and orientation of the 3D measurement system. Fig. 9 shows a method for correcting the deviation of the digital aerial 3D measurement system of the present invention. Fig. 10 shows the present invention. Fig. 11 shows a sample of a digital aerial 3D measurement system with corrected displacement. Fig. 1 1 shows the stereo matching method of the digital aerial 3D measurement system of the present invention. Fig. 2 is a perspective view of a consumer digital camera used in the digital aerial 3D measurement system of the present invention, and Fig. 1 3 is a consumer digital camera used in the digital aerial 3D measurement system of the present invention. Fig. 14 is a front view of a consumer digital camera used in the digital aerial 3D measurement system of the present invention. Good form for carrying out the invention
以下に 、 添付図面に基づいて、 本発明である ジタル 撮 3次元計測システム につレヽて詳細に説明する 図 1 は 、 本発明であるデジタル空撮 3次元計測システ ムの処理の流れを示すフ チャ ― トである 図 2 は 本発明であるデジタル空 撮 3次元計測システムの処理 、 テ タ及び装置の関係を示す図である。 図 3 は 本発明であるデジタル空撮 3次元計測システム ンピュ タの構成を示す図であ る。  Hereinafter, the digital photographing 3D measurement system according to the present invention will be described in detail with reference to the accompanying drawings. FIG. 1 is a flowchart showing the flow of processing of the digital aerial photography 3D measurement system according to the present invention. FIG. 2 which is a chart is a diagram showing the relationship between the processing, data and apparatus of the digital aerial three-dimensional measurement system according to the present invention. FIG. 3 is a diagram showing a configuration of a digital aerial imaging three-dimensional measurement system computer according to the present invention.
図 1 に示すよ う に、 デジタル空撮 3次元計測システム 1 は、 幾何補正 2 連 性解析 3、 タィ ポイ ン ト抽出 4 、 外部標定 5 、 偏位補正 6 、 ステ レオマ Vチング As shown in Fig. 1, the digital aerial 3D measurement system 1 consists of geometric correction, 2 linkage analysis 3, type point extraction 4, external orientation 5, displacement correction 6, stereo Vching
7 及びオルソ補正 8 の手段よ り なる。7 and orthorectification 8 means.
-何補正 2では 、 内部標定 9 で得られたパラ メ ―タ を用いて各写真に含まれる 歪曲収差を捕正した写真を生成する。  -What correction 2 uses the parameters obtained in internal orientation 9 to generate a photograph that corrects the distortion included in each photograph.
続性解析 3 では、 各写真は G P S信号と同期 してシャ ッターが切られてレヽる が 具体的にどの写真がどの G P S記録に対応しているのかを決定する 基準と なる 1枚の写真とそれに対応する G P S記録を入力する と 、 それを基準にして、 残り の画像と G P S記録と を照合及び解析させる  In continuity analysis 3, each photo is shot in sync with the GPS signal and shot, but it is a single photo that serves as a reference for determining which photo specifically corresponds to which GPS record. When the corresponding GPS record is input, the remaining image and the GPS record are collated and analyzed based on that.
タィポイ ン ト抽出 4 では、 外部標定 5 を行う ためのデータ を生成する 具体的 には 、 撮影範囲の一部が重なる写真同士について 、 画像照合技術を用いて 共通 に写つている特徴的な点を探し出して画像上座標をテータ と して保存する' 外部標定 5 では、 各写真の撮像時のカメ ラの撮像面の位置と を、 G P S に よ り 計測された位置データ及びカメ ラによ り撮影された写真に基づレ、て 、 高精度 に推定する。 In the point extraction 4, data for external orientation 5 is generated. Specifically, the characteristic points that are shared by using image matching technology between images with a part of the shooting range overlapped. Find and save the coordinates on the image as data In the external orientation 5, the position of the camera's imaging surface at the time of capturing each photo is determined with high accuracy based on the position data measured by the GPS and the photos taken by the camera. presume.
具体的には、 各写真が G P S記録通り の位置から直下方向を向いて撮影された の と の仮定を初期条件と し、 タイ ポイ ン ト 4 a から光学系を逆にたどつて得ら れる光線について本来同一平面上になるはずの各光線同士の捩れが最も小さ く な る よ う 、 撮像面の位置と姿勢をニュー ト ン法などの解法によつて算出する また Specifically, the initial condition is the assumption that each photograph was taken from the position as recorded in the GPS, and it can be obtained by following the optical system from typpoint 4a. Calculate the position and orientation of the imaging surface using a solution such as the Newton method so that the twist of each ray that should originally be on the same plane is minimized.
、 予め地理座標の分かつている地物 (地上基準点) が写真に収め られてレ、る場合 この推定にその情報を利用する こ と もできるが、 必須ではない。 If the features (ground control points) that have already been separated into geographical coordinates are included in the photo, the information can be used for this estimation, but this is not essential.
偏位補正 6 では、 ステ レ才マツチング 7 の計算を簡便にするため、 直下方向を 向いて撮影されていた場合に写るはずの画像へと射影し直した写真を生成する。 具体的には、 各写真の外部標定 5 の結果を用いて、 直下方向を向レ、て撮 された 場合に射影されるはずの位置へと写真上各点の画像上座標を変換し、 ステ レォマ In the displacement correction 6, in order to simplify the calculation of the stereo matching 7, a re-projected photograph is generated to the image that should have been taken when the photograph was taken in the direct downward direction. Specifically, using the results of external orientation 5 for each photo, the coordinates on the image of each point on the photo are converted to the positions that should be projected when taken in the direct downward direction. Reoma
Vチングする写真同士で向きを揃えて リ サンプルする処理を行 5 Resample by aligning the orientation of the photos to be vchinged 5
ステ レ才マ ッチング 7 では、 内部標定 9及び外部標定 5 が済み偏位捕正 6 した 後の複数の写真の う ち 、 重な り合う範囲の大きい 2枚について 、 その 通に写る 部分の 3次元情報を算 h Ύ—る。  In the stereo matching 7, of the multiple photos after the internal orientation 9 and external orientation 5 have been completed and the displacement corrected 6, two of the large overlapping areas appear in the 3 Calculate dimension information.
具体的には、 画像照合技術を用いて各写真上の点の対応関係 (一方に写つてい る点が他方のどこに写つている力 を求め、 対応する点同士それぞれにつレ、て光 学系を逆にたどって得られる直線の交点を導き、 その座標を求める。 写真はラス タ一画像であるため、 この処理はピクセルごと の対応関係を求めてレ、 < こ と によ つてなされる。  Specifically, using the image matching technology, the correspondence between the points on each photo (determining the force at which the point on one side appears on the other side, Deriving the intersection of straight lines obtained by tracing back the system and finding the coordinates Since the photograph is a raster image, this processing is done by finding the correspondence for each pixel. .
写真は中心投影であるため、 奥行きの深い (高度が高い) ものが写真の外縁付 近に写る場合、 その影になって見えない範囲 (倒れ込み) が発生する 倒れ込み によって隠れる範囲は撮影の位置と角度で変わるため、 写真ごと に異な り 、 この 範囲についてのステ レォマ ッチング 7 を行う こ とはできない。 したが て 、 穴の ハ、い 3次元情報を得たい場合には各写真中の倒れ込みを互いに補える よ 5 、 様々 な位置又は角度からの写真を必要とす O  Since the photo is a central projection, when a deep (high altitude) object appears in the vicinity of the outer edge of the photo, an area that cannot be seen in the shadow (falling down) occurs. Because it varies with the angle, it differs from photo to photo, and you cannot perform stereo matching7 on this range. Therefore, if you want to get 3D information, you can make up for the fall in each photo 5, which requires photos from various positions or angles.
オルソ捕正 8 では、 中心投影を正射影に射影し直す。 写真は中心投影であるが 、 ステ レ才マッチング 7 の結果と して写真上の各点の座標が得られるため、 この 座標へと各点を写像した後、 その奥行き軸 (高度の軸 ) を潰して縦横軸のみにす れば、 正射影された写真が得られ 尚、 ステ レオマ -yチング 7 の結果と して得 られた座標に基づいて 3次元データを生成し、 テクスチャ と してオルソ補正 8 さ れた写真を用いれば、 3次元写真地図が得られる。 Orthorectification 8 reprojects the central projection into an orthographic projection. The photo is a central projection As a result of the stereo matching 7, the coordinates of each point on the photo are obtained, and after mapping each point to this coordinate, the depth axis (altitude axis) is crushed so that it is only the vertical and horizontal axes. Then, an orthographically projected photo is obtained.In addition, 3D data is generated based on the coordinates obtained as a result of stereo-yching7, and the orthorectified photo8 is used as the texture. If used, a 3D photographic map can be obtained.
図 2 に示すよ う に、 幾何補正 2 の手段は 、 内部標定要素 9 b及ぴ航空写真 1 0 a のデータを入力 し、 幾何補正済写真 2 a のデータを出力する。 内部標定要素 9 b の値を係数と して、 放射方向歪みの式と 、 接線方向歪みの式を決定し 、 その逆 変換を航空写真 1 0 a に施す。 具体的には 、 歪曲の逆変換によ り 画像上座標を変 換し、 3次内挿などの内揷法によつてラスタ一画像へと リ サンプルする処理であ る。  As shown in Fig. 2, the geometric correction 2 means inputs the data of the internal orientation element 9b and the aerial photograph 10a, and outputs the data of the geometrically corrected photograph 2a. Using the value of the internal orientation element 9 b as a coefficient, a radial distortion equation and a tangential distortion equation are determined, and the inverse transformation is applied to the aerial photograph 10 a. Specifically, it is a process that transforms the coordinates on the image by inverse transformation of the distortion and resamples it into a raster image by an internal method such as cubic interpolation.
内部標定要素 9 b のテ —タは、 力メ ラ仕様 9 a の T一タ を入力 とする内部標定 The data of the internal orientation element 9 b is the internal orientation with T data of the force meter specification 9 a as input.
9 の処理によ り 作成される。 カメ ラ仕様 9 a のデ ―タの う ち民生用デジタルカメ ラ 1 4 の光学系を規定するパラメ一タ と、 内部標定 9 の処理によって決定された 光学系の歪みを表すパラメ ータである It is created by the process of 9. Among the data of camera specification 9a, it is a parameter that specifies the optical system of consumer digital camera 14 and the parameter that represents the distortion of the optical system determined by the process of internal orientation 9
航空写真 1 0 a のデ一タは、 航 影 1 0 の実施によ り撮 a;される。 尚 、 民生 用デジタルカメ ラ 1 4 の出力フォ一マッ トに貝 IJつた電子画像フアイルである。 各 写真はその前後の写真とそれぞれの +曰-影範囲の 6 0 %以上が重な り合う よ う に撮 影され、 また、 全飛行経路中で最低 1 枚は、 連続しない飛行経路 (即ち、 交差ま たは並走する経路) から撮られた写真と、 互いの撮影範囲の 3 0 %以上が重な り 合う よ う に撮影される  The data for aerial photograph 10 a is taken a; It is an electronic image file with a shell IJ on the output format of a consumer digital camera 14. Each photo is taken so that 60% or more of the + 曰 -shadow range overlaps the previous and next photos, and at least one of the flight routes is a non-consecutive flight route (ie Taken from a crossing or parallel path) and 30% or more of each other's shooting range overlap each other.
幾何捕正済写真 2 a のデータは 、 航空写真 1 0 a のデータのそれぞれから民生 用デジタルカメ ラ 1 4 の光学系固有の歪みを取り 除レ、た電子画像フ ァイルである カ メ ラ仕様 9 a のデ一タは、 民生用デジタル力メ ラ 1 4 の力メ ラ と レンズの主 要諸 即ち、 主点位置 、 焦点距離ヽ 画素の寸法 、 画素の間隔などの属性を有す るデータの集合である。  Geometrically corrected photo 2a data is an electronic image file obtained by removing the distortion inherent in the optical system of consumer digital camera 14 from each of aerial photo 10a data. The data in 9a is data that has attributes such as the principal force lens of the consumer digital force meter 14 and the lens, that is, principal point position, focal length 画素 pixel size, pixel spacing, etc. Is a set of
内部標定 9 の作業は、 民生用デジタルカメ ラ 1 4が持つ光学系の歪み、 即ち、 カメ ラ仕様 9 a によって決まる理想的な中心投影の様子と実際の中心投影の様子 と の誤差を計測し、 それを補正するパラメータを決定する。 写真撮影における中 心投影は、 理想的にはカメ ラの主点位置、 焦点距離、 画素の寸法、 及び画素の間 隔で決定されるが、 その決定式自体が近似条件を使っているこ と、 また工作精度 の問題などから、 現実のレンズは理想的ではなく 、 歪曲収差等を現物合わせて計 測し、 理想的な中心投影からの誤差を捕正するためのパラメータを決定する必要 がある。 The work of internal orientation 9 is the distortion of the optical system of the consumer digital camera 14, that is, the ideal center projection determined by the camera specification 9 a and the actual center projection Measure the error between and and determine the parameter to correct it. The central projection in photography is ideally determined by the camera's principal point position, focal length, pixel dimensions, and pixel spacing, but the decision itself must use approximate conditions. In addition, due to problems with machining accuracy, the actual lens is not ideal, it is necessary to measure distortion aberration etc. in combination and determine the parameters to correct the error from the ideal central projection .
具体的には、 絶対的な座標が分かっている複数の対象物を、 絶対的な座標が分 かっている位置から撮影した写真について、 理想的条件下でそれら対象物が写る はずの画像上座標 (前記決定式で計算される) と実際にそれら対象物が写ってい る画像上座標と を、 放射方向歪みと接線方向歪みを表す画像上座標変換の式にあ てはめ解く こ とで、 それら歪みの式がその光学系の歪みを再現する式に相当する と きの係数を決定する。 その光学系で撮影された写真は、 このよ う にして決定し た係数をそれら歪みの式に与えて決定される画像上座標変換の式を用いて逆変換 する こ とで、 いずれも等しく 理想的条件下で撮影された画像へと変換する こ とが でき る  Specifically, for photographs of multiple objects with known absolute coordinates taken from a position where the absolute coordinates are known, the coordinates on the image where these objects should appear under ideal conditions ( And the coordinates on the image in which these objects are actually reflected are fit into the equation for the coordinate transformation on the image representing the radial distortion and tangential distortion. The coefficient is determined when this equation corresponds to the equation that reproduces the distortion of the optical system. The photographs taken with the optical system are inversely transformed by using the on-image coordinate transformation formula determined by applying the coefficients determined in this way to the distortion formulas. Can be converted to images taken under dynamic conditions
航空撮影 1 0 の作業は、 民生用デジタルカメ ラ 1 4 と航空 G P S受信機 1 6 を 、 取付 · 制御用台座 1 5 を介して航空機に搭載し、 航空 G P S受信機 1 6 の機能 を利用 して航空機の軌跡を記録しつつ、 取付 , 制御用台座 1 5 の機能を利用 して 航空 G P S受信機 1 6 の記録と同期 したシャ ッ ター信号を民生用デジタルカメ ラ 1 4 に送り 、 また取付 · 制御用台座 1 5 の機能を利用 して民生用デジタルカメ ラ 1 4 を直下方向に向いた姿勢に維持したまま、 民生用デジタルカメ ラ 1 4 の機能 によって航空機から直下方向の写真を撮影する。  For the aerial photography 10 work, the consumer digital camera 14 and the aviation GPS receiver 16 are installed on the aircraft via the mounting and control base 15 and the functions of the aviation GPS receiver 16 are used. Using the functions of the mounting and control pedestal 15 to send a shutter signal synchronized with the recording of the aviation GPS receiver 16 to the consumer digital camera 14 Use the functions of the control pedestal 15 to take a photo from the aircraft with the function of the consumer digital camera 14 while keeping the consumer digital camera 14 in a position facing directly downward. .
民生用デジタルカメ ラ 1 4 は、 市販されている一眼レフデジタルカメ ラ等であ る。 外部からシャ ツタ一信号を受け付けるイ ンタフェースを持ち、 それを取付 - 制御用台座 1 5 と繋げる。 取 り付け部分には汎用性があ り 、 よ り 高性能なカメ ラ が登場した場合には換装するこ とができる。  Consumer digital cameras 14 are commercially available single-lens reflex digital cameras and the like. It has an interface that accepts a shutter signal from the outside, and attaches it to the control base 1 5. The mounting part is versatile and can be replaced when a higher-performance camera appears.
取付 · 制御用台座 1 5 は、 民生用デジタル力メ ラ 1 4 と航空 G P S受信機 1 6 を搭載し、 航空機に取り 付ける台座である。 航空機由来の振動を吸収し、 民生用 デジタルカメ ラ 1 4 が常に直下方向を向く よ う取付部をモーターで傾斜させる機 能を持つ。 また、 航空 G P S受信機 1 6 から信号を受信して内蔵のメモ リ ー力一 ドに記録する と 同時に、 その信号受信と 同期して民生用デジタルカメ ラ 1 4ヘシ ャ ッター信号を送る機能を有する。 Mounting and control pedestal 15 is a pedestal mounted on an aircraft, equipped with a consumer digital force meter 14 and an aviation GPS receiver 16. A machine that absorbs vibrations from aircraft and tilts the mounting part with a motor so that the consumer digital camera 14 always faces directly below. Has the ability. It also has a function to receive a signal from the aviation GPS receiver 16 and record it in the built-in memory force, and at the same time send a digital camera for consumer use 14 in sync with the signal reception. Have.
航空 G P S受信機 1 6 は、 衛星から発射した時刻信号の電波の到達時間などか ら、 航空機の位置を三次元測位するための受信機である。 電子基板は取付 . 制御 用台座 1 5 に内蔵され、 アンテナのみを電波を受信しやすい航空機外又は機内の 適切な場所へ取 り 付け、 アンテナと基板と をケーブルで槃ぐ。  The aviation GPS receiver 16 is a receiver for three-dimensional positioning of an aircraft based on the arrival time of radio waves of time signals emitted from satellites. The electronic board is installed. Installed in the control pedestal 15, attach only the antenna to an appropriate place outside the aircraft or inside the aircraft where radio waves can be easily received, and wire the antenna and board with a cable.
図 2 に示すよ う に、 連続性解析 3 の手段は、 航空写真 1 0 a 及び G P Sデータ 1 2 a のデータ を入力 し、 カ メ ラ位置 3 a のデータを出力する。 航空写真 1 0 a の う ち基準と なる 1 枚の写真と、 G P Sデータ 1 2 a う ちその写真に対応する G P S記録を入力する と、 まず、 その写真とその G P S記録と を結び付ける。 その 後、 G P S記録と結び付けられた写真に連続する、 G P S記録と結び付けられて いない写真 1 枚を航空写真 1 0 a から取 り 出す。 一方の写真上からいく つかの点 を選び、 それらが他方の写真上のどこに写っているかを、 画像相関法による画像 照合によって探し出 し、 それら 2枚の重な り具合を求める。 この重な り 具合と、 一方の写真に結び付けられた G P S記録の高度情報から、 それらの撮影位置の差 を実距離に換算し、 G P Sデータ 1 2 a の各緯度 . 経度情報と比較して、 他方の 写真の地理座標に対応すべき G P Sデータ 1 2 a の記録を結び付ける。 これを帰 納的に繰り 返し、 航空写真 1 O a の各写真に対応する G P S記録を G P Sデータ 1 2 a から取 り 出 して結び付けて、 カメ ラ位置 3 a とする。  As shown in Fig. 2, the means of continuity analysis 3 inputs aerial photograph 10a and GPS data 12a data, and outputs camera position 3a data. Entering a standard photo of aerial photos 10 a and the GPS data 1 2 a corresponding to that photo, the photos are first linked to the GPS record. After that, one photo from the aerial photograph 10 0a that is not linked to the GPS record is taken from the photo associated with the GPS record. Select several points on one photo, find out where they are on the other photo by image matching using the image correlation method, and determine the overlap of the two images. Based on this overlap and the altitude information of the GPS record linked to one of the photos, the difference between the shooting positions is converted into an actual distance, and compared with the latitude and longitude information of GPS data 1 2 a, Connect the record of GPS data 1 2 a that should correspond to the geographic coordinates of the other photo. This is recursively repeated, and the GPS record corresponding to each photograph of the aerial photograph 1 O a is taken out from the GPS data 1 2 a and connected to obtain camera position 3 a.
G P Sデータ 1 2 a のデータは、 G P S解析 1 2 の実施によ り 、 航空撮影 1 0 の際に飛行経路を記録した航空 G P Sデータ 1 0 b の精度を向上させたデータで ある。 G P S時間、 地理座標、 移動速度などのデータが含まれる。  The GPS data 1 2 a data is data obtained by improving the accuracy of the aerial GPS data 10 b in which the flight path was recorded at the time of aerial photography 10 by performing the GPS analysis 1 2. G P S Includes data such as time, geographic coordinates, and moving speed.
カ メ ラ位置 3 a のデータは、 航空写真 1 0 a のデータのそれぞれについて、 G P Sデータ 1 2 a の中から、 撮影した瞬間に該当する記録 ( G P S時間、 地理座 標、 移動速度など) を結び付けたデータである。  The data at camera position 3a is the GPS data 1 2a for each of the aerial photos 1 0a, and records (GPS time, geographic coordinates, movement speed, etc.) corresponding to the moment of shooting are recorded. It is the linked data.
G P S解析 1 2 の処理は、 航空 G P Sデータ 1 0 b を移動局データ と して、 定 点 G P Sデータ 1 1 a を基地局データ と して、 R T K— G P S解析を実施し、 よ り精度の高い航空撮影 1 0 の飛行経路を算出する。 航空 G P Sデータ 1 0 b のデータは、 航空撮影 1 0 の実施の際の飛行経路を、 航空 G P S受信機 1 6 によって記録したデータである。 G P S時間、 地理座標、 移動速度、 搬送波などのデータが含まれる。 The GPS analysis 1 2 processing is more accurate by performing RTK-GPS analysis using aerial GPS data 10 b as mobile station data and fixed-point GPS data 1 1 a as base station data. Calculate flight path of aerial photography 1 0. The aerial GPS data 10 b data is the data recorded by the aerial GPS receiver 16 when the flight route during the aerial photography 10 is implemented. Includes data such as GPS time, geographic coordinates, travel speed, and carrier waves.
定点 G P Sデータ 1 1 a のデータは、 予め地理座標の分かっている点に定点 G P S受信機 1 7 を設置し、 航空撮影 1 0 の実施期間と 同じかそれを含む期間中に 観測したデータ、 及び予め分かっていたその点の地理座標である。 G P S時間、 搬送波などのデータが含まれる。  The fixed-point GPS data 1 1 a data is the data observed during the same period including or including the fixed-point GPS receiver 1 7 installed at the point where the geographical coordinates are known in advance, and The geographical coordinates of the point that were known in advance. Includes data such as G PS time and carrier wave.
定点測位 1 1 の作業は、 航空撮影 1 0 の飛行経路に近く (飛行経路全体が半径 1 0 k m圏に収まるのが好ま しい) 、 かつ地理座標が予め判明 している地点に定 点 G P S受信機 1 7 を設置し、 航空撮影 1 0 の実施期間と 同 じかそれを含む期間 中、 測位する。 尚、 国土地理院の管理する電子基準点が航空撮影 1 0 の飛行経路 付近に存在していた場合、 その電子基準点による観測は定点測位 1 1 の要件を満 た し、 その電子基準点から提供されるデータは定点 G P Sデータ 1 1 a と して利 用でき る。  Fixed point positioning 1 1 work is close to the flight path of aerial photography 10 (preferably the entire flight path is within a radius of 10 km), and the fixed point GPS is received at a point where the geographical coordinates are known in advance. Aircraft 17 will be installed, and positioning will be performed during the same period as aerial photography 10 or during that period. If an electronic reference point managed by the Geospatial Information Authority of Japan exists near the flight path of aerial photography 10, observations using that electronic reference point satisfy the requirements for fixed-point positioning 1 1, and from that electronic reference point, The provided data can be used as fixed-point GPS data 1 1 a.
定点 G P S受信機 1 7 は、 地理座標が予め分かっている場所に設置し、 衛星か ら発射した時刻信号の電波の到達時間などから、 その地理座標における搬送波を 記録するための受信機である。  The fixed point GPS receiver 17 is a receiver that is installed at a location where the geographical coordinates are known in advance, and records the carrier wave at the geographical coordinates based on the arrival time of the radio wave of the time signal emitted from the satellite.
図 2 に示すよ う に、 タイポイ ン ト抽出 4 の手段は、 幾何補正済写真 2 a及び力 メ ラ位置 3 a のデータを入力 し、 タイポイ ン ト 4 a のデータを出力する。 また、 内部標定要素 9 b を入力する こ と もでき る。 カメ ラ位置 3 a の写真対位置情報の 関係から、 幾何補正済写真 2 a の う ちの互いに重な り合 う写真同士をペアにする 。 重な り 合 う かど う かの判断は、 各写真に対応するカメ ラ位置 3 a の間の距離が 、 外部から与えたしきい値を超えるかど う かで決定する。 内部標定要素 9 b を入 力する場合、 内部標定要素 9 b の画角 とカメ ラ位置 3 a の高度から撮影範囲の大 き さ を概算する こ とでこのしきい値を算出する。 次に、 ペアごと に一方の写真上 からいく つかの点を選び、 それらが他方の写真上のどこに写っているかを、 画像 相関法によ る画像照合によって探し出 し、 それぞれの画像上座標の対応関係と し て記録する。  As shown in Fig. 2, the tie point extraction 4 means inputs the data of the geometrically corrected photograph 2a and the force mela position 3a, and outputs the data of the tie point 4a. It is also possible to input the internal orientation element 9b. Based on the relationship between the camera position 3a and the position information, the geometrically corrected photographs 2a that overlap each other are paired. The determination of whether or not they overlap is determined by whether or not the distance between the camera positions 3a corresponding to each photo exceeds a threshold given from the outside. When the internal orientation element 9 b is input, this threshold value is calculated by estimating the size of the shooting range from the angle of view of the internal orientation element 9 b and the altitude of the camera position 3 a. Next, for each pair, select several points on one photo, find out where they are on the other photo by image matching using the image correlation method, and determine the coordinates of each image. Record as correspondence.
タイポイ ン ト 4 a のデータは、 幾何補正済データ 2 a のデータの う ちの隣接し あう写真同士のペアについて、 共通に写っている点のそれぞれの画像上の座標を 記録したデータである。 The data for typpoint 4a is adjacent to the data for geometrically corrected data 2a. This is a record of the coordinates of each point in the image for a pair of matching photos.
図 2に示すよ うに、 外部標定 5の手段は、 幾何補正済写真 2 a 、 カメ ラ位置 3 a、 タイポイン ト 4 a及び内部標定要素 9 bのデータを入力し、 位置姿勢 5 a の データを出力する。 尚、 G C Pデータ 1 3 b のデータを入力すること もできる。 タイボイン ト 4 a の画像上座標について、 内部標定要素 9 b を用いて中心投影を 逆にたどり 、 撮像面への入射光線を算出する と、 各ペアのうち一方の写真上の点 への入射光線と他方の写真上の対応する点への入射光線とは同一平面上になるは ずである。 そこで、 ニュー トン法などの解法を用いて、 対応する各光線同士の捩 れが最小となるときの各撮像面の位置及び姿勢を算出するこ とで、 各写真撮影時 の撮像面の位置と姿勢の情報とする。  As shown in Fig. 2, the method of external orientation 5 is to input the data of geometrically corrected photo 2a, camera position 3a, typpoint 4a and internal orientation element 9b, and the data of position and orientation 5a. Output. In addition, data of G CP data 1 3 b can be input. With respect to the coordinates on the image of tie point 4a, the center projection is traced back using the internal orientation element 9b, and the incident ray on the imaging surface is calculated. The incident light beam to the corresponding point on the other photograph should be on the same plane. Therefore, by using a solution such as Newton's method, the position and orientation of each imaging surface when the twist of each corresponding ray is minimized are calculated, and the position of the imaging surface at the time of each photography is calculated. Use posture information.
また、 G C Pデータ 1 3 b を使用する場合、 G C Pデータ 1 3 b の点に対応す る写真上各点についても同様に入射光線を算出し、 前記解法を用いて、 その点に 結び付けられた地理座標とその光線との距離も同時に最短になるよ う、 各撮像面 の位置及ぴ姿勢の情報を算出する。 尚、 その場合、 G C Pデータ 1 3 bの画像上 座標を幾何補正済写真 2 a に合わせて変形させる必要があるため、 内部標定要素 9 b を用いて幾何補正 2 と同様の前処理を G C Pデータ 1 3 bに施す。  In addition, when GCP data 1 3 b is used, the incident ray is similarly calculated for each point on the photograph corresponding to the point of GCP data 1 3 b, and the geography associated with that point is calculated using the above solution. Information on the position and orientation of each imaging plane is calculated so that the distance between the coordinates and the ray is also the shortest at the same time. In this case, since the coordinates on the image of GCP data 1 3 b need to be deformed according to geometrically corrected photo 2 a, the same preprocessing as geometric correction 2 is performed using internal orientation element 9 b. Apply to 1 3 b.
位置姿勢 5 a のデータは、 幾何補正済写真 2 aの写真のそれぞれについての撮 影した瞬間のカメ ラの撮像面の位置と姿勢のデータである。  The data of the position / orientation 5a is data of the position and orientation of the camera imaging surface at the moment when each photograph of the geometrically corrected photograph 2a was taken.
G C Pデータ 1 3 bのデータは、 航空写真 1 0 a のうち G C P情報 1 3 a の地 点が写っている写真について、 その画像上座標と地理座標を結び付けたデータで ある。  The data of G C P data 1 3 b is the data that combines the coordinates on the image and the geographical coordinates of the photo that includes the point of G C P information 1 3 a in the aerial photo 10 a.
G C P指定 1 3の処理は、 G C P情報 1 3 a の地上基準点の様子から、 航空写 真 1 0 a の画像上でその場所に該当する点を探し出し、 その地上基準点の地理座 標 (緯度、 経度、 高度) をその画像の画像上座標に結び付けて記録する。  The processing of GCP designation 1 3 is to find the point corresponding to the location on the aerial photograph 1 0 a image from the state of the ground reference point of GCP information 1 3 a, and to determine the geographical coordinate (latitude of the ground reference point) , Longitude, altitude) and the image coordinates of the image.
G C P情報 1 3 a のデータは、 航空撮影 1 0の撮影対象地域のう ち、 航空撮影 1 0によって航空写真 1 0 a に記録され、 かつ航空写真 1 0 a上での判読が容易 である地点の様子 (画像判読の手がかり となる風景写真等) と、 その地点の地理 座標のデータである。 図 2に示すよ うに、 偏位補正 6 の手段は、 内部標定要素 9 b、 航空写真 1 0 a 及び位置姿勢 5 a のデータを入力し、 偏位補正済写真 6 a のデータを出力する。 位置姿勢 5 a の位置及び姿勢から、 幾何補正済写真 2 a の写真を、 カメ ラが直下 を向いていた場合の撮影の状態になるよ う射影し直す。 GCP information 1 3 a data is recorded in aerial photography 1 0 a in aerial photography 1 0 and is easy to interpret on aerial photography 1 0 a in aerial photography 1 0 (Scenery photo, etc., which is a clue to image interpretation) and geographic coordinate data at that point. As shown in Fig. 2, the displacement correction 6 means inputs the data of the internal orientation element 9b, the aerial photograph 10a and the position and orientation 5a, and outputs the data of the displacement corrected photograph 6a. From the position and orientation of position 5a, re-project the photo of geometrically corrected photo 2a so that it is in the shooting state when the camera is facing directly below.
尚、 幾何補正済写真 2 a は一度リサンプルされている画像であるため、 これに 再び処理を施すことは画像の精細性が失われるおそれがあるので、 航空写真 1 0 に内部標定要素 9 b を用いて幾何補正 2 を行う と ころから再計算する。 即ち、 こ こでの処理は、 歪曲の逆変換によ り画像上座標を変換し、 位置 . 姿勢の傾きを キャンセルして直下方向に画像上座標を変換し、 3次内揷などの内揷法によ り ラ スター画像へと リサンプルする処理となる。  Note that the geometrically corrected photograph 2 a is an image that has been resampled once, so re-processing it may result in loss of image definition. Perform geometric correction 2 using, and recalculate from the point. In other words, the processing here converts the coordinates on the image by the inverse transformation of the distortion, cancels the position / posture inclination, converts the coordinates on the image in the downward direction, Resampled into a raster image by the method.
偏位補正済写真 6 a のデータは、 幾何捕正済写真 2 a の写真のそれぞれについ て、 直下方向で撮影した場合の写真へと射影し直した電子画像ファイルである。 実際には、 画像の精細性欠落を避けるために、 幾何補正済写真 2 a を直接処理す るのではなく 、 航空写真 1 0 aから光学系固有の歪みを取り除く ところから再処 理して生成する。  The data of the deviation corrected photo 6a is an electronic image file obtained by projecting each of the photos of the geometrically corrected photo 2a into a photo when taken in the downward direction. Actually, in order to avoid the loss of detail in the image, instead of directly processing the geometrically corrected photo 2a, it is generated by reprocessing from the aerial photo 10a where the inherent distortion of the optical system is removed. To do.
図 2に示すよ うに、 ステ レオマッチング 7の手段は、 内部標定要素 9 b、 偏位 補正済写真 6 a及び位置姿勢 5 a のデータを入力し、 3次元座標 7 a のデータを 出力する。 偏位補正済写真 6 aのうち重なり合う写真を位置姿勢 5 a の位置情報 を元に選び出しペアにする。 このペアの写真同士で共通に含まれる撮影範囲につ いて、 一方の写真上の各点が他方の写真上のどこに写っているかを、 画像相関法 による画像照合によって探し出す。 その後、 それぞれの写真上の各点について、 位置姿勢 5 a の位置及び姿勢に基づいて、 実空間からの中心投影を逆にたどる直 線を算出する。 外部標定 5 の処理を経て位置姿勢 5 a のデータが得られているた め、 一方の写真のある点から延びる直線は、 他方の写真の同じ点から延びる直線 との距離が 0であるカ 又は非常に短い。 そこで、 その距離の中点をその点の地 理座標とする。 こ う して得られた画像上座標と地理座標の対応関係を記録する。  As shown in FIG. 2, the stereo matching 7 means inputs the data of the internal orientation element 9 b, the displacement corrected photo 6 a and the position and orientation 5 a and outputs the data of the three-dimensional coordinate 7 a. A pair of photos that have been corrected is selected based on the position information of position and orientation 5a. With regard to the shooting range that is commonly included in this pair of photos, the image correlation method is used to find where each point on one photo appears on the other photo. Then, for each point on each photo, a straight line that reverses the central projection from the real space is calculated based on the position and orientation of the position and orientation 5a. Since the data of the position and orientation 5a is obtained through the processing of the external orientation 5, the straight line extending from a certain point in one photo is 0 or the distance from the straight line extending from the same point in the other photo is 0 or Very short. Therefore, the midpoint of the distance is the geographical coordinate of that point. The correspondence between the coordinates on the image and the geographical coordinates obtained in this way is recorded.
3次元座標 7 a のデータは、 偏位補正済写真 6 a の写真のそれぞれについて、 他の写真と共通に写っている点の画像上座標と地理座標を結び付けたデータであ る。 図 2 に示すよ う に、 オルソ補正 8 の手段は、 内部標定要素 9 b 、 航空写真 1 0 a及び 3次元座標 7 a のデータを入力 し、 補正済写真 8 a を出力する。 3次元座 標 7 a の地理座標で表される地形を、 内部標定要素 9 b の幾何的歪みを持った状 態で、 位置姿勢 5 a によって撮影した場合に、 撮像面上の各点がどの地点から射 影されるかを画像座標と地理座標の対応関係と して算出する。 この対応関係によ つて航空写真 1 0 a の画像上各点に地理座標を結び付け、 その地理座標に基づい て、 各画像座標を中心投影のものから正射影のものへと変換する。 変換された画 像上座標について 3 次内挿などの内揷法によ り ラスター画像へと リ サンプルした ものを記録する。 The data of the three-dimensional coordinate 7 a is data that associates the coordinates on the image and the geographical coordinates of the point that is common to the other photos for each of the photos of the displacement corrected photo 6 a. As shown in Fig. 2, the orthorectification 8 means inputs the data of the internal orientation element 9b, the aerial photograph 10a and the three-dimensional coordinate 7a, and outputs the corrected photograph 8a. When the terrain represented by the geographical coordinates of the 3D coordinate 7a is captured with the position and orientation 5a in the state with the geometric distortion of the internal orientation element 9b, The projection from the point is calculated as the correspondence between image coordinates and geographic coordinates. Based on this correspondence, geographic coordinates are connected to each point on the image of the aerial photograph 10a, and based on the geographic coordinates, each image coordinate is converted from a central projection to an orthogonal projection. Record the re-sampled coordinates on the converted image into a raster image by an interpolation method such as cubic interpolation.
補正済写真 8 a のデ タは、 航空写真 1 0 a の写真のそれぞれにつレ、て 、 他の と imに写っている範囲を、 正射影に射影し直した電子画像ファィルである デジタル空撮 3 次元計測システム 1 の各手段は、 コ ンピュータ 1 8 によ り 実現 される。 図 3 に示すよ 5 に、 コ ンピュータ 1 8 は、 入力装置 1 8 a 、 中央処理装 置 1 8 b 、 記憶装置 1 8 c及び出力装置 1 8 d等からなる。  The data of the corrected photo 8a is an electronic image file that is an electronic image file that is obtained by projecting the range shown in each of the aerial photos 10a and the other and im into an orthogonal projection. Each means of the imaging 3D measurement system 1 is realized by a computer 18. As shown in FIG. 3, the computer 18 includes an input device 18 a, a central processing device 18 b, a storage device 18 c, and an output device 18 d.
入力装置 1 8 a は、 外部から入力データ 1 8 e をコ ンピュータ 1 8 内に取り 込 む装置である 。 入力 7~一タ 1 8 e は、 中央処理装置 1 8 b の指示によ り 記憶装置 The input device 1 8 a is a device that takes in the input data 1 8 e from the outside into the computer 1 8. Inputs 7 to 1 8 e are stored in accordance with instructions from the central processing unit 1 8 b.
1 8 c に保存される。 Stored in 1 8 c.
入力データ 1 8 e は 、 内部標定要素 9 b 、 航空写真 1 0 a 、 航空 G P Sデータ Input data 1 8 e is internal orientation element 9 b, aerial photograph 10 0 a, aerial GPS data
1 2 a、 G C Pデータ 1 3 b などのデータである。 尚、 記録媒体等に記録された 状態で提供される場合 ある。 1 2 a, G C P data 1 3 b, etc. It may be provided in a state of being recorded on a recording medium.
1 8 b 壮 中央処理装置 は 、 命令を解析して処理の実行及び制御を行 置である 1 8 b So central processing unit is a command execution and control by analyzing instructions
。 必要に し 、 pLi Ί' 装置 1 8 c からデータを取り 出 した り 、 記憶 置 1 8 c に データを保存した り する . If necessary, retrieve data from pLiLi 'device 18 c or save data to storage device 18 c
記憶装置 1 8 c は、 プロ グラム、 入力データ 1 8 e 、 内部データ 1 8 f 、 出力 データ 1 8 g などを記録する装置であ り 、 データべース と して構築する場合もあ る。  The storage device 18 c is a device that records programs, input data 18 e, internal data 18 f, output data 18 g, and the like, and may be constructed as a database.
内部データ 1 8 f は、 処理の過程においてコ ンピュータ 1 8 内で作成されるデ ータである。 幾何補正済写真 2 a 、 カメ ラ位置 3 a 、 タイポイ ン ト 4 a 、 位置姿 勢 5 a、 偏位補正済写真 6 a 、 3次元座標 7 a などのデータがある。 The internal data 1 8 f is data created in the computer 1 8 during the process. Geometrically corrected photo 2a, camera position 3a, typpoint 4a, position There are data such as 5a, displacement corrected photo 6a, 3D coordinate 7a, etc.
出力装置 1 8 dは、 コ ンピュータ 1 8から出力データ 1 8 g を外部へ取り 出す 装置である。 出力データ 1 8 gは、 中央処理装置 1 8 b の指示によ り記憶装置 1 8 cから呼び出される。  The output device 18 d is a device for taking out output data 18 g from the computer 18 to the outside. The output data 18 g is called from the storage device 18 c by the instruction of the central processing unit 18 b.
出力データ 1 8 gは、 補正済写真 8 a などのデータである。 尚、 画面に表示し たり、 用紙に印刷したり、 記録媒体に記録するなどの手段によ り 出力される。 図 4及び図 5 は、 本発明であるデジタル空撮 3次元計測システムの画像相関法 における図形の変形を示す図である。 図 6 は、 本発明であるデジタル空撮 3次元 計測システムの画素値の取得を示す図である。  The output data 1 8 g is data such as the corrected photo 8 a. It is output by means such as displaying on a screen, printing on paper, or recording on a recording medium. FIG. 4 and FIG. 5 are diagrams showing the deformation of a figure in the image correlation method of the digital aerial photography three-dimensional measurement system according to the present invention. FIG. 6 is a diagram showing the acquisition of pixel values of the digital aerial imaging three-dimensional measurement system according to the present invention.
2枚以上のデジタル画像について、 ある 1枚の画像上に写っているものが、 他 の画像上のどこに写っているのかを検出する技術を画像照合という。 特に、 写真 測量においては、 ラスター画像と して記録された各写真上の各点について、 その 点が他の写真上のどの点に対応するのかを、 ィ象相関法を用いて検出す 本 明においては 、 タイポイン ト抽出 4及びステレォマツチング 7の手段でこの手法 を用いる。  Image verification is a technology that detects where two or more digital images appear on one image but on the other. In particular, in photogrammetry, for each point on each photo recorded as a raster image, which point on the other photo corresponds to that point is detected using the image correlation method. In this method, this method is used by means of typpoint extraction 4 and stereo matching 7.
2枚のデジタノレ画像を画像 A、 画像 B とする 通常は、 まず元となる画像 Aに ついて、 画像照合の対象となる点 Aを 1つ任 、に選び出したら、 予め定められた 長さの辺を持つ正方形の枠 Aによつてその点 Aの周り を囲み、 枠 A中に含まれる 各点 Aの画素値 Aを取得する。  Assuming that two digital images are image A and image B. First, for the original image A, select one point A that is the target of image verification. The area around the point A is surrounded by a square frame A with, and the pixel value A of each point A contained in the frame A is acquired.
次に、 他方の画像 B上に同じ大きさの枠 Bを置き 、 そこに含まれる点 Bの画素 値 Bを取得し 、 先に取得した枠 A内の画素値 Aとの相関係数を求める。 画像 A上 の枠 Aを固定したまま、 他方の画像 B上の枠 Bを少しずつ動かしながら次々に相 関係数を求めていき、 相関係数が最も高く なる とさの枠 B の中心点 Bが 、 画像 A 上の枠 Aの中心点 Aと一致する点とみなす。  Next, a frame B of the same size is placed on the other image B, the pixel value B of the point B included therein is acquired, and the correlation coefficient with the pixel value A in the frame A acquired previously is obtained. . While fixing frame A on image A, move frame B on the other image B little by little to find the correlation number one after another. When the correlation coefficient is the highest, center point B of frame B Is considered to be a point that coincides with the center point A of frame A on image A.
図 4に示すよ う に、 本発明では 、 画像 1 9上に置く枠 2 0 a の形状を変形する As shown in FIG. 4, in the present invention, the shape of the frame 20 a placed on the image 19 is deformed.
。 同一のものであっても、 画像 1 9外縁付近に写つた点 2 1 は、 中央に つに点. Even if they are the same, the point 2 1 shown near the outer edge of image 1 9 is a dot in the center.
2 0 と比べて 、 倒れ込みが生じるため、 周辺の地物が放射方向に引き延ばされた 形状で写真上に記録される。 そのため、 置が画像 9 の中央付近であれば正方 形の枠 2 0 a 、 外縁付近であれば放射方向に引さ延ばした四角形の枠 2 1 a とす る。 Compared to 20, it falls down and the surrounding features are recorded on the photograph in a shape that is stretched radially. Therefore, if the position is near the center of the image 9, it is a square frame 20a, and if it is near the outer edge, it is a square frame 21a. The
タイボイ ン ト抽出 4においては、 偏位捕正 6がされておらず個々の画像の向き 等が揃っていないので、 図 5の上段に示すよ う に、 タイポイン ト抽出する写真同 士で向きが揃う よ うに枠 2 0 a を回転させる。  In tie-point extraction 4, the displacement correction 6 is not performed and the orientation of each image is not uniform, so as shown in the upper part of FIG. Rotate frame 20 a to align.
次に、 図 5の中段に示すよ うに、 カメ ラの直下方向に対する傾きに合わせて枠 2 0 a を歪ませる。 このとき用いる傾きの情報は外部から入力することができる が、 入力しない場合、 傾きは無いものと して扱われる。 さ らに、 図 5の下段に示 すよ う に、 カメ ラ高度に合わせて枠 2 0 a を拡大又は縮小する。 このとき用いる 高度の情報はカメ ラ位置 3 aから得ることができる。  Next, as shown in the middle of Fig. 5, the frame 20 a is distorted according to the inclination of the camera in the direction directly below. The tilt information used at this time can be input from the outside, but if it is not input, it is treated as having no tilt. In addition, as shown in the lower part of Fig. 5, the frame 20 a is enlarged or reduced according to the camera height. The altitude information used at this time can be obtained from the camera position 3a.
また、 図 6 に示すよ う に、 枠 2 0 a を変形させるに伴い、 相関係数を計算する ための画素値 2 0 b取得の位置も変更する。 変形なしの正方形の枠 2 0 a では各 ピクセルの中心から画素値 2 0 b を取得していたと考え、 各中心点の位置を枠 2 0 a の変形に合わせて変更する。  Also, as shown in Fig. 6, as the frame 20a is deformed, the position for obtaining the pixel value 20b for calculating the correlation coefficient is also changed. In the square frame 20 0 a without deformation, it is assumed that the pixel value 2 0 b was obtained from the center of each pixel, and the position of each center point is changed according to the deformation of the frame 20 0 a.
尚、 変更された取得位置から画素値 2 0 b を取得する際は、 まず取得位置がい すれかのピクセル中心と一致する場合は、 そのピクセルの画素値 2 0 b をそのま ま使う。  When acquiring the pixel value 2 0 b from the changed acquisition position, first, if the acquisition position coincides with any pixel center, the pixel value 2 0 b of that pixel is used as it is.
次に、 取得位置が連続する 2つのピクセルの中心を結ぶ直線上にある場合は、 それらのピクセルの画素値 2 0 bをそれぞれ中心と取得位置との距離によ り内挿 した値を使う。  Next, when the acquisition position is on a straight line connecting the centers of two consecutive pixels, the values obtained by interpolating the pixel values 20 b of those pixels by the distance between the center and the acquisition position are used.
それ以外の場合には、 取得位置を囲む最近傍の 4つのピクセルの画素値 2 0 b をそれぞれ中心と取得位置との距離によ り内挿した値を使う。 尚、 内挿に使う式 は、 3次内挿などの内揷法に従う。  In other cases, the values obtained by interpolating the pixel values 2 0 b of the nearest four pixels surrounding the acquisition position by the distance between the center and the acquisition position are used. The equation used for interpolation follows an interpolation method such as cubic interpolation.
写真上の各点の地理座標を割り出すためには、 各写真を撮影したときのカメ ラ の位置と姿勢の情報が必要となる。 従来の実測データに基づき、 各写真撮影時の 撮像装置の位置と姿勢がある程度正確に求められるので、 これを初期値と して各 写真の撮影時の位置及び姿勢の推定を行う。 本発明においては、 外部標定 5の手 段でこの手法を用いる。  In order to determine the geographical coordinates of each point on the photo, information on the camera's position and orientation when each photo was taken is required. Based on the actual measurement data, the position and orientation of the imaging device at the time of each photo shoot can be obtained to some extent accurately. This is used as the initial value to estimate the position and posture at the time of taking each photo. In the present invention, this method is used in the external orientation 5 method.
図 7の上段は、 共面条件を示す図である。 尚、 共面条件とは、 同一の点を観測 した 2枚の写真について、 その点に対応するそれぞれの撮像面上の点と、 それぞ れのレンズ主点は、 同一の平面上に存在する と レ、 う性質である。 The upper part of Fig. 7 shows the coplanar conditions. Note that the coplanar condition refers to the point on the imaging surface corresponding to each point of two photographs of the same point, respectively. These lens principal points have the property of being on the same plane.
従来の方法では、 撮影範囲が互いに重な り合う 2枚の写真について、 撮像面 2 2上の点 2 2 a に対応する撮像面 2 3上の点 2 3 a を画像相関法で探し出す。 点 2 2 a と点 2 3 a の地理座標は、 カメ ラの位置と姿勢を変数と した式で表すこ と ができ、 このと き共面条件によ り 、 点 2 2 a 、 レンズ主点 2 2 b 、 点 2 3 a 、 レ ンズ主点 2 3 b は、 同一の平面上にある。 2枚の写真間で対応し合う 点の組を多 く 探し出せば、 それらの式と条件を連立させる こ とで、 位置と姿勢に含まれる誤 差を推定し、 その誤差を除去する こ とができ る。  In the conventional method, the point 2 3 a on the imaging surface 2 3 corresponding to the point 2 2 a on the imaging surface 2 2 is searched for by using the image correlation method for two photographs whose imaging ranges overlap each other. The geographical coordinates of point 2 2 a and point 2 3 a can be expressed by an expression using the camera position and orientation as variables. At this time, depending on the coplanar conditions, point 2 2 a, the lens principal point 2 2 b, point 2 3 a, and lens principal point 2 3 b are on the same plane. If you find many pairs of points that correspond between two photos, it is possible to estimate the error included in the position and orientation and eliminate the error by combining these equations and conditions. it can.
図 7 の下段は、 共線条件を示す図である。 尚、 共線条件と は、 撮影範囲の点と 、 その点に対応する撮像面上の点と、 レンズ主点は、 同一の直線上に存在する と い う性質である。  The lower part of Fig. 7 shows the collinear conditions. Note that the collinear condition is a property that the point of the imaging range, the point on the imaging surface corresponding to the point, and the lens principal point exist on the same straight line.
共面条件による推定方法では、 各画像間の位置及び姿勢情報のバラつきは除去 でき るが、 全体が同 じ誤差を定数的に含んでいる場合には、 それを取り 除く こ と はできない。 特に、 共面条件を適用する 2枚の写真のレンズ主点同士を結んだ直 線を軸と した回転の 自 由度に由来する誤差は大き く 残る可能性がある。 そこで、 従来の方法では、 いずれかの写真上に写っている地理座標の分かる点を地上基準 点と し、 撮影範囲 2 4 b 内の地上基準点 2 4 c と、 撮像面 2 4上に記録された点 2 4 a と、 レンズ主点 2 4 d について共線条件を用いて、 その写真の位置及び姿 勢情報に含まれる誤差を推定し、 その誤差を除去する。  The estimation method based on coplanar conditions can remove variations in position and orientation information between images, but if the entire image contains the same error, it cannot be removed. In particular, the error derived from the degree of freedom of rotation about the straight line connecting the lens principal points of the two photographs to which the coplanar condition is applied may remain large. Therefore, in the conventional method, the point where the geographical coordinates that can be seen in any one of the photographs is known as the ground reference point, and recorded on the ground reference point 2 4 c in the shooting range 2 4 b and on the imaging surface 24. Using the collinear conditions for the selected point 2 4 a and the lens principal point 2 4 d, the error contained in the position and attitude information of the photograph is estimated, and the error is removed.
本発明では、 3枚以上の写真を利用する こ とで、 位置の実測データのみから、 従来の方法と 同程度の推定精度を実現する。 即ち、 姿勢の実測データを用いず、 また地上基準点を用いず、 R T K— G P S によ り 導出された位置データのみを用 いる。  In the present invention, by using three or more photographs, an estimation accuracy equivalent to that of the conventional method is realized from only the measured position data. In other words, only the position data derived by R T K-GPS is used without using the measured data of the attitude and the ground reference point.
図 8 は、 本発明であるデジタル空撮 3次元計測システムの位置及び姿勢推定に おける画像の組を示す図である。 航空撮影 1 0 において、 進行方向に沿った連続 する 2枚の画像 2 5 、 2 5 a と撮影範囲が重な り 合う よ う 、 これらの進行方向 と 直交する位置で撮影した画像 2 5 b を用意する。  FIG. 8 is a diagram showing a set of images in position and posture estimation of the digital aerial imaging three-dimensional measurement system according to the present invention. In aerial photography 10, images 25 b taken at positions orthogonal to these traveling directions are overlapped so that the two continuous images 25, 25 a along the traveling direction overlap the shooting range. prepare.
図 8 の上図に示すよ う に、 まず直線上に飛行しながら撮影を行って画像 2 5及 び画像 2 5 a を得た後、 その脇をほぼ平行に飛びながら撮影する飛行経路 2 5 c によ り画像 2 5 b を得る。 または、 図 8 の下図に示すよ う に、 まず直線上に飛行 しなが ら撮影を行って画像 2 5及び画像 2 5 a を得た後、 その直線に直交する方 向に飛びなが ら撮影する飛行経路 2 5 d によ り 、 画像 2 5 b を得る。 As shown in the upper diagram of Fig. 8, first, while taking a picture while flying on a straight line to obtain images 25 and 25a, flight path 25 c Thus, image 25b is obtained. Or, as shown in the lower figure of Fig. 8, first take a picture while flying on a straight line to obtain images 25 and 25a, then fly in the direction perpendicular to the straight line. The image 25 b is obtained by the flight path 25 d to be photographed.
このよ う にして得られた画像 2 5、 2 5 a 2 5 b の各 2枚ずつのペアにタイ ボイ ン ト抽出 4 を適用する こ とで得られるタイボイ ン ト 4 a の画像上座標につい て、 内部標定要素 9 b を用いて中心投影を逆にたど り 、 各写真撮影時の撮像面へ の入射光線を算出する と、 各ペアの う ち一方の写真上の点への入射光線と他方の 写真上の対応する点への入射光線と は、 同一の地点を観測したそれぞれの撮像面 上の点とそれぞれの レンズ主点と を結ぶ直線であるから、 共面条件によ り 、 同一 平面上になるはずである。 したがって、 初期条件と してこれら各画像がカメ ラ位 置 3 a の位置情報の位置で、 カメ ラが直下方向を向いた状態で撮影されたものと 仮定し、 ニュー ト ン法などの解法を用いて、 それら 3つのペアの間で対応する各 光線同士の捩れが最小と なる よ う各撮像面の位置及び姿勢を算出する と、 これが 各写真撮影時の撮像面の位置及び姿勢に等しく なる。 このと き、 3枚の写真のレ ンズ主点が同一直線上にないため、 これら 3組の共面条件についてレンズ主点同 士を結んだ直線を軸と した回転の自 由度は互いに打ち消し合い、 この自 由度に由 来する誤差はごく 僅かになる。 したがって、 共線条件を用いなく と も、 非常に高 い精度でカメ ラの位置及び姿勢情報を推定する こ と ができ る。 これら 3枚の画像 に加えて、 これら 3枚のいずれかと重な り合う画像があるなら、 その画像につい ても共面条件を適用する こ と によ り 、 その画像の位置及び姿勢情報もあわせて推 定する こ とができ る と同時に、 元の 3枚についての位置及び姿勢情報の推定も高 精度になる。 同様に して、 帰納的に、 推定の対象と なる画像を増やすこ とができ る。 即ち、 3枚以上の画像のグループがこの方法によ る推定を行う ための条件を 満たすと き、 それらのグループに含まれない画像が、 それらのグループに含まれ るいずれかの画像と重な り合う なら、 その画像もまたそのグループに加えてこの 方法による推定の対象とする こ とができ る。 また、 これらの画像のいずれかに G C Pデータ 1 3 b の点が写っていた場合、 内部標定要素 9 b を用いて中心投影を 逆にたど り 、 その点への入射光線を算出し、 前記解法を用いて、 前記各光線の捩 れが最小になる と 同時に、 その点に結び付けられた地理座標とその光線との距離 も最短になるよ う、 各撮像面の位置及び姿勢の情報を算出する。 尚、 その場合、 G C Pデータ 1 3 bの画像上座標を幾何補正済写真 2 a に合わせて変形させる必 要があるため、 内部標定要素 9 bを用いて幾何補正 2 と同様の前処理を G C Pデ ータ 1 3 bに施す。 The coordinates on the image of tie point 4a obtained by applying tie point extraction 4 to two pairs of images 2 5, 2 5 a 2 5 b obtained in this way. Then, using the internal orientation element 9 b to reverse the central projection and calculate the incident light on the imaging surface at the time of each photo shoot, the incident light on the point on one photo of each pair And the incident light at the corresponding point on the other photo are straight lines connecting the points on the imaging surface and the lens principal points at the same point, and depending on the coplanar conditions, Should be on the same plane. Therefore, assuming that each of these images was taken at the position of the position information of camera position 3a as an initial condition and the camera was taken with the camera facing directly downward, a solution such as the Newton method was used. Using this, the position and orientation of each imaging plane are calculated so that the twisting of the corresponding rays between the three pairs is minimized, and this is equal to the position and orientation of the imaging plane at the time of each photo shoot. . At this time, since the lens principal points of the three photographs are not on the same straight line, the degrees of freedom of rotation around the straight line connecting the lens principal points for these three coplanar conditions cancel each other out. At the same time, the error resulting from this freedom is negligible. Therefore, the camera position and orientation information can be estimated with very high accuracy without using the collinear condition. In addition to these three images, if there is an image that overlaps with any of these three images, applying the coplanar condition to that image also adjusts the position and orientation information of that image. At the same time, the position and orientation information of the original three images can be estimated with high accuracy. Similarly, the number of images to be estimated can be increased inductively. That is, when a group of three or more images satisfies the conditions for estimation by this method, an image not included in those groups overlaps with any image included in those groups. If so, the image can also be subject to estimation by this method in addition to the group. If a point of GCP data 1 3 b is shown in any of these images, the center projection is reversed using the internal orientation element 9 b to calculate the incident ray at that point, and Using the solution, the twist between each ray is minimized and at the same time the distance between the ray and the geographic coordinates associated with that point In addition, the position and orientation information of each imaging surface is calculated so as to be as short as possible. In this case, since it is necessary to transform the coordinates on the image of the GCP data 1 3 b to match the geometrically corrected photograph 2 a, the same preprocessing as in the geometric correction 2 is performed using the internal orientation element 9 b. Apply to data 1 3 b.
本発明においては、 ステ レオマッチング 7に先立って、 各写真に偏位捕正 6 を 施す。 即ち、 写真撮影時のカメ ラの傾きによって傾いた中心投影の写真を、 直下 方向の中心投影に投影し直す処理を行う。  In the present invention, prior to stereo matching 7, displacement correction 6 is applied to each photograph. In other words, the central projection photograph tilted by the camera tilt at the time of taking a picture is re-projected to the central projection in the direct downward direction.
図 9は、 本発明であるデジタル空撮 3次元計測システムの偏位補正の手法を示 す図である。 低緯度において、 緯線及び経線は局所的には正方形状のマス目を形 成しているが、 カメ ラが傾いた状態で撮影した撮影写真 2 6は全て歪んだマス目 になるので、 カメ ラの傾きの情報を用いて正方形のマス目になるよ うに補正して 偏位補正済写真 2 7 を生成する。  FIG. 9 is a diagram showing a deviation correction method of the digital aerial imaging three-dimensional measurement system according to the present invention. At low latitudes, the latitudes and meridians locally form square squares, but all of the photos taken with the camera tilted 26 have distorted squares. Using the information of the slope of the image, the image is corrected so that it becomes a square cell, and a displacement corrected photo 2 7 is generated.
まず、 撮影範囲 2 6 b に対して傾いている撮像面 2 6 a を、 撮影範囲 2 6 bか らの光線はそのままに水平になるよ うに、 レンズ主点 2 6 c を中心に回転させる 。 次に、 回転前の画像上各点について、 それぞれ対応する各光線が回転後の撮像 面 2 7 a と交わる位置へと、 それらの点の位置を移動させる。  First, the imaging surface 26 a tilted with respect to the imaging range 26 b is rotated around the lens principal point 26 c so that the light rays from the imaging range 26 b remain horizontal. Next, for each point on the image before rotation, the position of each point is moved to a position where each corresponding light beam intersects the rotated imaging surface 27 a.
図 1 0は、 本発明であるデジタル空撮 3次元計測システムの偏位補正した画像 の リ サンプルを示す図である。 撮影写真 2 8 の水平化 2 9によ り移動された各点 をラスタ一画像化するために、 リサンプル処理を行って、 偏位補正済画像と して 記録する。 このとき、 この後のステレオマッチング 7の計算を簡便にするために 、 ステレオマッチングを行う画像同士で画像の辺の向きが揃う よ うにリサンプル する。 特に、 連続する写真である場合は、 リサンプル後の画像の辺がそれぞれの 撮影時における航空機の進行方向 2 8 a に揃う よ うに回転 2 9 a させる。  FIG. 10 is a diagram showing a resample of an image subjected to displacement correction in the digital aerial photography three-dimensional measurement system according to the present invention. In order to create a raster image for each point moved by leveling the photographed picture 28, resample the image and record it as a displacement-corrected image. At this time, in order to simplify the subsequent calculation of stereo matching 7, resampling is performed so that the directions of the edges of the images to be stereo matched are aligned. In particular, in the case of a series of photographs, the image is rotated 29 a so that the sides of the resampled image are aligned with the aircraft traveling direction 28 a at the time of each photographing.
2枚以上のデジタル画像について、 画像照合に基づき、 あるものがある画像上 に写っている画像上座標と、 同じものが他の画像上に写っている画像上座標とを 求め、 そこにそれぞれのカメ ラの位置及ぴ姿勢を加味するこ とで、 そのものの 3 次元座標を求めることができる。 本発明においては、 ステレオマッチング 7の手 段でこの手法を用いる。  For two or more digital images, based on image matching, find the coordinates on the image where one image appears on one image and the coordinates on the image where the same image appears on the other image. By taking into account the camera's position and orientation, the 3D coordinates of the camera itself can be obtained. In the present invention, this method is used in the method of stereo matching 7.
共通に写っている範囲の広い 2枚の画像について、 位置及び姿勢推定手法によ り それらが撮影されたと きのカメ ラの位置及び姿勢を求める。 その上で、 一方の 画像上の各点について、 画像相関法によ り他方の画像上から対応する点を見つけ 出す。 For two images with a wide range shown in common, the position and orientation estimation method is used. Obtain the position and orientation of the camera when they were photographed. Then, for each point on one image, a corresponding point is found from the other image by the image correlation method.
図 1 1 は、 本発明であるデジタル空撮 3次元計測システムのステ レオマツチン グの手法を示す図である。 一方の撮像面 3 0上の点 3 0 a から光学系を逆に迪り 、 レンズ主点 3 0 b を通って撮影範囲 3 0 c にまで延びる直線 3 0 d の式は一意 に求める こ とができ る。 同様に、 対応する他方の撮像面 3 1 上の点 3 1 a から光 学系を逆に迪 り 、 レンズ主点 3 1 b を通って撮影範囲 3 1 c にまで延びる直線 3 1 d の式も一意に求めるこ とができる。  FIG. 11 is a diagram showing a stereo-matching method of the digital aerial imaging three-dimensional measurement system according to the present invention. The optical system is turned upside down from the point 30 0a on one imaging surface 30 and the straight line 30d that extends to the shooting range 30c through the lens principal point 30b is uniquely determined. You can. Similarly, the equation of the straight line 3 1 d extending from the corresponding point 3 1 a on the other imaging surface 3 1 to the imaging range 3 1 c through the main lens point 3 1 b is reversed. Can also be determined uniquely.
点 3 0 a と点 3 1 a は、 同一の物点を異なる中心投影の系で射影したものであ る力 ら、 直線 3 0 d と直線 3 1 d は交わるはずで、 これらの直線 3 0 d 、 3 1 d の式を連立させて解いた結果がその交点の座標、 即ち、 3 次元座標 3 2 にあたる 。 尚、 直線 3 0 d、 3 1 d が交わらない場合は、 2 直線の中点を求め、 それを 3 次元座標 3 2 とする。  Point 3 0 a and point 3 1 a are the same object point projected by different central projection systems, so line 3 0 d and line 3 1 d should intersect, and these lines 3 0 The result of solving the equations of d and 3 1 d simultaneously corresponds to the coordinate of the intersection, that is, the three-dimensional coordinate 3 2. If the straight lines 3 0 d and 3 1 d do not intersect, find the midpoint of the two straight lines and use it as the 3D coordinate 3 2.
本発明では、 偏位補正 6 を施した互いに重な り合う 2枚の画像 A、 画像 B につ いて、 帰納的な手順で画像 A上の各点 3 0 a と対応する画像 B上の点 3 1 a を探 し出 し、 各点の 3次元座標 3 2 を求める。  In the present invention, for two images A and B that are overlapped with each other with the displacement correction 6, a point on image B corresponding to each point 30 0a on image A in an inductive procedure. Find 3 1 a and find the 3D coordinates 3 2 of each point.
第 0手順 Step 0
処理精度の段階分けの数を n と し、 しきい値 t h ( 1 ) 〜 t h ( n ) と しきい 値 d i f f ( l ) 〜 d i f f ( n ) を定める。 画像 A上に動点 X a 、 画像 B上に 動点 X b を置き、 その初期位置とそれらを動かす方向を定める。  The number of stages of processing accuracy is n, and threshold values t h (1) to t h (n) and threshold values d i f f (l) to d i f f (n) are determined. Place moving point X a on image A and moving point X b on image B, and determine the initial position and the direction to move them.
第 i 手順 ( i = 1 〜 n ) Step i (i = 1 to n)
(第 1 ステ ップ)  (1st step)
画像 A上の点の う ち、 第 0 〜 ( i 一 1 ) 手順でまだ対応関係が決まっていない 点の う ち、 動点を動かす方向に関 して初期位置に最も近い点に動点 X a を置く 。 また、 画像 B の初期位置に動点 X b を置く 。  Of the points on image A, the moving point X is the point closest to the initial position with respect to the direction in which the moving point is moved among the points that have not yet been determined in steps 0 to (i 1 1). Put a. Also, the moving point X b is placed at the initial position of image B.
(第 2 ステ ップ)  (2nd step)
動点 X a 、 動点 X b をそれぞれ中心とする枠を W a 、 W b と して、 画像相関法 によ り枠 W a と枠 W b の相関係数 c c を求める。 相関係数 c c がしきい値 t h ( i ) 以上なら第 3 ステップへ移り 、 それ以外は第 4 ステップへ移る。 The frames centered at the moving point Xa and the moving point Xb are defined as Wa and Wb, respectively, and the correlation coefficient cc between the frame Wa and the frame Wb is obtained by the image correlation method. The correlation coefficient cc is the threshold th ( i) If so, go to the third step, otherwise go to the fourth step.
(第 3 ステ ップ)  (3rd step)
点 X a を含む小範囲 S a を取 、 小範囲 S a に含まれる各点について、 その 点を中心とする枠 W s と枠 W b と の相関係数をそれぞれ計算し、 最も相関係数の 高い点を X a ' とする。 動点 X a と点 X a ' の画像上座標しての距離がしきい値 d i f f ( i ) 以下であった場合は第 5 ステ ップへ移り 、 それ以外は第 4 ステ ツ プへ移る。  Take the small range S a including the point X a, calculate the correlation coefficient between the frame W s centered on that point and the frame W b for each point included in the small range S a, Let X a 'be the high point. If the distance between the moving point Xa and the point Xa 'on the image is less than or equal to the threshold value d i f f (i), go to the fifth step, otherwise go to the fourth step.
(第 4 ステップ)  (4th step)
動点 X b を画像 Bの次の点に移して第 2ステ ップに戻る。 尚、 次の点がなけれ ば第 6 ステップへ移る。  Move the moving point X b to the next point in image B and return to the second step. If there is no next point, proceed to Step 6.
(第 5 ステ ップ)  (5th step)
動点 X a に対応する点を X b とす  Let X b be the point corresponding to moving point X a
(第 6 ステ ップ)  (6th step)
動点 X a を画像 A上の次の点に移して第 2 ステップに戻る。 尚、 次の点がなけ れば第 ( i + 1 ) 手順に進む。  Move moving point X a to the next point on image A and return to the second step. If there is no next point, go to step (i + 1).
第 ( n + 1 ) 手順 Step (n + 1)
第 0〜 n手順で対応関係が決まつた点について、 3次元座標を求める。  Obtain the 3D coordinates for the points for which the correspondence was determined in steps 0 to n.
第 ( n + 2 ) 手順 Step (n + 2)
第 0〜 n手順で対応関係が決ま らなかつた点については、 その旨を示す値を入 れるか、 又は第 ( n + 1 ) 手順で得られた 3次元座標から 3次内挿などの内揷法 によ り 内揷した値を入れる。  For points for which the correspondence relationship has not been determined in steps 0 to n, enter a value to that effect, or use 3D interpolation from the 3D coordinates obtained in step (n + 1). Enter the value entered by the method.
各点の対応関係が第 1 〜 n手順のどの段階で決まったかによつて、 その点のス テ レォマ ツチ精度を区分し、 その区分を色別に表示させる と、 測量精度分布図を 作成する こ とができ る。  Depending on which step in steps 1 to n the correspondence of each point is determined, the stereo matching accuracy of that point is classified, and when that category is displayed by color, a survey accuracy distribution map can be created. You can
図 1 2 は 、 本発明であるデジタル空撮 3次元計測システムで使用する民生用デ ジタルカメ ラの斜視図である。 図 1 3 は、 本発明であるデジタル空撮 3次元計測 システムで使用する民生用デジタルカメ ラの平面図である。 図 1 4 は、 本発明で あるデジタル空撮 3次元計測システムで使用する民生用デジタルカメ ラの正面図 である。 民生用デジタルカメ ラ 1 4 は、 カメ ラ 1 4 a 、 制御部 1 4 b 、 外フ レーム 1 4 c 、 内フ レーム 1 4 d 、 モ一タ一 1 4 e 、 取付部 1 4 f 、 及びコィノレノ ネ 1 4 g 等からなる。 FIG. 12 is a perspective view of a consumer digital camera used in the digital aerial photography three-dimensional measurement system according to the present invention. FIG. 13 is a plan view of a consumer digital camera used in the digital aerial photography 3D measurement system of the present invention. FIG. 14 is a front view of a consumer digital camera used in the digital aerial photography 3D measurement system of the present invention. The consumer digital camera 14 has a camera 14a, a controller 14b, an outer frame 14c, an inner frame 14d, a motor 14e, a mounting 14f, and It consists of 14 g.
尚、 民生用デジタルカメ ラ 1 4 を設置する取付 · 制御用台座 1 5 には、 中央に 円形の穴 1 5 b が空けられ、 四隅に民生用デジタルカメ ラ 1 4 を支えるための支 柱 1 5 a が設けられる。  In addition, the mounting and control base 15 for installing the consumer digital camera 14 has a circular hole 15b in the center, and supports 1 for supporting the consumer digital camera 1 4 at the four corners. 5 a is provided.
カメ ラ 1 4 a は、 レンズを下に向けて設置される。 取付 . 制御用台座 1 5 に固 定した際に穴 1 5 b を通して写真を撮影する こ とができ る。 画素数は 1 2 0 0万 以上であ り 、 広角 レンズを取り 付けて使用する。  The camera 14a is installed with the lens facing down. Installation. When fixed to the control pedestal 15, the photograph can be taken through the hole 15 b. The number of pixels is 12,000,000 or more, and a wide-angle lens is attached for use.
制御部 1 4 b は、 カメ ラ 1 4 a の上に設けた箱であ り 、 内部にジャイ ロを搭載 してお り 、 重力方向に対する傾き を検知して、 カメ ラ 1 4 a が常に直下方向を向 く よ う に制御する こ とができる。  The control unit 14 b is a box provided on the camera 14 a, has a gyro inside, detects the inclination with respect to the direction of gravity, and the camera 14 a is always directly below. It can be controlled to turn in the direction.
制御部 1 4 b には、 ケーブルを介して、 航空 G P S アンテナ 1 6やコンビユー タ 1 8等が接続される。 カメ ラ と シャ ッター信号用送信用のケーブルを繋ぎ、 G P S信号を受信する と 同時にシャ ツタ一を切る信号を送る。  The control unit 14 b is connected with an aeronautical GPS antenna 16, a computer 18, and the like via a cable. Connect the camera and the shutter signal transmission cable, receive the GPS signal, and send a signal to turn off the shutter at the same time.
外フ レーム 1 4 c は、 内フ レーム 1 4 d の周囲を覆う略 8角形の枠である。 内 フ レーム 1 4 d は、 ピッチ軸及びロール軸の両方について自 由に動く 形で外フ レ ーム 1 4 c に繋がる。  The outer frame 14c is a substantially octagonal frame that covers the periphery of the inner frame 14d. The inner frame 14 d is connected to the outer frame 14 c in such a way that it freely moves on both the pitch axis and the roll axis.
内フ レーム 1 4 d は、 カメ ラ 1 4 a 及び制御部 1 4 b が取り 付けられる略 8角 形の枠であ り 、 モータ一 1 4 e によ り傾きを変える こ とができ る。  The inner frame 14 d is a substantially octagonal frame to which the camera 14 a and the control unit 14 b are attached. The inclination of the inner frame 14 d can be changed by the motor 14 e.
モーター 1 4 e は、 カメ ラ 1 4 a の四方に設置され、 制御部 1 4 b の指示によ り動作してカメ ラ 1 4 a の傾きを調整する。  The motor 14 e is installed in the four directions of the camera 14 a and operates according to instructions from the control unit 14 b to adjust the inclination of the camera 14 a.
取付部 1 4 f は、 外フ レーム 1 4 c の四隅において、 支柱 1 5 a に固定する部 分である。 四隅とはコイルバネ 1 4 g を介して連結しているので、 振動を吸収し てカメ ラ 1 4 a がぶれるのを防止する こ とができる。  The attachment portions 14 f are portions fixed to the support columns 15 a at the four corners of the outer frame 14 c. Since it is connected to the four corners via a coil spring 14 g, the vibration can be absorbed and the camera 14 a can be prevented from shaking.
以上のよ う に、 本発明であるデジタル空撮 3 次元計測システムは、 計測機器は 民生用デジタルカメ ラ と G P Sのみだけと な り 、 精度を維持しつつ、 小型、 軽量 及び低コス ト化されるので、 コス ト的に困難だった小規模プロ ジェク トにも航空 写真測定を利用できるよ う になる。 また、 誤差要素の補正又は除去、 カメ ラ視線方向の推定などをソフ ト ウェアで 行う こ と によ り 、 精度向上のためのバージョ ンア ップが容易とな り 、 ハ ー ドゥエ ァで行う場合よ り も、 高精度に計算する こ とができる よ う になる。 As described above, the digital aerial 3D measurement system according to the present invention has only a digital camera for consumer use and a GPS as measuring instruments, and is small, light, and low in cost while maintaining accuracy. As a result, aerial photometry can be used even for small-scale projects that were difficult in terms of cost. In addition, by performing correction or removal of error elements, estimating the camera's line-of-sight direction, etc. with software, it is easy to upgrade the version to improve accuracy. It will be possible to calculate with higher accuracy.
さ らに、 民生用デジタルカメ ラは換装可能であ り 、 民生用デジタルカメ ラの性 能が変わっても ソフ ト ウェアを調整するだけで容易に対応する こ とができ るので 、 柔軟に運用する こ,とが可能である。 産業上の利用可能性  In addition, consumer digital cameras can be replaced, and even if the performance of consumer digital cameras changes, it can be easily accommodated by simply adjusting the software. It is possible to do this. Industrial applicability
本発明は、 以上の構成であるから以下の効果が得られる。 第 1 に、 計測機器は 民生用デジタルカメ ラ と G P S のみだけと な り 、 精度を維持しつつ、 小型、 軽量 及ぴ低コス ト化されるので、 コス ト的に困難だった小規模プロ ジヱク トにも航空 写真測量を利用できるよ う になる。  Since the present invention has the above configuration, the following effects can be obtained. First, the measurement equipment is only a consumer digital camera and GPS, and while maintaining accuracy, it is small, lightweight, and low cost. Aerial photogrammetry will also be available.
第 2 に、 誤差要素の補正又は除去、 カメ ラ視線方向の推定などをソフ ト ウェア で行 う こ と によ り 、 精度向上のためのバージョ ンアップが容易と な り 、 ハ ー ドウ エアで行う場合よ り も、 高精度に計算する こ とができ るよ う になる。  Secondly, the software can be used to correct or remove error elements, estimate the direction of the camera's line of sight, etc. Compared to the case, it becomes possible to calculate with high accuracy.
第 3 に、 民生用デジタルカ メ ラは換装可能であ り 、 民生用デジタルカメ ラの性 能が変わっても ソフ ト ウェアを調整するだけで容易に対応する こ とができ るので 、 柔軟に運用する こ とが可能である。  Third, consumer digital cameras can be replaced, and even if the performance of consumer digital cameras changes, it can be easily accommodated by adjusting the software. It can be used.

Claims

請求の範囲 デジタルカメ ラの仕様に基づく 内部標定要素と前記デジタルカメ ラで撮影 した複数の航空写真をコ ンピュータに入力し、 前記内部標定要素を係数と して放射方向歪みと接線方向歪みについて前記航空写真の画像上座標を変 換した幾何捕正済写真のデータを記憶装置に保存する幾何補正の手段と、 前記航空写真を記憶装置から読み出すと共に前記航空写真の撮影に伴い受 信した G P Sデータをコンピュータに入力し、 前記航空写真について連続 する 2枚の写真の重なり具合を求めて前記 G P Sデータの高度情報を基に 撮影位置の差を算出し、 前記 G P Sデータの緯度及び経度情報と比較して 各写真に結び付けたカメ ラ位置のデータを記憶装置に保存する連続性解析 の手段と、 前記幾何補正済写真と前記カメ ラ位置を記憶装置から読み出し 、 前記カメ ラ位置を基に前記幾何補正済写真のう ち隣接する写真同士をぺ ァと したとき、 一方の写真上の複数点が他方の写真のどこに写っているか を画像照合によ り探し出したタイボイ ン トのデータを記憶装置に保存する タイポイン ト抽出の手段と、 前記内部標定要素と前記幾何補正済写真と前 記カメ ラ位置と前記タイポイ ン トを記憶装置から読み出し、 前記タイボイ ン トが前記カメ ラ位置において直下方向の姿勢で撮影されたと仮定して実 空間での中心投影を逆にたどる直線を算出し、 ペアの写真における 2つの 直線間の摈れが最小となるよ う に姿勢を調整した位置姿勢のデータを記憶 装置に保存する外部標定の手段と、 前記内部標定要素と前記航空写真と前 記位置姿勢を記憶装置から読み出し、 前記航空写真を前記内部標定要素で 幾何補正したものに対し前記位置姿勢を基に直下方向の姿勢で撮影した状 態となるよ うに投影し直した偏位補正済写真のデータを記憶装置に保存す る偏位補正の手段と、 前記位置姿勢と前記偏位捕正済写真を記憶装置から 読み出し、 前記偏位補正済写真のう ち重なり合う写真同士をペアにしたと き、 一方の写真上の複数の点が他方の写真のどこに写っているかを画像照 合によ り探し出し、 前記位置姿勢に基づいて実空間での中心投影を逆にた どる直線を算出して、 ペアの写真における 2つの直線によ り求めた 3次元 , 座標のデ一タを記憶装置に保存するステ レオマ ッチングの手段と、 前記内 部標定要素と前記航空写真と前記位置姿勢と前記 3次元座標を記憶装置か ら読み出し、 前記 3次元座標で表されるものを前記内部標定要素の幾何的 歪みを持った状態で前記位置姿勢の姿勢で撮影した際の対応関係を算出し て前記航空写真上の各点を前記 3次元座標に結び付け、 画像を中心投影か ら正射影に変換した補正済写真のデータをコ ンピュータから出力するオル ソ補正の手段とからなることを特徴とするデジタル空撮 3次元計測システ ム。 Claims An internal orientation element based on the specifications of the digital camera and a plurality of aerial photographs taken with the digital camera are input to a computer, and the radial orientation and tangential distortion are calculated using the internal orientation element as a coefficient. Geometric correction means for storing in the storage device the data of the geometrically corrected photo obtained by converting the coordinates on the image of the aerial photo; and the GPS data received when the aerial photo is read out from the storage device and taken Is input to the computer, the difference between two consecutive photographs of the aerial photograph is obtained, the difference in the shooting position is calculated based on the altitude information of the GPS data, and compared with the latitude and longitude information of the GPS data. A continuity analysis means for storing camera position data associated with each photograph in a storage device, and reading the geometrically corrected photograph and the camera position from the storage device. Based on the camera position, when the adjacent photos of the geometrically corrected photos are taken as a pair, the image collation shows where the multiple points on one photo appear in the other photo. Tie-point extracting means for storing the tie-point data found in the storage device in a storage device, reading out the internal orientation element, the geometrically corrected photograph, the camera position, and the tie-point from the storage device; Assuming that the camera was photographed at the camera position with a posture in the direct downward direction, a straight line that reverses the central projection in the real space is calculated, and the blur between the two straight lines in the pair of photos is minimized. Means for external orientation for storing data of position and orientation adjusted in posture in a storage device; reading the internal orientation element, the aerial photograph and the position and orientation from the storage device; Deviation that stores in a storage device the data of a deviation-corrected photo that has been re-projected so as to be photographed in a posture in a direct downward direction based on the position and orientation of the one that has been geometrically corrected by the internal orientation element When the correction means, the position and orientation, and the displacement-corrected photo are read from the storage device, and the overlapped photos of the displacement-corrected photos are paired, a plurality of points on one photo are Find the location of the other photo by image matching, calculate a straight line that reverses the central projection in real space based on the position and orientation, and use the two straight lines in the pair of photos. Sought 3D , Stereo-matching means for storing coordinate data in a storage device, the internal orientation element, the aerial photograph, the position and orientation, and the three-dimensional coordinates are read from the storage device, and represented by the three-dimensional coordinates. To calculate the corresponding relationship when the object is photographed in the position and orientation with the geometric distortion of the internal orientation element, linking each point on the aerial photograph to the three-dimensional coordinate, A digital aerial 3D measurement system characterized by comprising orthorectification means that outputs from a computer the data of a corrected photograph converted from a central projection to an orthographic projection.
. 写真測量のタイポイン ト抽出又はステ レオマッチングで画像照合する際の 画素値取得において、 画像上の点に対して、 前記点が中央付近の場合は正 方形状、 前記点が画像外縁付近の場合は放射方向に引き延ばした形状で設 定した枠を、 カメ ラの進行方向に合わせて回転、 カメ ラの傾きに合わせて 歪曲、 又はカメ ラの高度に合わせて拡大縮小し、 前記枠の形状に合わせて 画素値の取得位置を変更して周囲のピク セル中心との距離によ り画素値を 算出することを特徴とする画像照合方法。 When obtaining pixel values when collating images with photogrammetry typographical points or stereo matching, when the point is near the center, it is a square shape, and when the point is near the outer edge of the image The frame set with the shape extended in the radial direction is rotated according to the camera traveling direction, is distorted according to the camera tilt, or is scaled according to the camera altitude, to the shape of the frame. At the same time, the pixel value is calculated based on the distance from the surrounding pixel center by changing the acquisition position of the pixel value.
. 写真測量の外部標定で撮影範囲が重なり合う写真の共面条件及び共線条件 を利用して行うカメ ラの姿勢推定において、 カメ ラの進行方向に沿った連 続する 2枚の写真と、 前記写真と撮影範囲が重なり合い且つ進行方向と直 交する位置で撮影した写真とで推定することを特徴とする姿勢推定方法。. 写真測量の偏位補正で直下方向の中心投影に補正する際において、 写真撮 影時の力メ ラの傾きによつて傾いた中心投影の写真に対し、 撮像面が水平 となるよ う にレンズ主点を中心に回転させ、 回転前の撮像面上の各点を回 転後の撮影面に投影し直した後、 各写真の辺の向きを揃えてリサンプルす ることを特徴とする偏位補正方法。 In the camera posture estimation using the coplanar condition and collinear condition of the photos with overlapping shooting ranges in the photogrammetry external orientation, two consecutive photos along the camera traveling direction, and A posture estimation method characterized in that estimation is performed based on a photograph and a photograph taken at a position where the photographing range overlaps and is orthogonal to the traveling direction. When correcting to the central projection in the downward direction by deviation correction of photogrammetry, the imaging surface should be horizontal with respect to the photo of the central projection tilted by the tilt of the force lens when taking a photo. Rotating around the lens principal point, reprojecting each point on the imaging surface before rotation onto the imaging surface after rotation, and then resampling with the orientation of the sides of each photo aligned Deviation correction method.
. 写真測量のステ レオマッチングで一方の画像上の各点に対応する他方の画 像上の点を探し出す場合において、 処理精度の段階と しきい値を設定し、 帰納的な手順で各点の 3次元座標を求めることを特徴とするステ レオマツ チング方法。 In photogrammetry stereo matching, when finding a point on the other image corresponding to each point on one image, set the processing accuracy level and threshold, and use an inductive procedure to set each point. A stereo matching method characterized by obtaining three-dimensional coordinates.
PCT/JP2007/062373 2007-06-13 2007-06-13 Digital aerial photographing three-dimensional measurement system WO2008152740A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2009519130A JPWO2008152740A1 (en) 2007-06-13 2007-06-13 Digital aerial 3D measurement system
PCT/JP2007/062373 WO2008152740A1 (en) 2007-06-13 2007-06-13 Digital aerial photographing three-dimensional measurement system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2007/062373 WO2008152740A1 (en) 2007-06-13 2007-06-13 Digital aerial photographing three-dimensional measurement system

Publications (1)

Publication Number Publication Date
WO2008152740A1 true WO2008152740A1 (en) 2008-12-18

Family

ID=40129357

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/062373 WO2008152740A1 (en) 2007-06-13 2007-06-13 Digital aerial photographing three-dimensional measurement system

Country Status (2)

Country Link
JP (1) JPWO2008152740A1 (en)
WO (1) WO2008152740A1 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011133321A (en) * 2009-12-24 2011-07-07 Pasuko:Kk Heat dissipation diagnostic device for three-dimensional structure and heat dissipation diagnostic program
JP2011242315A (en) * 2010-05-20 2011-12-01 Topcon Corp Electronic level
WO2012127601A1 (en) * 2011-03-22 2012-09-27 株式会社パスコ Standing structure heat dissipation diagnostic device, heat dissipation diagnostic program, and heat dissipation diagnostic method
JP2012242321A (en) * 2011-05-23 2012-12-10 Topcon Corp Aerial photograph imaging method and aerial photograph imaging device
JP2013061204A (en) * 2011-09-13 2013-04-04 Asia Air Survey Co Ltd Method for setting corresponding point of aerial photographic image data, corresponding point setting apparatus, and corresponding point setting program
JP2013532299A (en) * 2010-05-06 2013-08-15 ヘキサゴン テクノロジー センター ゲゼルシャフト ミット ベシュレンクテル ハフツング Camera, especially for recording aerial photos from aircraft
EP2597422A3 (en) * 2011-11-24 2014-12-03 Kabushiki Kaisha Topcon Aerial photograph image pickup method and aerial photograph image pickup apparatus
WO2014197054A3 (en) * 2013-03-15 2015-02-19 Pictometry International Corp. System and method for early access to captured images
US9013576B2 (en) 2011-05-23 2015-04-21 Kabushiki Kaisha Topcon Aerial photograph image pickup method and aerial photograph image pickup apparatus
US9020666B2 (en) 2011-04-28 2015-04-28 Kabushiki Kaisha Topcon Taking-off and landing target instrument and automatic taking-off and landing system
JP2015087846A (en) * 2013-10-29 2015-05-07 山九株式会社 Three-dimensional model generation system
US9609282B2 (en) 2012-08-24 2017-03-28 Kabushiki Kaisha Topcon Camera for photogrammetry and aerial photographic device
WO2017103982A1 (en) * 2015-12-14 2017-06-22 株式会社 ニコン・トリンブル Defect detection apparatus and program
CN109029365A (en) * 2018-06-26 2018-12-18 广东电网有限责任公司 Electric power corridor heteropleural image connecting points extracting method, system, medium and equipment
CN109685845A (en) * 2018-11-26 2019-04-26 普达迪泰(天津)智能装备科技有限公司 A kind of realtime graphic splicing processing method based on POS system for FOD detection robot
CN110533766A (en) * 2019-08-06 2019-12-03 陕西土豆数据科技有限公司 It is a kind of based on exempt from as control PPK data oblique photograph image intelligence wiring method
US10665035B1 (en) 2017-07-11 2020-05-26 B+T Group Holdings, LLC System and process of using photogrammetry for digital as-built site surveys and asset tracking
US11151782B1 (en) 2018-12-18 2021-10-19 B+T Group Holdings, Inc. System and process of generating digital images of a site having a structure with superimposed intersecting grid lines and annotations

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109934911B (en) * 2019-03-15 2022-12-13 鲁东大学 OpenGL-based three-dimensional modeling method for high-precision oblique photography of mobile terminal

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001243450A (en) * 2000-02-29 2001-09-07 Ntt Data Creation Kk Method and device for detecting tie point of satellite image and recording medium
JP2004110479A (en) * 2002-09-19 2004-04-08 Topcon Corp Picture calibration method, picture calibration processor, and picture calibration processing terminal
JP2005156514A (en) * 2003-11-27 2005-06-16 Kokusai Kogyo Co Ltd Constitution method of aerial photographic image data set
JP2005308553A (en) * 2004-04-21 2005-11-04 Topcon Corp Three-dimensional image measuring device and method
JP2006018549A (en) * 2004-07-01 2006-01-19 Kokusai Kogyo Co Ltd Retrieval method, display method and management system for stereoscopic photograph image

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001243450A (en) * 2000-02-29 2001-09-07 Ntt Data Creation Kk Method and device for detecting tie point of satellite image and recording medium
JP2004110479A (en) * 2002-09-19 2004-04-08 Topcon Corp Picture calibration method, picture calibration processor, and picture calibration processing terminal
JP2005156514A (en) * 2003-11-27 2005-06-16 Kokusai Kogyo Co Ltd Constitution method of aerial photographic image data set
JP2005308553A (en) * 2004-04-21 2005-11-04 Topcon Corp Three-dimensional image measuring device and method
JP2006018549A (en) * 2004-07-01 2006-01-19 Kokusai Kogyo Co Ltd Retrieval method, display method and management system for stereoscopic photograph image

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011133321A (en) * 2009-12-24 2011-07-07 Pasuko:Kk Heat dissipation diagnostic device for three-dimensional structure and heat dissipation diagnostic program
JP2013532299A (en) * 2010-05-06 2013-08-15 ヘキサゴン テクノロジー センター ゲゼルシャフト ミット ベシュレンクテル ハフツング Camera, especially for recording aerial photos from aircraft
JP2011242315A (en) * 2010-05-20 2011-12-01 Topcon Corp Electronic level
WO2012127601A1 (en) * 2011-03-22 2012-09-27 株式会社パスコ Standing structure heat dissipation diagnostic device, heat dissipation diagnostic program, and heat dissipation diagnostic method
US9020666B2 (en) 2011-04-28 2015-04-28 Kabushiki Kaisha Topcon Taking-off and landing target instrument and automatic taking-off and landing system
JP2012242321A (en) * 2011-05-23 2012-12-10 Topcon Corp Aerial photograph imaging method and aerial photograph imaging device
US9013576B2 (en) 2011-05-23 2015-04-21 Kabushiki Kaisha Topcon Aerial photograph image pickup method and aerial photograph image pickup apparatus
JP2013061204A (en) * 2011-09-13 2013-04-04 Asia Air Survey Co Ltd Method for setting corresponding point of aerial photographic image data, corresponding point setting apparatus, and corresponding point setting program
EP2597422A3 (en) * 2011-11-24 2014-12-03 Kabushiki Kaisha Topcon Aerial photograph image pickup method and aerial photograph image pickup apparatus
US9007461B2 (en) 2011-11-24 2015-04-14 Kabushiki Kaisha Topcon Aerial photograph image pickup method and aerial photograph image pickup apparatus
US9609282B2 (en) 2012-08-24 2017-03-28 Kabushiki Kaisha Topcon Camera for photogrammetry and aerial photographic device
US9275080B2 (en) 2013-03-15 2016-03-01 Pictometry International Corp. System and method for early access to captured images
WO2014197054A3 (en) * 2013-03-15 2015-02-19 Pictometry International Corp. System and method for early access to captured images
US9805059B2 (en) 2013-03-15 2017-10-31 Pictometry International Corp. System and method for early access to captured images
US10311089B2 (en) 2013-03-15 2019-06-04 Pictometry International Corp. System and method for early access to captured images
JP2015087846A (en) * 2013-10-29 2015-05-07 山九株式会社 Three-dimensional model generation system
WO2017103982A1 (en) * 2015-12-14 2017-06-22 株式会社 ニコン・トリンブル Defect detection apparatus and program
US10665035B1 (en) 2017-07-11 2020-05-26 B+T Group Holdings, LLC System and process of using photogrammetry for digital as-built site surveys and asset tracking
CN109029365A (en) * 2018-06-26 2018-12-18 广东电网有限责任公司 Electric power corridor heteropleural image connecting points extracting method, system, medium and equipment
CN109029365B (en) * 2018-06-26 2021-05-18 广东电网有限责任公司 Method, system, medium and device for extracting different-side image connection points of electric power corridor
CN109685845A (en) * 2018-11-26 2019-04-26 普达迪泰(天津)智能装备科技有限公司 A kind of realtime graphic splicing processing method based on POS system for FOD detection robot
US11151782B1 (en) 2018-12-18 2021-10-19 B+T Group Holdings, Inc. System and process of generating digital images of a site having a structure with superimposed intersecting grid lines and annotations
CN110533766A (en) * 2019-08-06 2019-12-03 陕西土豆数据科技有限公司 It is a kind of based on exempt from as control PPK data oblique photograph image intelligence wiring method
CN110533766B (en) * 2019-08-06 2023-04-11 土豆数据科技集团有限公司 Oblique photography image intelligent writing method based on image control-free PPK data

Also Published As

Publication number Publication date
JPWO2008152740A1 (en) 2010-08-26

Similar Documents

Publication Publication Date Title
WO2008152740A1 (en) Digital aerial photographing three-dimensional measurement system
US5596494A (en) Method and apparatus for acquiring digital maps
EP1242966B1 (en) Spherical rectification of image pairs
US7773799B2 (en) Method for automatic stereo measurement of a point of interest in a scene
US5878174A (en) Method for lens distortion correction of photographic images for texture mapping
EP2791868B1 (en) System and method for processing multi-camera array images
US11887273B2 (en) Post capture imagery processing and deployment systems
US20060215935A1 (en) System and architecture for automatic image registration
JP3541855B2 (en) Method and apparatus for extracting three-dimensional data
JP5134784B2 (en) Aerial photogrammetry
KR101218220B1 (en) Apparatus for drawing digital map
JP2008186145A (en) Aerial image processing apparatus and aerial image processing method
KR100373615B1 (en) Method and device for making map using photograph image and method for correcting distortion of photograph image
Sai et al. Geometric accuracy assessments of orthophoto production from uav aerial images
JP3808833B2 (en) Aerial photogrammetry
KR101224830B1 (en) Portable Multi-Sensor System for Acquiring Georeferenced Images and Method thereof
WO2008020461A1 (en) Method for acquiring, processing and presenting images and multimedia navigating system for performing such method
CN110986888A (en) Aerial photography integrated method
KR102578056B1 (en) Apparatus and method for photographing for aerial photogrammetry using an air vehicle
KR102225321B1 (en) System and method for building road space information through linkage between image information and position information acquired from a plurality of image sensors
Barazzetti et al. Stitching and processing gnomonic projections for close-range photogrammetry
JP3781034B2 (en) Stereo image forming method and apparatus
CN111581322A (en) Method, device and equipment for displaying interest area in video in map window
WO2023127020A1 (en) Information processing system, method, and program
JP6133057B2 (en) Geographic map landscape texture generation based on handheld camera images

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07767212

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2009519130

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07767212

Country of ref document: EP

Kind code of ref document: A1