WO2017043078A1 - Dispositif d'estimation de distance, système, procédé d'estimation de distance, et support d'enregistrement - Google Patents

Dispositif d'estimation de distance, système, procédé d'estimation de distance, et support d'enregistrement Download PDF

Info

Publication number
WO2017043078A1
WO2017043078A1 PCT/JP2016/004075 JP2016004075W WO2017043078A1 WO 2017043078 A1 WO2017043078 A1 WO 2017043078A1 JP 2016004075 W JP2016004075 W JP 2016004075W WO 2017043078 A1 WO2017043078 A1 WO 2017043078A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
unit
image
imaging device
imaging
Prior art date
Application number
PCT/JP2016/004075
Other languages
English (en)
Japanese (ja)
Inventor
高橋 勝彦
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to JP2017538872A priority Critical patent/JPWO2017043078A1/ja
Publication of WO2017043078A1 publication Critical patent/WO2017043078A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes

Definitions

  • the present invention relates to a distance estimation device, a system, a distance estimation method, and a recording medium.
  • Patent Document 1 describes a method of performing surveying by simultaneously photographing a movement reference point that moves relative to an object and the object.
  • Patent Document 2 describes that the distance to the measurement object and the three-dimensional coordinates of the measurement object are calculated by stereo measurement using two images.
  • Patent Document 3 describes a method for creating an image taken with two cameras, which is an image at a virtual viewpoint that is a virtual viewpoint, with reduced image disturbance. Yes.
  • Patent Document 4 describes an example of calculating the distance between the mobile robot and the remote controller using a triangulation method.
  • the movement reference point and the camera described in Patent Document 1 only move relative to the object, and the positional relationship between the cameras and the positional relationship between the movement reference points do not change. In the techniques described in Patent Documents 2 and 3, the distance between the cameras is fixed.
  • the measurement accuracy of the distance from the imaging device to the measurement object may be lowered when the positional relationship between the cameras changes.
  • the present invention has been made in view of the above problems, and a purpose of the present invention is to estimate the distance from a plurality of imaging devices to the position of the measurement object with high accuracy even if the distance between the imaging devices is variable. Is to provide.
  • a distance estimation apparatus includes a first imaging device that captures a first image, and a second imaging range that overlaps at least a part of the first imaging range captured by the first imaging device.
  • a first distance estimating means for estimating a first distance that is a distance between the second image capturing device and a second image capturing device that captures two images at a timing synchronized with the image capturing timing of the first image capturing device; Based on the correspondence between the first position and the second position in the second image and the first distance, the reference position based on the positional relation between the first imaging device and the second imaging device, and the first Second distance estimation means for estimating a second distance that is a distance between the position and a position in real space corresponding to the second position.
  • a system includes a first imaging device that captures a first image, and a second image in a second imaging range that overlaps at least partly with respect to the first imaging range captured by the first imaging device.
  • a second imaging device that captures the image at a timing synchronized with the imaging timing of the first imaging device, and the distance estimation device.
  • a distance estimation method includes a first imaging device that captures a first image, and a second imaging range that overlaps at least a portion of the first imaging range captured by the first imaging device.
  • a first distance that is a distance between the second image capturing device and a second image capturing device that captures two images at a timing synchronized with the image capturing timing of the first image capturing device is estimated, and the first position in the first image and the first image
  • a reference position based on a positional relationship between the first imaging device and the second imaging device, a first position, and a second position based on a correspondence relationship with a second position in two images and the first distance.
  • a second distance that is a distance from the position in the real space corresponding to is estimated.
  • the distance from the plurality of imaging devices to the position of the measurement target can be estimated with high accuracy.
  • FIG. 1 is a block diagram illustrating an example of a functional configuration of the distance estimation apparatus 10 according to the present embodiment.
  • the distance estimation apparatus 10 according to the present embodiment includes a first distance estimation unit 11 and a second distance estimation unit 12.
  • the first distance estimation unit 11 estimates a distance (referred to as a first distance) between the first imaging device and the second imaging device.
  • the first distance is a length between the first imaging device and the second imaging device in real space. Such a length in real space is also called an absolute distance.
  • the first imaging device and the second imaging device are devices that capture images. In the present embodiment, an image captured by the first imaging device is referred to as a first image, and an image captured by the second imaging device is referred to as a second image.
  • the range captured by the second imaging device (referred to as the second capturing range) is at least partially overlapped with the range captured by the first imaging device (referred to as the first capturing range).
  • the second imaging device captures the second imaging range at a timing synchronized with the imaging timing of the first imaging device.
  • the first distance estimation unit 11 may estimate the first distance based on position information of the first imaging device and the second imaging device, for example. Further, the first distance estimation unit 11 receives radio waves and / or sound waves emitted from one imaging device (for example, the first imaging device) and arrives at the other imaging device (for example, the second imaging device). The first distance may be estimated based on the time required until. In addition, the first distance estimation unit 11 may estimate the first distance based on the size of the photographing object included in the first image and the second image.
  • the first distance estimation unit 11 supplies the estimated first distance to the second distance estimation unit 12.
  • the second distance estimation unit 12 has a correspondence relationship between a position (first position) in the first image captured by the first imaging device and a position (second position) in the second image captured by the second imaging device, Then, the second distance is estimated based on the first distance.
  • the second distance is a reference position based on the positional relationship between the first imaging device and the second imaging device, and a position in real space (target position) corresponding to the first position and the second position that are in a corresponding relationship. Also called).
  • the first position in the first image is a pixel position (pixel position) in the first image
  • the second position in the second image is a pixel position in the second image. Since the first position and the second position are positions in the image, they are described using a two-dimensional coordinate system.
  • the second distance may be, for example, a distance between a straight line connecting the first imaging device and the second imaging device and the object position.
  • the second distance is, for example, that of the main imaging device of the following (a) and (b) when one of the first imaging device and the second imaging device is the main imaging device and the other is the sub imaging device. It may be a distance in the optical system coordinate system.
  • A a plane (first plane) perpendicular to the optical axis of the main imaging device, including the position of the main imaging device;
  • B A plane (second plane) including the object position and parallel to the first plane.
  • the distances (a) and (b) in the optical system coordinate system of the main imaging device that is, the first plane and the second plane. Is a distance in the optical system coordinate system from the main imaging device to the object projected onto the optical axis. That is, the distance in the optical system coordinate system of the main imaging device between (a) and (b) can also be said to be the length in the optical axis direction from the main imaging device to the object position.
  • the second distance estimation unit 12 may further calculate the distance between the object position and a straight line connecting the first imaging device and the second imaging device from the distance in the optical system coordinate system.
  • FIG. 2 is a diagram for explaining a method in which the second distance estimation unit 12 of the distance estimation apparatus 10 according to the present embodiment estimates the second distance.
  • the second distance is a position in the real space corresponding to the straight line connecting the first imaging device and the second imaging device and the first position and the second position that are in a corresponding relationship. It is assumed that the distance is.
  • the second distance estimation unit 12 can estimate by the same method.
  • the second distance with respect to the object A which is obtained by the second distance estimation unit 12
  • the object position P_A which is the position of the object A, and the first and second imaging devices. 2
  • the object position P_A indicates the position of a certain point in the object A, and the position may be the position of a feature point such as a corner or the position of another point. Also good.
  • the second distance with respect to the object B obtained by the second distance estimation unit 12 is the distance between the object position P_B, which is the position of the object B, and the straight line L.
  • the first image and the second image include the object A and the object B.
  • a pixel position in the first image corresponding to the object position P_A of the object A is a first position F_A
  • a pixel position in the second image is a second position S_A.
  • a pixel position in the first image corresponding to the object position P_B of the object B is set as a first position F_B
  • a pixel position in the second image is set as a second position S_B.
  • the second distance estimation unit 12 includes a first position and a second position corresponding to the same position in real space among the plurality of first positions in the first image and the plurality of second positions in the second image.
  • the second distance is estimated based on the correspondence relationship with the position (correspondence relationship between the two pixel positions) and the first distance.
  • the correspondence between the first position and the second position corresponding to the same position in the real space is the combination of the first position F_A and the second position S_A, and the first position F_B. A pair with the second position S_B is shown.
  • the second distance estimation unit 12 performs the first imaging with the position P_A corresponding to the first position F_A based on the first position F_A in the first image and the camera parameters of the first imaging device.
  • An angle ⁇ 1_A formed by a straight line connecting the apparatus and the straight line L is calculated.
  • the camera parameters of the first imaging device may be stored in a storage device (not shown) in the distance estimation device 10 or may be transmitted from the first imaging device.
  • the second distance estimation unit 12 corresponds to the second position S_A based on the second position S_A in the second image and the camera parameter of the second imaging device, which has a corresponding relationship with the first position F_A.
  • An angle ⁇ 2_A formed by a straight line connecting the position P_A and the second imaging device and the straight line L is calculated.
  • the camera parameters of the second imaging device may be stored in a storage device (not shown) in the distance estimation device 10 or may be transmitted from the second imaging device. .
  • the length of the line segment between the first imaging device and the second imaging device in the straight line L is the first distance estimated by the first distance estimation unit 11. Therefore, since the length of one side of the triangle and the angles of both ends are known, the second distance estimation unit 12 can estimate the second distance with respect to the object A by the principle of triangulation.
  • the second distance estimation unit 12 obtains the angle ⁇ 1_B and the angle ⁇ 2_B by the same method, and estimates the second distance with respect to the object B.
  • the distance estimation apparatus 10 cannot distinguish between the following (1) and (2) only from the correspondence between two pixel positions.
  • (1) A case where the first imaging device and the second imaging device are located relatively apart from each other and the subject to be photographed is relatively large.
  • (2) When the first imaging device and the second imaging device are relatively close to each other and the object to be photographed is relatively small.
  • the second distance estimation unit 12 of the distance estimation device 10 uses the distance between the first imaging device and the second imaging device in addition to the correspondence between the pixel positions in the second distance estimation processing.
  • a certain first distance is used.
  • the 2nd distance estimation part 12 can obtain
  • FIG. 3 is a flowchart showing an example of an operation flow of the distance estimation apparatus 10 according to the present embodiment.
  • the first distance estimation unit 11 estimates a first distance, which is a distance between the first imaging device and the second imaging device (step S1). Thereafter, the second distance estimation unit 12 determines whether the first imaging device and the second imaging device are based on the correspondence between the pixel position in the first image and the pixel position in the second image, and the first distance. A second distance that is a distance from a reference position based on the positional relationship to a position in the real space corresponding to both pixel positions is estimated (step S2).
  • the distance estimation apparatus 10 ends the process.
  • the first distance estimation unit 11 of the distance estimation apparatus 10 estimates the first distance that is the distance between the first imaging apparatus and the second imaging apparatus. Then, the second distance estimation unit 12 determines the correspondence between the pixel position in the first image and the pixel position in the second image and the first distance between the first imaging device and the second imaging device. A second distance that is a distance from a reference position based on the positional relationship to a position in real space corresponding to both pixel positions is estimated.
  • the distance estimation device 10 estimates the first distance that is the distance between the imaging devices.
  • the distance estimation apparatus 10 uses not only the correspondence between two images but also the first distance for estimation of the second distance. Thereby, the above-described “scale indefiniteness” can be solved.
  • the distance estimation device 10 even if the distance between the imaging devices is variable, the second distance, which is the distance from the plurality of imaging devices to the position of the measurement target, is highly accurate. Can be estimated.
  • the general stereo camera needs to keep the positional relationship between the two cameras constant, and there is a limit to the baseline length due to the constraint of ensuring the rigidity of the housing. Therefore, the accuracy of measuring the distance from the camera to a distant point may be lowered. Further, there is a positive relationship between the baseline length and the distance from the measurable camera to a distant point. Therefore, if the baseline length is set to be very long, the positional relationship between cameras becomes difficult to stabilize, so a complicated procedure of frequently correcting the positional relationship between cameras using a large number of calibration patterns is required. There is a case. In addition, each time the housing is distorted due to aging, it is necessary to perform camera calibration using a new calibration pattern.
  • the distance estimation apparatus 10 in the present embodiment it is not necessary to perform camera calibration. Therefore, the second distance can be measured with high accuracy using an inexpensive device.
  • the distance estimation apparatus 100 has the same function as the distance estimation apparatus 10 described in the first embodiment.
  • FIG. 4 is a diagram for explaining a usage scene of the distance estimation apparatus 100 according to the present embodiment.
  • the distance estimation apparatus 100 includes a first imaging unit 101 and a second imaging unit 102.
  • the first imaging unit 101 and the second imaging unit 102 of the distance estimation apparatus 100 capture an imaging range at least partially overlapping.
  • the first imaging unit 101 and the second imaging unit 102 are provided in the distance estimation apparatus 100 so that the distance between the first imaging unit 101 and the second imaging unit 102 is variable.
  • FIG. 5 is a functional block diagram illustrating an example of a functional configuration of the distance estimation apparatus 100 according to the present embodiment.
  • distance estimation apparatus 100 according to the present embodiment is associated with first imaging section 101, second imaging section 102, first distance estimation section 110, and second distance estimation section 120. Unit 130 and storage unit 140.
  • the first photographing unit 101 photographs an image every moment.
  • the first imaging unit 101 is realized by an imaging device such as a camera.
  • the image photographed by the first photographing unit 101 may be a landscape image including a road or an image including a structure such as a bridge.
  • an image photographed by the first photographing unit 101 is referred to as a first image.
  • the first imaging unit 101 can receive position information from a highly accurate positioning system (for example, RTK-GPS (Real Time Kinetic-Global Positioning System), quasi-zenith satellite system, etc.). It is assumed that a unit 1011 is provided. When the first image capturing unit 101 captures the first image, the position information receiving unit 1011 receives position information indicating the position of the first image capturing unit 101 at the time of image capturing.
  • the position information receiving unit 1011 may be realized by a GPS receiver provided close to the first imaging unit 101.
  • the first photographing unit 101 supplies the photographed first image to the associating unit 130.
  • the first photographing unit 101 receives the positional information (referred to as first photographing unit position information) indicating the position of the first photographing unit 101 when photographing the first image. 1 distance is supplied to the distance estimation unit 110.
  • the first image capturing unit position information may include the image capturing time of the first image.
  • the second imaging unit 102 captures an image at a timing synchronized with the imaging timing of the first imaging unit 101. Similar to the first photographing unit 101, the second photographing unit 102 is realized by an imaging device such as a camera. In the present embodiment, it is assumed that the second imaging unit 102 is installed so as to have a field of view overlapping with the first imaging unit 101. Therefore, at least a part of the range captured by the second imaging unit 102 overlaps the range captured by the first imaging unit 101.
  • the range where the first imaging unit 101 captures an image is referred to as a first imaging range
  • the range where the second imaging unit 102 captures an image is referred to as a second imaging range.
  • An image captured by the second imaging unit 102 is referred to as a second image.
  • the second imaging unit 102 includes a position information receiving unit 1021 having the same function as the above-described position information receiving unit 1011, similarly to the first imaging unit 101.
  • the position information receiving unit 1021 receives position information indicating the position of the second image capturing unit 102 at the time of capturing.
  • the position information receiving unit 1021 may be realized by a GPS receiver provided close to the second imaging unit 102.
  • the second photographing unit 102 supplies the photographed second image to the associating unit 130.
  • the second photographing unit 102 receives the positional information (referred to as second photographing unit position information) indicating the position of the second photographing unit 102 when photographing the second image. 1 distance is supplied to the distance estimation unit 110.
  • the second image capturing unit position information may include the image capturing time of the second image.
  • the method of synchronizing the shooting timing between the first shooting unit 101 and the second shooting unit 102 is not particularly limited, and a general synchronization method may be adopted.
  • the first photographing unit 101 and the second photographing unit 102 need not be integrated with each other by being connected to a rigid casing. That is, in the present embodiment, the distance estimation apparatus 100 may be provided so that the positional relationship between the first imaging unit 101 and the second imaging unit 102 varies. In the present embodiment, the first imaging unit 101 and the second imaging unit 102 are described as being incorporated in the distance estimation apparatus 100, but the present invention is not limited to this.
  • the first imaging unit 101 and the second imaging unit 102 may each be realized by an imaging device separate from the distance estimation device 100. Further, one of the first imaging unit 101 and the second imaging unit 102 may be built in the distance estimation apparatus 100 and the other may be realized by an imaging apparatus separate from the distance estimation apparatus 100. Thus, a configuration in which at least one of the first imaging unit 101 and the second imaging unit 102 is realized by an imaging device separate from the distance estimation device 100 will be described in another embodiment.
  • the first distance estimation unit 110 corresponds to the first distance estimation unit 11 in the first embodiment described above.
  • the first distance estimating unit 110 receives the first photographing unit position information and the second photographing unit position information.
  • the shooting timing of the first image shot at the position indicated by the first shooting unit position information is synchronized with the shooting timing of the second image shot at the position indicated by the second shooting unit position information.
  • the first distance estimating unit 110 describes the first photographing unit position information and the second photographing unit position information using a global coordinate system, and the first photographing unit 101, the second photographing unit 102, and the like at the photographing timing described above.
  • a first distance that is a distance between is estimated.
  • the first distance is a length between the first photographing unit 101 and the second photographing unit 102 in real space.
  • the first distance estimating unit 110 uses the image capturing time.
  • the first distance may be estimated.
  • the first distance estimated by the first distance estimation unit 110 is the distance between the image plane captured by the first imaging unit 101 and the image plane captured by the second imaging unit 102.
  • a form is not limited to this, It can select suitably according to a precision.
  • the first distance estimation unit 110 supplies the estimated first distance to the second distance estimation unit 120.
  • the associating unit 130 associates the pixel position (first position) in the first image photographed by the first photographing unit 101 with the pixel position (second position) in the second image photographed by the second photographing unit 102. .
  • epipolar constraint means that a point in one image is projected onto a straight line in the other image. In general, using this condition, the straight line is obtained, and only the straight line is searched to determine a point to be projected (referred to as a corresponding point). However, in this embodiment, a different method is used because the positional relationship between the first imaging unit 101 and the second imaging unit 102 is not fixed.
  • the association unit 130 first extracts feature points such as corners and blobs from the first image received from the first imaging unit 101 and the second image received from the second imaging unit 102. Then, the associating unit 130 extracts a feature amount for each of the extracted feature points.
  • the method of extracting feature points and feature amounts may be a generally employed method, for example, using SIFT (Scale-Invariant Feature Transform).
  • the associating unit 130 includes a plurality of feature points (referred to as first feature points) extracted from the first image (referred to as first feature points) and a plurality of feature points extracted from the second image.
  • Each feature amount (referred to as the second feature point) of the feature points (referred to as the second feature point) generates a set of feature points similar to each other.
  • the associating unit 130 uses the 8-point method and RANSAC (Random Sample Consensus) processing to increase the matching accuracy between the first feature point and the second feature point in the set of feature points extracted. Is calculated.
  • RANSAC Random Sample Consensus
  • This coefficient is generally expressed in the form of a basic matrix (3 ⁇ 3 matrix). If the basic matrix is determined, an epipolar line in which a point on the other image corresponding to a point on one image exists can be determined. Accordingly, the associating unit 130 associates the points on the first image with the points on the second image by performing stereo matching using the calculated coefficient.
  • the point on the first image and the point on the second image are points represented by two-dimensional coordinates on the respective images. Therefore, it can be said that the association process performed by the association unit 130 is a process of associating the pixel position in the first image with the pixel position in the second image.
  • the point which the matching part 130 matches is not limited to a feature point, Other points may be sufficient.
  • the associating unit 130 performs the above processing using the first image and the second image that are taken every moment. Thereby, even in a situation where the positional relationship between the first imaging unit 101 and the second imaging unit 102 changes with the passage of time, the associating unit 130 matches the first position in the first image. The association with the second position in the second image can be performed.
  • the associating unit 130 supplies information indicating the correspondence between the associated first position and second position to the second distance estimating unit 120.
  • the information indicating the correspondence relationship between the first position and the second position is, for example, a set of coordinates indicating the first position and coordinates indicating the second position associated with the first position.
  • the storage unit 140 stores the camera parameters of the first imaging unit 101 and the camera parameters of the second imaging unit 102.
  • the storage unit 140 may store the first image captured by the first imaging unit 101 and the second image captured by the second imaging unit 102.
  • the first distance estimation unit 110 and the association unit 130 may perform the above-described processing using the first image and the second image stored in the storage unit 140.
  • the storage unit 140 is realized by a storage device separate from the distance estimation device 100. There may be.
  • the second distance estimation unit 120 is based on the first distance supplied from the first distance estimation unit 110 and the information indicating the correspondence between the first position and the second position supplied from the association unit 130.
  • the second distance is estimated.
  • the second distance estimation unit 120 estimates the second distance using the camera parameters of the first imaging unit 101 and the camera parameters of the second imaging unit 102 stored in the storage unit 140.
  • the second distance is a reference position based on the positional relationship between the first photographing unit 101 and the second photographing unit 102 and a position in the real space corresponding to the first position and the second position that are in a corresponding relationship.
  • Distance Since the second distance estimating method by the second distance estimating unit 120 is the same as the second distance estimating method of the second distance estimating unit 12 in the first embodiment described above, description thereof is omitted.
  • FIG. 6 is a flowchart illustrating an example of an operation flow of the distance estimation apparatus 100 according to the present embodiment.
  • the first photographing unit 101 and the second photographing unit 102 respectively photograph the first image and the second image at the synchronized timing (step S61). Then, the position information receiving unit 1011 of the first imaging unit 101 receives first imaging unit position information indicating the position of the first imaging unit 101 at the time of imaging from the positioning system (step S62). Then, the first photographing unit 101 supplies the first photographing unit position information at the time of photographing to the first distance estimating unit 110. Further, the position information receiving unit 1021 of the second imaging unit 102 receives second imaging unit position information indicating the position of the second imaging unit 102 at the time of imaging from the positioning system (step S63). Then, the second photographing unit 102 supplies the second photographing unit position information at the time of photographing to the first distance estimating unit 110. Note that the above steps S61 to S63 are preferably executed at the same timing.
  • the first distance estimation unit 110 estimates a first distance that is a distance between the first imaging unit 101 and the second imaging unit 102 (step S64).
  • the associating unit 130 extracts feature points from each of the first image photographed by the first photographing unit 101 and the second image photographed by the second photographing unit 102 in step S61 (step S65). Then, the associating unit 130 generates a set of the first feature point and the second feature point based on the feature amount of the first feature point and the feature amount of the second feature point (step S66). Thereafter, the associating unit 130 calculates a coefficient expressed by the basic matrix (step S67). Then, the associating unit 130 performs stereo matching using the calculated coefficient, and associates the first position, which is the pixel position in the first image, with the second position, which is the pixel position in the second image (step) S68).
  • step S64 may be executed at the same timing as any of steps S65 to S68, or may be executed after step S68.
  • the second distance estimation unit 120 estimates the second distance based on the correspondence between the first position and the second position associated in step S68 and the first distance (step S69).
  • the distance estimation apparatus 100 ends the process.
  • the above-described processing of the distance estimation device 100 is processing performed for each set of the first image and the second image captured at the same timing as the first image.
  • the first imaging unit 101 and the second imaging unit 102 capture images every moment. Therefore, the distance estimation apparatus 100 performs the process shown in FIG. 10 for each set of the first image and the second image that are taken every moment.
  • the distance estimation apparatus 100 has the same effects as the distance estimation apparatus 10 according to the first embodiment described above. Further, the associating unit 130 performs associating using the first image and the second image that are photographed every moment. As a result, the distance estimation apparatus 100 allows the first position and the first position associated with each other even under a situation where the positional relationship between the first photographing unit 101 and the second photographing unit 102 changes with the passage of time. The second distance can be estimated using the two positions and the first distance.
  • the distance estimation apparatus 100 is configured to estimate the first distance using position information from a positioning system such as GPS, for example, the distance between two cameras is rapidly increased. Even if it changes, the first distance can be estimated appropriately.
  • 1st distance estimation part 110 demonstrated that 1st distance was estimated based on the positional information on the 1st imaging
  • the method by which the first distance estimation unit 110 estimates the first distance is not limited to this.
  • FIG. 7 is a functional block diagram illustrating an example of a functional configuration of the distance estimation apparatus 100 according to the present modification.
  • the first photographing unit 101 uses the position information receiving unit 1011 capable of receiving position information from a high-accuracy positioning system to display information about the photographing time at the same time as the photographing time of the first image.
  • a radio wave transmission unit 1012 that transmits radio waves including the radio wave is provided.
  • the radio wave transmission unit 1012 may be realized by a radio wave transmitter provided in the vicinity of the first imaging unit 101.
  • the second imaging unit 102 receives a radio wave transmitted from the radio wave transmission unit 1012 of the first imaging unit 101 instead of the position information receiving unit 1021 that can receive the position information from the high-accuracy positioning system. Part 1022 is provided.
  • the radio wave receiving unit 1022 of the second imaging unit 102 acquires the reception time when the radio wave is received.
  • the radio wave receiving unit 1022 may be realized by a radio wave receiver provided in the vicinity of the second imaging unit 102.
  • the second imaging unit 102 measures the time required from reception of the acquired reception time and reception of the radio wave from the reception time indicated by the received radio wave to reception. Then, the second imaging unit 102 supplies measurement information indicating the measured time to the first distance estimation unit 110.
  • the first distance estimation unit 110 estimates the first distance based on the measurement information supplied from the second imaging unit 102.
  • the first distance estimation unit 110 may be configured to receive information indicating the shooting time from the first shooting unit 101 and receive information indicating the reception time of the radio wave from the second shooting unit 102. In this case, the first distance estimation unit 110 measures the time required for the second imaging unit 102 to receive the radio wave after the radio wave is transmitted from the first imaging unit 101, and based on the measured time. The first distance is estimated.
  • the distance estimation apparatus 100 demonstrated the method for estimating the 1st distance using a radio wave in this modification, you may estimate a 1st distance using a sound wave instead of a radio wave.
  • the distance estimation apparatus 100 may estimate the first distance using both radio waves and sound waves.
  • a third embodiment of the present invention will be described with reference to the drawings.
  • the configuration in which the first photographing unit 101 and the second photographing unit 102 are included in the distance estimation apparatus has been described.
  • the first photographing unit 101 and the second photographing unit 102 are included.
  • a configuration in which the imaging unit 102 is included in a device separate from the distance estimation device will be described.
  • the distance estimation apparatus 200 according to the present embodiment has the same function as the distance estimation apparatus 10 described in the first embodiment.
  • the system 1 according to the present embodiment includes a distance estimation device 200, an unmanned aircraft 210, and an unmanned aircraft 220.
  • FIG. 8 is a diagram for explaining a use scene of the system 1 according to the present embodiment.
  • the distance estimation device 200 and the unmanned aircraft (210, 220) are wirelessly connected.
  • unmanned aerial vehicles (210, 220) are flying objects, for example, flying over the town.
  • the unmanned aerial vehicle (210, 220) may be a flying object or an autonomous object by an artificial operation.
  • the unmanned aerial vehicles (210, 220) are each provided with a photographing unit. Each imaging unit captures an image so that a part of the imaging range overlaps. Further, the two photographing units take images at synchronized photographing timings.
  • the photographed image is transmitted from each unmanned aerial vehicle to the distance estimation apparatus 200.
  • the distance estimation apparatus 200 uses an image captured by the imaging unit of each unmanned aerial vehicle, from a reference position based on the positional relationship between the two unmanned aircraft to a target (in the case of FIG. 8, for example, a building).
  • a second distance which is a distance, is estimated.
  • FIG. 9 is a functional block diagram illustrating an example of a functional configuration of the system 1 according to the present embodiment.
  • members having the same functions as the members included in the drawings described in the above-described embodiments are denoted by the same reference numerals and description thereof is omitted.
  • the unmanned aircraft 210 of the system 1 includes a first imaging unit 101 and a transmission unit 211.
  • the unmanned aerial vehicle 220 includes the second imaging unit 102 and a transmission unit 221.
  • the first imaging unit 101 of the unmanned aerial vehicle 210 has the same function as the first imaging unit 101 in the second embodiment described above.
  • the first imaging unit 101 includes a position information receiving unit 1011 in the same manner as the first imaging unit 101 in the second embodiment.
  • photography part 101 may be the structure containing the electromagnetic wave transmission part 1012 similarly to the modification of 2nd Embodiment.
  • the first photographing unit 101 supplies the photographed first image and the first photographing unit position information received by the position information receiving unit 1011 to the transmission unit 211.
  • the first image captured by the first imaging unit 101 may include imaging time information indicating the imaging time.
  • the transmission unit 211 communicates with the distance estimation apparatus 200.
  • the transmission unit 211 transmits the first image supplied from the first imaging unit 101 and the first imaging unit position information to the distance estimation apparatus 200.
  • the second imaging unit 102 of the unmanned aerial vehicle 220 has the same function as the second imaging unit 102 in the second embodiment described above.
  • the second imaging unit 102 includes a position information receiving unit 1021 as in the second imaging unit 102 in the second embodiment.
  • the second imaging unit 102 may include a radio wave receiving unit 1022 as in the modification of the second embodiment.
  • the second imaging unit 102 supplies the captured second image and the second imaging unit location information received by the location information receiving unit 1021 to the transmission unit 221.
  • the second image captured by the second imaging unit 102 may include imaging time information indicating the imaging time.
  • the transmission unit 221 communicates with the distance estimation apparatus 200.
  • the transmission unit 221 transmits the second image and the second imaging unit position information supplied from the second imaging unit 102 to the distance estimation apparatus 200.
  • the distance estimation apparatus 200 includes a first distance estimation unit 110, a second distance estimation unit 120, an association unit 130, a storage unit 140, and a reception unit 150.
  • the receiving unit 150 receives the first image and the first imaging unit position information transmitted from the unmanned aircraft 210.
  • the receiving unit 150 receives the second image and the second imaging unit position information transmitted from the unmanned aircraft 220.
  • the receiving unit 150 supplies the received first image and second image to the associating unit 130.
  • the reception unit 150 supplies the received first imaging unit position information and second imaging unit position information to the first distance estimation unit 110.
  • the receiving unit 150 may store the received first image, first photographing unit position information, second image, and second photographing unit position information in the storage unit 140. At this time, the receiving unit 150 refers to the shooting time information included in the first image and the shooting time information included in the second image, and receives the first image and the second image at the same shooting time together with these.
  • the first photographing unit position information and the second photographing unit position information may be stored in the storage unit 140 in association with each other.
  • the first distance estimation unit 110 uses the first imaging unit position information and the second imaging unit position information supplied from the reception unit 150, similarly to the first distance estimation unit 110 in the second embodiment. A first distance is estimated.
  • the associating unit 130 uses the first image and the second image supplied from the receiving unit 150, as in the associating unit 130 in the second embodiment, and the first position and the first position in the first image. The second position in the two images is associated.
  • the 2nd distance estimation part 120 estimates a 2nd distance similarly to the 2nd distance estimation part 120 in 2nd Embodiment.
  • FIG. 10 is a flowchart showing an example of the operation flow of the system 1 according to the present embodiment.
  • the process of the unmanned aerial vehicle 210 is shown on the left side
  • the process of the distance estimation device 200 is shown in the center
  • the process of the unmanned aircraft 220 is shown on the right side.
  • a broken-line arrow between the flowcharts indicates information transmission.
  • the first imaging unit 101 of the unmanned aerial vehicle 210 captures a first image (step S101). Further, the second imaging unit 102 of the unmanned aerial vehicle 220 captures the second image at the same timing as step S101 (step S102).
  • the position information receiving unit 1011 of the first imaging unit 101 of the unmanned aircraft 210 receives the first imaging unit position information from the positioning system at the same timing as in step S101 (step S103). Further, the position information receiving unit 1021 of the second imaging unit 102 of the unmanned aircraft 220 receives the second imaging unit position information from the positioning system at the same timing as step S102 (step S104).
  • the transmission unit 211 of the unmanned aircraft 210 sends the first image captured by the first imaging unit 101 in step S101 and the first imaging unit position information received by the position information receiving unit 1011 in step S103 to the distance estimation apparatus 200. Transmit (step S105).
  • the receiving unit 150 of the distance estimating apparatus 200 receives the first image and the first photographing unit position information transmitted from the unmanned aircraft 210 (step S106).
  • the transmission unit 221 of the unmanned aircraft 220 sends the second image captured by the second imaging unit 102 in step S102 and the second imaging unit position information received by the position information receiving unit 1021 in step S104 to the distance estimation device 200. Transmit (step S107).
  • the receiving unit 150 of the distance estimating apparatus 200 receives the second image and the second imaging unit position information transmitted from the unmanned aircraft 220 (step S108).
  • step S105 and step S107 may be executed at the same timing.
  • step S106 and step S108 may be performed at the same timing, and may be performed in reverse order.
  • the first distance estimation unit 110 estimates a first distance that is a distance between the first imaging unit 101 and the second imaging unit 102 (step S109).
  • the associating unit 130 extracts feature points from each of the first image and the second image (step S110). Then, the associating unit 130 generates a set of the first feature point and the second feature point based on the feature amount of the first feature point and the feature amount of the second feature point (step S111). Thereafter, the associating unit 130 calculates a coefficient expressed by the basic matrix (step S112). Then, the associating unit 130 performs stereo matching using the calculated coefficient, and associates the first position, which is the pixel position in the first image, with the second position, which is the pixel position in the second image (step) S113).
  • step S109 may be executed at the same timing as any of steps S110 to S113, or may be executed after step S113.
  • the second distance estimation unit 120 estimates the second distance based on the correspondence between the first position and the second position associated in step S113 and the first distance (step S114).
  • the system 1 finishes the process.
  • the above-described processing of the system 1 is processing performed for each set of the first image and the second image captured at the same timing as the first image.
  • the first imaging unit 101 and the second imaging unit 102 capture images every moment. Therefore, the process shown in FIG. 10 is performed for each set of the first image and the second image that are taken every moment.
  • system 1 in the present embodiment can obtain the same effects as those of the distance estimation devices (10, 100) according to the first and second embodiments described above. Further, system 1 according to the present embodiment estimates the first distance and the second distance for each captured image even if the distance between unmanned aircraft 210 and unmanned aircraft 220 varies. Therefore, for example, the accuracy of the distance between the target and the reference position based on the positional relationship between the first imaging unit 101 and the second imaging unit 102 at the time when the first image and the second image are captured can be improved. it can.
  • the distance estimation device 200 of the system 1 may be configured to control the operations of the unmanned aircraft (210, 220) based on the estimated first distance and / or second distance.
  • the number of unmanned aircraft is not limited to two and may be three or more.
  • a certain unmanned aircraft for example, among a plurality of unmanned aircraft, a certain unmanned aircraft, a photographing range of a photographing unit provided in the unmanned aircraft, and a photographing unit that photographs a range that partially overlaps are provided.
  • Two of the other unmanned aircraft are unmanned aircraft (210, 220).
  • the unmanned aerial vehicles (210, 220) may fly in formation.
  • the unmanned aircraft (210, 220) of the system 1 in the present embodiment can be applied to a scene where, for example, a place where a disaster has occurred or an infrastructure place where people are difficult to enter is photographed.
  • the distance estimation apparatus 200 can accurately estimate the second distance to the building or the infrastructure.
  • Such a second distance can be suitably used for disaster situations and infrastructure deterioration diagnosis.
  • the second distance can also be suitably used for scenes such as town security and creation of a three-dimensional map.
  • the system 1 includes an unmanned aircraft including the first imaging unit 101 or the second imaging unit 102.
  • the system 1 is replaced with an unmanned aircraft.
  • the structure provided with portable terminals, such as a smart phone may be sufficient.
  • the mobile terminal is provided with the first imaging unit 101 or the second imaging unit 102 described above.
  • the system 1 may be configured to include an unmanned aircraft including the first imaging unit 101 and a mobile terminal including the second imaging unit 102.
  • the first photographing unit 101 and / or the second photographing unit 102 may be provided in a building.
  • the system 1 may include a moving object such as an automobile including the first imaging unit 101 and / or the second imaging unit 102 instead of the unmanned aircraft.
  • the apparatus including the first imaging unit 101 and the apparatus including the second imaging unit 102 are apparatuses that are wirelessly connected to the distance estimation apparatus 200, such as an unmanned aircraft.
  • the apparatus including the first imaging unit 101 and / or the apparatus including the second imaging unit 102 may be an apparatus connected to the distance estimation apparatus 200 by wire.
  • a fourth embodiment of the present invention will be described with reference to the drawings.
  • the unmanned aircraft 300 has the same functions as the first imaging unit 101 and the distance estimation apparatus 10 described in the first embodiment, and the unmanned aircraft 310 has the function of the second imaging unit 102.
  • the system 2 provided will be described.
  • the system 2 according to the present embodiment includes an unmanned aircraft 300 and an unmanned aircraft 310.
  • FIG. 11 is a diagram for explaining a usage scene of the system 2 according to the present embodiment.
  • Unmanned aerial vehicle (distance estimation device) 300 and unmanned aerial vehicle 310 are wirelessly connected.
  • unmanned aerial vehicles (300, 310) are flying objects, like unmanned aerial vehicles (210, 220).
  • One of the unmanned aircraft (300, 310) functions as a master and the other functions as a slave.
  • unmanned aircraft 300 functions as a master and unmanned aircraft 310 functions as a slave.
  • the unmanned aerial vehicle 300 and the unmanned aerial vehicle 310 are each provided with a photographing unit. Each imaging unit captures an image so that a part of the imaging range overlaps. Further, the two photographing units take images at synchronized photographing timings.
  • the image taken by the unmanned aerial vehicle 310 that functions as a slave is transmitted to the unmanned aircraft 300 that functions as a master.
  • the unmanned aerial vehicle 300 that functions as a master has the function of the distance estimation device 10 as described above. Therefore, the unmanned aerial vehicle 300 uses the image captured by the imaging unit included in the unmanned aircraft 300 and the image transmitted from the unmanned aircraft 310 from the reference position based on the positional relationship between the two unmanned aircraft 310.
  • a second distance that is a distance to (in the case of FIG. 11, for example, a building) is estimated.
  • FIG. 12 is a functional block diagram illustrating an example of a functional configuration of the system 2 according to the present embodiment.
  • members having the same functions as the members included in the drawings described in the above-described embodiments are denoted by the same reference numerals and description thereof is omitted.
  • the unmanned aerial vehicle 310 has the same function as the unmanned aerial vehicle 220 described in the third embodiment, except that the transmission unit 221 transmits the second image and the second imaging unit position information to the unmanned aircraft 300. Therefore, the description is omitted.
  • the unmanned aerial vehicle 300 includes a first imaging unit 101, a first distance estimation unit 110, a second distance estimation unit 120, an association unit 130, a storage unit 140, and a reception unit 150, as shown in FIG.
  • the first photographing unit 101 has the same function as the first photographing unit 101 in each of the above-described embodiments.
  • the first imaging unit 101 includes a position information receiving unit 1011 in the same manner as the first imaging unit 101 in the second embodiment.
  • photography part 101 may be the structure containing the electromagnetic wave transmission part 1012 similarly to the modification of 2nd Embodiment.
  • the first photographing unit 101 supplies the photographed first image to the associating unit 130.
  • the first imaging unit 101 supplies the first imaging unit position information received by the position information receiving unit 1011 to the first distance estimation unit 110.
  • the first distance estimation unit 110 may store the first image and the first photographing unit position information in the storage unit 140.
  • the receiving unit 150 receives the second image and the second imaging unit position information transmitted from the unmanned aircraft 310.
  • the receiving unit 150 supplies the received second image to the associating unit 130.
  • the receiving unit 150 supplies the received second imaging unit position information to the first distance estimating unit 110.
  • the receiving unit 150 may store the received second image and second photographing unit position information in the storage unit 140.
  • FIG. 13 is a flowchart showing an example of the operation flow of the system 2 according to the present embodiment.
  • the process of the unmanned aerial vehicle 300 is shown on the left side
  • the process of the unmanned aerial vehicle 310 is shown on the right side.
  • a broken-line arrow between the flowcharts indicates information transmission.
  • the first imaging unit 101 of the unmanned aerial vehicle 300 captures a first image (step S131).
  • the second imaging unit 102 of the unmanned aircraft 310 captures the second image at the same timing as Step S131 (Step S132).
  • the position information receiving unit 1011 of the first imaging unit 101 of the unmanned aircraft 300 receives the first imaging unit position information from the positioning system at the same timing as step S131 (step S133). Further, the position information receiving unit 1021 of the second imaging unit 102 of the unmanned aircraft 310 receives the second imaging unit position information from the positioning system at the same timing as Step S132 (Step S134).
  • the transmission unit 221 of the unmanned aircraft 310 transmits the second image captured by the second imaging unit 102 in step S132 and the second imaging unit position information received by the position information receiving unit 1021 in step S134 to the unmanned aircraft 300. (Step S135).
  • the receiving unit 150 of the unmanned aircraft 300 receives the second image and the second imaging unit position information transmitted from the unmanned aircraft 310 (step S136).
  • unmanned aerial vehicle 300 performs the same processing as steps S109 to S114 described above (steps S137 to S142).
  • an apparatus different from the unmanned aircraft may not be provided as the distance estimation apparatus. Therefore, the system 2 can estimate the second distance without considering the distance between the distance estimation device and the unmanned aircraft.
  • the first distance is estimated using position information of the first imaging unit 101 and the second imaging unit 102, or radio waves or sound waves emitted from the first imaging unit 101. I explained what will be done.
  • the estimation method of the first distance is not limited to this. In the present embodiment, another example of the first distance estimation method will be described.
  • FIG. 14 is a functional block diagram illustrating an example of a functional configuration of the distance estimation apparatus 400 according to the present embodiment.
  • distance estimation apparatus 400 according to the present embodiment is associated with first imaging section 101, second imaging section 102, first distance estimation section 410, and second distance estimation section 120. Unit 130 and a storage unit 440.
  • the distance estimation apparatus 400 according to the present embodiment includes a first distance estimation unit 410 instead of the first distance estimation unit 110 of the distance estimation apparatus 100 according to the second embodiment described above. Instead, the storage unit 440 is provided. Note that the first distance estimation unit 410 of the distance estimation apparatus 400 according to the present embodiment is also applicable to the system (1, 2) in the third or fourth embodiment.
  • the first imaging unit 101 and the second imaging unit 102 do not include the location information receiving unit 1011 and the location information receiving unit 1021. .
  • the first photographing unit 101 supplies the photographed first image to the associating unit 130.
  • the second imaging unit 102 supplies the captured second image to the associating unit 130.
  • the storage unit 440 stores the camera parameters of the first imaging unit 101 and the camera parameters of the second imaging unit 102 in the same manner as the storage unit 140 in each embodiment described above.
  • the storage unit 440 stores a size related to the photographing object.
  • the storage unit 440 may store the first image and the second image, similarly to the storage unit 140.
  • the storage unit 440 is realized by a storage device separate from the distance estimation device 400. There may be. The size relating to the object to be photographed stored in the storage unit 440 will be described later.
  • the association unit 130 includes the first position in the first image captured by the first image capturing unit 101 and the second image captured by the second image capturing unit 102. The second position in the image is associated.
  • the associating unit 130 supplies information indicating the correspondence between the associated first position and second position to the first distance estimating unit 410 and the second distance estimating unit 120.
  • the information indicating the correspondence relationship between the first position and the second position is, for example, a plurality of sets of coordinates indicating the first position and coordinates indicating the second position associated with the first position.
  • Information on feature points and information on feature points at the second position are included.
  • the first distance estimation unit 410 receives information indicating the correspondence between the first position and the second position from the association unit 130.
  • the first distance estimation unit 410 uses the information about the feature point at the first position and the information about the feature point at the second position included in the information indicating the correspondence relationship between the first position and the second position in the storage unit 440.
  • the feature point of the photographing object in which the size is stored is specified.
  • the first distance estimation unit 410 determines the first between the first imaging unit 101 and the second imaging unit 102 based on the specified feature point in the first image and the feature point in the second image. Estimate distance.
  • the first position in the first image and the second position in the second image that are associated with each other by the associating unit 130 are described in a two-dimensional coordinate system (x, y) (however, a homogeneous coordinate system expression).
  • a point indicating the first position described in the two-dimensional coordinate system is a point f
  • a point indicating the second position is a point f '.
  • points indicating positions in the real space described using the global coordinate system (X, Y, Z) corresponding to the first position and the second position having a correspondence relationship (however, a homogeneous coordinate system expression) Is a point F.
  • f PF
  • f ′ P′F
  • P and P ′ are coefficients calculated by the association unit 130 and are 3 ⁇ 4 matrices.
  • A is a 4 ⁇ 4 matrix expressed by the following equation (1).
  • the first distance estimation unit 410 identifies two feature points F 1 and F 2 for which the distance between feature points (size between feature points) is known among the plurality of feature points in the photographing object.
  • the distance between the feature points in the object to be photographed is the size between the feature points in the object to be photographed.
  • the size related to the photographing target stored in the storage unit 440 is a size between a certain feature point and another feature point in the photographing target. This feature point may be a point at a position where the size of the entire object to be photographed is known, the point having the largest feature amount, or another point.
  • the storage unit 440 stores the size between the feature points together with information (for example, feature amount) that can identify the two feature points. Therefore, the first distance estimation unit 410 identifies two feature points whose sizes are stored in the storage unit 440.
  • the first distance estimation unit 410 obtains a magnification at which the following equation (2) is equal to the size stored in the storage unit 440 for each captured image.
  • the 1st distance estimation part 410 can specify the magnification with respect to a to-be-photographed object for every picked-up image. Therefore, the first distance estimation unit 410 estimates the first distance between the first imaging unit 101 and the second imaging unit 102 based on the specified magnification.
  • the storage unit 440 may store a size such as an area between a plurality of feature points instead of or in addition to the distance between the two feature points.
  • the first distance estimation unit 410 identifies the plurality of feature points, and identifies the magnification from the area of the identified feature points on the captured image and the size stored in the first distance estimation unit 410. May be.
  • the feature points used when calculating the magnification may be the same or different between the first image and the second image.
  • FIG. 15 is a flowchart showing an example of an operation flow of distance estimation apparatus 400 according to the present embodiment.
  • the first photographing unit 101 and the second photographing unit 102 respectively photograph the first image and the second image at the synchronized timing (step S151).
  • the associating unit 130 extracts feature points from the first image captured by the first imaging unit 101 and the second image captured by the second imaging unit 102 in step S151 (step S152). .
  • the associating unit 130 generates a set of the first feature point and the second feature point based on the feature amount of the first feature point and the feature amount of the second feature point (step S153).
  • the associating unit 130 calculates a coefficient expressed by the basic matrix (step S154).
  • the associating unit 130 performs stereo matching using the calculated coefficient, and associates the first position, which is the pixel position in the first image, with the second position, which is the pixel position in the second image (step) S155).
  • the first distance estimation unit 410 specifies the feature point of the shooting target object whose size is stored in the storage unit 440 among the extracted feature points (step S156). Then, the first distance estimation unit 410 estimates a first distance that is a distance between the first imaging unit 101 and the second imaging unit 102 (step S157).
  • the second distance estimation unit 120 estimates the second distance based on the correspondence between the first position and the second position and the first distance (step S158).
  • the distance estimation apparatus 400 ends the process.
  • the first distance estimation unit 410 when the size between the feature points of the shooting target is known, an image including the shooting target is shot, so that the first distance estimation unit 410 has the first shooting unit.
  • the first distance between 101 and the second imaging unit 102 can be estimated.
  • members such as the above-described position information receiving unit 1011 are not provided in the first imaging unit 101, the same effects as those of the above-described embodiments can be obtained.
  • the first photographing unit 101 and the second photographing unit 102 photograph a road
  • the size between feature points of a photographing object such as a road sign or a vehicle is stored in the storage unit 440, this size is set.
  • the first distance estimation unit 410 the first distance can be estimated.
  • each component of each device or each system represents a functional unit block.
  • a part or all of each component of each device or each system is realized by an arbitrary combination of an information processing device 500 and a program as shown in FIG. 16, for example.
  • the information processing apparatus 500 includes the following configuration as an example.
  • CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • a program 504 loaded into the RAM 503
  • a storage device 505 for storing the program 504
  • a drive device 507 for reading / writing the recording medium 506
  • a communication interface 508 connected to the communication network 509
  • An input / output interface 510 for inputting / outputting data -Bus 511 connecting each component
  • Each component of each apparatus or each system in each embodiment is realized by the CPU 501 acquiring and executing a program 504 that realizes these functions.
  • the program 504 that realizes the function of each component of each device is stored in advance in the storage device 505 or the RAM 503, for example, and is read by the CPU 501 as necessary.
  • the program 504 may be supplied to the CPU 501 via the communication network 509 or may be stored in the recording medium 506 in advance, and the drive device 507 may read the program and supply it to the CPU 501.
  • each device or each system may be realized by an arbitrary combination of an information processing device 500 and a program that are different for each component.
  • a plurality of constituent elements included in each device may be realized by an arbitrary combination of one information processing device 500 and a program.
  • each device or each system are realized by other general-purpose or dedicated circuits, processors, etc., or combinations thereof. These may be configured by a single chip or may be configured by a plurality of chips connected via a bus.
  • each device or each system may be realized by a combination of the above-described circuit and the like and a program.
  • each device or each system When some or all of the components of each device or each system are realized by a plurality of information processing devices and circuits, the plurality of information processing devices and circuits may be centrally arranged or distributed. It may be arranged.
  • the information processing apparatus, the circuit, and the like may be realized as a form in which each is connected via a communication network, such as a client and server system and a cloud computing system.

Abstract

L'invention vise à fournir une technique pour estimer avec précision la distance entre une pluralité de dispositifs d'imagerie et la position d'un sujet de mesure, même lorsque la distance entre les dispositifs d'imagerie est variable. À cet effet, l'invention concerne un dispositif d'estimation de distance qui comprend : un premier moyen d'estimation de distance pour estimer une première distance, qui est la distance entre un premier dispositif d'imagerie et un second dispositif d'imagerie, le premier dispositif d'imagerie capturant des premières images et le second dispositif d'imagerie capturant des secondes images dans une seconde zone d'imagerie qui chevauche au moins partiellement une première zone d'imagerie capturée par le premier dispositif d'imagerie, le second dispositif d'imagerie capturant les secondes images selon une temporisation synchronisée avec la temporisation de la capture par le premier dispositif d'imagerie ; et un second dispositif d'estimation de distance qui, sur la base de la première distance et de la corrélation entre une première position dans la première image et une seconde position dans la seconde image, estime une seconde distance, qui est la distance entre une position de référence basée sur la relation de position entre le premier dispositif d'imagerie et le second dispositif d'imagerie et une position d'espace réel correspondant à la première position et la seconde position.
PCT/JP2016/004075 2015-09-10 2016-09-07 Dispositif d'estimation de distance, système, procédé d'estimation de distance, et support d'enregistrement WO2017043078A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2017538872A JPWO2017043078A1 (ja) 2015-09-10 2016-09-07 距離推定装置、システム、距離推定方法およびコンピュータプログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015178634 2015-09-10
JP2015-178634 2015-09-10

Publications (1)

Publication Number Publication Date
WO2017043078A1 true WO2017043078A1 (fr) 2017-03-16

Family

ID=58239408

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/004075 WO2017043078A1 (fr) 2015-09-10 2016-09-07 Dispositif d'estimation de distance, système, procédé d'estimation de distance, et support d'enregistrement

Country Status (2)

Country Link
JP (1) JPWO2017043078A1 (fr)
WO (1) WO2017043078A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009094724A (ja) * 2007-10-05 2009-04-30 Fujifilm Corp 撮像装置
JP2014089160A (ja) * 2012-10-31 2014-05-15 Topcon Corp 航空写真測定方法及び航空写真測定システム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009094724A (ja) * 2007-10-05 2009-04-30 Fujifilm Corp 撮像装置
JP2014089160A (ja) * 2012-10-31 2014-05-15 Topcon Corp 航空写真測定方法及び航空写真測定システム

Also Published As

Publication number Publication date
JPWO2017043078A1 (ja) 2018-06-28

Similar Documents

Publication Publication Date Title
US11024046B2 (en) Systems and methods for depth estimation using generative models
US10586352B2 (en) Camera calibration
EP3248374B1 (fr) Procédé et appareil pour acquisition et fusion de cartes de profondeur à technologies multiples
KR101666959B1 (ko) 카메라로부터 획득한 영상에 대한 자동보정기능을 구비한 영상처리장치 및 그 방법
US10681269B2 (en) Computer-readable recording medium, information processing method, and information processing apparatus
WO2018142496A1 (fr) Dispositif de mesure tridimensionnelle
WO2017022033A1 (fr) Dispositif de traitement d'images, procédé de traitement d'images et programme de traitement d'images
US11089288B2 (en) Corner point extraction system and method for image guided stereo camera optical axes alignment
US11158088B2 (en) Vanishing point computation and online alignment system and method for image guided stereo camera optical axes alignment
TW201904643A (zh) 控制裝置、飛行體以及記錄媒體
JP2019190974A (ja) キャリブレーション装置、キャリブレーション方法、及びプログラム
KR20180105875A (ko) 단일 영상을 이용한 카메라 캘리브레이션 방법 및 이를 위한 장치
WO2021005977A1 (fr) Procédé et dispositif de génération de modèles tridimensionnels
CN113256718B (zh) 定位方法和装置、设备及存储介质
CN110910459A (zh) 一种对摄像装置的标定方法、装置及标定设备
JP2019032218A (ja) 位置情報記録方法および装置
JP6662382B2 (ja) 情報処理装置および方法、並びにプログラム
WO2020181409A1 (fr) Procédé d'étalonnage de paramètre de dispositif de capture, appareil et support de stockage
CN110825079A (zh) 一种地图构建方法及装置
CN113361365A (zh) 定位方法和装置、设备及存储介质
US10937180B2 (en) Method and apparatus for depth-map estimation
CN111882655A (zh) 三维重建的方法、装置、系统、计算机设备和存储介质
WO2021193672A1 (fr) Procédé de génération de modèle tridimensionnel et dispositif de génération de modèle tridimensionnel
Lin et al. Real-time low-cost omni-directional stereo vision via bi-polar spherical cameras
CN109389645B (zh) 相机自校准方法、系统、相机、机器人及云端服务器

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16843945

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017538872

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16843945

Country of ref document: EP

Kind code of ref document: A1