WO2017042907A1 - 航法装置および測量システム - Google Patents
航法装置および測量システム Download PDFInfo
- Publication number
- WO2017042907A1 WO2017042907A1 PCT/JP2015/075598 JP2015075598W WO2017042907A1 WO 2017042907 A1 WO2017042907 A1 WO 2017042907A1 JP 2015075598 W JP2015075598 W JP 2015075598W WO 2017042907 A1 WO2017042907 A1 WO 2017042907A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- image
- coordinate
- image data
- angle
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 claims abstract description 16
- 238000005259 measurement Methods 0.000 claims description 18
- 238000004891 communication Methods 0.000 claims description 9
- 230000007423 decrease Effects 0.000 abstract 1
- 238000012545 processing Methods 0.000 description 21
- 238000010586 diagram Methods 0.000 description 18
- 239000011159 matrix material Substances 0.000 description 18
- 230000006870 function Effects 0.000 description 13
- 238000000034 method Methods 0.000 description 13
- 230000008859 change Effects 0.000 description 10
- 238000012937 correction Methods 0.000 description 10
- 238000013461 design Methods 0.000 description 8
- 239000003381 stabilizer Substances 0.000 description 7
- 230000001133 acceleration Effects 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 230000004069 differentiation Effects 0.000 description 4
- 238000005096 rolling process Methods 0.000 description 4
- 239000002131 composite material Substances 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
- G01C11/06—Interpretation of pictures by comparison of two or more pictures of the same area
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D47/00—Equipment not otherwise provided for
- B64D47/08—Arrangements of cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/002—Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C1/00—Measuring angles
- G01C1/02—Theodolites
- G01C1/04—Theodolites combined with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
- H04N7/185—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/08—Control of attitude, i.e. control of roll, pitch, or yaw
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Definitions
- the present invention relates to a navigation device for estimating the attitude of a moving body equipped with a surveying camera and a laser ranging device, and a surveying system equipped with the navigation device.
- Patent Document 1 describes a surveying system that performs photogrammetry and aviation laser surveying using a camera mounted on a flying object and a laser transmitter / receiver.
- the camera that captures the survey object from the flying object is supported by an attitude stabilization device called a stabilizer, and can maintain the shooting direction vertically downward regardless of the attitude of the flying object during flight.
- the laser transmission / reception apparatus irradiates the survey target with laser light from the flying object at a predetermined cycle, and receives the reflected light from the survey target.
- the control device in this surveying system performs aviation laser surveying using the information of the reflected light received from the survey target by the laser transmission / reception device.
- the laser transmission / reception device corresponds to the laser distance measuring device in the present invention.
- the three-dimensional coordinate data of the flying object is detected by a GNSS (Global Navigation Satellite System) device mounted on the flying object.
- the GNSS device receives GNSS information from the GNSS satellite at regular intervals and analyzes the GNSS information to obtain three-dimensional coordinate data of the flying object.
- the period in which the laser beam is irradiated onto the survey target by the laser transmission / reception apparatus is shorter than the period in which the GNSS information is received by the GNSS apparatus. For this reason, even if the reflected light from the surveying object is received by the laser transmission / reception device, the control device cannot obtain the three-dimensional coordinate data of the flying object at a constant interval other than the period in which the GNSS information is received.
- GNSS is used by using the information on triaxial acceleration and triaxial angular acceleration measured by an IMU (Internal Measurement Unit) mounted on the aircraft. The three-dimensional coordinate data of the aircraft was obtained at regular intervals other than the period in which the information is received.
- the surveying system described in Patent Document 1 includes an accelerometer and an angular accelerometer that are cheaper and smaller than the IMU instead of the IMU.
- the three-dimensional acceleration from the accelerometer and the three-axis angular acceleration information from the angular accelerometer are used to determine the three-dimensional coordinates of the flying object at a fixed interval other than the reception period of the GNSS information. I have data.
- the information indicating the attitude of the flying object is an angle of the flying object in the rolling direction, pitching direction, and yawing direction (hereinafter referred to as roll angle, pitch angle, yaw angle), and two or more different positions depending on the camera.
- the value obtained by the bundle calculation for the corresponding points between the images taken from is used.
- the control device uses the acceleration from the accelerometer and the angular acceleration from the angular accelerometer based on the attitude of the flying object obtained by the bundle calculation, for the laser light scanning period (other than the GNSS information reception period). The attitude of the flying object at a certain interval) is calculated.
- the attitude of the flying object is estimated by bundle calculation using image data captured from different positions by the camera, and only the image data is used for estimating the attitude. For this reason, there is a limit to the accuracy of posture estimation.
- This invention solves the said subject, and it aims at obtaining the navigation apparatus and surveying system which can estimate the attitude
- the navigation device includes a data acquisition unit, a coordinate calculation unit, an image matching unit, and an attitude estimation unit.
- the data acquisition unit includes distance data indicating a distance from a laser light irradiation reference point to a distance measuring point measured by a laser distance measuring device mounted on the moving body, angle data indicating a laser light irradiation angle,
- the coordinate data indicating the three-dimensional coordinates of the irradiation reference point of the laser beam measured by the mounted coordinate measuring device, and the image data including the distance measuring point in the photographing target photographed by the photographing device mounted on the moving body are acquired. .
- the coordinate calculation unit calculates the coordinates of the distance measurement point on the image of the image data based on the distance data, the angle data, the coordinate data, and the parameter indicating the posture of the moving body acquired by the data acquisition unit.
- the image matching unit performs image matching of a pair of image data shot at different shooting positions by the shooting device, and sets the distance measurement point coordinates on the image of one image data of the pair calculated by the coordinate calculation unit.
- the corresponding point is searched from the image of the other image data of the pair.
- the posture estimation unit moves so that the difference between the coordinates of the distance measuring point on the image of the other image data of the pair calculated by the coordinate calculation unit and the coordinates of the corresponding point searched by the image matching unit is small.
- the posture of the moving body is estimated by correcting the parameter value indicating the posture of the body.
- the posture of the moving body is estimated by correcting the parameter value, the posture of the moving body can be estimated without using an IMU or a stabilizer. Further, in addition to the image data obtained by photographing the distance measuring point, the moving object using the distance from the laser light irradiation reference point to the distance measuring point, the laser light irradiation angle, and the three-dimensional coordinates of the laser light irradiation reference point Therefore, it is possible to accurately estimate the posture of the moving body.
- FIG. 2 is a block diagram showing a functional configuration of the navigation device according to Embodiment 1.
- FIG. 3 is a block diagram showing a hardware configuration of the navigation device according to Embodiment 1.
- FIG. 3A shows a hardware processing circuit for realizing the function of the navigation device
- FIG. 3B shows a hardware configuration for executing software for realizing the function of the navigation device.
- 4 is a flowchart showing an outline of the operation of the navigation device according to the first embodiment. It is a figure which shows typically the positional relationship of a left camera, a right camera, and a laser ranging device.
- FIG. 5A is a perspective view of a unit including a left camera, a right camera, and a laser distance measuring device
- FIG. 5B is a diagram of the unit as viewed in the X-axis direction
- FIG. FIG. 5 is a diagram of the unit viewed in the Z-axis direction
- FIG. 5D is a diagram of the unit viewed in the Y-axis direction. It is a figure which shows the position change of the left camera, the right camera, and laser rangefinder accompanying flight of an aircraft.
- FIG. 6A shows position coordinate data of the laser distance measuring device.
- FIG. 6B is a graph in which the position coordinates of the left camera, the right camera, and the laser distance measuring device are plotted on the XZ plane
- FIG. 6C is a graph in which these position coordinates are plotted on the YZ plane.
- FIG. 6D is a graph in which these position coordinates are plotted on the XY plane. It is a figure which shows the change of the measurement result of the laser ranging apparatus with the flight of an aircraft.
- FIG. 7A shows angle data and distance data at each time
- FIG. 7B is a graph in which the data shown in FIG. 7A is plotted. It is a figure which shows the image image
- FIG. 3 is a flowchart showing the operation of the navigation device according to the first embodiment. It is a figure which shows the calculation result of the three-dimensional coordinate of a ranging point. It is a figure which shows the projection center coordinate of a left camera and a right camera. It is a figure which shows the coordinate of the ranging point on the image image
- FIG. 1 is a block diagram showing a configuration of a surveying system 1 according to Embodiment 1 of the present invention.
- the surveying system 1 is a system for surveying terrain from the aircraft 2, and includes a left camera 20 a, a right camera 20 b, a laser ranging device 21, a GNSS device 22, a memory card 23, and a navigation device 3 mounted on the aircraft 2.
- the navigation device 3 is a device that estimates the attitude of the aircraft 2 in flight, and is provided separately from the aircraft 2 as shown in FIG. However, the navigation device 3 may be mounted on the aircraft 2. Further, the attitude of the aircraft 2 is specified by three parameters of the roll angle ⁇ , the pitch angle ⁇ , and the yaw angle ⁇ that are attitude angles in the rolling direction, the pitching direction, and the yawing direction of the aircraft 2.
- the aircraft 2 embodies the moving body in the present invention, and can fly with the left camera 20a, the right camera 20b, the laser distance measuring device 21, the GNSS device 22, and the memory card 23 mounted thereon.
- an aircraft operated by a pilot on board may be used, or a UAV (Unmanned Aero Vehicle) may be used.
- UAV Unmanned Aero Vehicle
- the left camera 20a and the right camera 20b are components embodying the first photographing unit and the second photographing unit in the present invention, and photograph the ground surface including the distance measuring point of the laser distance measuring device 21.
- a device including the left camera 20a and the right camera 20b and a control device that controls these photographing processes corresponds to the photographing device in the present invention.
- the control device instructs the left camera 20a and the right camera 20b to shoot the ground surface at a predetermined cycle, and stores image data in which the image obtained by shooting is associated with the shooting date / time in the memory card 23.
- a predetermined period it is conceivable to perform photographing every second.
- the laser distance measuring device 21 receives the reflected light from the distance measuring point on the ground surface by irradiating the ground surface to be surveyed while changing the irradiation angle ⁇ of the laser light, thereby receiving the laser light irradiation reference. The distance l from the point to the distance measuring point is measured. Each time the laser distance measuring device 21 measures the distance l, the memory card 23 stores distance data indicating the distance l and angle data indicating the irradiation angle ⁇ of the laser beam from which the distance l is obtained. To do.
- the GNSS device 22 is a component that embodies the coordinate measuring device according to the present invention, and measures the three-dimensional coordinates of the laser beam irradiation reference point in the laser distance measuring device 21. Further, the GNSS device 22 stores coordinate data indicating the three-dimensional coordinates of the irradiation reference point in the memory card 23 at a predetermined cycle. For example, the coordinates are measured every second in synchronization with photographing by the left camera 20a and the right camera 20b. Note that the difference in position between the GNSS device 22 and the irradiation reference point is within an allowable range with respect to the measurement accuracy of the GNSS device 22. That is, the GNSS device 22 is assumed to be at the same position as the irradiation reference point, and the position of the irradiation reference point has the same meaning as the position of the aircraft 2.
- the memory card 23 is a component that embodies the storage device in the present invention, and stores distance data, angle data, image data, and coordinate data measured during the flight of the aircraft 2.
- an SD (Secure Digital) memory card may be used as the memory card 23, for example, an SD (Secure Digital) memory card may be used.
- FIG. 2 is a block diagram showing a functional configuration of the navigation device 3.
- the navigation device 3 includes a data acquisition unit 30, a coordinate calculation unit 31, an image matching unit 32, and a posture estimation unit 33.
- the data acquisition unit 30 is a component that acquires distance data, angle data, coordinate data, and image data stored in the memory card 23 of the aircraft 2. For example, the data acquisition unit 30 reads and acquires the data by connecting to the card drive of the memory card 23 by wire or wireless.
- the coordinate calculation unit 31 is based on the distance data, angle data, coordinate data acquired by the data acquisition unit 30 and the attitude angle (roll angle ⁇ , pitch angle ⁇ , yaw angle ⁇ ) of the aircraft 2 on the image data.
- the coordinates of the distance measuring point are calculated. For example, the distance l from the laser light irradiation reference point to the distance measuring point, the laser light irradiation angle ⁇ , the three-dimensional coordinates of the laser light irradiation reference point, the roll angle ⁇ , the pitch angle ⁇ , and the yaw angle ⁇ of the aircraft 2. Based on this, the three-dimensional coordinates of the distance measuring point are calculated.
- the coordinates of the distance measurement points on the image of the image data photographed by the left camera 20a and the right camera 20b are calculated.
- the image matching unit 32 performs image matching of a pair of image data photographed at different photographing positions by at least one of the left camera 20a and the right camera 20b, and one image data of the pair (hereinafter referred to as first image data).
- a point corresponding to the coordinates of the distance measuring point on the image (denoted as image data as appropriate) is searched from the image of the other image data of the pair (hereinafter referred to as second image data as appropriate).
- a method for image matching a well-known template matching method for examining the similarity between two images can be used.
- the first image data is used as a template image
- the second image data is used as target image data
- the two image data are compared, and a point corresponding to the coordinates of a distance measuring point on the template image is determined from the target image data image.
- the image data pair may be a pair of image data taken at different photographing positions, the image data taken at time i during the flight of the aircraft 2 and the time advanced from this time i You may use the image data image
- the pair of image data may be a pair of image data captured at time i by the left camera 20a and the right camera 20b.
- the pair of image data may be a pair of image data captured at time i and time j advanced from time i by at least one of the left camera 20a and the right camera 20b.
- the change of the subject on the image according to the attitude of the aircraft 2 can be used for estimating the attitude of the aircraft 2.
- the left camera 20a and the right camera 20b do not need a stabilizer.
- the posture estimation unit 33 calculates the coordinates of the distance measurement points on the image of the other image data (second image data) of the pair calculated by the coordinate calculation unit 31 and the corresponding points searched by the image matching unit 32.
- the attitude angle value of the aircraft 2 is corrected so that the difference from the coordinates is small, and the attitude of the aircraft 2 is estimated.
- the attitude estimation unit 33 calculates the correction amount of the attitude angle value of the aircraft 2 so that the difference between the coordinates of these two points becomes small, and finally determines the attitude angle that minimizes the difference between the coordinates of the two points.
- the attitude angle of the aircraft 2 is estimated. This makes it possible to estimate the attitude angle of the aircraft 2 with high accuracy based on the distance data, angle data, coordinate data, and image data.
- FIG. 3 is a block diagram showing a hardware configuration of the navigation device 3.
- FIG. 3A shows a hardware processing circuit 100 that realizes the function of the navigation apparatus 3
- FIG. 3B shows a hardware configuration that executes software that realizes the function of the navigation apparatus 3.
- FIG. 4 is a flowchart showing an outline of the operation of the navigation device 3.
- Each function of the data acquisition unit 30, the coordinate calculation unit 31, the image matching unit 32, and the posture estimation unit 33 in the navigation device 3 is realized by a processing circuit. That is, the navigation apparatus 3 obtains the distance data, angle data, coordinate data, and image data shown in FIG. 4 based on the distance data, the angle data, the coordinate data, and the attitude angle of the aircraft 2 based on the attitude data of the aircraft 2.
- Step ST2 for calculating the coordinates of the distance measuring points on the image, matching the coordinates of the distance measuring points on the image of the image data of one of the pair by performing image matching of a pair of image data photographed at different photographing positions.
- Step ST3 for searching for the point to be searched from the image of the other image data of the pair, the coordinates of the distance measuring point on the image of the other image data of the pair, and the coordinates of the corresponding point searched by the image matching unit 32 Is provided with a processing circuit for performing step ST4 for estimating the attitude of the aircraft 2 by correcting the value of the attitude angle of the aircraft 2 so that the difference between the two is reduced.
- the processing circuit may be dedicated hardware or a CPU (Central Processing Unit) that executes a program stored in the memory.
- CPU Central Processing Unit
- the processing circuit 100 when the processing circuit is a dedicated hardware processing circuit 100, the processing circuit 100 may be, for example, a single circuit, a composite circuit, a programmed processor, or a parallel programmed processor.
- ASIC Application Specific Integrated Circuit
- FPGA Field-Programmable Gate Array
- the functions of the data acquisition unit 30, the coordinate calculation unit 31, the image matching unit 32, and the posture estimation unit 33 may be realized by a processing circuit, or the functions of the units may be realized by a single processing circuit. May be.
- the functions of the data acquisition unit 30, the coordinate calculation unit 31, the image matching unit 32, and the posture estimation unit 33 are software, firmware, or software and firmware. Realized by a combination of Software and firmware are described as programs and stored in the memory 102.
- the CPU 101 reads out and executes the program stored in the memory 102, thereby realizing the functions of each unit. That is, the navigation device 3 includes a memory 102 for storing a program that, when executed by the CPU 101, results from the processing from step ST1 to step ST4 shown in FIG.
- these programs cause the computer to execute the procedures or methods of the data acquisition unit 30, the coordinate calculation unit 31, the image matching unit 32, and the posture estimation unit 33.
- the memory is, for example, a RAM (Random Access Memory), ROM, flash memory, EPROM (Erasable Programmable ROM), EEPROM (Electrically Programmable EPROM), or other nonvolatile or volatile semiconductor memory, magnetic disk, flexible disk, Optical discs, compact discs, mini discs, DVDs (Digital Versatile Disk), and the like are applicable.
- RAM Random Access Memory
- ROM Read Only Memory
- EPROM Erasable Programmable ROM
- EEPROM Electrically Programmable EPROM
- magnetic disk magnetic disk
- flexible disk Optical discs
- compact discs compact discs
- mini discs mini discs
- DVDs Digital Versatile Disk
- the data acquisition unit 30 realizes its function by a dedicated hardware processing circuit 100, and the coordinate calculation unit 31, the image matching unit 32, and the posture estimation unit 33 are executed by the CPU 101 executing a program stored in the memory 102. The function is realized by doing.
- the processing circuit can realize the above-described functions by hardware, software, firmware, or a combination thereof.
- FIG. 5 is a diagram schematically showing the positional relationship among the left camera 20a, the right camera 20b, and the laser distance measuring device 21.
- FIG. 5A is a perspective view of a unit including the left camera 20a, the right camera 20b, and the laser distance measuring device 21, and
- FIG. 5B is a view of the unit when viewed in the X-axis direction.
- FIG. 5C is a diagram of the unit viewed in the Z-axis direction
- FIG. 5D is a diagram of the unit viewed in the Y-axis direction.
- FIG. 5A is a perspective view of a unit including the left camera 20a, the right camera 20b, and the laser distance measuring device 21, and
- FIG. 5B is a view of the unit when viewed in the X-axis direction.
- FIG. 5C is a diagram of the unit viewed in the Z-axis direction
- FIG. 5D is a diagram of the unit viewed in the Y-axis direction.
- the left camera 20a is attached to the end of an arm 20c extending to the left side of the laser distance measuring device 21, and the right camera 20b is an arm 20d extending to the right side of the laser distance measuring device 21. It is attached to the end.
- the lengths of the arms 20c and 20d on both sides are, for example, 1 m.
- the shooting directions of the left camera 20a and the right camera 20b are directed in the direction directly below the aircraft 2 (Z-axis direction).
- the laser distance measuring device 21, FIG. 5 (b) the distance measuring point is irradiated to the distance measuring point P 0 of the irradiation angle ⁇ to change under irradiation reference point 21a laser light ground surface from the laser light P 0 receiving reflected light from, for measuring the distance l from the irradiation reference point 21a to the distance measuring point P 0.
- the irradiation angle ⁇ when the laser beam is irradiated directly below the irradiation reference point 21a is 90 degrees.
- the left camera 20a and the right camera 20b obtain image data as will be described later with reference to FIG. 8 by performing imaging within a rectangular imaging range.
- the position of the irradiation reference point 21a and the position of the aircraft 2 are the same. Accordingly, when the aircraft 2 flies horizontally in the X-axis direction, the unit including the left camera 20a, the right camera 20b, and the laser distance measuring device 21 also moves in the X-axis direction as shown in FIG. However, even if a horizontal flight is intended in the actual flight environment, the aircraft 2 cannot take a straight path due to the influence of wind or the like. That is, the aircraft 2 flies while tilting in the rolling direction, the pitching direction, and the yawing direction.
- FIG. 6 is a diagram showing changes in the positions of the left camera 20a, the right camera 20b, and the laser distance measuring device 21 accompanying the flight of the aircraft.
- FIG. 6A shows the position coordinate data of the laser distance measuring device 21.
- FIG. 6B is a graph in which the position coordinates of the left camera 20a, the right camera 20b, and the laser distance measuring device 21 are plotted on the XZ plane.
- FIG. 6C is a graph in which these position coordinates are plotted on the YZ plane
- FIG. 6D is a graph in which these position coordinates are plotted on the XY plane.
- the plane is flying for 3 seconds.
- the position coordinates of the laser distance measuring device 21 are the position coordinates of the irradiation reference point 21a measured every second by the GNSS device 22.
- the position coordinates of the left camera 20a and the right camera 20b are calculated on the assumption that they are separated from the position of the irradiation reference point 21a by the 1 m arm 20c, 20d in the Y-axis direction.
- the position of the aircraft 2 is shifted in the Y-axis direction and the Z-axis direction in 3 seconds, and the aircraft 2 is tilted and flew.
- FIG. 7 is a diagram showing a change in the measurement result of the laser distance measuring device 21 accompanying the flight of the aircraft 2, and shows the measurement result when the aircraft 2 flies in the state of FIG.
- FIG. 7A shows angle data and distance data at each time
- FIG. 7B is a graph in which the data shown in FIG. 7A is plotted.
- FIG. 8 is a diagram showing an image taken every second by the left camera 20a and the right camera 20b, and shows an image taken when the aircraft 2 flies in the state of FIG.
- the measurement result of the laser distance measuring device 21 is stored in the memory card 23 with the measurement time t, the irradiation angle ⁇ , and the distance l as one record.
- the laser distance measuring device 21 performs measurement four times per second.
- the irradiation angle ⁇ is 90 degrees immediately below the irradiation reference point 21a shown in FIG.
- the laser distance measuring device 21 scans the laser beam by rotating the irradiation reference point 21a clockwise by 18 degrees around the X axis when viewed from the positive direction of the X axis. .
- the laser light is also emitted from the laser distance measuring device 21 at an inclination. That is, distance data and angle data as shown in FIGS. 7A and 7B also change according to the attitude angle of the aircraft 2. Further, when the aircraft 2 is tilted and flies, the imaging directions of the left camera 20a and the right camera 20b are also tilted. As a result, the left camera image captured by the left camera 20 a and the right camera image captured by the right camera 20 b as shown in FIG. 8 also change according to the attitude angle of the aircraft 2.
- the aircraft 2 flies horizontally, and the coordinates on the image of the distance measuring point calculated using the distance data, angle data, coordinate data, and image data are the same as those when the aircraft 2 flies at an angle.
- An error corresponding to the attitude angle of the aircraft 2 occurs between the coordinates of the distance measuring point. Therefore, in the present invention, the attitude angle is corrected so that the error is reduced, and the attitude angle at which the error is minimized is set as the estimated value of the attitude angle of the aircraft 2.
- Figure 9 is a diagram showing the image 100a, and 100b including the distance measuring point P 0 of the laser distance measuring device 21 taken in the left camera 20a and the right camera 20b when the aircraft 2 is flying horizontally.
- the aircraft 2 flies horizontally along the positive direction of the X axis.
- the lower part of the aircraft 2 is photographed by the left camera 20a and the right camera 20b every second, and the laser distance measuring device 21 sets the irradiation angle ⁇ to 90 degrees and the distance measuring point P 0 directly below the aircraft. Measure distance.
- Figure 10 is an image 100a including the distance measuring point P 0 of the laser distance measuring device 21 taken in the left camera 20a and the right camera 20b when the aircraft 2 is flying inclined in the pitch direction, a view showing a 100c is there.
- FIG. 11 shows the coordinates of the ranging point P 0 on the image calculated on the assumption that the aircraft 2 is flying horizontally and the corresponding on the image when the aircraft 2 is flying in the pitch direction. It is a figure which shows the difference
- Such a difference ⁇ u L between the coordinates P 0b and the coordinates P 0b ′ is a distance measuring point P 0 on the image 100c on the assumption that the aircraft 2 flew without inclination even though the aircraft 2 was inclined in actual flight. This is because the coordinates P 0b of the above are calculated. Therefore, the attitude angle of the aircraft 2 that minimizes the difference ⁇ u L is an estimation result that appropriately represents the actual attitude of the aircraft 2. For example, in the case of FIG. 11, since the difference ⁇ u L is minimized when the aircraft 2 is tilted by the pitch angle ⁇ , the pitch angle ⁇ is obtained as the attitude estimation result. Note that the actual aircraft 2 is inclined in both the rolling direction and the yawing direction in addition to the pitching direction. In this case as well, the roll angle ⁇ and the yaw angle ⁇ may be similarly estimated.
- FIG. 12 is a flowchart showing the operation of the navigation device 3 according to the first embodiment, and shows a series of processes for estimating the attitude angle of the aircraft 2 in flight.
- the attitude angle of the aircraft 2 is represented by three parameters of a roll angle ⁇ , a pitch angle ⁇ , and a yaw angle ⁇ , and these angles are estimated every second.
- ⁇ , ⁇ may be estimated. That is, a total of six posture angles, which are unknown numbers, are estimated.
- the data acquisition unit 30 reads and acquires distance data, angle data, coordinate data, and image data from the memory card 23 mounted on the aircraft 2 (step ST1a).
- Distance data indicates a distance l to the distance measurement point P 0 from the irradiation reference point 21a of the measurement laser light by the laser distance measuring device 21
- angle data is data indicating an irradiation angle ⁇ of the laser beam is there.
- the coordinate data is a three-dimensional coordinate (X 0 , Y 0 , Z 0 ) of the irradiation reference point 21 a of the laser beam measured by the GNSS device 22.
- the image data is image data that includes a distance measuring point P 0 on a subject to be photographed by the left camera 20a and the right camera 20b. In this way, by using the data stored in the memory card 23 during the flight of the aircraft 2, the attitude of the aircraft 2 can be estimated after completion of the flight, and the survey result is corrected using the estimated attitude angle. Is also possible.
- the coordinate calculation unit 31 uses the following formula (1).
- three-dimensional coordinates of each distance measuring point P 0 (X, Y, Z ) is calculated (step ST2a).
- a 11 to a 33 are elements of a 3 ⁇ 3 rotation matrix representing the inclinations of the laser distance measuring device 21, the left camera 20a, and the right camera 20b according to the attitude of the aircraft 2.
- (X 0 , Y 0 , Z 0 ) is a three-dimensional coordinate of the irradiation reference point 21a of the laser beam indicated by the coordinate data.
- ⁇ is the irradiation angle of the laser light shown the angle data
- l is the distance from the irradiation reference point 21a of the laser beam indicated by the distance data to the distance measuring point P 0.
- the irradiation angle ⁇ is 90 degrees when the aircraft 2 is vertically downward.
- the coordinate calculation unit 31 calculates the projection center coordinates ( XL , Y) of the left camera 20a every second according to the following formula (2) and the following formula (3) based on the coordinate data and the setting value of the attitude angle.
- L , Z L ) and the projection center coordinates (X R , Y R , Z R ) of the right camera 20b are calculated (step ST3a).
- the coordinate calculation unit 31 based on the coordinate data, the setting value of the attitude angle, the three-dimensional coordinates of the distance measuring point P 0 , and the projection center coordinates of the left camera 20a and the right camera 20b, the coordinate calculation unit 31 according to the following equation (5), the coordinates (x L, y L) of the distance measuring point P 0 on the left camera image and the distance measuring point P 0 on the right camera image coordinates (x R, y R) is calculated ( Step ST4a).
- c is the focal length of the left camera 20a and the right camera 20b.
- the image matching unit 32 obtains the left camera image captured at time i and the right camera image captured at time j advanced by +1 from the time i from the image data acquired by the data acquisition unit 30. Extract as a pair. Such processing for extracting a pair of image data to be subjected to image matching is called pairing. By this pairing, a pair of image data shot at different shooting positions is obtained. Subsequently, the image matching unit 32 performs template matching between the left camera image at the time i and the right camera image at the time j, so that a distance measuring point on the left camera image at the time i calculated by the coordinate calculation unit 31 is obtained. A point corresponding to the coordinate (x L , y L ) of P 0 is searched from the right camera image at time j (step ST5a).
- FIG. 16 shows the correspondence between the coordinates of the distance measuring point P 0 on the left camera image at time i and the coordinates of the points on the right camera image at time j obtained by the template matching corresponding thereto.
- SCAN x (x Li , y Li ) is template matching on the right camera image at time j for a small area centered on the coordinates (x Li , y Li ) of the distance measuring point P 0 on the left camera image at time i. Is the x coordinate value of the corresponding point obtained by performing.
- SCAN y (x Li , y Li ) is the y coordinate value of the corresponding point obtained by performing template matching in the same manner.
- the coordinates (x Rj , y Rj ) of the distance measuring point P 0 on the right camera image at time j calculated by the coordinate calculating unit 31 and the corresponding points searched by the image matching unit 32.
- the coordinates (SCAN x (x Li , y Li ), SCAN y (x Li , y Li ) This is because the attitude angle ( ⁇ , ⁇ , ⁇ ) of the aircraft 2 is a value other than zero. That is, if an appropriate attitude angle ( ⁇ , ⁇ , ⁇ ) indicating the attitude of the aircraft 2 in flight is set and the coordinates shown in FIG.
- the posture estimation unit 33 uses the coordinates (x Rj , y Rj ) of the distance measuring point P 0 on the right camera image at time j and the coordinates (SCAN x (x Li , y) of the corresponding points searched by the image matching unit 32. Li ), SCAN y (x Li , y Li )) is calculated as a correction amount for the posture angle (step ST6a). For example, using the observation equation v x, v y of the following formula (6).
- tilde omega ( ⁇ omega), tilde phi ( ⁇ phi), tilde kappa ( ⁇ kappa) is roll angle is unknown omega, pitch angle phi, is an approximate solution of the yaw angle kappa.
- ⁇ , ⁇ , and ⁇ are correction amounts for the approximate solutions tilde ⁇ , tilde ⁇ , and tilde ⁇ .
- ⁇ F x / ⁇ is a partial differentiation of F x at the roll angle ⁇
- ⁇ F x / ⁇ is a partial differentiation of F x at the pitch angle ⁇ of F x , ⁇ F x / ⁇ .
- These partial differentials are coefficients whose values are obtained by substituting the approximate solutions tilde ⁇ , tilde ⁇ , and tilde ⁇ .
- ⁇ F y / ⁇ is a partial differential of F y at the roll angle ⁇
- ⁇ F y / ⁇ is a partial differential of F y at the pitch angle ⁇
- ⁇ F y / ⁇ is F This is a partial derivative of y at the yaw angle ⁇ .
- These partial differentials are coefficients that can be obtained by substituting the approximate solutions tilde ⁇ , tilde ⁇ , and tilde ⁇ .
- Tilde F x ( ⁇ F x) is, SCAN x (x L, y L) with respect to F x is a value obtained by substituting the approximate solution of the x R, tilde F y ( ⁇ F y) is This is a value obtained by substituting an approximate solution of SCAN y (x L , y L ) and y R for F y .
- FIG. 18 shows a 24 ⁇ 6 design matrix composed of partial differential coefficients calculated for each observation equation in this way.
- the posture estimation unit 33 calculates a product of the transposed matrix of the design matrix and the design matrix.
- a calculation result using the planning matrix shown in FIG. 18 is shown in FIG.
- the posture estimation unit 33 calculates the product of the transposed matrix of the design matrix and the constant vector shown in FIG. The result of this calculation is shown in FIG.
- the posture estimation unit 33 calculates the product of the inverse matrix calculated from the matrix shown in FIG. 19 and the vector shown in FIG.
- This calculation result is the posture angle correction amount ( ⁇ , ⁇ , ⁇ ) shown in FIG. Initially, assuming that the aircraft 2 is flying horizontally without tilting, (0, 0, 0) is set as the initial value of the attitude angle ( ⁇ , ⁇ , ⁇ ). The quantity is an approximate solution of the attitude angle as it is.
- the posture estimation unit 33 adds the correction amount calculated as described above to the previous approximate solution for correction, and determines the corrected approximate solution as a setting value for the posture angle (step ST7a). At this time, if the predetermined number of repetitions has not elapsed (step ST8a; NO), the posture estimation unit 33 instructs the coordinate calculation unit 31 to perform the same coordinate calculation as described above. Thereby, the coordinate calculation unit 31 performs the processing from step ST2a to step ST4a using the corrected approximate solution as the setting value of the posture angle, and the image matching unit 32 performs the processing of step ST5a.
- the pairing of the left camera image at time i and the right camera image at time i corresponds to fixed stereo. In motion stereo, the camera is moved and shot from different shooting positions.
- the configuration using the left camera 20a and the right camera 20b is shown, but a single camera may be used.
- the unknowns are the three parameters ( ⁇ , ⁇ , ⁇ ) of the attitude angle for each time has been shown so far, but the six parameters including the position coordinates (X, Y, Z) of the aircraft 2 may be added. Often, internal parameters such as the focal length c of the camera may be included.
- the attitude of the aircraft 2 can be estimated without using an IMU or a stabilizer.
- the attitude of the aircraft 2 is estimated using the dimensional coordinates (X, Y, Z), the attitude of the aircraft 2 can be estimated with high accuracy.
- the pair of image data includes image data captured at time i and image data captured at time j that is ahead of time i during flight of the aircraft 2. It is.
- the change of the subject on the image according to the attitude of the aircraft 2 can be used for the attitude estimation of the aircraft 2.
- the image data pair is a pair of image data captured at the time i by the left camera 20a and the right camera 20b, or the left camera 20a and the right camera 20b. It is a pair of image data photographed at a time i and a time j advanced from the time i by at least one of them. Even if such image data is used, the change of the subject on the image according to the attitude of the aircraft 2 can be used for the estimation of the attitude of the aircraft 2.
- the surveying system 1 includes a memory card 23 mounted on the aircraft 2.
- the data acquisition unit 30 reads and acquires distance data, angle data, coordinate data, and image data stored in the memory card 23. In this way, by using the data stored in the memory card 23 during the flight of the aircraft 2, the attitude of the aircraft 2 can be estimated after the flight is completed, and the survey result can be corrected using the estimated attitude angle.
- FIG. FIG. 23 is a block diagram showing a configuration of a surveying system 1A according to Embodiment 2 of the present invention.
- the surveying system 1A is a system for surveying the terrain from the aircraft 2A.
- the wireless communication device 24 transmits distance data, angle data, coordinate data, and image data obtained during the flight of the aircraft 2 ⁇ / b> A to the navigation device 3.
- the navigation apparatus 3 is provided separately from the aircraft 2A. However, the navigation device 3 may be mounted on the aircraft 2A.
- the data acquisition unit 30 of the navigation device 3 receives and acquires distance data, angle data, coordinate data, and image data transmitted by the wireless communication device 24.
- the navigation device 3 estimates the attitude of the aircraft 2A based on the data acquired in this way by the same processing as in the first embodiment.
- the surveying system 1 ⁇ / b> A includes the wireless communication device 24 mounted on the aircraft 2.
- the data acquisition unit 30 receives and acquires distance data, angle data, coordinate data, and image data transmitted by the wireless communication device 24. In this way, by using the data wirelessly transmitted from the wireless communication device 24, it is possible to estimate the attitude during the flight of the aircraft 2A. It is also possible to correct the survey result during the flight of the aircraft 2A using the estimated attitude angle.
- the mobile body in the present invention is a flying body such as the aircraft 2, but the present invention is not limited to this.
- the navigation apparatus according to the present invention can be applied to a mobile mapping system, and a vehicle equipped with this system is a moving body.
- the navigation apparatus according to the present invention can be used as an apparatus for estimating a posture of a railway vehicle, a ship, and a robot as moving bodies. Even in such a moving body, the posture angle ( ⁇ , ⁇ , ⁇ ) of the moving body can be similarly used as a parameter indicating the posture. In some cases, position information may be included in the parameter.
- the navigation device according to the present invention is suitable as, for example, a UAV navigation device because it can accurately estimate the posture of a moving body without a configuration equipped with an IMU and a stabilizer.
- 1,1A survey system 1,2A aircraft, 3 navigation devices, 20a left camera, 20b right camera, 20c, 20d arm, 21 laser ranging device, 21a irradiation reference point, 22 GNSS device, 23 memory card, 24 wireless communication Device, 30 data acquisition unit, 31 coordinate calculation unit, 32 image matching unit, 33 attitude estimation unit, 100 processing circuit, 100a to 100c image, 101 CPU, 102 memory.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Navigation (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
Description
この測量システムにおいて、飛行体から測量対象を撮影するカメラは、スタビライザーと呼ばれる姿勢安定化装置に支持されており、飛行中の飛行体の姿勢によらず、撮影方向を鉛直下向きに保つことができる。
また、レーザ発信受信装置は、飛行体から予め定められた周期でレーザ光を測量対象に照射し、この測量対象からの反射光を受信する。この測量システムにおける制御装置は、レーザ発信受信装置によって測量対象から受信された反射光の情報を使用して航空レーザ測量を行う。なお、レーザ発信受信装置は、本発明におけるレーザ測距装置に相当する。
これに対して、従来の一般的な航空レーザ測量においては、飛行体に搭載されたIMU(Inertial Measurement Unit)により計測された3軸の加速度および3軸の角加速度の情報を使用して、GNSS情報が受信される周期以外の一定の間隔における飛行体の3次元座標データを得ていた。
そこで、特許文献1に記載される測量システムでは、IMUの代わりに、IMUよりも安価でかつ小型な加速度計および角加速度計を備えている。
すなわち、この測量システムでは、加速度計からの3軸の加速度および角加速度計からの3軸の角加速度の情報を使用して、GNSS情報の受信周期以外の一定の間隔における飛行体の3次元座標データを得ている。
このため、カメラの撮影方向を常に鉛直下向きに維持するスタビライザーを備える必要があり、システム構成が複雑化する。
また、測距点が撮影された画像データに加え、レーザ光の照射基準点から測距点までの距離、レーザ光の照射角度、レーザ光の照射基準点の3次元座標を使用して移動体の姿勢を推定するので、移動体の姿勢を精度よく推定することが可能である。
実施の形態1.
図1は、この発明の実施の形態1に係る測量システム1の構成を示すブロック図である。測量システム1は、航空機2から地形を測量するシステムであり、航空機2に搭載された左カメラ20a、右カメラ20b、レーザ測距装置21、GNSS装置22およびメモリカード23と、航法装置3とを備える。航法装置3は、飛行中の航空機2の姿勢を推定する装置であって、図1に示すように航空機2とは別に設けられる。ただし、航法装置3を航空機2に搭載してもよい。また、航空機2の姿勢は、航空機2のローリング方向、ピッチング方向、ヨーイング方向の姿勢角であるロール角ω、ピッチ角φ、ヨー角κの3つのパラメータによって特定される。
また、レーザ測距装置21は、距離lを測定するごとに、この距離lを示す距離データと、この距離lが得られたレーザ光の照射角度θを示す角度データとをメモリカード23に記憶する。
また、GNSS装置22は、予め定めた周期で照射基準点の3次元座標を示す座標データをメモリカード23に記憶する。例えば、左カメラ20aおよび右カメラ20bによる撮影に同期して1秒ごとに座標を測定する。
なお、GNSS装置22と照射基準点の位置の違いは、GNSS装置22の測定精度に対して許容範囲内である。すなわち、GNSS装置22は照射基準点と同一の位置にあるものとし、さらに当該照射基準点の位置は航空機2の位置と同じ意味であるものとする。
メモリカード23として、例えばSD(Secure Digital)メモリカードを使用してもよい。
例えば、データ取得部30は、メモリカード23のカードドライブに有線または無線で接続して上記データを読み出して取得する。
なお、航空機2のロール角ω、ピッチ角φ、ヨー角κは未知であり、最初は、姿勢角の補正量が算出されていないため、ロール角ω=0、ピッチ角φ=0、ヨー角κ=0を初期値として座標の計算を行う。この座標算出の詳細は後述する。
画像マッチングの方法としては、2つの画像の類似度などを調べる周知のテンプレートマッチング法を用いることができる。例えば、第1の画像データをテンプレート画像とし、第2の画像データを対象画像データとして双方の画像データを比較し、テンプレート画像上の測距点の座標に対応する点を対象画像データの画像から探索する。
さらに、画像データのペアとしては、左カメラ20aおよび右カメラ20bとによって時刻iにそれぞれ撮影された画像データのペアであってもよい。
さらに、画像データのペアは、左カメラ20aおよび右カメラ20bのうちの少なくとも一方によって時刻iと時刻iよりも進んだ時刻jとにそれぞれ撮影された画像データのペアであってもよい。
このような画像データのペアを用いることで、航空機2の姿勢に応じた画像上の被写体の変化を航空機2の姿勢推定に利用することができる。
換言すると、この発明では、航空機2の姿勢に応じた画像上の被写体の変化を航空機2の姿勢推定に利用するため、左カメラ20aと右カメラ20bにスタビライザーが不要である。
そこで、姿勢推定部33は、これら2点の座標の差が小さくなるように航空機2の姿勢角の値の補正量を算出し、2点の座標の差が最小となる姿勢角を最終的な航空機2の姿勢角と推定する。これにより、距離データ、角度データ、座標データ、画像データに基づいて、航空機2の姿勢角を高精度に推定することが可能となる。
航法装置3におけるデータ取得部30、座標算出部31、画像マッチング部32、姿勢推定部33の各機能は、処理回路により実現される。
すなわち、航法装置3は、図4に示す、距離データ、角度データ、座標データ、画像データを取得するステップST1、距離データ、角度データ、座標データおよび航空機2の姿勢角に基づいて、画像データの画像上の測距点の座標を算出するステップST2、異なる撮影位置でそれぞれ撮影された画像データのペアの画像マッチングを行って、ペアの一方の画像データの画像上の測距点の座標に対応する点を、ペアのもう一方の画像データの画像から探索するステップST3、ペアのもう一方の画像データの画像上の測距点の座標と画像マッチング部32によって探索された対応する点の座標との差が小さくなるように航空機2の姿勢角の値を補正して、航空機2の姿勢を推定するステップST4を行うための処理回路を備えている。
処理回路は、専用のハードウェアであっても、メモリに格納されるプログラムを実行するCPU(Central Processing Unit)であってもよい。
また、データ取得部30、座標算出部31、画像マッチング部32、姿勢推定部33の各部の機能をそれぞれ処理回路で実現してもよいし、各部の機能をまとめて1つの処理回路で実現してもよい。
ソフトウェアとファームウェアはプログラムとして記述され、メモリ102に格納される。CPU101は、メモリ102に格納されたプログラムを読み出して実行することにより、各部の機能を実現する。
すなわち、航法装置3は、CPU101によって実行されるときに、図4に示すステップST1からステップST4までの処理が結果的に実行されるプログラムを格納するためのメモリ102を備える。また、これらのプログラムは、データ取得部30、座標算出部31、画像マッチング部32、姿勢推定部33の手順または方法をコンピュータに実行させるものである。
例えば、データ取得部30は、専用のハードウェアの処理回路100でその機能を実現し、座標算出部31、画像マッチング部32および姿勢推定部33は、CPU101が、メモリ102に格納されたプログラム実行することによりその機能を実現する。
このように、上記処理回路は、ハードウェア、ソフトウェア、ファームウェア、またはこれらの組み合わせによって前述の機能を実現することができる。
左カメラ20aおよび右カメラ20bは、図5(c)に示すように、矩形の撮影範囲で撮影を行うことにより、図8を用いて後述するような画像データを得る。
しかしながら、実際の飛行環境では水平飛行を意図しても、航空機2は、風などの影響によって直線状の進路をとることはできない。すなわち、航空機2は、ローリング方向、ピッチング方向、ヨーイング方向のそれぞれに傾いた状態で飛行する。
また、点のある大きな四角形のプロットは、時刻t=1における左カメラ20aの位置座標である。点のある大きな逆三角形のプロットは、時刻t=1における右カメラ20bの位置座標であり、点のある小さな四角形のプロットは、時刻t=1におけるレーザ測距装置21の位置座標である。
大きな円形のプロットは、時刻t=2における左カメラ20aの位置座標である。大きな三角形のプロットは、時刻t=2における右カメラ20bの位置座標であり、小さな円形のプロットは、時刻t=2におけるレーザ測距装置21の位置座標である。
点のある大きな円形のプロットは、時刻t=3における左カメラ20aの位置座標である。点のある大きな三角形のプロットは、時刻t=3における右カメラ20bの位置座標であり、点のある小さな円形のプロットは、時刻t=3におけるレーザ測距装置21の位置座標である。
一方、時刻t=1を過ぎると、左カメラ20a、右カメラ20bおよびレーザ測距装置21の位置がZ軸方向に逸れている。
図6(b)から図6(d)までのグラフを考慮すると、航空機2の位置が3秒間でY軸方向とZ軸方向にずれており、航空機2が傾いて飛行したことがわかる。
図7(a)は、各時刻における角度データと距離データを示しており、図7(b)は、図7(a)に示すデータをプロットしたグラフである。
また、図8は、左カメラ20aと右カメラ20bによって1秒ごとに撮影された画像を示す図であり、図6の状態で航空機2が飛行したときに撮影された画像を示している。
レーザ測距装置21は、図5(b)に示したように、X軸の正方向から見てX軸を中心として時計回りに照射基準点21aを18度ずつ回転させてレーザ光を走査する。
図7(b)において、白い三角形のプロットは、時刻t=0.00~0.15における角度データおよび距離データ、黒い三角形のプロットは時刻t=1.00~1.15における角度データおよび距離データ、白い四角形のプロットは時刻t=2.00~2.15における角度データおよび距離データである。
また、航空機2が傾いて飛行すると、左カメラ20aおよび右カメラ20bの撮像方向も傾く。これにより、図8に示すような左カメラ20aに撮影された左カメラ画像および右カメラ20bに撮影された右カメラ画像も、航空機2の姿勢角に応じて変化する。
従って、航空機2が水平に飛行したと仮定して距離データ、角度データ、座標データ、画像データを用いて算出した測距点の画像上の座標と、航空機2が傾いて飛行したときの同一の測距点の座標との間には、航空機2の姿勢角に応じた誤差が生じる。
そこで、この発明では、上記誤差が小さくなるように姿勢角を補正していき、上記誤差が最小となる姿勢角を航空機2の姿勢角の推定値とする。以下、この発明における姿勢角の推定処理の概要を説明する。
ここでは、1秒ごとに、左カメラ20aおよび右カメラ20bによって航空機2の下方が撮影され、レーザ測距装置21が、照射角度θを90度として機体の直下にある測距点P0との距離を測定する。
同様に、右カメラ20bによって時刻t=1に撮影された画像100b上の測距点P0の座標P0bは、時刻t=1における照射基準点21aの3次元座標と航空機2の姿勢角とから算出することができる。
なお、図9では、航空機2が水平に飛行すると仮定しているので、姿勢角は0である。
例えば、図11の場合、航空機2をピッチ角φだけ傾けたときに差ΔuLが最小となるため、ピッチ角φが姿勢の推定結果として得られる。
なお、実際の航空機2では、ピッチング方向に加え、ローリング方向とヨーイング方向のそれぞれにも傾くが、この場合も同様にロール角ω、ヨー角κを推定すればよい。
図12は、実施の形態1に係る航法装置3の動作を示すフローチャートであり、飛行中の航空機2の姿勢角を推定する一連の処理を示している。
以降では、航空機2の姿勢角がロール角ω、ピッチ角φ、ヨー角κの3つのパラメータで表され、これらの角度は1秒ごとに推定される。
便宜上、時刻t=0および時刻t=3の姿勢角(ω,φ,κ)を(0,0,0)と仮定するので、時刻t=1および時刻t=2における未知の姿勢角(ω,φ,κ)を推定すればよい。すなわち、未知数である計6つの姿勢角を推定する。
距離データは、レーザ測距装置21によって測定されたレーザ光の照射基準点21aから測距点P0までの距離lを示すデータであり、角度データは、レーザ光の照射角度θを示すデータである。座標データは、GNSS装置22によって測定されたレーザ光の照射基準点21aの3次元座標(X0,Y0,Z0)である。画像データは、左カメラ20aと右カメラ20bによって撮影された撮影対象に測距点P0を含む画像データである。
このように航空機2の飛行中にメモリカード23に蓄積されたデータを用いることで、航空機2の姿勢を飛行終了後に推定することができ、推定された姿勢角を用いて測量結果を補正することも可能である。
下記式(1)において、a11~a33は、航空機2の姿勢に応じたレーザ測距装置21、左カメラ20aおよび右カメラ20bの傾きを表す3×3の回転行列の要素である。
また、姿勢角の設定値は、航空機2が水平に飛行していると仮定して(ω,φ,κ)=(0,0,0)とする。
図6(a)に示した時刻t=0~2における座標データおよび図7(a)に示した時刻t=0~2における測距点P0ごとの角度データと距離データを用いて、測距点P0ごとの3次元座標(X,Y,Z)を算出した結果を図13に示す。
姿勢角の設定値を(ω,φ,κ)=(0,0,0)とし、図6(a)に示した時刻t=0~2における座標データを用いて投影中心座標を算出した結果を図14に示す。
下記式(4)と下記式(5)において、cは、左カメラ20aと右カメラ20bの焦点距離である。
ただし、
続いて、画像マッチング部32は、時刻iの左カメラ画像と時刻jの右カメラ画像とのテンプレートマッチングを行うことにより、座標算出部31によって算出された時刻iの左カメラ画像上の測距点P0の座標(xL,yL)に対応する点を、時刻jの右カメラ画像から探索する(ステップST5a)。
SCANx(xLi,yLi)は、時刻iの左カメラ画像上の測距点P0の座標(xLi,yLi)を中心とした小領域について時刻jの右カメラ画像上でテンプレートマッチングを行って得られた対応する点のx座標値である。また、SCANy(xLi,yLi)は、同様にテンプレートマッチングを行って得られた対応する点のy座標値である。
すなわち、飛行中の航空機2の姿勢を示す適切な姿勢角(ω,φ,κ)を設定して、図16に示した座標を再計算すれば、上記座標(xRj,yRj)と上記座標(SCANx(xLi,yLi),SCANy(xLi,yLi))は一致する。
従って、上記座標の差が最小化される姿勢角が航空機2の姿勢角の推定値となる。
姿勢推定部33は、時刻jの右カメラ画像上の測距点P0の座標(xRj,yRj)と、画像マッチング部32によって探索された対応点の座標(SCANx(xLi,yLi),SCANy(xLi,yLi))との差が小さくなる姿勢角の補正量を算出する(ステップST6a)。例えば、下記式(6)に示す観測方程式vx,vyを用いる。
また、∂Fx/∂ωは、Fxのロール角ωでの偏微分、∂Fx/∂φは、Fxのピッチ角φでの偏微分されたFx、∂Fx/∂κは、Fxのヨー角κでの偏微分である。これらの偏微分は、近似解チルダω、チルダφ、チルダκを代入して値が得られる係数である。
同様に、∂Fy/∂ωは、Fyのロール角ωでの偏微分、∂Fy/∂φは、Fyのピッチ角φでの偏微分、∂Fy/∂κは、Fyのヨー角κでの偏微分である。これらの偏微分も近似解チルダω、チルダφ、チルダκを代入して値が得られる係数である。
チルダFx(~Fx)は、Fxに対してSCANx(xL,yL)とxRの近似解を代入して得られる値であり、チルダFy(~Fy)は、Fyに対してSCANy(xL,yL)とyRの近似解を代入して得られる値である。
このようにして観測方程式ごとに算出された偏微分係数からなる24×6の計画行列を図18に示す。
さらに、姿勢推定部33は、この計画行列の転置行列と図17に示した定数ベクトルの積を算出する。この算出の結果を図20に示す。
この後、姿勢推定部33は、図19に示す行列から算定した逆行列と図20に示すベクトルの積を算出する。この算出結果が、図21に示す姿勢角の補正量(δω,δφ,δκ)である。
なお、最初は、航空機2が傾き無く水平に飛行していると仮定して、姿勢角(ω,φ,κ)の初期値として(0,0,0)を設定しているので、上記補正量がそのまま姿勢角の近似解となる。
これにより、座標算出部31が、補正後の近似解を姿勢角の設定値としてステップST2aからステップST4aまでの処理を行い、画像マッチング部32がステップST5aの処理を行う。
上記一連の処理が予め定めた繰り返し回数だけ実行されて(ステップST8a;YES)、上記座標の差が最小となる補正量が得られると、姿勢推定部33は、この補正量で補正した近似解を、最終的な姿勢角の推定結果として出力する(ステップST9a)。
このようにして得られた時刻t=0.00~3.00の姿勢角の推定結果を図22に示す。
なお、異なる位置のカメラにより撮影された画像間で対応点を探索して観測対象までの距離および奥行き情報などの3次元情報を取得するステレオ画像処理には、固定ステレオと呼ばれる方法と、モーションステレオと呼ばれる方法がある。
固定ステレオは、2つのカメラを間隔を空けて配置して撮影するものである。時刻iの左カメラ画像と時刻iの右カメラ画像のペアリングは、固定ステレオに相当する。
モーションステレオでは、カメラを移動させて異なる撮影位置から撮影する。時刻iの左カメラ画像と時刻j(=i+1)の右カメラ画像のペアリングは、モーションステレオに相当する。
また、測距点P0が撮影された画像データに加え、レーザ光の照射基準点21aから測距点P0までの距離l、レーザ光の照射角度θ、レーザ光の照射基準点21aの3次元座標(X,Y,Z)を使用して航空機2の姿勢を推定するので、航空機2の姿勢を精度よく推定することが可能である。
このような画像データを用いても、航空機2の姿勢に応じた画像上の被写体の変化を航空機2の姿勢推定に利用することができる。
このように航空機2の飛行中にメモリカード23に蓄積されたデータを用いることで、航空機2の姿勢を飛行終了後に推定でき、推定された姿勢角を用いて測量結果を補正することもできる。
図23は、この発明の実施の形態2に係る測量システム1Aの構成を示すブロック図である。測量システム1Aは、航空機2Aから地形を測量するシステムであり、航空機2Aに搭載された左カメラ20a、右カメラ20b、レーザ測距装置21、GNSS装置22および無線通信装置24と、航法装置3とを備える。
無線通信装置24は、航空機2Aの飛行中に得られた距離データ、角度データ、座標データおよび画像データを航法装置3へ送信する。
航法装置3は、図23に示すように、航空機2Aとは別に設けられる。ただし、航法装置3を航空機2Aに搭載してもよい。
航法装置3は、このようにして取得された上記データに基づいて、実施の形態1と同様の処理で航空機2Aの姿勢を推定する。
このように無線通信装置24から無線送信されたデータを用いることで、航空機2Aの飛行中に姿勢を推定することができる。推定された姿勢角を用いて航空機2Aの飛行中に測量結果を補正することも可能である。
Claims (6)
- 移動体に搭載されたレーザ測距装置により測定されたレーザ光の照射基準点から測距点までの距離を示す距離データと前記レーザ光の照射角度を示す角度データ、前記移動体に搭載された座標測定装置により測定された前記レーザ光の照射基準点の3次元座標を示す座標データおよび前記移動体に搭載された撮影装置により撮影された撮影対象に測距点を含む画像データを取得するデータ取得部と、
前記データ取得部により取得された前記距離データ、前記角度データ、前記座標データおよび前記移動体の姿勢を示すパラメータに基づいて、前記画像データの画像上の測距点の座標を算出する座標算出部と、
前記撮影装置により異なる撮影位置でそれぞれ撮影された前記画像データのペアの画像マッチングを行って、前記座標算出部により算出された前記ペアの一方の前記画像データの画像上の測距点の座標に対応する点を前記ペアのもう一方の前記画像データの画像から探索する画像マッチング部と、
前記座標算出部により算出された前記ペアのもう一方の前記画像データの画像上の測距点の座標と前記画像マッチング部により探索された前記対応する点の座標との差が小さくなるように前記移動体の姿勢を示すパラメータの値を補正して当該移動体の姿勢を推定する姿勢推定部と
を備えたことを特徴とする航法装置。 - 前記ペアは、前記移動体の移動中に、前記撮影装置により時刻iに撮影された前記画像データと前記時刻iよりも進んだ時刻jに撮影された前記画像データであることを特徴とする請求項1記載の航法装置。
- 前記撮影装置は、前記移動体に搭載された第1の撮影部および第2の撮影部を備え、
前記ペアは、前記第1の撮影部と前記第2の撮影部により時刻iにそれぞれ撮影された前記画像データまたは前記第1の撮影部および前記第2の撮影部のうちの少なくとも一方により前記時刻iと当該時刻iよりも進んだ時刻jとにそれぞれ撮影された前記画像データであることを特徴とする請求項1記載の航法装置。 - 移動体に搭載されて、レーザ光の照射基準点から測距点までの距離を示す距離データと前記レーザ光の照射角度を示す角度データを測定するレーザ測距装置と、
前記移動体に搭載されて、前記レーザ光の照射基準点の3次元座標を示す座標データを測定する座標測定装置と、
前記移動体に搭載されて、撮影対象に測距点を含む画像データを撮影する撮影装置と、
前記距離データ、前記角度データ、前記座標データおよび前記画像データを取得するデータ取得部と、前記データ取得部により取得された前記距離データ、前記角度データ、前記座標データおよび前記移動体の姿勢を示すパラメータに基づいて、前記画像データの画像上の測距点の座標を算出する座標算出部と、前記撮影装置により異なる撮影位置でそれぞれ撮影された前記画像データのペアの画像マッチングを行って、前記座標算出部により算出された前記ペアの一方の前記画像データの画像上の測距点の座標に対応する点を前記ペアのもう一方の前記画像データの画像から探索する画像マッチング部と、前記座標算出部により算出された前記ペアのもう一方の前記画像データの画像上の測距点の座標と前記画像マッチング部により探索された前記対応する点の座標との差が小さくなるように前記移動体の姿勢を示すパラメータの値を補正して当該移動体の姿勢を推定する姿勢推定部とを有する航法装置と
を備えたことを特徴とする測量システム。 - 前記移動体に搭載され、前記距離データ、前記角度データ、前記座標データおよび前記画像データを記憶する記憶装置を備え、
前記データ取得部は、前記記憶装置に記憶された前記距離データ、前記角度データ、前記座標データおよび前記画像データを読み出して取得することを特徴とする請求項4記載の測量システム。 - 前記移動体に搭載され、前記距離データ、前記角度データ、前記座標データおよび前記画像データを送信する無線通信装置を備え、
前記データ取得部は、前記無線通信装置により送信された前記距離データ、前記角度データ、前記座標データおよび前記画像データを受信して取得することを特徴とする請求項4記載の測量システム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/561,882 US10222210B2 (en) | 2015-09-09 | 2015-09-09 | Navigation system and survey system |
JP2016513572A JP6029794B1 (ja) | 2015-09-09 | 2015-09-09 | 航法装置および測量システム |
PCT/JP2015/075598 WO2017042907A1 (ja) | 2015-09-09 | 2015-09-09 | 航法装置および測量システム |
EP15903572.4A EP3348963B1 (en) | 2015-09-09 | 2015-09-09 | Navigation system and survey system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/075598 WO2017042907A1 (ja) | 2015-09-09 | 2015-09-09 | 航法装置および測量システム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017042907A1 true WO2017042907A1 (ja) | 2017-03-16 |
Family
ID=57358823
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/075598 WO2017042907A1 (ja) | 2015-09-09 | 2015-09-09 | 航法装置および測量システム |
Country Status (4)
Country | Link |
---|---|
US (1) | US10222210B2 (ja) |
EP (1) | EP3348963B1 (ja) |
JP (1) | JP6029794B1 (ja) |
WO (1) | WO2017042907A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022118513A1 (ja) * | 2020-12-02 | 2022-06-09 | 三菱電機株式会社 | 位置姿勢算出装置、位置姿勢算出方法及び測量装置 |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6861592B2 (ja) * | 2017-07-14 | 2021-04-21 | 三菱電機株式会社 | データ間引き装置、測量装置、測量システム及びデータ間引き方法 |
CN107146447A (zh) * | 2017-07-16 | 2017-09-08 | 汤庆佳 | 一种基于无人机的智能车辆管理系统及其方法 |
CN107478195A (zh) * | 2017-09-15 | 2017-12-15 | 哈尔滨工程大学 | 一种基于光学的空间物体姿态测量装置及其测量方法 |
CN110268710B (zh) * | 2018-01-07 | 2022-01-04 | 深圳市大疆创新科技有限公司 | 图像数据处理方法、设备、平台及存储介质 |
JP7188687B2 (ja) * | 2018-07-17 | 2022-12-13 | エアロセンス株式会社 | 情報処理方法、プログラム、および情報処理システム |
CN109345589A (zh) * | 2018-09-11 | 2019-02-15 | 百度在线网络技术(北京)有限公司 | 基于自动驾驶车辆的位置检测方法、装置、设备及介质 |
GB201818357D0 (en) * | 2018-11-12 | 2018-12-26 | Forsberg Services Ltd | Locating system |
JP7241517B2 (ja) * | 2018-12-04 | 2023-03-17 | 三菱電機株式会社 | 航法装置、航法パラメータ計算方法およびプログラム |
CN110672117B (zh) * | 2019-11-04 | 2022-02-18 | 中国人民解放军空军工程大学 | 基于单观察哨数字望远镜的小航路捷径运动目标航迹获取方法 |
CN112099521B (zh) * | 2020-10-09 | 2022-05-17 | 北京邮电大学 | 一种无人机路径规划方法及装置 |
CN117848294A (zh) * | 2023-07-29 | 2024-04-09 | 上海钦天导航技术有限公司 | 测量方法、系统、设备及存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05118850A (ja) * | 1991-10-25 | 1993-05-14 | Shimizu Corp | 模型ヘリコプターを用いる3次元測量システム |
JP2001133256A (ja) * | 1999-11-05 | 2001-05-18 | Asia Air Survey Co Ltd | 空中写真の位置及び姿勢の計算方法 |
JP2007278844A (ja) * | 2006-04-06 | 2007-10-25 | Topcon Corp | 画像処理装置及びその処理方法 |
JP2013187862A (ja) * | 2012-03-09 | 2013-09-19 | Topcon Corp | 画像データ処理装置、画像データ処理方法および画像データ処理用のプログラム |
JP2014145762A (ja) * | 2013-01-07 | 2014-08-14 | Amuse Oneself Inc | 制御装置、測量システム、プログラム及び記録媒体並びに計測方法 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4335520A (en) * | 1980-09-22 | 1982-06-22 | The United States Of America As Represented By The Secretary Of The Navy | Survey spar system for precision offshore seafloor surveys |
JPH1151611A (ja) * | 1997-07-31 | 1999-02-26 | Tokyo Electric Power Co Inc:The | 認識対象物体の位置姿勢認識装置および位置姿勢認識方法 |
EP1934632A1 (en) * | 2005-10-13 | 2008-06-25 | Atlantic Inertial Systems Limited | Terrain mapping |
JP2007240506A (ja) * | 2006-03-06 | 2007-09-20 | Giyourin Cho | 3次元形状と3次元地形計測法 |
DK2227676T3 (en) * | 2007-12-21 | 2017-05-15 | Bae Systems Plc | APPARATUS AND METHOD OF LANDING A ROTOR AIR |
US10212687B2 (en) * | 2010-09-30 | 2019-02-19 | Echo Ridge Llc | System and method for robust navigation and geolocation using measurements of opportunity |
US9684076B1 (en) * | 2013-03-15 | 2017-06-20 | Daniel Feldkhun | Frequency multiplexed ranging |
US10643351B2 (en) * | 2013-03-20 | 2020-05-05 | Trimble Inc. | Indoor navigation via multi beam laser projection |
US9911189B1 (en) * | 2016-06-30 | 2018-03-06 | Kitty Hawk Corporation | Navigation based on downward facing sensors |
-
2015
- 2015-09-09 WO PCT/JP2015/075598 patent/WO2017042907A1/ja active Application Filing
- 2015-09-09 JP JP2016513572A patent/JP6029794B1/ja active Active
- 2015-09-09 EP EP15903572.4A patent/EP3348963B1/en active Active
- 2015-09-09 US US15/561,882 patent/US10222210B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05118850A (ja) * | 1991-10-25 | 1993-05-14 | Shimizu Corp | 模型ヘリコプターを用いる3次元測量システム |
JP2001133256A (ja) * | 1999-11-05 | 2001-05-18 | Asia Air Survey Co Ltd | 空中写真の位置及び姿勢の計算方法 |
JP2007278844A (ja) * | 2006-04-06 | 2007-10-25 | Topcon Corp | 画像処理装置及びその処理方法 |
JP2013187862A (ja) * | 2012-03-09 | 2013-09-19 | Topcon Corp | 画像データ処理装置、画像データ処理方法および画像データ処理用のプログラム |
JP2014145762A (ja) * | 2013-01-07 | 2014-08-14 | Amuse Oneself Inc | 制御装置、測量システム、プログラム及び記録媒体並びに計測方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3348963A4 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022118513A1 (ja) * | 2020-12-02 | 2022-06-09 | 三菱電機株式会社 | 位置姿勢算出装置、位置姿勢算出方法及び測量装置 |
Also Published As
Publication number | Publication date |
---|---|
EP3348963A1 (en) | 2018-07-18 |
JP6029794B1 (ja) | 2016-11-24 |
US10222210B2 (en) | 2019-03-05 |
EP3348963B1 (en) | 2020-05-20 |
JPWO2017042907A1 (ja) | 2017-09-07 |
US20180120107A1 (en) | 2018-05-03 |
EP3348963A4 (en) | 2019-05-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6029794B1 (ja) | 航法装置および測量システム | |
CN109461190B (zh) | 测量数据处理装置及测量数据处理方法 | |
US10970873B2 (en) | Method and device to determine the camera position and angle | |
US11099030B2 (en) | Attitude estimation apparatus, attitude estimation method, and observation system | |
US11120560B2 (en) | System and method for real-time location tracking of a drone | |
JP5134784B2 (ja) | 空中写真測量方法 | |
KR101614338B1 (ko) | 카메라 캘리브레이션 방법, 카메라 캘리브레이션 프로그램을 기록한 컴퓨터 판독가능한 기록매체 및 카메라 캘리브레이션 장치 | |
JP6988197B2 (ja) | 制御装置、飛行体、および制御プログラム | |
JP6138326B1 (ja) | 移動体、移動体の制御方法、移動体を制御するプログラム、制御システム、及び情報処理装置 | |
JP2008186145A (ja) | 空撮画像処理装置および空撮画像処理方法 | |
CN112005077A (zh) | 无人航空器的设置台、测量方法、测量装置、测量系统和程序 | |
CN115118876B (zh) | 拍摄参数的确定方法、装置及计算机可读存储介质 | |
JP3808833B2 (ja) | 空中写真測量方法 | |
RU2597024C1 (ru) | Способ оперативного определения угловых элементов внешнего ориентирования космического сканерного снимка | |
JP2016223934A (ja) | 位置補正システム、位置補正方法、および位置補正プログラム | |
CN113340272A (zh) | 一种基于无人机微群的地面目标实时定位方法 | |
JP2003083745A (ja) | 航空機搭載撮像装置および航空撮像データ処理装置 | |
Skaloud et al. | Mapping with MAV: experimental study on the contribution of absolute and relative aerial position control | |
JP6305501B2 (ja) | キャリブレーション方法、プログラムおよびコンピュータ | |
US20200116482A1 (en) | Data thinning device, surveying device, surveying system, and data thinning method | |
JP5409451B2 (ja) | 3次元変化検出装置 | |
JPH0524591A (ja) | 垂直離着陸航空機の機体位置測定方法 | |
JP6974290B2 (ja) | 位置推定装置、位置推定方法、プログラム、及び記録媒体 | |
Ruzgienė | Analysis of camera orientation variation in airborne photogrammetry: images under tilt (roll‑pitch‑yaw) angles | |
Zoltek | Relative Measurement Accuracies within Pictometry’s Individual Orthogonal and Oblique Frame Imagery© |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2016513572 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15903572 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15561882 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015903572 Country of ref document: EP |