CN111615616A - Position estimation device, position estimation method, program, and recording medium - Google Patents

Position estimation device, position estimation method, program, and recording medium Download PDF

Info

Publication number
CN111615616A
CN111615616A CN201980009074.2A CN201980009074A CN111615616A CN 111615616 A CN111615616 A CN 111615616A CN 201980009074 A CN201980009074 A CN 201980009074A CN 111615616 A CN111615616 A CN 111615616A
Authority
CN
China
Prior art keywords
flying object
error score
imaging
assumed
calculating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980009074.2A
Other languages
Chinese (zh)
Inventor
顾磊
陈斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN111615616A publication Critical patent/CN111615616A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • G01S19/14Receivers specially adapted for specific applications
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/14Flying platforms with four distinct rotor axes, e.g. quadcopters

Abstract

It is desirable to be able to measure the position of a flying object even when the flying object does not have a positioning function or when the positioning accuracy is low. The position estimation device includes a processing unit that performs processing related to estimating the position of the flying object. The processing unit acquires a plurality of captured images obtained by capturing the flying object by a plurality of imaging devices, acquires information on the arrangement position of each imaging device and the imaging direction of each imaging device, calculates an error score indicating an error between each assumed position and the existing position of the flying object based on the arrangement position of each imaging device, the imaging direction of each imaging device, each assumed position where the flying object is assumed to exist in the actual space, and the image position where the flying object exists in each captured image, and estimates the existing position of the flying object in the actual space based on the error score.

Description

Position estimation device, position estimation method, program, and recording medium [ technical field ] A method for producing a semiconductor device
The present disclosure relates to a position estimation device, a position estimation method, a program, and a storage medium for estimating a position of a flying object.
[ background of the invention ]
It is known that a conventional unmanned aerial vehicle receives a GPS signal transmitted from a GPS (global Positioning system) satellite, calculates a position from the GPS signal, and autonomously flies (see patent document 1).
[ Prior art documents ]
[ patent document ]
[ patent document 1 ] Japanese patent laid-open publication No. 2018-147467
[ summary of the invention ]
[ technical problem to be solved by the invention ]
However, in an environment such as under a bridge or indoors, the unmanned aerial vehicle may not be able to receive the GPS signal, and in such a case, autonomous flight is difficult. Even when the GPS signal cannot be received, the position of the unmanned aerial vehicle can be estimated by integrating the velocity of the unmanned aerial vehicle, for example. However, when the position of the unmanned flying object is estimated by integrating the velocity of the unmanned flying object, an error (for example, an error of 2m per 10 m) is likely to occur, and the accuracy of the estimated position is not high enough. As a technique for detecting the position of an object, there is a technique of capturing the motion of the object by irradiating light to the object and capturing an image (motion capture). However, in the case of motion capture, tracking needs to be performed using a dedicated marker, and the usage scenarios are limited. In addition, there is also a technique in which a mobile object receives a beacon signal transmitted from a beacon and detects its own position. However, when a beacon signal is used, it is necessary to arrange beacons in various places, and the use scenarios are similarly limited.
[ MEANS FOR SOLVING PROBLEMS ] to solve the problems
In one aspect, a position estimation device is a position estimation device that estimates a position where a flying object exists, and includes a processing unit that performs processing related to estimating the position where the flying object exists, the processing unit acquiring a plurality of captured images obtained by capturing the flying object by a plurality of imaging devices, acquiring information of an arrangement position of each imaging device and an imaging direction of each imaging device, calculating an error score indicating an error between each assumed position and the position where the flying object exists based on the arrangement position of each imaging device, the imaging direction of each imaging device, each assumed position where the flying object exists in an actual space, and an image position where the flying object exists in each captured image, and estimating the position where the flying object exists in the actual space based on the error score.
The processing unit may acquire a plurality of captured images obtained by capturing the flying object at a plurality of times by a plurality of imaging devices, calculate an error score at the plurality of times, and estimate the position of the flying object in the real space based on the error score.
The processing unit may acquire a distance between the imaging device and the flying object, derive a reliability of an error score with respect to the captured image from the distance, and calculate the error score based on the reliability.
The processing unit may calculate the error score based on a difference between each of the projected assumed positions and each of the image positions, each of the projected assumed positions being a projected assumed position obtained by projecting each of the assumed positions assumed to have the flight object in the actual space onto each of the captured images captured in the arrangement position of each of the imaging devices and the imaging direction of each of the imaging devices, and each of the image positions being an image position where the flight object exists in each of the captured images captured in the arrangement position of each of the imaging devices and the imaging direction of each of the imaging devices, and estimate the assumed position of the flight object having the smallest error score as the presence position of the flight object.
The processing unit may acquire a first measurement position of the flight object measured by the positioning unit included in the flight object, and calculate the error score based on the first measurement position.
The Positioning unit may receive GPS signals from a plurality of GPS (Global Positioning System) satellites to acquire the first measurement position, and the processing unit may calculate the error score such that the greater the number of GPS satellites that the Positioning unit cannot receive, the greater the influence of the first measurement position on the error score.
The processing unit may calculate the error score based on a position where the flight vehicle can physically move.
The processing unit may derive a second measurement position of the flying object based on the acceleration measured by the acceleration measuring device included in the flying object, and calculate the error score based on the second measurement position.
In one aspect, a position estimation method is a position estimation method of estimating a position of a flying object, including the steps of: acquiring a plurality of captured images obtained by capturing a flying object by a plurality of imaging devices; acquiring information of arrangement positions of the camera devices and camera shooting directions of the camera devices; calculating an error score indicating an error between each of the assumed positions and a position where the flying object exists, based on an arrangement position of each of the imaging devices, an imaging direction of each of the imaging devices, each of the assumed positions where the flying object is assumed to exist in the actual space, and an image position where the flying object exists in each of the imaged images; and estimating the presence position of the flying body in the actual space based on the error score.
The step of acquiring a plurality of captured images may include the step of acquiring a plurality of captured images obtained by capturing the flying object at a plurality of times by a plurality of imaging devices. The step of calculating an error score may comprise the step of calculating an error score at a plurality of times. The step of estimating the presence position of the flying object may include a step of estimating the presence position of the flying object in the real space based on the error score.
The step of calculating an error score may comprise the steps of: acquiring the distance between a camera device and a flying object; deriving a reliability of an error score with respect to the captured image from the distance; and calculating an error score based on the reliability.
The step of calculating an error score may comprise the steps of: an error score is calculated based on a difference between each of projected assumed positions, which are projected assumed positions where a flight object is present in an actual space, on each of captured images captured in the arrangement position of each of the imaging devices and the imaging direction of each of the imaging devices, and each of image positions, which are image positions where the flight object is present in each of the captured images captured in the arrangement position of each of the imaging devices and the imaging direction of each of the imaging devices. The step of estimating the present position of the flying object may include a step of estimating an assumed position of the flying object with the smallest error score as the present position of the flying object.
The step of calculating an error score may comprise the steps of: acquiring a first measurement position of the flying object measured by a positioning unit included in the flying object; and calculating an error score based on the first measured position.
The step of acquiring a first measured position may include the step of receiving GPS signals from a plurality of GPS satellites to acquire a measured position. The step of calculating an error score may comprise the steps of: the error score is calculated such that the influence of the first measured position on the error score increases as the number of GPS satellites that the positioning unit cannot receive increases.
The step of calculating the error score may include a step of calculating the error score based on a position where the flight vehicle can physically move.
The step of calculating an error score may comprise the steps of: deriving a second measurement position of the flying object based on an acceleration measured by an acceleration measuring instrument included in the flying object; and an error score is calculated based on the second measured position.
In one aspect, the program is a program for causing a position estimation device that estimates a presence position of a flying object to execute: acquiring a plurality of captured images obtained by capturing a flying object by a plurality of imaging devices; acquiring information of arrangement positions of the camera devices and camera shooting directions of the camera devices; calculating an error score indicating an error between each of the assumed positions and a position where the flying object exists, based on an arrangement position of each of the imaging devices, an imaging direction of each of the imaging devices, each of the assumed positions where the flying object is assumed to exist in the actual space, and an image position where the flying object exists in each of the imaged images; and estimating the presence position of the flying body in the actual space based on the error score.
In one aspect, the recording medium is a computer-readable recording medium having recorded thereon a program for causing a position estimation device that estimates a presence position of a flying object to execute: acquiring a plurality of captured images obtained by capturing a flying object by a plurality of imaging devices; acquiring information of arrangement positions of the camera devices and camera shooting directions of the camera devices; calculating an error score indicating an error between each of the assumed positions and a position where the flying object exists, based on an arrangement position of each of the imaging devices, an imaging direction of each of the imaging devices, each of the assumed positions where the flying object is assumed to exist in the actual space, and an image position where the flying object exists in each of the imaged images; and estimating the presence position of the flying body in the actual space based on the error score.
Moreover, the summary above is not exhaustive of all features of the disclosure. Furthermore, sub-combinations of these feature sets may also constitute the invention.
[ description of the drawings ]
Fig. 1 is a diagram showing an example of an outline of an unmanned aerial vehicle system according to embodiment 1.
Fig. 2 is a diagram showing a hardware configuration of the flight control apparatus.
Fig. 3 is a diagram showing one example of a concrete appearance of the unmanned aerial vehicle.
Fig. 4 is a block diagram showing one example of a hardware configuration of the unmanned aerial vehicle.
Fig. 5 is a diagram illustrating a reprojection error.
Fig. 6 is a diagram showing a hardware configuration of the flight control device according to embodiment 2.
Fig. 7 is a diagram showing a hardware configuration of a flight control device according to embodiment 3.
[ detailed description ] embodiments
The present disclosure will be described below with reference to embodiments of the present invention, but the following embodiments do not limit the invention according to the claims. All of the combinations of features described in the embodiments are not necessarily essential to the inventive solution.
The claims, the specification, the drawings, and the abstract of the specification contain matters to be protected by copyright. The copyright owner would not make an objection to the facsimile reproduction by anyone of the files, as represented by the patent office documents or records. However, in other cases, the copyright of everything is reserved.
In the following embodiments, the flying object is exemplified by an Unmanned Aerial Vehicle (UAV). Unmanned aircraft include aircraft that move in the air. In the drawings of the present specification, the unmanned aerial vehicle is also expressed as "UAV". The position estimating device is, for example, a PC (Personal Computer), a server, a terminal, various processing devices, and a control device. The position estimation method defines the operation of the position estimation device. Further, a program (for example, a program for causing the position estimating apparatus to execute various processes) is recorded in the recording medium.
(embodiment mode 1)
Fig. 1 is a diagram showing an example of an outline of an unmanned aerial vehicle system 5 according to embodiment 1. The unmanned aerial vehicle 100 can use the unmanned aerial vehicle system 5 even in a situation where the unmanned aerial vehicle 100 is autonomously flying under a bridge or indoors, for example, in a place where the unmanned aerial vehicle 100 cannot receive or is difficult to receive a GPS signal. In addition, unmanned aerial vehicle system 5 is also used in the case where unmanned aerial vehicle 100 flies autonomously when unmanned aerial vehicle 100 does not include a GPS receiver. The unmanned aerial vehicle 100 is one example of a flight vehicle. The unmanned flight vehicle system 5 includes a plurality of cameras 300, a flight control device 800, and an unmanned aircraft 100. The camera 300 is one example of an image pickup apparatus. Flight control device 800 is one example of a position estimation device.
The plurality of cameras 300 are installed on different places on the ground or on water, and photograph the unmanned aircraft 100 from various directions. The plurality of cameras 300 may be fixed cameras, and the plurality of cameras 300 may be cameras whose image capturing positions and image capturing directions can be freely changed. The plurality of cameras 300 may be installed in the same place or area, or may be installed in different places or areas. The camera 300 may have a processing section, a communication section, and a storage section.
The flight control device 800 performs a process of tracking the unmanned aircraft 100 based on the images captured by the plurality of cameras 300. In order to derive the position where the unmanned aircraft 100 is estimated to be present, the flight control device 800 performs optimization processing so as to minimize an error score indicating an error between each of the assumed positions where the unmanned aircraft 100 is assumed to be present and the actual position where the unmanned aircraft 100 is present. In fig. 1, a situation in which a plurality of cameras 300 photograph an unmanned aircraft 100 that is flying autonomously is assumed. The unmanned aerial vehicle 100 is not limited to one, and a plurality of unmanned aerial vehicles may be provided.
Fig. 2 is a diagram showing a hardware configuration of the flight control apparatus 800. Flight control device 800 includes tracking front end 400, processing back end 500, and PID (Proportional-Integral-differential) control device 700. At least one of the processing unit of the tracking front end 400 and the processing unit of the processing rear end 500 is an example of the processing unit of the position estimating apparatus. The tracking front end 400, the processing back end 500, and the PID control device 700 may be provided on one device or may be provided in a plurality of devices in a distributed manner. The flight control device 800 may be a PC, a server, a terminal, various processing devices, and a control device as a whole or in part. Each unit (tracking front end 400, processing rear end 500, PID control device 700) of flight control device 800 may be installed in the same place or area, or may be installed in different places or areas.
The tracking front end 400 acquires the pixel positions on the frame (in the captured image) of the unmanned aircraft 100 captured in the captured images captured by the plurality of cameras 300, and transmits the pixel positions to the processing rear end 500. The tracking front end 400 is able to track the movement of the unmanned aerial vehicle 100 by detecting the pixel positions where the unmanned aerial vehicle 100 is located in time series, and is able to track the movement of the unmanned aerial vehicle 100 in the photographic image. In addition, the tracking front end 400 transmits information of the posture of each camera 300 to the processing rear end 500. The posture of each camera 300 can be specified by the arrangement position and the imaging orientation of each camera 300. The tracking front end 400 includes a communication unit 405, a processing unit 410, and a storage unit 420. Note that, when the orientation of each camera 300 is fixed, the data may be transmitted to the processing backend 500 at a time, and when the orientation of each camera 300 is variable, the data may be sequentially transmitted to the processing backend 500.
The processing unit 410 acquires, as the observation position, a pixel position on the frame of the unmanned aerial vehicle 100 captured by the plurality of cameras 300 at the same time and captured in each captured image. The processing portion 410 may acquire the pixel position of the unmanned aerial vehicle 100 at a frequency of 30fps or 60 fps. The processing unit 410 acquires information on the posture of each camera 300 from each camera 300, an external server, or the like. Further, when the posture of the camera 300 is fixed, the processing unit 410 may cause the storage unit 420 to store information on the posture of the camera 300.
The communication unit 405 communicates with the plurality of cameras 300 and the processing backend 500. The communication method is a dedicated line, a Local Area Network (LAN), a wireless LAN, mobile communication, or the like. The communication unit 405 receives a captured image from the plurality of cameras 300. The communication unit 405 transmits the observation position of the unmanned aircraft 100 captured in the captured images captured by the plurality of cameras 300 to the processing backend 500. Further, the communication unit 405 transmits information on the postures of the plurality of cameras 300 to the processing backend 500.
The storage section 420 may be used as a work memory of the processing section 410.
The process backend 500 performs optimization processing based on each pixel position (observation position) of the unmanned aerial vehicle 100 acquired from the tracking frontend 400. The processing backend 500 includes a communication unit 505, a processing unit 510, and a storage unit 520. The optimization process is, for example, a process for minimizing the value of the error score expressed by the equation (1) described later. Specifically, the optimization processing is processing for minimizing a difference (re-projection error) between a projected assumed position, which is a position when an assumed position where the unmanned aerial vehicle 100 is assumed to exist is projected onto the captured image, and an image position (observed position) actually captured in the captured image. The assumed position may also be arbitrarily changed in the 3-dimensional space. As a result of the optimization process, the assumed position of the unmanned aerial vehicle 100 at which the cost is the smallest can be set as the position (estimated position) at which the unmanned aerial vehicle 100 is estimated to exist. The reprojection error may be the sum of a plurality of captured images captured by the plurality of cameras 300, or may be the sum of a plurality of captured images captured in time series.
The processing unit 510 may estimate the position of the unmanned aircraft 100 based on the pixel position of the unmanned aircraft 100 captured in the plurality of captured images and the postures of the plurality of cameras 300. The processing section 510 may estimate the position of the unmanned aerial vehicle 100 based on the projected assumed positions and the observed positions with respect to the plurality of captured images and the postures of the plurality of cameras 300. That is, the processing portion 510 may estimate the position of the unmanned aerial vehicle 100 by optimizing (minimizing) the re-projection error from the tracking result of the unmanned aerial vehicle 100 by the camera 300.
The communication unit 505 communicates with the tracking tip 400 and the PID control device 700. The communication method is a dedicated line, a wired LAN, a wireless LAN, mobile communication, or the like. The communication unit 505 receives information of the observation position and the posture of each camera 300 from the tracking tip 400. The communication unit 505 transmits information of the estimated position of the optimized unmanned aircraft 100 to the PID control device 700. The information on the posture of each camera 300 may not be directly acquired from each camera 300, and may be stored in the storage unit 520 in advance, or may be acquired from an external server, for example.
The storage section 520 may be used as a work memory of the processing section 510.
PID control device 700 performs PID (P: general I: Integral D: Differential) control for flying unmanned aircraft 100 along the flight path based on the information of the estimated position of unmanned aircraft 100. The flight path may be a predetermined flight path. The PID control device 700 may generate information for at least a portion of the flight parameters of the flight of the unmanned aircraft 100 and transmit to the unmanned aircraft 100. The flight parameters include a flight position, a flight altitude, a flight speed, a flight acceleration (for example, acceleration in three axial directions of front and rear, left and right, and up and down) of the unmanned aerial vehicle 100, a pitch angle, a yaw angle, and a roll angle indicating the direction of the body. The PID control device 700 includes a communication unit 705, a processing unit 710, and a storage unit 720. The flight parameter may be data for filling a difference between a target state (e.g., a flight position, a flying altitude, a flying velocity, a flying acceleration, a pitch angle, a yaw angle, a roll angle, etc. of the unmanned aircraft 100 as a target) and an actual state (e.g., a current estimated flight position, a flying altitude, a flying velocity, a flying acceleration, a pitch angle, a yaw angle, a roll angle, etc. of the unmanned aircraft 100).
The processing unit 710 may perform PID control to generate flight parameters in order to approximate the estimated optimized position of the unmanned aerial vehicle 100 to a target position along the flight path.
The communication unit 705 communicates with the process backend 500 and the unmanned aerial vehicle 100. The communication unit 705 transmits the flight parameters generated by the processing unit 710 to the unmanned aircraft 100. As a communication method with the process backend 500 and the unmanned aerial vehicle 100, a dedicated line, a wired LAN, a wireless LAN, mobile communication, or the like is used.
The storage section 720 may be used as a work memory of the processing section 710. The storage unit 720 may store data such as a flight path, data of a target state related to the flight of the unmanned aircraft 100, and data of an actual state. Such data may be acquired, for example, from the unmanned aerial vehicle 100 or a terminal that instructs control of the flight of the unmanned aerial vehicle 100 via the communication section 705.
Fig. 3 is a diagram showing one example of a concrete appearance of the unmanned aerial vehicle 100. In fig. 3, a perspective view of the unmanned aerial vehicle 100 is shown when flying in the moving direction STV 0. The unmanned aerial vehicle 100 is one example of a mobile body.
As shown in fig. 3, the roll axis (refer to the x-axis) is set in a direction parallel to the ground and along the moving direction STV 0. In this case, the pitch axis (see the y axis) is set in the direction parallel to the ground and perpendicular to the roll axis, and the yaw axis (see the z axis) is set in the direction perpendicular to the ground and perpendicular to the roll axis and the pitch axis.
The unmanned aerial vehicle 100 includes a UAV main body 102, a universal joint 200, an image pickup unit 220, and a plurality of image pickup units 230.
The UAV body 102 includes a plurality of rotors (propellers). UAV body 102 flies unmanned aircraft 100 by controlling the rotation of the plurality of rotors. UAV body 102 uses, for example, four rotors to fly unmanned aircraft 100. The number of rotors is not limited to four. Further, the unmanned aerial vehicle 100 may be a fixed wing aircraft without rotors.
The imaging unit 220 is an imaging camera that images an object (for example, an overhead scene as an aerial image capture target, a scene such as a mountain or a river, or a building on the ground) included in a desired imaging range.
The plurality of imaging units 230 are sensing cameras that capture images of the surroundings of the unmanned aircraft 100 in order to control the flight of the unmanned aircraft 100. The 2 cameras 230 may be provided on the nose, i.e., the front surface, of the unmanned aircraft 100. The other 2 image pickup units 230 may be provided on the bottom surface of the unmanned aircraft 100. The two image pickup portions 230 on the front side may be paired to function as a so-called stereo camera. The 2 image pickup portions 230 on the bottom surface side may also be paired to function as a stereo camera. Three-dimensional spatial data around the unmanned aerial vehicle 100 may be generated based on the images captured by the plurality of cameras 230. In addition, the number of the image pickup units 230 included in the unmanned aerial vehicle 100 is not limited to four. The unmanned aerial vehicle 100 may include at least one image pickup unit 230. The unmanned aircraft 100 may include at least one camera 230 at the nose, tail, sides, bottom, and top of the unmanned aircraft 100, respectively. The angle of view settable in the image pickup section 230 may be larger than the angle of view settable in the image pickup section 220. The image pickup section 230 may have a single focus lens or a fisheye lens.
Fig. 4 is a block diagram showing one example of the hardware configuration of the unmanned aerial vehicle 100. The unmanned aerial vehicle 100 includes a UAV control Unit 110, a communication interface 150, a memory 160, a memory 170, a universal joint 200, a rotor mechanism 210, a camera Unit 220, a camera Unit 230, a GPS receiver 240, an Inertial Measurement Unit (IMU) 250, a magnetic compass 260, an air pressure altimeter 270, an ultrasonic sensor 280, and a laser Measurement instrument 290. The GPS receiver 240 is an example of a position measurement section.
In addition, here, it is assumed that the unmanned aircraft 100 includes the GPS receiver 240, but the accuracy of the GPS signal received by the GPS receiver 240 is low. Additionally, it is also contemplated that the unmanned aircraft 100 may not include the GPS receiver 240 and may not be able to acquire GPS signals at all.
The UAV control Unit 110 is constituted by, for example, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or a DSP (Digital Signal Processor). The UAV control unit 110 performs signal processing for controlling the operation of each unit of the unmanned aircraft 100 as a whole, input/output processing of data with other units, arithmetic processing of data, and storage processing of data.
The UAV controller 110 controls the flight of the unmanned aircraft 100 according to a program stored in the memory 160. The UAV control 110 may take aerial images. The UAV control 110 may obtain information of flight parameters from the PID control 700 via the communication interface 150. The UAV control 110 may control the flight of the unmanned aerial vehicle 100 based on the acquired flight parameters.
The UAV control 110 acquires position information indicating a position of the unmanned aerial vehicle 100. The UAV controller 110 may obtain, from the GPS receiver 240, location information indicating the latitude, longitude, and altitude at which the unmanned aircraft 100 is located. The UAV control unit 110 may acquire latitude and longitude information indicating the latitude and longitude where the unmanned aircraft 100 is located from the GPS receiver 240, and may acquire altitude information indicating the altitude where the unmanned aircraft 100 is located from the barometric altimeter 270 as position information. The UAV control section 110 may acquire a distance between a radiation point of the ultrasonic wave generated by the ultrasonic sensor 280 and a reflection point of the ultrasonic wave as the altitude information.
The UAV control 110 may obtain orientation information from the magnetic compass 260 that represents the orientation of the unmanned aircraft 100. The orientation information may be represented by, for example, a bearing corresponding to the orientation of the nose of the unmanned aircraft 100.
The UAV control unit 110 may acquire position information indicating a position where the unmanned aircraft 100 should exist when the imaging unit 220 performs imaging in accordance with the imaging range of the imaging. The UAV control 110 may obtain from the memory 160 location information indicating where the unmanned aircraft 100 should be. The UAV control 110 may obtain, via the communication interface 150, location information from other devices that indicates a location where the unmanned aerial vehicle 100 should be present. The UAV control unit 110 may determine a position where the unmanned aircraft 100 can exist by referring to the three-dimensional map database, and acquire the position as position information indicating a position where the unmanned aircraft 100 should exist. The UAV control 110 may send, via the communication interface 150, position information to the flight control 800 (e.g., the process backend 500) indicating a location where the unmanned aircraft 100 should be present. The position information indicating the position where the unmanned aerial vehicle 100 should exist may be included in the information of the predetermined flight path.
The UAV control unit 110 can acquire imaging range information indicating imaging ranges of the imaging unit 220 and the imaging unit 230. The UAV control unit 110 may acquire, as a parameter for determining an imaging range, angle-of-view information indicating angles of view of the imaging unit 220 and the imaging unit 230 from the imaging unit 220 and the imaging unit 230. The UAV control unit 110 may acquire information indicating the imaging directions of the imaging unit 220 and the imaging unit 230 as a parameter for determining the imaging range. The UAV control unit 110 may acquire, for example, attitude information indicating an attitude state of the imaging unit 220 from the universal joint 200 as information indicating an imaging direction of the imaging unit 220. The attitude information of the imaging unit 220 may indicate the angle at which the pitch axis and the yaw axis of the gimbal 200 are rotated from the reference rotation angle.
The UAV control unit 110 may acquire, as a parameter for determining the imaging range, estimated position information indicating an estimated position where the unmanned aerial vehicle 100 is located. The UAV control unit 110 may obtain imaging range information by defining an imaging range indicating a geographical range to be imaged by the imaging unit 220, based on the angles of view and the imaging directions of the imaging unit 220 and the imaging unit 230 and the estimated position of the unmanned aircraft 100, and generating imaging range information.
The UAV control unit 110 may acquire imaging range information from the memory 160. The UAV control section 110 may acquire imaging range information via the communication interface 150.
The UAV control unit 110 controls the universal joint 200, the rotor mechanism 210, the imaging unit 220, and the imaging unit 230. The UAV control unit 110 may control an imaging range of the imaging unit 220 by changing an imaging direction or an angle of view of the imaging unit 220. The UAV control unit 110 can control the imaging range of the imaging unit 220 supported by the gimbal 200 by controlling the rotation mechanism of the gimbal 200.
The imaging range refers to a geographical range imaged by the imaging unit 220 or the imaging unit 230. The imaging range is defined by latitude, longitude, and altitude. The imaging range may be a range of three-dimensional spatial data defined by latitude, longitude, and altitude. The imaging range may be a range of two-dimensional spatial data defined by latitude and longitude. The imaging range may be specified according to the angle of view and the imaging direction of the imaging unit 220 or the imaging unit 230, and the position where the unmanned aerial vehicle 100 is located. The imaging direction of the imaging section 220 and the imaging section 230 may be defined by the azimuth and depression angle that the front of the imaging lens in which the imaging section 220 and the imaging section 230 are provided faces. The imaging direction of the imaging section 220 may be a direction determined by the orientation of the nose of the unmanned aerial vehicle 100 and the attitude state of the imaging section 220 with respect to the gimbal 200. The imaging direction of the imaging section 230 may be a direction determined by the orientation of the nose of the unmanned aerial vehicle 100 and the position where the imaging section 230 is provided. The imaging direction may coincide with the imaging orientation.
The UAV control 110 may determine the environment around the unmanned aircraft 100 by analyzing a plurality of images captured by the plurality of cameras 230. The UAV control 110 may control flight based on the environment surrounding the unmanned aircraft 100, such as avoiding obstacles.
The UAV control unit 110 can acquire stereo information (three-dimensional information) indicating a stereo shape (three-dimensional shape) of an object existing around the unmanned aircraft 100. The object may be, for example, a part of a landscape of a building, a road, a vehicle, a tree, etc. The stereo information is, for example, three-dimensional spatial data. The UAV control unit 110 may acquire the stereoscopic information by generating stereoscopic information indicating a stereoscopic shape of an object existing around the unmanned aircraft 100 from each of the images obtained from the plurality of imaging units 230. The UAV control unit 110 may acquire the stereoscopic information indicating the stereoscopic shape of the object existing around the unmanned aircraft 100 by referring to the three-dimensional map database stored in the memory 160 or the memory 170. The UAV control section 110 can acquire the stereoscopic information relating to the stereoscopic shape of the object existing around the unmanned aircraft 100 by referring to the three-dimensional map database managed by the server existing on the network.
UAV control 110 controls the flight of unmanned aircraft 100 by controlling rotor mechanism 210. That is, the UAV controller 110 controls the position including the latitude, longitude, and altitude of the unmanned aerial vehicle 100 by controlling the rotor mechanism 210. The UAV control unit 110 may control the imaging range of the imaging unit 220 by controlling the flight of the unmanned aerial vehicle 100. The UAV control unit 110 may control an angle of view of the image pickup unit 220 by controlling a zoom lens included in the image pickup unit 220. The UAV control unit 110 may control an angle of view of the image capturing unit 220 by digital zooming using a digital zoom function of the image capturing unit 220.
When the imaging unit 220 is fixed to the unmanned aircraft 100 and the imaging unit 220 cannot be moved, the UAV control unit 110 may cause the imaging unit 220 to capture an image of a desired imaging range in a desired environment by moving the unmanned aircraft 100 to a predetermined position on a predetermined date. Alternatively, even when the imaging unit 220 does not have the zoom function and the angle of view of the imaging unit 220 cannot be changed, the UAV control unit 110 may cause the imaging unit 220 to capture an image of a desired imaging range in a desired environment by moving the unmanned aerial vehicle 100 to a specific position on a specific date.
The communication interface 150 communicates with other communication devices, such as terminals or transmitters (remote controls) that indicate flight controls of the unmanned aircraft 100. The communication interface 150 may perform wireless communication by any wireless communication method. The communication interface 150 may perform wired communication by any wired communication method. The communication interface 150 can transmit the aerial image, additional information (metadata) related to the aerial image to the terminal. Communication interface 150 may communicate with at least one device included in flight control device 800 (e.g., process backend 500, PID control device 700).
The memory 160 stores programs and the like necessary for the UAV control unit 110 to control the universal joint 200, the rotor mechanism 210, the imaging unit 220, the imaging unit 230, the GPS receiver 240, the inertial measurement unit 250, the magnetic compass 260, the barometric altimeter 270, the ultrasonic sensor 280, and the laser meter 290. The Memory 160 may be a computer-readable recording medium and may include at least one of flash memories such as an SRAM (Static Random Access Memory), a DRAM (Dynamic Random Access Memory), an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programmable Read-Only Memory), and a USB (Universal Serial Bus) Memory. The memory 160 may be detached from the unmanned aircraft 100. The memory 160 may operate as a working memory.
The memory 170 may include at least one of an HDD (Hard Disk Drive), an SSD (Solid State Drive), an SD card, a USB memory, and other memories. The memory 170 may store various information and various data. The memory 170 may be detachable from the unmanned aircraft 100. The memory 170 may record aerial images.
The memory 160 or the storage 170 may store information of the aerial position, the aerial route (flight route) generated by the terminal or the unmanned aircraft 100. The information of the aerial position and the aerial route may be set by the UAV controller 110 as one of the aerial parameters related to aerial photography predetermined by the unmanned aircraft 100 or the flight parameters related to flight predetermined by the unmanned aircraft 100. The setting information may be stored in the memory 160 or the storage 170.
The gimbal 200 may support the camera 220 rotatably about a yaw axis, a pitch axis, and a roll axis. The gimbal 200 can rotate the imaging unit 220 around at least one of the yaw axis, pitch axis, and roll axis, thereby changing the imaging direction of the imaging unit 220.
Rotor mechanism 210 has a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors. The rotary wing mechanism 210 is controlled to rotate by the UAV control unit 110, thereby flying the unmanned aerial vehicle 100.
The image pickup unit 220 picks up an image of an object in a desired image pickup range and generates data of a picked-up image. Image data (for example, an aerial image) captured by the imaging unit 220 may be stored in the memory of the imaging unit 220 or the memory 170.
The imaging unit 230 captures an image of the periphery of the unmanned aircraft 100 and generates data of a captured image. The image data of the image pickup section 230 may be stored in the memory 170.
The GPS receiver 240 receives a plurality of signals (GPS signals) indicating the times of day transmitted from a plurality of navigation satellites (i.e., GPS satellites) and the positions (coordinates) of the respective GPS satellites. The GPS receiver 240 calculates the position of the GPS receiver 240 (i.e., the position of the unmanned aircraft 100) based on the plurality of received signals. The GPS receiver 240 outputs the position information of the unmanned aerial vehicle 100 to the UAV control section 110. In addition, the UAV controller 110 may calculate the position information of the GPS receiver 240 instead of the GPS receiver 240. In this case, information indicating the time and the position of each GPS satellite included in the plurality of signals received by the GPS receiver 240 is input to the UAV control unit 110.
The inertial measurement unit 250 detects the attitude of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110. The inertial measurement unit 250 can detect the three-axis acceleration and the three-axis angular velocity of the pitch axis, the roll axis, and the yaw axis of the unmanned aerial vehicle 100 in the front-rear direction, the left-right direction, and the up-down direction, as the attitude of the unmanned aerial vehicle 100.
The magnetic compass 260 detects the orientation of the nose of the unmanned aerial vehicle 100, and outputs the detection result to the UAV control section 110.
The barometric altimeter 270 detects the flying height of the unmanned aircraft 100, and outputs the detection result to the UAV control unit 110.
The ultrasonic sensor 280 emits ultrasonic waves, detects the ultrasonic waves reflected by the ground or an object, and outputs the detection result to the UAV control unit 110. The detection result may show the distance from the unmanned aerial vehicle 100 to the ground, i.e., the altitude. The detection result may show the distance from the unmanned aerial vehicle 100 to the object (subject).
The laser measurement instrument 290 irradiates a laser beam on an object, receives reflected light reflected by the object, and measures the distance between the unmanned aircraft 100 and the object (subject) by the reflected light. As an example of the laser-based distance measuring method, a time-of-flight method may be cited.
In addition, although fig. 4 illustrates a case where the unmanned aircraft 100 includes the GPS receiver 240, the unmanned aircraft 100 may not include the GPS receiver 240. Even in this case, the unmanned aerial vehicle 100 can perform flight control based on the estimated position of the unmanned aerial vehicle 100, regardless of whether the GPS receiver 240 is provided or not.
Next, the operation of assisting the autonomous flight of the unmanned aerial vehicle 100 in the unmanned aerial vehicle system 5 is shown. Here, a situation is assumed in which the unmanned aircraft 100 flies in a space where it is difficult to receive the GPS signal, for example, below a bridge or indoors. That is, the accuracy of positioning of the unmanned aircraft 100 is assumed to be low. Note that the unmanned aircraft 100 may not include the GPS receiver 240, that is, the unmanned aircraft 100 may not have a positioning function. The plurality of cameras 300 are disposed in various places such as the ground where the unmanned aerial vehicle 100 can be imaged.
Fig. 5 is a diagram illustrating a reprojection error. In fig. 5, the actual position of the unmanned aerial vehicle 100, the assumed position of the unmanned aerial vehicle 100 are shownText (also denoted as x)t) The assumed projection position pi (cj, xt) on the projection plane of the captured images of the plurality of cameras 300, and the observation position ojt. In addition, j and t are variables, and it is assumed that the position of the position xt is variable.
The plurality of cameras 300 each take a picture of the unmanned aircraft 100 flying, for example, in the sky. In the captured image captured by one camera, the pixel position of the unmanned aircraft 100 on the frame can be recognized in a planar manner, but it is difficult to recognize the position in the depth direction with respect to the image plane (projection plane) of the captured image. Therefore, a plurality of cameras 300 capable of photographing the unmanned aerial vehicle 100 from different angles are provided.
In fig. 5, a camera pose cj (also denoted as c) is shownj) The camera 300 of (1) and a frame GM1 of a camera image shot by the camera 300 of the camera pose cj-1 and a frame GM2 of a camera image shot by the camera 300 of the camera pose cj-1. In the frame GM1, there is an observation position ojt (also referred to as o) which is a pixel position obtained by observing the unmanned aerial vehicle 100jt) And a projected assumed position pi (cj, xt) obtained by projecting the assumed position xt of the unmanned aircraft 100. The observed position ojt and the projection assumed position pi (cj, xt) sometimes do not coincide, and a reprojection error as a difference is generated therebetween. Similarly, in the frame GM2, there are an observed position o (j-1) t obtained by the unmanned aircraft 100 and a projected assumed position pi (cj-1, xt) obtained by projecting the assumed position xt of the unmanned aircraft 100. The observed position o (j-1) t may not coincide with the projected assumed position pi (cj-1, xt), and a reprojection error may occur. The observed positions ojt, o (j-1) t are positions where the actually observed positions are projected and are fixed positions, but the projected assumed positions pi (cj, xt), pi (cj-1, xt) are variable positions. The process backend 500 optimizes the re-projection errors so as to reduce these re-projection errors, optimizing the estimated position of the unmanned aircraft 100. In addition, j is an example of identification information of the camera, and t is an example of time.
The communication section 405 of the tracking front end 400 can receive the captured images captured by the plurality of cameras 300 and store them in the storage section 420. The communication unit 405 may transmit the captured images captured by the plurality of cameras 300, the observation positions ojt, o (j-1) t,. and the camera postures cj, cj-1,. to the processing backend 500. The communication unit 505 of the process backend 500 may receive the captured image, the observation position of the unmanned aircraft 100, and the information on the camera attitude, and store them in the storage unit 520. The communication unit 405 may not transmit the captured image to the processing backend 500.
The communication unit 505 of the processing backend 500 receives information of the observation positions ojt, o (j-1) t,. and information of the camera postures cj, cj-1,. of the unmanned aircraft 100. The communication unit 505 may receive the captured image. Further, the processing unit 510 assumes the three-dimensional position of the unmanned aircraft 100 at the time t at various positions to acquire an assumed position xt, and derives (for example, calculates) projected assumed positions pi (cj, xt), pi (cj-1, xt), and. For example, the processing section 510 may calculate a projected assumed position of the camera 300 based on an assumed position and a camera posture as a three-dimensional position. The processing unit 510 performs optimization processing according to expression (1) based on the observed positions ojt, o (j-1) t,. and the projected assumed positions pi (cj, xt), pi (cj-1, xt),. and so on, optimizes the assumed position of the unmanned aircraft 100, and estimates the position of the unmanned aircraft 100.
[ mathematical formula 1 ]
argminxtjwjtp(π(cj,xt)-ojt) ......(1)
The argmin function shown in formula (1) is a function in which x having the smallest function value is transferred as an argument. The function value of the argmin function represents an error score for representing an error between each assumed position xt and the existing position of the unmanned aircraft 100, and represents a function for minimizing a reprojection error. cj represents the camera pose. The camera pose may be determined by a photographing position and a photographing direction (orientation). j (j ═ 1.. times, n) may be a camera number used to identify the plurality of cameras 300. xt represents the three-dimensional position of the unmanned aircraft 100 in real space at time t and is assumed to be various positions. t represents the time. On the frame (imaging surface, projection surface) of the camera 300 of the camera pose cj, a projected assumed position pi (cj, xt) of the unmanned aircraft 100 and an observation position ojt of the unmanned aircraft 100 captured by the camera 300 of the camera pose cj are projected. Pi (cj, xt) represents a projected assumed position obtained by projecting the unmanned aerial vehicle 100 assumed to be the position xt onto the frame of the camera 300 in the camera pose cj. ojt represents an observation position that is a pixel position of the unmanned aircraft 100 in the frame of the captured image captured at time t by the camera 300 having the camera posture cj. p (pi (cj, xt) -ojt) is a function that integrates the reprojection error as the difference between the projected assumed position and the observed position. Therefore, the processing unit 510 may search for the assumed position x where the reprojection error is minimized while changing the assumed position xt according to equation (1) to estimate the position of the unmanned aircraft 100.
In fig. 5, in the frame of the camera 300 of the camera pose cj, a projected assumed position pi (cj, xt) and an observed position ojt are shown. Likewise, in the frame of camera 300 for camera pose cj-1, the projected assumed position π (cj-1, xt) and the observed position o (j-1) t are shown.
Wjt (also denoted as w)jt) Is a coefficient indicating reliability. The reliability wjt is calculated according to equation (2). In addition, the reliability may be omitted from the formula (1). In equation (2), the case where the reliability is determined by the distance between the camera 300 and the unmanned aircraft 100 is exemplified.
[ mathematical formula 2 ]
Figure PCTCN2019113651-APPB-000001
Here, ω jt (also referred to as ω j)jt) The fixed value determined by the camera number j and the time t may be a value for adjusting the range of values derived from the expressions (1) and (2). ds is a fixed value. d represents the distance between the unmanned aerial vehicle 100 and the camera 300. For example, the camera 300 may determine the unmanned aircraft 100 in the captured image by image recognition or the like and derive from the size of the unmanned aircraft 100 (for exampleSuch as calculated) to the unmanned aircraft 100. In addition, the camera 300 may have a ranging sensor with which the distance to the unmanned aerial vehicle 100 is measured. The shorter the distance d, the higher the reliability wjt. When the unmanned aircraft 100 is not captured in the captured image, the reliability Wjt may be set to a value of 0, and the assumed position xt of the unmanned aircraft 100 in this case may not be used as the estimated position. The distance d may be derived by the tracking front end 400 or the processing back end 500. For example, the processing unit 510 of the processing backend 500 may derive (e.g., calculate) the distance d based on the distance between the assumed position xt and each camera 300. The information of the distance d may be notified from the camera 300, the tracking front end 400 to the processing back end 500.
As described above, the argmin function is a function obtained by adding the minimum assumed position xt among values obtained by multiplying the difference (re-projection error) between the projected assumed position pi (cj, xt) and the observed position ojt of the unmanned aircraft 100 by the reliability Wjt and the normalization coefficient p over the entire observation time t by all the cameras 300 (cameras having camera postures C1 to Cn) as the estimated position. n may be any value representing the total number of cameras.
Estimating the position of the unmanned aircraft 100 using the argmin function shown in equation (1) is an example of estimating the position of the unmanned aircraft 100, and the position may be estimated by another method. Deriving the reliability wjt according to equation (2) is an example of reliability derivation, and reliability may be derived by other methods.
Communication unit 505 of process backend 500 transmits the data of the estimated position of unmanned aircraft 100 optimized by processing unit 510 to PID control device 700. The communication unit 705 of the PID control device 700 receives data of the estimated position of the unmanned aircraft 100 and stores the data in the storage unit 720. The processing unit 710 of the PID control device 700 performs PID control for flying the unmanned aircraft 100 along the target flight path stored in the storage unit 720, based on the estimated position of the unmanned aircraft 100. The communication unit 705 transmits the flight parameters obtained by the PID control to the unmanned aircraft 100.
Upon receiving the flight parameters from PID control device 700, communication interface 150 of unmanned aircraft 100 stores them in memory 160. The UAV control 110 controls the rotor mechanism 210 according to the flight parameters, controls the flight of the unmanned aircraft 100, and continues autonomous flight.
In this way, the unmanned aerial vehicle system 5 includes a flight control device 800 (an example of a position estimation device) that estimates the presence position of the unmanned aerial vehicle 100. The flight control device 800 includes a processing unit (for example, at least one of the processing unit 410 for tracking the front end 400 and the processing unit 510 for processing the rear end 500) that performs processing related to the estimation of the presence position of the unmanned aerial vehicle 100. The processing unit 410 of the tracking tip 400 may acquire a plurality of captured images of the unmanned aerial vehicle 100 captured by the plurality of cameras 300. The processing section 410 of the tracking front end 400 can acquire information of the posture (arrangement position and imaging direction of each camera 300) of each camera 300. The processing unit 510 of the processing backend 500 may calculate an error score (for example, a derived value of the argmin function) indicating an error between each assumed position xt and the existing position of the unmanned aircraft 100 based on the attitude of the camera 300, each assumed position where the unmanned aircraft 100 is assumed to exist in the real space, and the observed position of the unmanned aircraft 100 projected onto each captured image (one example of the image position where the unmanned aircraft 100 exists). The process backend 500 may estimate the location of the presence of the unmanned aerial vehicle 100 in real space based on the error score. In addition, the process backend 500 alone may be an example of a position estimation device.
Thus, the flight control device 800 can estimate the position of the unmanned aircraft 100 by taking into account not only the position along the plane of the imaging plane (projection plane) of the captured image but also the position in the depth direction by capturing the image of the unmanned aircraft 100 using the plurality of cameras 300 in various postures (arrangement position, imaging direction). Therefore, even if the unmanned aircraft 100 cannot receive the GPS signal, the position of the unmanned aircraft 100 can be estimated. Therefore, even when the unmanned aircraft 100 cannot receive the GPS signal, the unmanned aircraft 100 can autonomously fly based on the estimated position of the unmanned aircraft 100. Further, the unmanned aircraft 100 can fly autonomously even when the unmanned aircraft does not have a positioning function by a GPS signal or when the accuracy of positioning by a GPS signal is low. Further, since flight control device 800 does not need to detect the acceleration of unmanned aircraft 100 and perform double integration (2-time integration) of the acceleration to estimate the position, unmanned aircraft position can be estimated with high accuracy without generating an error due to double integration. Further, since the flight control device 800 does not need to perform position estimation using motion capture or a beacon signal, it is possible to prevent the location where the unmanned aircraft 100 can be used from being limited.
The processing unit 410 of the tracking tip 400 may acquire a plurality of captured images of the unmanned aircraft 100 captured by the plurality of cameras 300 at a plurality of times t. The processing backend 500 may calculate error scores at multiple times t. The processing section 510 of the processing backend 500 can estimate the presence position of the unmanned aircraft 100 in the real space based on the error score.
Thus, the flight control device 800 can estimate the position of the unmanned aircraft 100 at a plurality of times t (time points), and therefore can also estimate the position of the unmanned aircraft 100 in consideration of the movement of the unmanned aircraft 100. Therefore, the estimation accuracy of the position of the unmanned aerial vehicle 100 is improved.
In addition, the processing section 510 of the process backend 500 may acquire the distance d between the camera 300 and the unmanned aircraft 100. The processing unit 510 may derive the reliability wjt of the error score with respect to the captured image from the distance d. The processing portion 510 may calculate an error score based on the reliability wjt.
Thus, if the distance d from the camera 300 to the unmanned aircraft 100 is long, that is, if the unmanned aircraft 100 is far away, the accuracy of estimating the position of the unmanned aircraft 100 by the processing unit 510 decreases, and the processing unit 510 can estimate the position of the unmanned aircraft 100 from this point as well, taking into account the distance d of the unmanned aircraft 100. For example, the shorter the distance d, the greater the reliability wjt, and the longer the distance d, the smaller the reliability wjt. Therefore, the flight control device 800 can improve the accuracy of estimating the position of the unmanned aircraft 100 using the error score.
Further, the processing unit 510 may calculate an error score based on a difference (re-projection error) between each of the projected assumed positions pi (cj, xt) where it is assumed that the unmanned aircraft 100 exists in the actual space and each of the assumed positions xt where each of the assumed positions xt is projected onto each of the captured images captured in the arrangement position of each of the cameras 300 and the imaging direction (camera orientation cj) of each of the cameras 300, and each of the observation positions ojt, for each of the captured images captured in the arrangement position of each of the cameras 300 and the imaging direction (camera orientation cj) of each of the cameras 300; the observation position is a position where the unmanned aerial vehicle 100 exists in each captured image captured at the arrangement position of each camera 300 and the imaging direction of each imaging device. The processing unit 510 may estimate the assumed position xt of the unmanned aircraft 100 having the smallest error score as the existing position of the unmanned aircraft 100. For example, the processing unit 510 may optimize the error score according to equation (1).
Thus, the flight control device 800 can derive an error score based on the difference between the projected assumed position projected onto the captured image captured by each camera 300 and the image position (observed position). Then, for example, the flight control device 800 can obtain an error score at each time t, and estimate the assumed position xt at which the error score is minimum as the existing position of the unmanned aircraft 100. Therefore, the processing unit 510 can optimize the estimated position of the unmanned aerial vehicle 100 based on the images captured by the plurality of cameras 300, for example, according to equation (1).
(embodiment mode 2)
In embodiment 2, a case is shown in which GPS position information detected by the unmanned aerial vehicle 100 is further considered in embodiment 1 to optimize the estimated position of the unmanned aerial vehicle 100.
Fig. 6 is a diagram showing a hardware configuration of a flight control device 800A according to embodiment 2. The unmanned aerial vehicle system 5A according to embodiment 2 has substantially the same configuration as that of embodiment 1. The same reference numerals are used for the same components as those in embodiment 1, and the description thereof will be omitted.
The unmanned aerial vehicle 100 includes the GPS receiver 240 and is not omitted. The UAV control unit 110 of the unmanned aircraft 100 acquires GPS position information of the unmanned aircraft 100 based on the GPS signal received by the GPS receiver 240. The UAV control 110 transmits GPS location information to the flight control apparatus 800A via the communication interface 150.
The communication unit 505A of the process backend 500A receives GPS position information from the unmanned aerial vehicle 100. Further, the communication unit 505A receives captured images captured by the plurality of cameras 300, information on the observation position of the unmanned aircraft 100, and information on the camera posture from the tracking tip 400, as in embodiment 1. In addition, the captured image may not be received.
The processing unit 510A of the processing backend 500A considers the GPS position information received via the communication unit 505A when performing optimization of the reprojection error. That is, processing unit 510A may optimize the estimated position of unmanned aircraft 100 according to equation (3).
The argmin function shown in equation (3) is a function that transfers X having the smallest function value as an argument as in equation (1), and includes a term λ GG (X, Z) related to GPS position information. λ GG (X, Z) is a score that takes into account the GPS signal.
[ mathematical formula 3 ]
argminx(∑tjwjtp(π(cj,xt)-ojt)+λGG(X,Z)) ......(3)
λ G (also described as λG) As shown in formula (4).
λG=Ca×(Cn-N) ......(4)
Here, Cn is the total number of GPS satellites. N is the number of GPS satellites used by the GPS receiver 240 in signal reception. Thus, Cn-N represents the number of GPS satellites not used by the GPS receiver 240 in signal reception. Ca is a coefficient. According to equation (4), the larger the number of GPS used by the GPS receiver 240 in signal reception, the smaller the value of λ G. In addition, λ G may be a value corresponding to the strength of the GPS signal. The value of λ G may be decreased when the intensity of the GPS signal is large, and may be increased when the intensity of the GPS signal is small. In this case, λ GG (X, Z) in equation (3) is greatly affected by the value of G (X, Z), and GPS position information can be further reflected. That is, when the intensity of the GPS signal is high, the reliability of the GPS signal is high, and therefore the value of G (X, Z) may greatly affect the value of expression (3).
G (X, Z) is a value corresponding to the sum of differences between all assumed positions xt and GPS positions of the unmanned aircraft 100, and is expressed by equation (5).
[ mathematical formula 4 ]
G(X,Z)=∑(xt-ζ)2......(5)
Where ζ represents a GPS position (position detected by the GPS receiver 240). Assume that the larger the difference between location xt and GPS location ζ, the larger G (X, Z).
In this way, the assumed position xt of the unmanned aircraft 100 is optimized by the argmin function so that the value (re-projection error) in parentheses of the argmin function (in () argminx ()) becomes small. That is, the estimated position of the unmanned aircraft 100 indicated by the argmin function is an assumed position at which the value in parentheses of the argmin function is the smallest.
Communication unit 505A transmits the estimated position of unmanned aircraft 100 optimized by processing unit 510A to PID control device 700. The PID control device 700 generates flight parameters of the unmanned aircraft 100 based on the estimated position of the optimized unmanned aircraft 100, and transmits the flight parameters to the unmanned aircraft 100. The operation of the unmanned aerial vehicle 100 is the same as that of embodiment 1.
In the unmanned aerial vehicle system 5A according to embodiment 2, the processing unit 510A of the processing backend 500A optimizes the reprojection error so that the value of the argmin function of the equation (3) is equal to or less than (e.g., minimum) the threshold value th2 in consideration of the GPS position information, and estimates the existing position of the unmanned aerial vehicle 100.
In this way, the processing section 510A of the processing backend 500 can acquire a GPS position (one example of the first measurement position) measured by the GPS receiver 240 (one example of the positioning section) included in the unmanned aircraft 100. The processing unit 510A may calculate an error score, for example, as shown in equation (3), based on the GPS position.
Thus, flight control device 800A can calculate an error score in consideration of a GPS signal that is generally used for positioning in unmanned aircraft 100, and optimize the estimated position of unmanned aircraft 100. Therefore, even if the accuracy of the GPS receiver 240 included in the unmanned aircraft 100 is low, the flight control device 800A can estimate the position of the unmanned aircraft 100 together with the GPS signal based on the same error score as that of embodiment 1, and can improve the position estimation accuracy. Thus, even when the positioning accuracy of the GPS receiver 240 is low, the flight control device 800A can estimate the position of the unmanned aircraft 100 using the error score, thereby assisting the positioning by the GPS receiver 240.
In addition, the GPS receiver 240 may receive GPS signals from a plurality of GPS satellites to acquire information of GPS positions. The processing unit 510A may calculate the error score so that the greater the number of GPS satellites that can be received by the GPS receiver 240, the more GPS position information can be reflected, that is, the greater the influence of the GPS signals on the error score. For example, in λ G · G (X, Z) of equation (3), as the number of GPS used for signal reception by the unmanned aircraft 100 increases, the value of the coefficient λ G decreases, and the influence of the value of G (X, Z) increases, and GPS position information can be reflected more.
Thus, the flight control device 800A can adjust the influence of the GPS signal value on the error score based on the reliability of the GPS signal (the intensity of the GPS signal), calculate the error score, optimize the reprojection error, and estimate the position of the unmanned aircraft 100. Therefore, the flight control device 800A estimates the position of the unmanned aircraft 100 in consideration of the reception state of the GPS, and therefore the position estimation accuracy of the unmanned aircraft 100 can be further improved. Thus, the flight control device 800A can correct the estimation result of the position of the unmanned aircraft 100 using the error score of embodiment 1, based on the reliability of the GPS signal.
(embodiment mode 3)
In embodiment 3, a case is shown in which the estimated position of the unmanned aircraft 100 is optimized in embodiment 2 by further considering the estimated position of the unmanned aircraft 100 based on the dynamic factor and the acceleration of the unmanned aircraft 100. The dynamic factor is a factor that takes into account a physical phenomenon.
Fig. 7 is a diagram showing a hardware configuration of a flight control device 800B in embodiment 3.
The flight control device 800B includes the IMU front end 600 in addition to the tracking front end 400, the process rear end 500B, and the PID control device 700, as in embodiment 1.
The IMU front end 600 has a communication section 605, a processing section 610, and a storage section 620. The processing unit 610 estimates the position of the unmanned aerial vehicle 100 by, for example, double-integrating (2-time integration) the acceleration (IMU data) measured by the inertia measurement device 250, and acquires information of the estimated position. The position estimated based on the acceleration is also referred to as an acceleration estimated position.
The communication unit 605 communicates with the unmanned aerial vehicle 100 and the process backend 500B. As a communication method with the unmanned aerial vehicle 100 and the process backend 500B, a dedicated line, a wired LAN, a wireless LAN, mobile communication, or the like is used. The communication unit 605 receives information of the acceleration measured by the inertia measurement device 250 from the unmanned aerial vehicle 100, for example. The communication unit 605 transmits the acceleration estimated position of the unmanned aerial vehicle 100 based on the acceleration to the processing backend 500B, for example. The storage section 420 functions as a work memory of the processing section 410.
In the IMU front end 600, data (data of acceleration) of 100 IMUs is acquired, for example, one second. In the tracking tip 400, for example, a captured image is acquired at a frequency of 30 (30fps) or 60 (60fps) frames per second, and an image position (observation position) where the unmanned aircraft 100A is present is derived. In the processing backend 500B, for example, 10 IMUs are used for one second.
Here, when the processing backend 500B optimizes the reprojection error to estimate the position of the unmanned aircraft 100, it is preferable that the frequency at which the acceleration estimated position of the unmanned aircraft 100 is updated by the IMU frontend 600, the frequency at which the captured image is acquired by the tracking frontend 400 (derived frequency of the observed position), and the frequency at which the estimated position of the unmanned aircraft 100 is processed by the processing backend 500B match each other. In order to process the IMU front end 600, the trace front end 400, and the processing back end 500B at the same time, the processing unit 510B may control the processing frequency (e.g., 10 times a second) of the processing back end 500B to be equal to each other.
For example, the processing unit 510B may integrate IMU data via the communication unit 505 and the communication unit 605 to acquire 10 acceleration estimated positions in one second, for example. The processing unit 510B can acquire information of image positions (observation positions) in 10 captured images within one second, for example, via the communication unit 505 and the communication unit 405.
The communication unit 505B of the process backend 500B receives the GPS position information of the unmanned aircraft 100 from the unmanned aircraft 100 as in embodiment 2. The UAV control 110 of the unmanned aircraft 100 may acquire GPS location information of the unmanned aircraft 100 based on the GPS signals received by the GPS receiver 240 and transmit to the flight control apparatus 800B via the communication interface 150. In the present embodiment, the processing backend 500B may not acquire the GPS position information and may not take the GPS position information into account in calculating the error score.
As in embodiments 1 and 2, the communication unit 505B of the processing backend 500B can receive captured images captured by the plurality of cameras 300 and data relating to the postures of the cameras 300 from the tracking frontend 400. The communication unit 505B may receive information of the acceleration estimated position from the IMU front end 600. The processing backend 300B may not acquire a captured image.
In addition, the storage section 520B may store a dynamic factor. The dynamic factor may include, for example, data for bringing the estimated position of the unmanned aircraft 100 within a range in which physical movement is possible according to a motion equation. The dynamic factor may comprise, for example, a kalman filter. In the case of using a kalman filter, the processing portion 510B may estimate the next position of the unmanned aircraft 100 based on the current position, velocity, acceleration, etc. of the unmanned aircraft 100.
The processing unit 510B of the processing backend 500B considers at least one of the acceleration estimated position received via the communication unit 505B and the physical estimated position (physical estimated position) obtained by the dynamic factor when optimizing the reprojection error to estimate the position of the unmanned aircraft 100. In addition, the processing unit 510B may consider GPS position information when optimizing the reprojection error to estimate the position of the unmanned aircraft 100. That is, the processing unit 510B can optimize the reprojection error according to equation (6).
[ math figure 5 ]
argminx(∑tjwjtp(π(cj,xt)-ojt)+λRR(X,)+λII(X,Δ)+λGG(X,Z)) ......(6)
Here, R (X,) represents a score considering the physical estimated position, that is, a score considering the dynamic factor, as shown in equation (7).
[ mathematical formula 6 ]
Figure PCTCN2019113651-APPB-000002
λ R (also described as λ)R) Is a fixed value. xt upper line (γ t) represents a physical estimated position at time t obtained from various physical laws and expressions (for example, a motion equation). Therefore, R (X,) represents the cumulative value of the difference between the assumed position xt of the unmanned aircraft 100 and the physically estimated position of the unmanned aircraft 100. Furthermore, the xt over-dash line may be a function used to derive the physical presumed location. Gamma t (also denoted as gamma)t) Variables required to derive the physical estimated position, such as the position and velocity of the unmanned aerial vehicle 100 at time t, may be used.
In this way, the processing portion 510B may calculate an error score based on the position where the unmanned aerial vehicle 100 can physically move.
When the difference between the assumed position xt of the unmanned aircraft 100 and the physically estimated position of the unmanned aircraft 100 is large, λ RR (X,) in expression (6) is a large value, and the influence on the value of expression (6) becomes large. For example, when the calculated next estimated position of the unmanned aircraft 100 is greatly deviated from the current assumed position of the unmanned aircraft 100 and cannot be moved, the value of the dynamic factor becomes large so that the assumed position xt is not estimated as the position of the unmanned aircraft 100, and the value of the equation (6) becomes difficult to be the minimum value. When the calculated velocity (for example, 50 m/sec) of the unmanned aircraft 100 exceeds the maximum velocity (for example, 20 m/sec) of the unmanned aircraft 100, the value of the dynamic factor increases so that the assumed position xt corresponding to such a velocity is not estimated as the position of the unmanned aircraft 100, and the value of the equation (6) is difficult to be the minimum value. Flight control device 800B can estimate the position of unmanned aircraft 100 in consideration of, for example, a physical estimated position where unmanned aircraft 100 can move, and improve the position estimation accuracy.
In addition, I (X, Δ) represents a score considering the estimated acceleration position, that is, a score considering the acceleration factor, as shown in equation (8).
[ formula 7 ]
Figure PCTCN2019113651-APPB-000003
λ I (also described as λI) Is a fixed value. xt upper-dashed line (t) indicates a position (acceleration estimated position) derived based on the acceleration measured by the inertial measurement unit 250. Therefore, I (X, Δ) represents an accumulated value of the difference between the assumed position xt of the unmanned aircraft 100 and the estimated acceleration position of the unmanned aircraft 100. Furthermore, the xt upper line may be a function for deriving the acceleration estimation position. t (also described ast) Variables necessary for deriving the acceleration estimated position, for example, the position and speed of the unmanned aircraft 100 at time t, may be used.
In this way, the processing unit 510B can derive (e.g., calculate) the position (acceleration estimated position) (an example of the second measured position) of the unmanned aerial vehicle 100 based on the acceleration measured by the inertial measurement device 250 (an example of the acceleration measuring device) included in the unmanned aerial vehicle 100. The processing portion 510B may calculate an error score based on the acceleration measurement position.
When the difference between the assumed position xt of the unmanned aircraft 100 and the estimated acceleration position of the unmanned aircraft 100 is large, λ II (X, Δ) in equation (8) is a large value, and the influence on the value of equation (8) becomes large. The flight control device 800B can estimate the position of the unmanned aircraft 100 in consideration of the measurement result (acceleration) of the inertial measurement device 250 included in the unmanned aircraft 100, and can improve the position estimation accuracy.
In the unmanned aerial vehicle system 5B according to embodiment 3, the processing unit 510B that processes the back end 500B optimizes the reprojection error so that the value of the argmin function of equation (6) is equal to or less than (e.g., minimum) the threshold value th3, for example, in consideration of at least one of the GPS position information, the physical estimated position, and the acceleration estimated position, and estimates the existing position of the unmanned aerial vehicle 100.
Communication unit 505B transmits the estimated position of unmanned aircraft 100 optimized by processing unit 510B to PID control device 700. The PID control device 700 generates flight parameters of the unmanned aircraft 100 based on the estimated position of the optimized unmanned aircraft 100, and transmits the flight parameters to the unmanned aircraft 100. The operation of the unmanned aerial vehicle 100 is the same as in embodiments 1 and 2.
In addition, in the formula (6), the case where both the score λ RR (X,) considering the dynamic factor and the score λ RR (X,) considering the acceleration factor are added to the formula (3) is exemplified, but only either one may be added.
The flight control devices 800, 800A, and 800B in the above embodiments are configured as devices separate from the camera 300 and the unmanned aerial vehicle 100, but at least a part of the configuration may be configured by the camera 300, the unmanned aerial vehicle 100, a terminal or a server other than the camera 300 and the unmanned aerial vehicle 100. The terminal may be, for example, a terminal capable of maneuvering an unmanned aircraft. The server may be a computer that is communicatively connectable to the network with the camera 300 and the unmanned aircraft 100. In the flight control apparatuses 800, 800A, 800B, for example, the tracking front end 400 may be provided in any one of the camera 300, the unmanned aerial vehicle 100, the terminal, and the server. The process backend 500, 500A, 500B may be provided in any of the camera 300, the unmanned aerial vehicle 100, a terminal, and a server. The PID control device 700 may be provided in any one of the camera 300, the unmanned aerial vehicle 100, the terminal, and the server. The IMU front end 600 may be provided in any one of the camera 300, the unmanned aerial vehicle 100, the terminal, and the server.
Each processing unit in the above embodiments realizes various functions by executing a program stored in each storage unit by a processor, for example. The processor may include MPU (micro Processing Unit), CPU (Central Processing Unit), DSP (digital Signal processor), GPU (Graphical Processing Unit), etc. Each processing unit controls each unit in the apparatus. Each processing unit executes various processes.
Each of the storage units in the above embodiments includes a main storage device (e.g., a RAM (Random Access Memory) and a ROM (Read Only Memory)). Each storage unit may include a secondary storage device (e.g., hdd (hard Disk drive), ssd (solid State drive), and a tertiary storage device (e.g., optical disc, SD card). Each storage section may include other storage devices. Each storage unit stores various data, information, and programs.
The present disclosure has been explained above using the embodiments, but the technical scope of the present disclosure is not limited to the scope described in the above embodiments. It will be apparent to those skilled in the art that various changes and modifications can be made in the above embodiments. As is apparent from the description of the claims, the embodiments to which such changes or improvements are made are included in the technical scope of the present disclosure.
The execution order of the operations, procedures, steps, and stages of the devices, systems, programs, and methods shown in the claims, the specification, and the drawings of the specification may be implemented in any order as long as it is not particularly clear that "before", "in advance", and the like, and as long as the output of the preceding process is not used in the following process. The operational flow in the claims, the description, and the drawings is described using "first", "next", and the like for convenience, but this does not necessarily mean that the operations are performed in this order.
[ notation ] to show
5, 5A, 5B unmanned aerial vehicle system
100 unmanned aerial vehicle
102 UAV main body
110 UAV control
150 communication interface
160 memory
170 memory
200 universal joint
210 rotor mechanism
220, 230 image pickup part
240 GPS receiver
250 inertia measuring device
260 magnetic compass
270 barometric altimeter
280 ultrasonic sensor
290 laser measuring instrument
300 Camera
400 trace front end
405, 505, 505A, 505B, 605, 705 communication unit
410, 510, 510A, 510B, 610, 710 processing unit
420, 520, 520B, 620, 720 storage section
500, 500A, 500B Process backend
600 IMU front end
700 PID control device
800, 800A, 800B flight control device
GM1, GM2 frame
Direction of movement of STV0

Claims (18)

  1. A position estimation device that estimates a position where a flying object exists, the device comprising:
    a processing unit that performs processing related to estimation of a position where the flying object exists;
    the processing part
    Acquiring a plurality of captured images obtained by capturing the flying object by a plurality of imaging devices,
    acquiring information of arrangement positions of the respective image pickup devices and image pickup orientations of the respective image pickup devices,
    based on the arrangement position of each imaging device, the imaging direction of each imaging device, each assumed position where the flying object is assumed to exist in the actual space, and the image position where the flying object exists in each imaged image,
    an error score representing an error between each of the assumed positions and the existing position of the flying object is calculated,
    and estimating the presence position of the flying body in the actual space based on the error score.
  2. The position estimation device according to claim 1,
    the processing unit acquires the plurality of captured images obtained by capturing the flying object at a plurality of times by the plurality of imaging devices,
    calculating the error score at the plurality of time instants,
    and estimating the presence position of the flying body in the actual space based on the error score.
  3. The position estimation device according to claim 2,
    the processing unit acquires a distance between the imaging device and the flying object,
    deriving a reliability of the error score with respect to the captured image based on the distance,
    and calculating the error score based on the reliability.
  4. The position estimation device according to claim 3,
    the processing unit calculates an error score based on a difference between each of projected assumed positions, which are projected assumed to be where the flying object exists in the actual space, on each of captured images captured in the arrangement position of each of the imaging devices and the imaging direction of each of the imaging devices, and each of image positions, which are image positions where the flying object exists in each of captured images captured in the arrangement position of each of the imaging devices and the imaging direction of each of the imaging devices, and estimates the assumed position of the flying object where the error score is the smallest as the existence position of the flying object.
  5. The position estimation device according to any one of claims 1 to 4,
    the processing unit acquires a first measurement position of the flying object measured by a positioning unit included in the flying object,
    and calculating the error score based on the first measured position.
  6. The position estimation device according to claim 5, wherein the Positioning unit receives GPS signals from a plurality of GPS (Global Positioning System) satellites to acquire the first measurement position;
    the processing unit calculates the error score such that the influence of the first measured position on the error score increases as the number of GPS satellites that the positioning unit cannot receive increases.
  7. The position estimation device according to any one of claims 1 to 6, wherein the processing unit calculates the error score based on a position at which the flying object is physically movable.
  8. The position estimation device according to any one of claims 1 to 6,
    the processing unit derives a second measurement position of the flying object based on an acceleration measured by an acceleration measuring device included in the flying object,
    and calculating the error score based on the second measured position.
  9. A position estimation method for estimating a position where a flying object exists, comprising:
    acquiring a plurality of captured images obtained by capturing the flying object by a plurality of imaging devices;
    acquiring information of arrangement positions of the camera devices and camera shooting directions of the camera devices;
    based on the arrangement position of each imaging device, the imaging direction of each imaging device, each assumed position where the flying object is assumed to exist in the actual space, and the image position where the flying object exists in each imaged image,
    calculating an error score indicating an error between each of the assumed positions and the existing position of the flying object;
    and estimating the presence position of the flying body in the actual space based on the error score.
  10. The position estimation method according to claim 9, wherein the step of acquiring the plurality of captured images includes a step of acquiring the plurality of captured images obtained by capturing the flying object at a plurality of times by the plurality of imaging devices;
    the step of calculating the error score includes the step of calculating the error score at the plurality of times;
    the step of estimating the presence position of the flying object includes a step of estimating the presence position of the flying object in the real space based on the error score.
  11. The position estimation method according to claim 10, wherein the step of calculating the error score includes the steps of:
    acquiring the distance between the camera device and the flying object;
    deriving a reliability of the error score with respect to the captured image from the distance;
    and calculating the error score according to the reliability.
  12. The position estimation method according to claim 11, wherein the step of calculating the error score includes the steps of: calculating an error score based on a difference between each of projected assumed positions, which are projected assumed positions where the flying object is present in the actual space, on each of captured images captured in the arrangement position of each of the imaging devices and the imaging direction of each of the imaging devices, and each of image positions where the flying object is present in each of the captured images captured in the arrangement position of each of the imaging devices and the imaging direction of each of the imaging devices;
    the step of estimating the position where the flying object exists includes a step of estimating an assumed position of the flying object where the error score is smallest as the position where the flying object exists.
  13. The position estimation method according to any one of claims 9 to 12, characterized in that the step of calculating the error score includes the steps of:
    acquiring a first measurement position of the flying object measured by a positioning unit included in the flying object;
    and calculating the error score based on the first measured position.
  14. The position estimation method according to claim 13, wherein the step of acquiring the first measured position includes the step of receiving GPS signals from a plurality of GPS satellites to acquire the first measured position;
    the step of calculating the error score includes a step of calculating the error score such that the influence of the first measurement position on the error score increases as the number of GPS satellites that the positioning unit cannot receive increases.
  15. The position estimation method according to any one of claims 9 to 14, characterized in that the step of calculating the error score includes a step of calculating the error score based on a position where the flying object can physically move.
  16. The position estimation method according to any one of claims 9 to 14, characterized in that the step of calculating the error score includes the steps of:
    deriving a second measurement position of the flying object based on an acceleration measured by an acceleration measurer included in the flying object;
    and calculating the error score based on the second measured position.
  17. A program for causing a position estimation device that estimates a presence position of a flying object to execute:
    acquiring a plurality of captured images obtained by capturing the flying object by a plurality of imaging devices;
    acquiring information of arrangement positions of the camera devices and camera shooting directions of the camera devices;
    based on the arrangement position of each imaging device, the imaging direction of each imaging device, each assumed position where the flying object is assumed to exist in the actual space, and the image position where the flying object exists in each imaged image,
    calculating an error score indicating an error between each of the assumed positions and the existing position of the flying object;
    and estimating the presence position of the flying body in the actual space based on the error score.
  18. A computer-readable recording medium having recorded thereon a program for causing a position estimation device that estimates a position of a flying object to execute:
    acquiring a plurality of captured images obtained by capturing the flying object by a plurality of imaging devices;
    acquiring information of arrangement positions of the camera devices and camera shooting directions of the camera devices;
    based on the arrangement position of each imaging device, the imaging direction of each imaging device, each assumed position where the flying object is assumed to exist in the actual space, and the image position where the flying object exists in each imaged image,
    calculating an error score indicating an error between each of the assumed positions and the existing position of the flying object;
    and estimating the presence position of the flying body in the actual space based on the error score.
CN201980009074.2A 2018-10-31 2019-10-28 Position estimation device, position estimation method, program, and recording medium Pending CN111615616A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2018206007A JP6974290B2 (en) 2018-10-31 2018-10-31 Position estimation device, position estimation method, program, and recording medium
JP2018-206007 2018-10-31
PCT/CN2019/113651 WO2020088397A1 (en) 2018-10-31 2019-10-28 Position estimation apparatus, position estimation method, program, and recording medium

Publications (1)

Publication Number Publication Date
CN111615616A true CN111615616A (en) 2020-09-01

Family

ID=70462059

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980009074.2A Pending CN111615616A (en) 2018-10-31 2019-10-28 Position estimation device, position estimation method, program, and recording medium

Country Status (3)

Country Link
JP (1) JP6974290B2 (en)
CN (1) CN111615616A (en)
WO (1) WO2020088397A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008281553A (en) * 2007-04-09 2008-11-20 Seiko Epson Corp Current position positioning method, program, storage medium, positioning device, and electronic device
CN101313233A (en) * 2005-11-21 2008-11-26 日本电气株式会社 Position estimating system, position estimating method, position estimating device and its program
JP2010169682A (en) * 2009-01-23 2010-08-05 Honeywell Internatl Inc System and method for determining location of aircraft using radar image
CN104854637A (en) * 2012-12-12 2015-08-19 日产自动车株式会社 Moving object location/attitude angle estimation device and moving object location/attitude angle estimation method
WO2016059930A1 (en) * 2014-10-17 2016-04-21 ソニー株式会社 Device, method, and program
WO2018142496A1 (en) * 2017-02-01 2018-08-09 株式会社日立製作所 Three-dimensional measuring device
WO2018146803A1 (en) * 2017-02-10 2018-08-16 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド Position processing device, flight vehicle, position processing system, flight system, position processing method, flight control method, program, and recording medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102004005380A1 (en) * 2004-02-03 2005-09-01 Isra Vision Systems Ag Method for determining the position of an object in space
JP4985166B2 (en) * 2007-07-12 2012-07-25 トヨタ自動車株式会社 Self-position estimation device
JP2014186004A (en) * 2013-03-25 2014-10-02 Toshiba Corp Measurement device, method and program
US10311739B2 (en) * 2015-01-13 2019-06-04 Guangzhou Xaircraft Technology Co., Ltd Scheduling method and system for unmanned aerial vehicle, and unmanned aerial vehicle

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101313233A (en) * 2005-11-21 2008-11-26 日本电气株式会社 Position estimating system, position estimating method, position estimating device and its program
JP2008281553A (en) * 2007-04-09 2008-11-20 Seiko Epson Corp Current position positioning method, program, storage medium, positioning device, and electronic device
JP2010169682A (en) * 2009-01-23 2010-08-05 Honeywell Internatl Inc System and method for determining location of aircraft using radar image
CN104854637A (en) * 2012-12-12 2015-08-19 日产自动车株式会社 Moving object location/attitude angle estimation device and moving object location/attitude angle estimation method
WO2016059930A1 (en) * 2014-10-17 2016-04-21 ソニー株式会社 Device, method, and program
WO2018142496A1 (en) * 2017-02-01 2018-08-09 株式会社日立製作所 Three-dimensional measuring device
WO2018146803A1 (en) * 2017-02-10 2018-08-16 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド Position processing device, flight vehicle, position processing system, flight system, position processing method, flight control method, program, and recording medium

Also Published As

Publication number Publication date
JP2020071154A (en) 2020-05-07
JP6974290B2 (en) 2021-12-01
WO2020088397A1 (en) 2020-05-07

Similar Documents

Publication Publication Date Title
US10860039B2 (en) Obstacle avoidance method and apparatus and unmanned aerial vehicle
JP6138326B1 (en) MOBILE BODY, MOBILE BODY CONTROL METHOD, PROGRAM FOR CONTROLLING MOBILE BODY, CONTROL SYSTEM, AND INFORMATION PROCESSING DEVICE
JP6962775B2 (en) Information processing equipment, aerial photography route generation method, program, and recording medium
WO2018193574A1 (en) Flight path generation method, information processing device, flight path generation system, program and recording medium
KR102239562B1 (en) Fusion system between airborne and terrestrial observation data
CN111213107B (en) Information processing device, imaging control method, program, and recording medium
CN111699454B (en) Flight planning method and related equipment
CN111247389B (en) Data processing method and device for shooting equipment and image processing equipment
JP6265576B1 (en) Imaging control apparatus, shadow position specifying apparatus, imaging system, moving object, imaging control method, shadow position specifying method, and program
US20210229810A1 (en) Information processing device, flight control method, and flight control system
JP2019028560A (en) Mobile platform, image composition method, program and recording medium
CN111344650B (en) Information processing device, flight path generation method, program, and recording medium
US20200217665A1 (en) Mobile platform, image capture path generation method, program, and recording medium
WO2019189381A1 (en) Moving body, control device, and control program
JP6515423B2 (en) CONTROL DEVICE, MOBILE OBJECT, CONTROL METHOD, AND PROGRAM
JP6974290B2 (en) Position estimation device, position estimation method, program, and recording medium
CN111788457A (en) Shape estimation device, shape estimation method, program, and recording medium
CN114586335A (en) Image processing apparatus, image processing method, program, and recording medium
WO2020001629A1 (en) Information processing device, flight path generating method, program, and recording medium
CN112313942A (en) Control device for image processing and frame body control
CN111656760A (en) Image generation device, image generation method, program, and recording medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200901

WD01 Invention patent application deemed withdrawn after publication