WO2020088397A1 - 位置推定装置、位置推定方法、程序以及记录介质 - Google Patents

位置推定装置、位置推定方法、程序以及记录介质 Download PDF

Info

Publication number
WO2020088397A1
WO2020088397A1 PCT/CN2019/113651 CN2019113651W WO2020088397A1 WO 2020088397 A1 WO2020088397 A1 WO 2020088397A1 CN 2019113651 W CN2019113651 W CN 2019113651W WO 2020088397 A1 WO2020088397 A1 WO 2020088397A1
Authority
WO
WIPO (PCT)
Prior art keywords
error score
assumed
flying body
imaging
camera
Prior art date
Application number
PCT/CN2019/113651
Other languages
English (en)
French (fr)
Inventor
顾磊
陈斌
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201980009074.2A priority Critical patent/CN111615616A/zh
Publication of WO2020088397A1 publication Critical patent/WO2020088397A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • G01S19/14Receivers specially adapted for specific applications
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/14Flying platforms with four distinct rotor axes, e.g. quadcopters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography

Definitions

  • the present disclosure relates to a position estimation device, a position estimation method, a program, and a storage medium that estimate the existence position of a flying body.
  • a conventional unmanned aerial vehicle receives a GPS signal transmitted from a GPS (Global Positioning System) satellite, calculates a position based on this GPS signal, and autonomously flies (refer to Patent Document 1).
  • GPS Global Positioning System
  • Patent Document 1 Japanese Patent Application Publication No. 2018-147467
  • unmanned aerial vehicles sometimes cannot receive GPS signals. In this case, it is difficult to perform autonomous flight. Even when GPS signals cannot be received, for example, the speed of an unmanned aerial vehicle can be integrated to estimate its position. However, when the velocity of an unmanned aerial vehicle is integrated to estimate its position, errors are likely to occur (for example, an error of 2 m per 10 m), and the accuracy of the estimated position is not high enough.
  • a technique for detecting the position of an object there is a technique (motion capture) that captures the motion of the object by irradiating the object with light and shooting.
  • motion capture it is necessary to use a dedicated marker for tracking, and the scenes used are limited.
  • the position estimation device is a position estimation device that estimates the existence position of the flying body, and includes a processing unit that performs processing related to estimating the existence position of the flying body, and the processing unit acquires a plurality of imaging devices to the flying body A plurality of captured images obtained by shooting, obtaining information on the arrangement position of each camera device and the imaging direction of each camera device, based on the arrangement position of each camera device, the imaging direction of each camera device, assuming that a flying body exists in actual space
  • an error score indicating the error between each assumed position and the existing position of the flying object is calculated, and the flying object in the actual space is estimated based on the error score Where it exists.
  • the processing unit may acquire a plurality of captured images obtained by photographing the flying object at multiple times by multiple camera devices, calculate an error score at multiple times, and estimate the existence position of the flying object in the actual space based on the error score .
  • the processing unit may acquire the distance between the camera and the flying object, derive the reliability of the error score for the captured image based on the distance, and calculate the error score based on the reliability.
  • the processing unit may calculate an error score based on the difference between each projection assumed position and each image position, each projection assumed position is to assume that each assumed position of the flying body in the actual space is based on the arrangement position of each camera device and The assumed position of the projection obtained by projecting on each captured image captured by the imaging direction of each camera device, each image position is the position of the flying object in each captured image captured in the arrangement position of each camera device and the imaging direction of each camera device The existing image position, and the assumed position of the flying body with the smallest error score is estimated as the existing position of the flying body.
  • the processing unit may acquire the first measured position of the flying body measured by the positioning unit included in the flying body, and calculate an error score based on the first measured position.
  • the positioning unit can receive GPS signals from multiple GPS (Global Positioning System) satellites to obtain the first measurement position.
  • the processing unit can determine the number of GPS satellites that the positioning unit cannot receive.
  • the error score is calculated in a way that the influence of the error score is greater.
  • the processing unit may calculate the error score based on the position where the flying body can physically move.
  • the processing unit may derive the second measurement position of the flying body based on the acceleration measured by the acceleration measuring device included in the flying body, and calculate the error score based on the second measurement position.
  • the position estimation method is a position estimation method for estimating the existence position of the flying object, which includes the steps of: acquiring a plurality of camera images obtained by shooting the flying object by a plurality of camera devices; acquiring each camera device Information about the arrangement position of each camera and the imaging direction of each camera; based on the arrangement position of each camera, the imaging direction of each camera, each assumed position of a flying object in the actual space, and the existence of the flying object in each captured image The image position of, calculates an error score indicating the error between each assumed position and the existing position of the flying object; and estimates the existing position of the flying object in the actual space based on the error score.
  • the step of acquiring a plurality of captured images may include the step of acquiring a plurality of captured images obtained by capturing the flying object at multiple times by multiple camera devices.
  • the step of calculating the error score may include the step of calculating the error score at multiple times.
  • the step of estimating the existence position of the flying body may include the step of estimating the existence position of the flying body in the actual space based on the error score.
  • the step of calculating the error score may include the following steps: acquiring the distance between the camera device and the flying body; deriving the reliability of the error score regarding the captured image according to the distance; and calculating the error score according to the reliability.
  • the step of calculating the error score may include the steps of: calculating the error score based on the difference between each assumed position of each projection and the position of each image, each assumed position of the projection is to assume that each assumed position of the flying body exists in the actual space at The assumed position of the projection obtained by projecting each captured image captured with the arrangement position of each camera device and the imaging direction of each camera device, each image position is captured in the arrangement position of each camera device and the imaging direction of each camera device In each of the captured images, the image position where the flying body exists.
  • the step of estimating the existing position of the flying body may include the step of estimating the assumed position of the flying body with the smallest error score as the existing position of the flying body.
  • the step of calculating the error score may include the steps of: acquiring the first measured position of the flying body measured by the positioning section included in the flying body; and calculating the error score based on the first measured position.
  • the step of acquiring the first measurement position may include the step of receiving GPS signals from a plurality of GPS satellites to acquire the measurement position.
  • the step of calculating the error score may include the step of calculating the error score in such a way that the greater the number of GPS satellites that the positioning unit cannot receive, the greater the influence of the first measurement position on the error score.
  • the step of calculating the error score may include the step of calculating the error score based on the position where the flying body can physically move.
  • the step of calculating the error score may include the steps of: deriving the second measured position of the flying body based on the acceleration measured by the acceleration measuring device included in the flying body; and calculating the error score based on the second measured position.
  • the program is a program for causing the position estimation device that estimates the existence position of the flying body to perform the following steps: acquiring a plurality of camera images obtained by shooting the flying body by a plurality of camera devices; acquiring each camera Information on the arrangement position of the device and the imaging direction of each camera device; based on the arrangement position of each camera device, the imaging direction of each camera device, each assumed position of the flying body assuming that there is an actual body in the actual space, and For the existing image position, an error score indicating an error between each assumed position and the existing position of the flying object is calculated; and the existing position of the flying object in the actual space is estimated based on the error score.
  • the recording medium is a computer-readable recording medium that records a program for causing a position estimation device that estimates the existence position of the flying body to perform the following step: acquiring obtained by photographing the flying body by a plurality of camera devices Multiple captured images; acquiring information about the arrangement position of each camera and the imaging orientation of each camera; based on the arrangement position of each camera, the imaging orientation of each camera, and each assumed position of the flying body in the actual space And the image position of the flying body in each captured image, calculate an error score indicating the error between each assumed position and the existing position of the flying body; and estimate the existing position of the flying body in the actual space based on the error score.
  • FIG. 1 is a diagram showing an example of the outline of an unmanned aerial vehicle system in Embodiment 1.
  • FIG. 1 is a diagram showing an example of the outline of an unmanned aerial vehicle system in Embodiment 1.
  • FIG. 2 is a diagram showing the hardware configuration of the flight control device.
  • FIG. 3 is a diagram showing an example of a specific appearance of an unmanned aircraft.
  • FIG. 4 is a block diagram showing an example of the hardware configuration of an unmanned aircraft.
  • FIG. 5 is a diagram illustrating reprojection errors.
  • FIG. 6 is a diagram showing the hardware configuration of the flight control device in Embodiment 2.
  • FIG. 6 is a diagram showing the hardware configuration of the flight control device in Embodiment 2.
  • FIG. 7 is a diagram showing the hardware configuration of the flight control device in Embodiment 3.
  • FIG. 7 is a diagram showing the hardware configuration of the flight control device in Embodiment 3.
  • an unmanned aircraft (UAV: Unmanned Aerial) is used as an example of a flying body.
  • Unmanned aircraft includes aircraft moving in the air.
  • unmanned aerial vehicles are also expressed as "UAV".
  • the position estimation device is, for example, a PC (Personal Computer), server, terminal, various processing devices, and control device.
  • the position estimation method specifies the operation in the position estimation device.
  • a program (for example, a program that causes the position estimation device to execute various processes) is recorded in the recording medium.
  • FIG. 1 is a diagram showing an example of the outline of the unmanned aerial vehicle system 5 in the first embodiment.
  • the unmanned aerial vehicle 100 can also use the unmanned aerial vehicle system 5 in a place where the unmanned aerial vehicle 100 cannot receive GPS signals or it is difficult to receive GPS signals, for example, under autonomous flight conditions under a bridge or indoors.
  • the unmanned aerial vehicle system 5 is also used in the case where the unmanned aerial vehicle 100 autonomously flies when the unmanned aerial vehicle 100 does not include a GPS receiver.
  • Unmanned aerial vehicle 100 is an example of a flying body.
  • the unmanned aerial vehicle system 5 includes a plurality of cameras 300, a flight control device 800, and an unmanned aerial vehicle 100.
  • the camera 300 is an example of an imaging device.
  • the flight control device 800 is an example of a position estimation device.
  • a plurality of cameras 300 are installed in different places on the ground and on the water, and shoot the unmanned aerial vehicle 100 from various directions.
  • the plurality of cameras 300 may be fixed cameras, and the plurality of cameras 300 may be cameras that can freely change the imaging position and imaging direction.
  • the plurality of cameras 300 may be installed in the same place or area, or may be installed in different places or areas.
  • the camera 300 may have a processing unit, a communication unit, and a storage unit.
  • the flight control device 800 performs a process of tracking the unmanned aircraft 100 based on the images captured by the plurality of cameras 300. In addition, in order to derive the position where the existence of the unmanned aircraft 100 is estimated, the flight control device 800 performs optimization processing so that the error between each assumed position indicating that the unmanned aircraft 100 exists and the actual existence position of the unmanned aircraft 100 The error score is the smallest. In FIG. 1, it is assumed that a plurality of cameras 300 photographs an autonomous flying unmanned aircraft 100. In addition, the unmanned aerial vehicle 100 is not limited to one, and there may be multiple.
  • FIG. 2 is a diagram showing the hardware configuration of the flight control device 800.
  • the flight control device 800 includes a tracking front end 400, a processing back end 500, and a PID (Proportional-Integral-Differentia, proportional-integral-derivative) control device 700.
  • At least one of the processing unit of the tracking front end 400 and the processing unit of the processing back end 500 is an example of the processing unit of the position estimation device.
  • the tracking front end 400, the processing back end 500, and the PID control device 700 may be installed on one device, or may be distributed on multiple devices.
  • the whole or a part of the flight control device 800 may be a PC, a server, a terminal, various processing devices, and a control device.
  • the parts of the flight control device 800 (tracking front end 400, processing back end 500, and PID control device 700) may be installed in the same place or area or in different places or areas.
  • the tracking front end 400 acquires the pixel position on the frame (in the captured image) of the unmanned aircraft 100 captured in each of the captured images captured by the multiple cameras 300 and sends it to the processing back end 500.
  • the tracking front end 400 can track the movement of the unmanned aircraft 100 by detecting the pixel position of the unmanned aircraft 100 in time series, and can track the movement of the unmanned aircraft 100 in the camera image.
  • the tracking front end 400 transmits information on the posture of each camera 300 to the processing back end 500.
  • the posture of each camera 300 can be specified by the arrangement position and imaging direction of each camera 300.
  • the tracking front end 400 has a communication unit 405, a processing unit 410, and a storage unit 420.
  • the posture of each camera 300 is fixed, it may be sent to the processing backend 500 at a time, and when the posture of each camera 300 is variable, it may be sent to the processing backend 500 in sequence.
  • the processing unit 410 acquires the pixel position on the frame of the unmanned aerial vehicle 100 captured by the plurality of cameras 300 at the same time and captured in each captured image as the observation position.
  • the processing unit 410 may acquire the pixel position of the unmanned aerial vehicle 100 at a frequency of 30 fps or 60 fps.
  • the processing unit 410 acquires information on the posture of each camera 300 from each camera 300, an external server, or the like.
  • the processing unit 410 may cause the storage unit 420 to store the information of the posture of the camera 300.
  • the communication unit 405 communicates with the plurality of cameras 300 and the processing backend 500.
  • the communication method uses dedicated lines, wired LAN (Local Area Network, local area network), wireless LAN, mobile communication, etc.
  • the communication unit 405 receives captured images from the plurality of cameras 300.
  • the communication unit 405 transmits the observation position of the unmanned aircraft 100 captured in the captured images captured by the plurality of cameras 300 to the processing backend 500.
  • the communication unit 405 transmits information on the posture of the plurality of cameras 300 to the processing backend 500.
  • the storage section 420 can be used as a working memory of the processing section 410.
  • the processing back-end 500 performs optimization processing based on each pixel position (observation position) of the unmanned aerial vehicle 100 acquired from the tracking front-end 400.
  • the processing backend 500 has a communication unit 505, a processing unit 510, and a storage unit 520.
  • the optimization process is, for example, a process for minimizing the value of the error score shown in Equation (1) described later.
  • the optimization process is a process for minimizing the difference (reprojection error) between the assumed position of the projection and the image position (observation position) actually captured in the captured image, where the assumed position of projection is the assumed unmanned aircraft 100
  • the assumed position at which it is projected onto the captured image It is assumed that the position can also be changed arbitrarily in a three-dimensional space.
  • the assumed position of the unmanned aircraft 100 with the lowest cost can be set as the position where the unmanned aircraft 100 is estimated to exist (estimated position).
  • the reprojection error may be a total of a plurality of captured images captured by the plurality of cameras 300, or may be a total of a plurality of captured images captured in time series.
  • the processing unit 510 may estimate the position of the unmanned aircraft 100 based on the pixel positions of the unmanned aircraft 100 captured in the plurality of captured images and the postures of the plurality of cameras 300.
  • the processing unit 510 may estimate the position of the unmanned aircraft 100 based on the projection assumed position and observation position regarding the plurality of captured images and the posture of the plurality of cameras 300. That is, the processing unit 510 may estimate the position of the unmanned aircraft 100 by optimizing (minimizing) the reprojection error based on the tracking result of the unmanned aircraft 100 by the camera 300.
  • the communication unit 505 communicates with the tracking front end 400 and the PID control device 700.
  • the communication method uses dedicated lines, wired LAN, wireless LAN, mobile communication, etc.
  • the communication unit 505 receives information on the observation position and the posture of each camera 300 from the tracking front end 400.
  • the communication unit 505 transmits the optimized estimated position information of the unmanned aircraft 100 to the PID control device 700.
  • the posture information of each camera 300 may not be directly obtained from each camera 300, for example, it may be stored in the storage unit 520 in advance, or may be obtained from an external server.
  • the storage section 520 can be used as a working memory of the processing section 510.
  • the PID control device 700 performs PID (P: Proportional I: Integral D: Differential) control for flying the unmanned aircraft 100 along the flight path based on the information of the estimated position of the unmanned aircraft 100.
  • the flight path may be a predetermined flight path.
  • the PID control device 700 may generate information of at least a part of flight parameters for flying the unmanned aircraft 100 and send it to the unmanned aircraft 100.
  • flight parameters include flight position, flight altitude, flight speed, and flight acceleration of unmanned aerial vehicle 100 (for example, acceleration in the three axis directions of forward, backward, left, and up and down), pitch angle and yaw indicating the direction of the airframe Angle, roll angle, etc.
  • the PID control device 700 has a communication unit 705, a processing unit 710, and a storage unit 720.
  • the flight parameters can be used to fill the target state (e.g. the flight position, flight altitude, flight speed, flight acceleration, pitch angle, yaw angle, roll angle, etc.) of the unmanned aircraft 100 as the target and the actual state (e.g. Data of the difference between the estimated flight position, flight altitude, flight speed, flight acceleration, pitch angle, yaw angle, roll angle, etc. of the unmanned aerial vehicle 100.
  • the processing unit 710 may perform PID control and generate flight parameters.
  • the communication unit 705 communicates with the processing backend 500 and the unmanned aircraft 100.
  • the communication unit 705 transmits the flight parameters generated by the processing unit 710 to the unmanned aircraft 100.
  • a dedicated line, wired LAN, wireless LAN, mobile communication, etc. are used as a communication method with the processing back end 500 and the unmanned aircraft 100.
  • the storage section 720 can be used as a working memory of the processing section 710.
  • the storage unit 720 may store data such as a flight path, data of a target state related to the flight of the unmanned aircraft 100, and data of an actual state. These data can be acquired via the communication section 705 from, for example, the unmanned aircraft 100 or a terminal instructing the control of the flight of the unmanned aircraft 100.
  • FIG. 3 is a diagram showing an example of a specific appearance of the unmanned aerial vehicle 100.
  • a perspective view of the unmanned aircraft 100 when flying in the moving direction STV0 is shown.
  • Unmanned aerial vehicle 100 is an example of a mobile body.
  • the roll axis (refer to the x axis) is set in a direction parallel to the ground and along the moving direction STV0.
  • set the pitch axis in the direction parallel to the ground and perpendicular to the roll axis (refer to the y-axis)
  • set the yaw axis in the direction perpendicular to the ground and perpendicular to the roll axis and pitch axis (Refer to the z axis).
  • the configuration of the unmanned aerial vehicle 100 includes a UAV main body 102, a universal joint 200, an imaging unit 220, and a plurality of imaging units 230.
  • the UAV body 102 includes a plurality of rotors (propellers).
  • the UAV main body 102 makes the unmanned aircraft 100 fly by controlling the rotation of a plurality of rotors.
  • the UAV main body 102 uses, for example, four rotors to fly the unmanned aerial vehicle 100.
  • the number of rotors is not limited to four.
  • the unmanned aerial vehicle 100 may be a fixed-wing aircraft without a rotor.
  • the imaging unit 220 is an imaging camera that captures an object included in an intended imaging range (for example, a scene above an aerial photography target, a landscape such as a mountain or river, or a building on the ground).
  • an intended imaging range for example, a scene above an aerial photography target, a landscape such as a mountain or river, or a building on the ground.
  • the plurality of imaging units 230 are sensing cameras that capture the surroundings of the unmanned aircraft 100 in order to control the flight of the unmanned aircraft 100.
  • the two imaging units 230 may be installed on the front of the nose of the unmanned aircraft 100.
  • the other two imaging units 230 may be installed on the bottom surface of the unmanned aerial vehicle 100.
  • the two imaging units 230 on the front side may be paired and function as a so-called stereo camera.
  • the two imaging units 230 on the bottom surface side may also be paired to function as a stereo camera.
  • the three-dimensional space data around the unmanned aerial vehicle 100 can be generated based on the images captured by the plurality of imaging units 230.
  • the number of imaging units 230 included in the unmanned aerial vehicle 100 is not limited to four.
  • the unmanned aerial vehicle 100 only needs to include at least one camera 230.
  • the unmanned aircraft 100 may include at least one camera 230 on the nose, tail, side, bottom, and top of the unmanned aircraft 100, respectively.
  • the angle of view that can be set in the imaging unit 230 can be larger than the angle of view that can be set in the imaging unit 220.
  • the imaging unit 230 may have a single focus lens or a fisheye lens.
  • FIG. 4 is a block diagram showing an example of the hardware configuration of the unmanned aerial vehicle 100.
  • the configuration of the unmanned aerial vehicle 100 includes a UAV control unit 110, a communication interface 150, a memory 160, a memory 170, a universal joint 200, a rotor mechanism 210, an imaging unit 220, an imaging unit 230, a GPS receiver 240, and an inertial measurement device ( IMU: Inertial Measurement Unit) 250, magnetic compass 260, barometric altimeter 270, ultrasonic sensor 280, laser measuring instrument 290.
  • the GPS receiver 240 is an example of a positioning unit.
  • the unmanned aerial vehicle 100 includes the GPS receiver 240, but the accuracy of the GPS signal received by the GPS receiver 240 is low. In addition, it is also conceivable that the unmanned aerial vehicle 100 does not include the GPS receiver 240 and cannot acquire GPS signals at all.
  • the UAV control unit 110 is composed of, for example, a CPU (Central Processing Unit: central processing unit), an MPU (Micro Processing Unit: a microprocessor), or a DSP (Digital Signal Processor: digital signal processor).
  • the UAV control unit 110 performs signal processing for overall control of the operation of each unit of the unmanned aerial vehicle 100, data input / output processing with other units, data calculation processing, and data storage processing.
  • the UAV control unit 110 controls the flight of the unmanned aircraft 100 according to a program stored in the memory 160.
  • the UAV control unit 110 can take aerial images.
  • the UAV control unit 110 may acquire the flight parameter information from the PID control device 700 via the communication interface 150.
  • the UAV control part 110 may control the flight of the unmanned aircraft 100 based on the acquired flight parameters.
  • the UAV control unit 110 acquires position information indicating the position of the unmanned aircraft 100.
  • the UAV control unit 110 may obtain position information indicating the latitude, longitude, and altitude where the unmanned aircraft 100 is located from the GPS receiver 240.
  • the UAV control unit 110 may obtain latitude and longitude information indicating the latitude and longitude of the unmanned aircraft 100 from the GPS receiver 240 and obtain altitude information indicating the altitude of the unmanned aircraft 100 from the barometric altimeter 270 as position information .
  • the UAV control unit 110 may acquire the distance between the radiation point of the ultrasonic wave generated by the ultrasonic sensor 280 and the reflection point of the ultrasonic wave as height information.
  • the UAV control unit 110 may obtain orientation information indicating the orientation of the unmanned aerial vehicle 100 from the magnetic compass 260.
  • the orientation information can be expressed by, for example, an orientation corresponding to the orientation of the nose of the unmanned aircraft 100.
  • the UAV control unit 110 may acquire position information indicating the position where the unmanned aircraft 100 should exist when the imaging unit 220 shoots the imaging range to be photographed.
  • the UAV control unit 110 may acquire position information indicating the position where the unmanned aircraft 100 should exist from the memory 160.
  • the UAV control unit 110 may acquire position information indicating the position where the unmanned aircraft 100 should exist from other devices via the communication interface 150.
  • the UAV control unit 110 may refer to the three-dimensional map database to determine the position where the unmanned aircraft 100 can exist, and acquire the position as position information indicating the position where the unmanned aircraft 100 should exist.
  • the UAV control unit 110 may transmit position information indicating the position where the unmanned aircraft 100 should exist to the flight control device 800 (eg, the processing backend 500) via the communication interface 150.
  • the position information indicating the position where the unmanned aircraft 100 should exist may be included in the information of the predetermined flight path.
  • the UAV control unit 110 can acquire imaging range information indicating the respective imaging ranges of the imaging unit 220 and the imaging unit 230.
  • the UAV control unit 110 may acquire the angle of view information indicating the angles of view of the imaging unit 220 and the imaging unit 230 from the imaging unit 220 and the imaging unit 230 as a parameter for specifying the imaging range.
  • the UAV control unit 110 may acquire information indicating the imaging directions of the imaging unit 220 and the imaging unit 230 as parameters for specifying the imaging range.
  • the UAV control unit 110 may acquire posture information indicating the posture state of the imaging unit 220 from the universal joint 200 as information indicating the imaging direction of the imaging unit 220, for example.
  • the posture information of the imaging unit 220 may indicate the angle at which the pitch axis and yaw axis of the gimbal 200 rotate from the reference rotation angle.
  • the UAV control unit 110 may acquire estimated position information indicating the estimated position where the unmanned aircraft 100 is located as a parameter for determining the imaging range.
  • the UAV control unit 110 may delimit the imaging range representing the geographical range captured by the imaging unit 220 and generate imaging range information based on the angle of view and imaging direction of the imaging unit 220 and the imaging unit 230 and the estimated position of the unmanned aircraft 100, thereby Get camera range information.
  • the UAV control unit 110 can obtain the imaging range information from the memory 160.
  • the UAV control unit 110 can acquire the imaging range information via the communication interface 150.
  • the UAV control unit 110 controls the universal joint 200, the rotor mechanism 210, the imaging unit 220, and the imaging unit 230.
  • the UAV control section 110 may control the imaging range of the imaging section 220 by changing the imaging direction or angle of view of the imaging section 220.
  • the UAV control unit 110 can control the imaging range of the imaging unit 220 supported by the universal joint 200 by controlling the rotation mechanism of the universal joint 200.
  • the imaging range refers to the geographic range captured by the imaging unit 220 or the imaging unit 230.
  • the camera range is defined by latitude, longitude and altitude.
  • the imaging range may be the range of three-dimensional spatial data defined by latitude, longitude, and altitude.
  • the imaging range may be the range of two-dimensional spatial data defined by latitude and longitude.
  • the imaging range can be specified according to the angle of view and the imaging direction of the imaging unit 220 or the imaging unit 230, and the location where the unmanned aircraft 100 is located.
  • the imaging direction of the imaging unit 220 and the imaging unit 230 can be defined by the azimuth and depression angle of the front of the imaging lens provided with the imaging unit 220 and the imaging unit 230.
  • the imaging direction of the imaging unit 220 may be a direction determined by the orientation of the nose of the unmanned aerial vehicle 100 and the posture state of the imaging unit 220 with respect to the universal joint 200.
  • the imaging direction of the imaging unit 230 may be a direction determined by the orientation of the nose of the unmanned aircraft 100 and the position where the imaging unit 230 is provided.
  • the camera direction can be the same as the camera direction.
  • the UAV control unit 110 may determine the surrounding environment of the unmanned aircraft 100 by analyzing a plurality of images captured by the plurality of imaging units 230.
  • the UAV control unit 110 may control the flight according to the surrounding environment of the unmanned aircraft 100, for example, avoiding obstacles.
  • the UAV control unit 110 may acquire three-dimensional information (three-dimensional information) indicating the three-dimensional shape (three-dimensional shape) of an object existing around the unmanned aircraft 100.
  • the object may be part of a landscape such as buildings, roads, vehicles, trees, etc., for example.
  • the three-dimensional information is, for example, three-dimensional spatial data.
  • the UAV control unit 110 may generate three-dimensional information representing the three-dimensional shape of an object existing around the unmanned aircraft 100 from each image obtained from the plurality of imaging units 230, thereby acquiring three-dimensional information.
  • the UAV control unit 110 may acquire the three-dimensional information indicating the three-dimensional shape of the object existing around the unmanned aircraft 100 by referring to the three-dimensional map database stored in the memory 160 or the memory 170.
  • the UAV control unit 110 may acquire three-dimensional information related to the three-dimensional shape of the object existing around the unmanned aircraft 100 by referring to the three-dimensional map database managed by a server existing on the network.
  • the UAV control unit 110 controls the flight of the unmanned aircraft 100 by controlling the rotor mechanism 210. That is, the UAV control unit 110 controls the rotor mechanism 210 to control the position including the latitude, longitude, and altitude of the unmanned aircraft 100.
  • the UAV control unit 110 may control the imaging range of the imaging unit 220 by controlling the flight of the unmanned aircraft 100.
  • the UAV control section 110 may control the angle of view of the imaging section 220 by controlling the zoom lens included in the imaging section 220.
  • the UAV control unit 110 may use the digital zoom function of the imaging unit 220 to control the angle of view of the imaging unit 220 through digital zoom.
  • the UAV control unit 110 can move the unmanned aircraft 100 to a certain position on a certain date to keep the camera unit 220 in a desired environment Next, shoot the desired shooting range.
  • the UAV control unit 110 may move the unmanned aircraft 100 to a specific position on a specific date to make the imaging unit 220 the desired Under the environment of the desired shooting range.
  • the communication interface 150 communicates with other communication devices (for example, a terminal or a transmitter (remote control) instructing flight control of the unmanned aircraft 100).
  • the communication interface 150 can perform wireless communication by any wireless communication method.
  • the communication interface 150 can perform wired communication by any wired communication method.
  • the communication interface 150 may send an aerial image and additional information (metadata) related to the aerial image to the terminal.
  • the communication interface 150 may communicate with at least one device included in the flight control device 800 (eg, the processing backend 500, the PID control device 700).
  • the memory 160 stores the UAV control unit 110 to the universal joint 200, the rotor mechanism 210, the imaging unit 220, the imaging unit 230, the GPS receiver 240, the inertial measurement device 250, the magnetic compass 260, the barometric altimeter 270, the ultrasonic sensor 280, and the laser measuring instrument 290 Programs required for control, etc.
  • the memory 160 may be a computer-readable recording medium, and may include SRAM (Static Random Access Memory), DRAM (Dynamic Random Access Memory), EPROM (Erasable Programmable Read Only Memory: erasable At least one of flash memory such as programmable read-only memory), EEPROM (Electrically Erasable Programmable Read-Only Memory: electrically erasable programmable read-only memory), and USB (Universal Serial Bus) memory.
  • SRAM Static Random Access Memory
  • DRAM Dynamic Random Access Memory
  • EPROM Erasable Programmable Read Only Memory: erasable At least one of flash memory such as programmable read-only memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory: electrically erasable programmable read-only memory
  • USB Universal Serial Bus
  • the memory 170 may include at least one of an HDD (Hard Disk: Drive), an SSD (Solid State Drive), an SD card, a USB memory, and other memories.
  • the memory 170 can store various information and various data.
  • the memory 170 can be detached from the unmanned aerial vehicle 100.
  • the memory 170 may record aerial images.
  • the memory 160 or the memory 170 may store information on the aerial photography position and aerial photography path (flight path) generated by the terminal or the unmanned aerial vehicle 100.
  • the UAV control unit 110 may set the aerial photography position and aerial photography path information as one of the aerial photography parameters related to the aerial photography scheduled by the unmanned aircraft 100 or the flight parameters related to the flight scheduled by the unmanned aircraft 100.
  • the setting information may be stored in the memory 160 or the memory 170.
  • the gimbal 200 can rotatably support the imaging unit 220 about the yaw axis, the pitch axis, and the roll axis.
  • the gimbal 200 can rotate the imaging unit 220 around at least one of the yaw axis, the pitch axis, and the roll axis, thereby changing the imaging direction of the imaging unit 220.
  • the rotor mechanism 210 has a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors.
  • the rotor mechanism 210 controls the rotation by the UAV control unit 110, thereby causing the unmanned aircraft 100 to fly.
  • the imaging unit 220 captures an object within a desired imaging range and generates data of a captured image.
  • the image data (for example, aerial image) captured by the imaging unit 220 may be stored in the memory or memory 170 of the imaging unit 220.
  • the imaging unit 230 captures the surroundings of the drone 100 and generates data of captured images.
  • the image data of the imaging unit 230 can be stored in the memory 170.
  • the GPS receiver 240 receives a plurality of signals (GPS signals) indicating the time sent from a plurality of navigation satellites (that is, GPS satellites) and the position (coordinates) of each GPS satellite.
  • the GPS receiver 240 calculates the position of the GPS receiver 240 (that is, the position of the unmanned aircraft 100) based on the received multiple signals.
  • the GPS receiver 240 outputs the position information of the unmanned aircraft 100 to the UAV control unit 110.
  • the UAV control unit 110 may calculate the position information of the GPS receiver 240 instead of the GPS receiver 240. In this case, information indicating the time and the position of each GPS satellite included in the multiple signals received by the GPS receiver 240 is input to the UAV control unit 110.
  • the inertial measurement device 250 detects the posture of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
  • the inertial measurement device 250 can detect the acceleration and triaxial angular velocity of the unmanned aircraft 100 in the three axis directions of front, back, left, right, and up and down as the attitude of the unmanned aircraft 100.
  • the magnetic compass 260 detects the orientation of the nose of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
  • the barometric altimeter 270 detects the flying altitude of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
  • the ultrasonic sensor 280 emits ultrasonic waves, detects ultrasonic waves reflected on the ground and objects, and outputs the detection results to the UAV control unit 110.
  • the detection result may show the distance from the unmanned aircraft 100 to the ground, that is, the height.
  • the detection result may show the distance from the unmanned aircraft 100 to the object (subject).
  • the laser measuring instrument 290 irradiates the object with laser light, receives the reflected light reflected by the object, and measures the distance between the unmanned aircraft 100 and the object (subject) by the reflected light.
  • a time-of-flight method may be used.
  • FIG. 4 illustrates the case where the unmanned aircraft 100 includes the GPS receiver 240, but the unmanned aircraft 100 may not include the GPS receiver 240. Even in this case, regardless of whether or not the GPS receiver 240 is provided, the unmanned aircraft 100 can perform flight control based on the estimated position of the unmanned aircraft 100.
  • the unmanned aerial vehicle 100 is flying in a space where it is difficult to receive GPS signals, for example, under a bridge, indoors, or the like. That is, it can be assumed that the positioning accuracy of the unmanned aerial vehicle 100 is low.
  • the unmanned aircraft 100 may not include the GPS receiver 240, that is, it can be assumed that the unmanned aircraft 100 does not have a positioning function.
  • the plurality of cameras 300 are arranged in various places such as the ground where the drone 100 can be photographed.
  • FIG. 5 is a diagram for explaining reprojection errors.
  • FIG. 5 shows the actual location of the unmanned aerial vehicle 100, unmanned aircraft assumed position XT 100 (also referred to as x t), the image projected on the projection plane of the imaging camera 300 assumes a plurality of positions ⁇ (cj, xt), observation position ojt.
  • j and t are variables, and it is assumed that the position of the position xt is variable.
  • the plurality of cameras 300 respectively photograph the unmanned aircraft 100 flying above, for example.
  • the pixel position of the drone 100 on the frame can be recognized planarly, but it is difficult to recognize the position in the depth direction with respect to the image surface (projection surface) of the camera image. Therefore, a plurality of cameras 300 capable of photographing the unmanned aerial vehicle 100 from different angles are provided.
  • the frame GM1 of the captured image captured by the camera 300 with the camera posture cj (also referred to as c j ) and the frame GM2 of the captured image captured by the camera 300 with the camera posture cj-1 are shown.
  • the frame GM1 there is an observation position ojt (also referred to as o jt ) which is a pixel position obtained by observing the unmanned aircraft 100, and a projection assumed position ⁇ (cj) obtained by projecting the assumed position xt of the unmanned aircraft 100 , Xt).
  • the observation position ojt and the projection assumed position ⁇ (cj, xt) sometimes do not match, and a reprojection error as a difference occurs between them.
  • observation position o (j-1) t obtained by the unmanned aircraft 100
  • a projection assumed position ⁇ (cj-1 obtained by projecting the assumed position xt of the unmanned aircraft 100 , Xt.
  • the observation position o (j-1) t and the projection hypothesis position ⁇ (cj-1, xt) sometimes do not coincide, resulting in a reprojection error.
  • the observation positions ojt and o (j-1) t are the positions where the actually observed positions are projected and are fixed positions, but the projection assumes that the positions ⁇ (cj, xt) and ⁇ (cj-1, xt) are variable positions.
  • the processing back-end 500 optimizes the re-projection error in such a way that these re-projection errors become smaller, and optimizes the estimated position of the unmanned aerial vehicle 100.
  • j is an example of camera identification information
  • t is an example of time.
  • the communication section 405 of the tracking front end 400 can receive the captured images captured by the plurality of cameras 300 and store them in the storage section 420.
  • the communication unit 405 can take information of the captured images captured by the plurality of cameras 300, the observation positions ojt, o (j-1) t, ... of the unmanned aircraft 100, and the camera posture cj, cj-1, ... Send to processing backend 500.
  • the communication unit 505 of the processing back-end 500 can receive the captured image, information on the observation position of the unmanned aerial vehicle 100, and the camera posture, and store it in the storage unit 520.
  • the communication unit 405 may not send the captured image to the processing backend 500.
  • the communication unit 505 of the processing back end 500 receives the information of the observation positions ojt, o (j-1) t, ... of the unmanned aircraft 100, and the information of the camera postures cj, cj-1, ....
  • the communication unit 505 may receive the captured image.
  • the processing unit 510 assumes the three-dimensional position of the drone 100 at time t at various positions to obtain the assumed position xt, and derives (for example, calculates) the assumed position xt to be projected to the camera posture cj, cj-1, ... the projections obtained on the image plane (projection plane) assume positions ⁇ (cj, xt), ⁇ (cj-1, xt), ....
  • the processing unit 510 may calculate the projection assumed position of the camera 300 based on the assumed position as the three-dimensional position and the camera posture.
  • the processing unit 510 optimizes according to the equation (1) based on the observation positions ojt, o (j-1) t, ... and the projection assumed positions ⁇ (cj, xt), ⁇ (cj-1, xt), ... Processing, optimizing the assumed position of the unmanned aircraft 100, and estimating the position of the unmanned aircraft 100.
  • the argmin function shown in equation (1) is a function that passes x with the smallest function value as an argument.
  • the function value of the argmin function represents an error score for indicating an error between each assumed position xt and the existing position of the unmanned aircraft 100, and represents a function for minimizing the reprojection error.
  • cj represents the camera pose.
  • the camera pose can be determined by the shooting position and shooting direction (orientation).
  • xt represents the three-dimensional position of the unmanned aerial vehicle 100 in real space at time t, and is assumed to be various positions.
  • t represents time.
  • ⁇ (cj, xt) represents a projection assumed position obtained by projecting the unmanned aircraft 100 assumed to be the position xt onto the frame of the camera 300 with the camera posture cj.
  • ojt represents the observation position of the pixel position of the unmanned aircraft 100 in the frame of the captured image captured at the time t by the camera 300 taking the camera posture cj.
  • p ( ⁇ (cj, xt) -ojt) is a function that integrates the reprojection error that is the difference between the projection assumed position and the observation position. Therefore, the processing unit 510 may search for the assumed position x that minimizes the reprojection error while changing the assumed position xt according to equation (1) to estimate the position of the unmanned aircraft 100.
  • wjt (also described as w jt ) is a coefficient indicating reliability.
  • the reliability wjt is calculated according to equation (2).
  • reliability can also be omitted from equation (1).
  • equation (2) the case where the reliability is determined by the distance between the camera 300 and the unmanned aircraft 100 is illustrated.
  • ⁇ jt (also described as ⁇ jt ) is a fixed value determined by the camera number j and time t, and may be a value for adjusting the range of values derived from equations (1) and (2).
  • ds is a fixed value.
  • d represents the distance between the unmanned aerial vehicle 100 and the camera 300.
  • the camera 300 may determine the unmanned aircraft 100 captured in the captured image through image recognition or the like, and derive (eg, calculate) the distance to the unmanned aircraft 100 according to the size of the unmanned aircraft 100.
  • the camera 300 may have a distance measuring sensor, and the distance to the unmanned aircraft 100 is measured using the distance measuring sensor. The shorter the distance d, the higher the reliability wjt.
  • the reliability Wjt may be set to a value of 0, and the assumed position xt of the unmanned aircraft 100 in this case may not be used for the estimated position.
  • the derivation of the distance d may be implemented by the tracking front end 400 and the processing back end 500.
  • the processing unit 510 of the processing backend 500 may derive (for example, calculate) the distance d based on the distance between the assumed position xt and each camera 300. The information of the distance d can be notified from the camera 300 and the tracking front end 400 to the processing back end 500.
  • the argmin function is the difference between the assumed position ⁇ (cj, xt) and the observation position ojt of the projection of the unmanned aircraft 100 over the entire observation time t for all cameras 300 (cameras with camera poses C1 to Cn) (Reprojection error)
  • n can be any value that represents the total number of cameras.
  • the estimation of the position of the unmanned aircraft 100 using the argmin function shown in equation (1) is an example of the position estimation of the unmanned aircraft 100, and the position estimation may be performed by other methods.
  • Derivation of reliability wjt according to equation (2) is an example of reliability derivation, and reliability can also be derived by other methods.
  • the communication unit 505 of the processing back-end 500 transmits the data of the estimated position of the unmanned aircraft 100 optimized by the processing unit 510 to the PID control device 700.
  • the communication unit 705 of the PID control device 700 receives the data of the estimated position of the unmanned aircraft 100 and stores it in the storage unit 720.
  • the processing unit 710 of the PID control device 700 performs PID control based on the estimated position of the unmanned aircraft 100 to fly the unmanned aircraft 100 along the target flight path stored in the storage unit 720.
  • the communication unit 705 transmits the flight parameters obtained by PID control to the unmanned aircraft 100.
  • the communication interface 150 of the unmanned aircraft 100 stores them in the memory 160.
  • the UAV control unit 110 controls the rotor mechanism 210 according to the flight parameters, controls the flight of the unmanned aircraft 100, and continues autonomous flight.
  • the unmanned aerial vehicle system 5 includes the flight control device 800 (one example of the position estimation device) that estimates the existence position of the unmanned aircraft 100.
  • the flight control device 800 includes a processing unit (for example, at least one of the processing unit 410 of the tracking front end 400 and the processing unit 510 of the processing back end 500) that performs processing related to estimating the existence position of the unmanned aircraft 100.
  • the processing unit 410 of the tracking front end 400 can acquire a plurality of captured images of the unmanned aircraft 100 captured by the plurality of cameras 300.
  • the processing unit 410 of the tracking front end 400 can acquire information on the posture (arrangement position and imaging direction of each camera 300) of each camera 300.
  • the processing unit 510 of the processing back-end 500 may be based on the posture of the camera 300, the assumed positions of the unmanned aircraft 100 in the actual space, and the observation position of the unmanned aircraft 100 projected to each captured image (unmanned An example of the image position where the aircraft 100 exists) to calculate an error score (for example, the derived value of the argmin function) that represents the error between each assumed position xt and the position where the unmanned aircraft 100 exists.
  • the processing back-end 500 may estimate the existence position of the unmanned aircraft 100 in the actual space based on the error score.
  • the processing back-end 500 unit may be an example of a position estimation device.
  • the flight control device 800 can photograph the unmanned aerial vehicle 100 by using a plurality of cameras 300 in various postures (arrangement position, imaging orientation), so that not only the plane along the imaging plane (projection plane) of the captured image is considered
  • the position of the unmanned aircraft 100 is also estimated by considering the position in the depth direction. Therefore, even if the unmanned aircraft 100 cannot receive the GPS signal, the position of the unmanned aircraft 100 can be estimated. Therefore, even in the case where the unmanned aircraft 100 cannot receive the GPS signal, the unmanned aircraft 100 can autonomously fly based on the estimated position of the unmanned aircraft 100.
  • the unmanned aircraft 100 can autonomously fly.
  • the flight control device 800 does not require the unmanned aerial vehicle 100 to detect the acceleration of the unmanned aerial vehicle 100 and double-integrate the acceleration (two-time integration) to estimate the position, so there is no error caused by the double integration, and Predict the position of unmanned aircraft with high accuracy.
  • the flight control device 800 does not need to use motion capture or beacon signals for position estimation, and therefore, it is possible to prevent the place where the unmanned aircraft 100 can be used from being restricted.
  • the processing unit 410 of the tracking front end 400 can acquire a plurality of captured images of the unmanned aircraft 100 photographed by the plurality of cameras 300 at a plurality of times t.
  • the processing backend 500 may calculate the error score at multiple times t.
  • the processing unit 510 of the processing backend 500 may estimate the existence position of the unmanned aircraft 100 in the actual space based on the error score.
  • the flight control device 800 can estimate the position of the unmanned aircraft 100 at a plurality of times t (time point), and therefore can also estimate the position of the unmanned aircraft 100 in consideration of the motion of the unmanned aircraft 100. Therefore, the estimation accuracy of the position of the unmanned aircraft 100 is improved.
  • the processing unit 510 of the processing back-end 500 may acquire the distance d between the camera 300 and the unmanned aircraft 100.
  • the processing unit 510 may derive the reliability wjt of the error score regarding the captured image based on the distance d.
  • the processing unit 510 may calculate an error score based on the reliability wjt.
  • the flight control device 800 can improve the estimation accuracy of the position of the unmanned aircraft 100 using the error score.
  • the processing unit 510 may assume the position ⁇ (cj, xt) and the observation position ojt of each captured image captured with the arrangement position of each camera 300 and the imaging direction of each camera 300 (camera posture cj). Difference (reprojection error), the error score is calculated, where the projected assumed position ⁇ (cj, xt) is the assumed position xt of the unmanned aircraft 100 assuming the actual space in the arrangement position of each camera 300 and The position of each camera 300 projected on each captured image captured by the camera orientation (camera posture cj); the observation position is among each captured image captured in the arrangement position of each camera 300 and the imaging orientation of each camera device, The location where the unmanned aerial vehicle 100 exists.
  • the processing unit 510 may estimate the assumed position xt of the unmanned aircraft 100 with the smallest error score as the existing position of the unmanned aircraft 100. For example, the processing unit 510 may optimize the error score according to equation (1).
  • the flight control device 800 can derive an error score based on the difference between the projection assumed position projected on the captured image captured by each camera 300 and the image position (observation position). Then, for example, the flight control device 800 can obtain an error score at each time t, and can estimate the assumed position xt with the smallest error score as the existence position of the unmanned aircraft 100. Therefore, the processing unit 510 can optimize the estimated position of the unmanned aircraft 100 based on the images captured by the plurality of cameras 300, for example, according to equation (1).
  • the case where the GPS position information detected by the unmanned aircraft 100 is further considered in the first embodiment is shown to optimize the estimated position of the unmanned aircraft 100.
  • FIG. 6 is a diagram showing the hardware configuration of the flight control device 800A in the second embodiment.
  • the unmanned aerial vehicle system 5A of the second embodiment has substantially the same structure as the first embodiment.
  • the same symbols are used, and the description thereof is omitted.
  • the unmanned aerial vehicle 100 includes a GPS receiver 240 without being omitted.
  • the UAV control unit 110 of the unmanned aircraft 100 acquires GPS position information of the unmanned aircraft 100 based on the GPS signal received by the GPS receiver 240.
  • the UAV control unit 110 transmits GPS position information to the flight control device 800A via the communication interface 150.
  • the communication unit 505A that processes the back end 500A receives GPS position information from the unmanned aerial vehicle 100.
  • the communication unit 505A receives the captured images of the plurality of cameras 300, the observation position of the unmanned aircraft 100, and the camera posture information from the tracking front end 400 as in the first embodiment.
  • the captured image may not be received.
  • the processing unit 510A that processes the back end 500A considers the GPS position information received via the communication unit 505A when optimizing the reprojection error. That is, the processing unit 510A can optimize the estimated position of the unmanned aerial vehicle 100 according to equation (3).
  • the argmin function shown in equation (3) is a function that passes the smallest function value x as an argument in the same manner as equation (1), and includes the term ⁇ GG (X, Z) related to GPS position information.
  • ⁇ GG (X, Z) is the score considering the GPS signal.
  • ⁇ G (also described as ⁇ G ) is shown in equation (4).
  • Cn is the total number of GPS satellites.
  • N is the number of GPS satellites used by the GPS receiver 240 in signal reception. Therefore, Cn-N represents the number of GPS satellites that are not used by the GPS receiver 240 in signal reception.
  • Ca is the coefficient. According to equation (4), the greater the number of GPSs used by the GPS receiver 240 for signal reception, the smaller the value of ⁇ G.
  • ⁇ G may be a value corresponding to the strength of the GPS signal. When the strength of the GPS signal is large, the value of ⁇ G can be reduced, and when the strength of the GPS signal is small, the value of ⁇ G can be increased.
  • G (X, Z) is a value equivalent to the sum of the difference between all assumed positions xt of the unmanned aircraft 100 and the GPS position, as shown in equation (5).
  • represents the GPS position (the position detected by the GPS receiver 240). It is assumed that the greater the difference between the position xt and the GPS position ⁇ , the greater the G (X, Z).
  • the argmin function optimizes the assumed position xt of the unmanned aerial vehicle 100 so that the value (reprojection error) within the parentheses of the argmin function (within () of argminx ()) becomes small. That is, the estimated position of the unmanned aerial vehicle 100 indicated by the argmin function is the assumed position where the value in parentheses of the argmin function is the smallest.
  • the communication unit 505A transmits the estimated position of the unmanned aircraft 100 optimized by the processing unit 510A to the PID control device 700.
  • the PID control device 700 generates flight parameters of the unmanned aircraft 100 based on the optimized estimated position of the unmanned aircraft 100 and sends them to the unmanned aircraft 100.
  • the operation of the unmanned aerial vehicle 100 is the same as in the first embodiment.
  • the processing unit 510A of the processing backend 500A considers the GPS position information to optimize the reprojection error so that, for example, the value of the argmin function of equation (3) is below the threshold th2 (for example Minimum), and the existence position of the unmanned aerial vehicle 100 is estimated.
  • the threshold th2 for example Minimum
  • the processing section 510A of the processing back-end 500 can acquire the GPS position (an example of the first measurement position) measured by the GPS receiver 240 (an example of the positioning section) included in the unmanned aircraft 100.
  • the processing unit 510A may calculate an error score shown in equation (3) based on the GPS position.
  • the flight control device 800A can consider the GPS signal generally used for positioning in the unmanned aircraft 100 to calculate an error score and optimize the estimated position of the unmanned aircraft 100. Therefore, even if the accuracy of the GPS receiver 240 included in the unmanned aerial vehicle 100 is low, the flight control device 800A can estimate the position of the unmanned aerial vehicle 100 based on the same error score as the first embodiment together with the GPS signal. Improve the position estimation accuracy. Accordingly, even when the positioning accuracy of the GPS receiver 240 is low, the flight control device 800A can estimate the position of the unmanned aircraft 100 using the error score, and assist the positioning by the GPS receiver 240.
  • the GPS receiver 240 may receive GPS signals from a plurality of GPS satellites to obtain information on GPS positions.
  • the processing unit 510A can calculate the error score in such a manner that the greater the number of GPS satellites that the GPS receiver 240 can receive, the more the GPS position information can be reflected, that is, the greater the influence of the GPS signal on the error score.
  • the greater the number of GPSs used for signal reception by the unmanned aircraft 100 the smaller the value of the coefficient ⁇ G, which is affected by G (X, Z) The value of is very large, so that it can reflect GPS position information more.
  • the flight control device 800A can adjust the influence of the GPS signal value on the error score according to the reliability of the GPS signal (the strength of the GPS signal), calculate the error score, optimize the reprojection error, and estimate the position of the unmanned aircraft 100 . Therefore, the flight control device 800A estimates the position of the unmanned aircraft 100 in consideration of the reception state of GPS, and therefore the position estimation accuracy of the unmanned aircraft 100 can be further improved. Thereby, the flight control device 800A can correct the estimation result of the position of the unmanned aircraft 100 using the error score of Embodiment 1 based on the reliability of the GPS signal.
  • Embodiment 3 shows a case where Embodiment 2 further considers the estimated position of the unmanned aircraft 100 based on the dynamic factor and the acceleration of the unmanned aircraft 100 to optimize the estimated position of the unmanned aircraft 100.
  • the dynamic factor is a factor considering physical phenomena.
  • FIG. 7 is a diagram showing the hardware configuration of the flight control device 800B in Embodiment 3.
  • FIG. 7 is a diagram showing the hardware configuration of the flight control device 800B in Embodiment 3.
  • the flight control device 800B has the IMU front end 600 in addition to the tracking front end 400, the processing back end 500B, and the PID control device 700 as in the first embodiment.
  • the IMU front end 600 has a communication unit 605, a processing unit 610, and a storage unit 620.
  • the processing unit 610 estimates the position of the unmanned aircraft 100 and obtains information on the estimated position by, for example, performing double integration (second integration) on the acceleration (IMU data) measured by the inertial measurement device 250.
  • the position estimated based on the acceleration is also referred to as an estimated acceleration position.
  • the communication unit 605 communicates with the unmanned aircraft 100 and the processing backend 500B.
  • a dedicated line, wired LAN, wireless LAN, mobile communication, etc. are used.
  • the communication unit 605 receives information on the acceleration measured by the inertial measurement device 250 from the unmanned aerial vehicle 100, for example.
  • the communication unit 605 transmits, for example, the estimated acceleration position of the unmanned aircraft 100 based on acceleration to the processing backend 500B.
  • the storage section 420 serves as a working memory of the processing section 410.
  • the IMU front end 600 for example, data (acceleration data) of 100 IMUs are acquired in one second.
  • captured images are acquired at a frequency of 30 (30 fps) or 60 (60 fps) per second, and the image position (observation position) where the unmanned aircraft 100A exists is derived.
  • the processing speed is, for example, 10 IMUs per second.
  • the processing unit 510B may control so as to match the processing frequency of the processing back-end 500B (for example, 10 times per second).
  • the processing unit 510B may integrate the IMU data via the communication unit 505 and the communication unit 605 to obtain 10 estimated acceleration positions in one second, for example.
  • the processing unit 510B may acquire the information of the image position (observation position) of 10 captured images within one second via the communication unit 505 and the communication unit 405, for example.
  • the communication unit 505B that processes the back end 500B receives GPS position information of the unmanned aircraft 100 from the unmanned aircraft 100 as in the second embodiment.
  • the UAV control unit 110 of the unmanned aircraft 100 may acquire the GPS position information of the unmanned aircraft 100 based on the GPS signal received by the GPS receiver 240 and send it to the flight control device 800B via the communication interface 150.
  • the processing backend 500B may not acquire GPS position information, and the GPS position information may not be considered in calculating the error score.
  • the communication unit 505B of the processing backend 500B can receive the captured images captured by the plurality of cameras 300 and the data related to the posture of the camera 300 from the tracking frontend 400.
  • the communication unit 505B can receive the information of the estimated acceleration position from the IMU front end 600.
  • the processing backend 300B may not acquire the captured image.
  • the storage unit 520B may store dynamic factors.
  • the dynamic factor may include, for example, data for bringing the estimated position of the unmanned aerial vehicle 100 within a physically movable range according to the equation of motion.
  • the dynamic factor may include a Kalman filter, for example. When a Kalman filter is used, the processing unit 510B may estimate the next position of the unmanned aircraft 100 based on the current position, speed, acceleration, etc. of the unmanned aircraft 100.
  • the processing unit 510B processing the back end 500B considers the estimated position of acceleration received through the communication unit 505B and the physical estimated position (physical estimated position) obtained by the dynamic factor At least one of them.
  • the processing unit 510B may consider GPS position information when optimizing the reprojection error to estimate the position of the unmanned aircraft 100. That is, the processing unit 510B can optimize the reprojection error according to equation (6).
  • R (X, ⁇ ) represents the score considering the physical estimated position, that is, the score considering the dynamic factor, as shown in equation (7).
  • ⁇ R (also described as ⁇ R ) is a fixed value.
  • the xt overline ( ⁇ t) represents the physical estimated position at time t based on various physical laws and expressions (such as equations of motion). Therefore, R (X, ⁇ ) represents the cumulative value of the difference between the assumed position xt of the unmanned aircraft 100 and the physically estimated position of the unmanned aircraft 100.
  • the underline on xt may be a function for deriving the physical estimated position.
  • [gamma] T (also referred to as ⁇ t) may be derived physical variables estimated desired position, for example, the position of the unmanned air vehicle 100, the speed at time t.
  • the processing unit 510B can calculate the error score based on the position where the drone 100 can physically move.
  • the flight control device 800B can estimate the position of the unmanned aircraft 100 in consideration of, for example, a physically estimated position where the unmanned aircraft 100 can move, and improve the position estimation accuracy.
  • I (X, ⁇ ) represents the score considering the acceleration estimated position, that is, the score considering the acceleration factor, as shown in equation (8).
  • [lambda] i (also referred to as ⁇ I) is a fixed value.
  • the xt dashed line ( ⁇ t) represents a position derived from the acceleration measured by the inertial measurement device 250 (acceleration estimated position). Therefore, I (X, ⁇ ) represents the cumulative value of the difference between the assumed position xt of the unmanned aircraft 100 and the estimated position of the acceleration of the unmanned aircraft 100.
  • the line drawn on xt may be a function for deriving the estimated position of acceleration.
  • [Delta] T (also referred to as ⁇ t) may be derived variable acceleration estimating a desired position, for example, the position of the unmanned air vehicle 100, the speed at time t.
  • the processing unit 510B can derive (for example, calculate) the position of the unmanned aircraft 100 (accelerated estimated position) based on the acceleration measured by the inertial measurement device 250 (an example of an acceleration measuring device) included in the unmanned aircraft 100 ) (An example of the second measurement position).
  • the processing unit 510B may calculate an error score based on the acceleration measurement position.
  • the flight control device 800B can estimate the position of the unmanned aircraft 100 in consideration of the measurement result (acceleration) of the inertial measurement device 250 included in the unmanned aircraft 100, and can improve the position estimation accuracy.
  • the processing unit 510B of the processing back-end 500B considers at least one of GPS position information, a physical estimated position, and an estimated acceleration position to optimize the reprojection error so that ),
  • the value of the argmin function is below the threshold th3 (for example, minimum), and the existence position of the unmanned aircraft 100 is estimated.
  • the communication unit 505B transmits the estimated position of the unmanned aircraft 100 optimized by the processing unit 510B to the PID control device 700.
  • the PID control device 700 generates flight parameters of the unmanned aircraft 100 based on the optimized estimated position of the unmanned aircraft 100 and sends them to the unmanned aircraft 100.
  • the operation of the unmanned aerial vehicle 100 is the same as in Embodiments 1 and 2.
  • the flight control devices 800, 800A, and 800B in the above-described embodiments are configured as separate devices from the camera 300 and the unmanned aircraft 100, but at least a part of the structure thereof may include the camera 300, the unmanned aircraft 100, the camera 300, and the The terminal and server other than the unmanned aerial vehicle 100 are configured.
  • the terminal may be, for example, a terminal capable of operating an unmanned aircraft.
  • the server may be a computer connected to the network in communication with the camera 300 and the unmanned aerial vehicle 100.
  • the tracking front end 400 may be provided in any one of the camera 300, the unmanned aircraft 100, the terminal, and the server.
  • the processing backends 500, 500A, 500B may be provided in any of the camera 300, the unmanned aerial vehicle 100, the terminal, and the server.
  • the PID control device 700 may be provided in any one of the camera 300, the unmanned aerial vehicle 100, the terminal, and the server.
  • the IMU front end 600 may be provided in any one of the camera 300, the unmanned aerial vehicle 100, the terminal, and the server.
  • Each processing unit in the above-described embodiments realizes various functions by, for example, a processor executing a program stored in each storage unit.
  • the processor may include MPU (Micro Processing Unit), CPU (Central Processing Unit), DSP (Digital Signal Processor), GPU (Graphical Processing Unit: graphic processing unit), etc.
  • MPU Micro Processing Unit
  • CPU Central Processing Unit
  • DSP Digital Signal Processor
  • GPU Graphic Processing Unit: graphic processing unit
  • Each processing unit controls each unit in the device.
  • Each processing unit executes various processes.
  • Each storage unit in each of the above-described embodiments includes a main storage device (for example, RAM (Random Access Memory), ROM (Read Only Memory)).
  • Each storage unit may include a secondary storage device (eg, HDD (Hard Disk), SSD (Solid State Drive)), and a tertiary storage device (eg, optical disk, SD card).
  • Each storage unit may include other storage devices.
  • Each storage unit stores various data, information, and programs.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Mechanical Engineering (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Navigation (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Studio Devices (AREA)
  • Traffic Control Systems (AREA)

Abstract

希望即使在飞行体不具有测位功能的情况下或测位的精度较低的情况下,也能够测量飞行体的位置。位置推定装置包括进行与推定飞行体的存在位置有关的处理的处理部。处理部获取由多个摄像装置对飞行体进行拍摄而得到的多个摄像图像,获取各摄像装置的布置位置以及各摄像装置的摄像朝向的信息,基于各摄像装置的布置位置、各摄像装置的摄像朝向、假定在实际空间中存在飞行体的各假定位置、以及各摄像图像中飞行体所存在的图像位置,来计算出表示各假定位置与飞行体的存在位置之间的误差的误差得分,并基于误差得分来推定实际空间中的飞行体的存在位置。

Description

位置推定装置、位置推定方法、程序以及记录介质 【技术领域】
本公开涉及一种推定飞行体的存在位置的位置推定装置、位置推定方法、程序以及存储介质。
【背景技术】
已知以往的无人飞行体接收从GPS(Global Positioning System)卫星发送的GPS信号,根据这个GPS信号计算出位置并自主飞行(参照专利文献1)。
【现有技术文献】
【专利文献】
【专利文献1】日本特开2018-147467号公报
【发明内容】
【发明所要解决的技术问题】
但是,在桥梁的下方、室内等的环境中,无人飞行体有时无法接收GPS信号,在这种情况下,难以进行自主飞行。即使在无法接收GPS信号的情况下,例如也可以对无人飞行体的速度进行积分来推定其位置。但是,在对无人飞行体的速度进行积分来推定其位置的情况下,容易产生误差(例如每10m有2m的误差),推定的位置的精度不够高。另外,作为检测物体的位置的技术,有通过对物体照射光并进行拍摄以捕捉物体的运动的技术(动作捕捉)。但是,在动作捕捉的情况下,需要使用专用的标记进行跟踪,使用的场景受到限制。另外,还有由移动体接收信标发出的信标信号,检测其自身位置的技术。但是,在使用信标信号的情况下,需要将信标布置在各场所,同样地,使用的场景受到限制。
【用于解决问题的技术手段】
在一个方面中,位置推定装置是一种推定飞行体的存在位置的位置推定装置,其包括进行与推定飞行体的存在位置有关的处理的处理部,处理部获取由多个摄像装置对飞行体进行拍摄而得到的多个摄像图像,获取各摄像装置的布置位置以及各摄像装置的摄像朝向的信息,基于各摄像装置的布置位置、各摄像装置的摄像朝向、假定在 实际空间中存在飞行体的各假定位置、以及各摄像图像中飞行体所存在的图像位置,计算出表示各假定位置与飞行体的存在位置之间的误差的误差得分,并基于误差得分来推定实际空间中的飞行体的存在位置。
处理部可以获取由多个摄像装置在多个时刻对飞行体进行拍摄而得到的多个摄像图像,在多个时刻计算出误差得分,并基于误差得分来推定实际空间中的飞行体的存在位置。
处理部可以获取摄像装置与飞行体之间的距离,根据距离,导出关于摄像图像的误差得分的可靠度,并基于可靠度计算出误差得分。
处理部可以基于各投影假定位置和各图像位置之间的差,来计算出误差得分,各投影假定位置是将假定在实际空间中存在飞行体的各假定位置在以各摄像装置的布置位置及各摄像装置的摄像朝向拍摄到的各摄像图像上投影而得到的投影假定位置,各图像位置是在以各摄像装置的布置位置以及各摄像装置的摄像朝向拍摄到的各摄像图像中飞行体所存在的图像位置,并将误差得分最小的飞行体的假定位置推定为所述飞行体的存在位置。
处理部可以获取由飞行体所包括的测位部测量出的飞行体的第一测量位置,并基于第一测量位置来计算出误差得分。
测位部可以从多个GPS(Global Positioning System,全球定位系统)卫星接收GPS信号以获取第一测量位置,处理部可以以测位部不能接收的GPS卫星的数量越多,第一测量位置对误差得分的影响就越大的方式来计算出误差得分。
处理部可以基于飞行体能够物理移动的位置来计算出误差得分。
处理部可以基于由飞行体所包括的加速度测量器测量出的加速度,导出飞行体的第二测量位置,并基于第二测量位置来计算出误差得分。
在一个方面中,位置推定方法是一种推定飞行体的存在位置的位置推定方法,其包括以下步骤:获取由多个摄像装置对飞行体进行拍摄而得到的多个摄像图像;获取各摄像装置的布置位置以及各摄像装置的摄像朝向的信息;基于各摄像装置的布置位置、各摄像装置的摄像朝向、假定在实际空间中存在飞行体的各假定位置、以及各摄像图像中飞行体所存在的图像位置,计算出表示各假定位置与飞行体的存在位置之间的误差的误差得分;以及基于误差得分来推定实际空间中的飞行体的存在位置。
获取多个摄像图像的步骤可以包括获取由多个摄像装置在多个时刻对飞行体进行拍摄而得到的多个摄像图像的步骤。计算出误差得分的步骤可以包括在多个时刻计算 出误差得分的步骤。推定飞行体的存在位置的步骤可以包括基于误差得分来推定实际空间中的飞行体的存在位置的步骤。
计算出误差得分的步骤可以包括以下步骤:获取摄像装置与飞行体之间的距离;根据距离导出关于摄像图像的误差得分的可靠度;以及根据可靠度来计算出误差得分。
计算出误差得分的步骤可以包括以下步骤:基于各投影假定位置和各图像位置之间的差,来计算出误差得分,各投影假定位置是将假定在实际空间中存在飞行体的各假定位置在以各摄像装置的布置位置及各摄像装置的摄像朝向拍摄到的各摄像图像上投影而得到的投影假定位置,各图像位置是在以各摄像装置的布置位置以及各摄像装置的摄像朝向拍摄到的各摄像图像中,飞行体所存在的图像位置。推定飞行体的存在位置的步骤可以包括将误差得分最小的飞行体的假定位置推定为飞行体的存在位置的步骤。
计算出误差得分的步骤可以包括以下步骤:获取由飞行体所包括的测位部测量出的飞行体的第一测量位置;以及基于第一测量位置来计算出误差得分。
获取第一测量位置的步骤可以包括从多个GPS卫星接收GPS信号以获取测量位置的步骤。计算出误差得分的步骤可以包括以下步骤:以测位部不能接收的GPS卫星的数量越多,第一测量位置对误差得分的影响就越大的方式来计算出误差得分。
计算出误差得分的步骤可以包括基于飞行体能够物理移动的位置来计算出误差得分的步骤。
计算出误差得分的步骤可以包括以下步骤:基于由飞行体所包括的加速度测量器测量出的加速度来导出飞行体的第二测量位置;并基于第二测量位置来计算出误差得分。
在一个方面中,程序是一种用于使推定飞行体的存在位置的位置推定装置执行以下步骤的程序:获取由多个摄像装置对飞行体进行拍摄而得到的多个摄像图像;获取各摄像装置的布置位置以及各摄像装置的摄像朝向的信息;基于各摄像装置的布置位置、各摄像装置的摄像朝向、假定在实际空间中存在飞行体的各假定位置、以及各摄像图像中飞行体所存在的图像位置,计算出表示各假定位置与飞行体的存在位置之间的误差的误差得分;以及基于误差得分来推定实际空间中的飞行体的存在位置。
在一个方面中,记录介质是一种计算机可读记录介质,其记录有用于使推定飞行体的存在位置的位置推定装置执行以下步骤的程序:获取由多个摄像装置对飞行体进行拍摄而得到的多个摄像图像;获取各摄像装置的布置位置以及各摄像装置的摄像朝 向的信息;基于各摄像装置的布置位置、各摄像装置的摄像朝向、假定在实际空间中存在飞行体的各假定位置、以及各摄像图像中飞行体所存在的图像位置,计算出表示各假定位置与飞行体的存在位置之间的误差的误差得分;以及基于误差得分来推定实际空间中的飞行体的存在位置。
此外,上述的发明内容中并未穷举本公开的所有特征。此外,这些特征组的子组合也可以构成发明。
【附图说明】
图1是示出实施方式1中的无人飞行体系统的概要的一个示例的图。
图2是示出飞行控制装置的硬件构成的图。
图3是示出无人驾驶航空器的具体的外观的一个示例的图。
图4是示出无人驾驶航空器的硬件构成的一个示例的框图。
图5是说明再投影误差的图。
图6是示出实施方式2中的飞行控制装置的硬件构成的图。
图7是示出实施方式3中的飞行控制装置的硬件构成的图。
【具体实施方式】
以下,通过本发明的实施方式来对本公开进行说明,但是以下实施方式并非限制权利要求书所涉及的发明。实施方式中说明的特征的组合的全部对于发明的解决方案未必是必须的。
权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人则不会提出异议。但是,在除此以外的情况下,保留一切的著作权。
在以下实施方式中,飞行体以无人驾驶航空器(UAV:Unmanned Aerial Vehicle)为例。无人驾驶航空器包括在空中移动的航空器。在本说明书的附图中,无人驾驶航空器也表述为“UAV”。位置推定装置例如是PC(Personal Computer,个人电脑)、服务器、终端、各种处理装置、控制装置。位置推定方法对位置推定装置中的动作进行了规定。此外,记录介质中记录有程序(例如使位置推定装置执行各种处理的程序)。
(实施方式1)
图1是示出实施方式1中的无人飞行体系统5的概要的一个示例的图。无人驾驶航空器100在无人驾驶航空器100无法接收或难以接收GPS信号的场所,例如桥梁的下方或室内进行自主飞行的状况下,也可使用无人飞行体系统5。另外,无人飞行 体系统5也用于当无人驾驶航空器100不包括GPS接收器时无人驾驶航空器100自主飞行的情况。无人驾驶航空器100为飞行体的一个示例。无人飞行体系统5构成为包括多个相机300、飞行控制装置800以及无人驾驶航空器100。相机300是摄像装置的一个示例。飞行控制装置800是位置推定装置的一个示例。
多个相机300设置在地上、水上的不同场所,从各种方向拍摄无人驾驶航空器100。多个相机300可以是固定相机,并且多个相机300可以是可自由改变摄像位置、摄像方向的相机。多个相机300既可以设置于相同的场所、区域,也可以设置于不同的场所、区域。相机300可以具有处理部、通信部、以及存储部。
飞行控制装置800基于由多个相机300拍摄到的图像,进行跟踪无人驾驶航空器100的处理。另外,为了导出推定存在无人驾驶航空器100的位置,飞行控制装置800进行优化处理,以使表示假定存在无人驾驶航空器100的各假定位置与无人驾驶航空器100的实际存在位置之间的误差的误差得分为最小。在图1中,设想了多个相机300对正在自主飞行的无人驾驶航空器100进行拍摄的状况。另外,无人驾驶航空器100不限于一台,也可以存在多台。
图2是示出飞行控制装置800的硬件构成的图。飞行控制装置800包括跟踪前端400、处理后端500、以及PID(Proportional-Integral-Differentia,比例-积分-微分)控制装置700。跟踪前端400的处理部和处理后端500的处理部中的至少一个是位置推定装置的处理部的一个示例。另外,跟踪前端400、处理后端500以及PID控制装置700可以设置在一台装置上,也可以分散设置在多个装置上。飞行控制装置800的整体或一部分可以是PC、服务器、终端、各种处理装置、控制装置。飞行控制装置800的各部(跟踪前端400、处理后端500、PID控制装置700)既可以设置于相同的场所、区域,也可以设置于不同的场所、区域。
跟踪前端400获取拍入由多个相机300拍摄的各摄像图像中的无人驾驶航空器100在帧上(摄像图像中)的像素位置,并发送到处理后端500。跟踪前端400能够通过以时间序列检测无人驾驶航空器100所处的像素位置,跟踪无人驾驶航空器100的运动,并能够在摄像图像中跟踪无人驾驶航空器100的运动。另外,跟踪前端400将各相机300的姿势的信息发送到处理后端500。各相机300的姿势可以通过各相机300的布置位置以及摄像朝向来规定。跟踪前端400具有通信部405、处理部410和存储部420。此外,在各相机300的姿势固定的情况下,可以一次发送到处理后端500,在各相机300的姿势可变的情况下,可以依次发送到处理后端500。
处理部410获取在同一时点由多个相机300拍摄到的、拍入各摄像图像中的无人驾驶航空器100在帧上的像素位置作为观测位置。处理部410可以以30fps或60fps的频率获取无人驾驶航空器100的像素位置。另外,处理部410从各相机300、外部 服务器等获取各相机300的姿势的信息。此外,在相机300的姿势被固定的情况下,处理部410可以使存储部420保存相机300的姿势的信息。
通信部405与多个相机300以及处理后端500进行通信。通信方式采用专用线、有线LAN(Local Area Network,局域网)、无线LAN、移动通信等。通信部405从多个相机300接收摄像图像。通信部405将拍入由多个相机300拍摄到的摄像图像中的无人驾驶航空器100的观测位置发送到处理后端500。另外,通信部405将多个相机300的姿势的信息发送到处理后端500。
存储部420可以用作处理部410的工作存储器。
处理后端500基于从跟踪前端400获取的无人驾驶航空器100的各像素位置(观测位置),进行优化处理。处理后端500具有通信部505、处理部510和存储部520。优化处理是例如用于将后述的式(1)所示的误差得分的值最小化的处理。具体而言,优化处理是用于将投影假定位置与实际拍入摄像图像中的图像位置(观测位置)的差(再投影误差)最小化的处理,其中投影假定位置是将假定无人驾驶航空器100存在的假定位置投影到拍摄图像上时的位置。假定位置也可以在3维空间中任意改变。作为优化处理的结果,可以将成本最小的无人驾驶航空器100的假定位置设为推定无人驾驶航空器100存在的位置(推定位置)。再投影误差可以是对由多个相机300拍摄到的多个摄像图像进行合计,也可以是对按时间序列拍摄到的多个摄像图像进行合计。
处理部510可以基于拍入多个摄像图像中的无人驾驶航空器100的像素位置以及多个相机300的姿势来推定无人驾驶航空器100的位置。处理部510可以基于关于多个摄像图像的投影假定位置和观测位置、以及多个相机300的姿势,来推定无人驾驶航空器100的位置。即,处理部510可以通过根据相机300对无人驾驶航空器100的跟踪结果对再投影误差进行优化(最小化)来推定无人驾驶航空器100的位置。
通信部505与跟踪前端400以及PID控制装置700进行通信。通信方式采用专用线、有线LAN、无线LAN、移动通信等。通信部505从跟踪前端400接收观测位置以及各相机300的姿势的信息。通信部505将优化后的无人驾驶航空器100的推定位置的信息发送到PID控制装置700。另外,各相机300的姿势的信息也可以不从各相机300直接获取,例如,也可以预先保存在存储部520中,或者可以从外部服务器获取。
存储部520可以用作处理部510的工作存储器。
PID控制装置700基于无人驾驶航空器100的推定位置的信息,进行用于使无人驾驶航空器100沿着飞行路径飞行的PID(P:Proportional I:Integral D:Differential)控制。飞行路径可以是预定的飞行路径。PID控制装置700可以生成用于无人驾驶航空 器100飞行的飞行参数的至少一部分的信息,并发送至无人驾驶航空器100。作为飞行参数,可以举出无人驾驶航空器100的飞行位置、飞行高度、飞行速度、飞行加速度(例如前后、左右、以及上下的三轴方向的加速度)、表示机体的方向的俯仰角、偏航角、滚转角等。PID控制装置700具有通信部705、处理部710和存储部720。飞行参数可以为用于填补目标状态(例如作为目标的无人驾驶航空器100的飞行位置、飞行高度、飞行速度、飞行加速度、俯仰角、偏航角、滚转角等)和实际的状态(例如当前的推定出的无人驾驶航空器100的飞行位置、飞行高度、飞行速度、飞行加速度、俯仰角、偏航角、滚转角等)之间的差的数据。
处理部710为了使无人驾驶航空器100的优化的推定位置接近沿着飞行路径的目标位置,可以进行PID控制,生成飞行参数。
通信部705与处理后端500以及无人驾驶航空器100进行通信。通信部705将由处理部710生成的飞行参数发送到无人驾驶航空器100。作为与处理后端500以及无人驾驶航空器100的通信方式,采用专用线、有线LAN、无线LAN、移动通信等。
存储部720可以用作处理部710的工作存储器。存储部720可以存储飞行路径等数据、与无人驾驶航空器100的飞行相关的目标状态的数据、实际状态的数据。这些数据例如可以从无人驾驶航空器100或指示无人驾驶航空器100的飞行的控制的终端经由通信部705获取。
图3是示出无人驾驶航空器100的具体的外观的一个示例的图。在图3中,示出了无人驾驶航空器100在移动方向STV0飞行时的立体图。无人驾驶航空器100是移动体的一个示例。
如图3所示,在与地面平行且沿着移动方向STV0的方向上设定滚转轴(参照x轴)。在此情况下,在与地面平行且与滚转轴垂直的方向上设定俯仰轴(参照y轴),进而,在与地面垂直且与滚转轴及俯仰轴垂直的方向上设定偏航轴(参照z轴)。
无人驾驶航空器100的构成为包括UAV主体102、万向节200、摄像部220、以及多个摄像部230。
UAV主体102包括多个旋翼(螺旋浆)。UAV主体102通过控制多个旋翼的旋转而使无人驾驶航空器100飞行。UAV主体102使用例如四个旋翼使无人驾驶航空器100飞行。旋翼的数量并不限于四个。此外,无人驾驶航空器100可以是没有旋翼的固定翼飞机。
摄像部220是对包含在预期摄像范围内的被摄体(例如,作为航拍对象的上空的景象、山川或河流等的景色、地上的建筑物)进行拍摄的摄像用相机。
多个摄像部230是为了控制无人驾驶航空器100的飞行而对无人驾驶航空器100的周围进行拍摄的传感用相机。2个摄像部230可以设置于无人驾驶航空器100的机头、即正面。并且,其他2个摄像部230可以设置于无人驾驶航空器100的底面。正面侧的两个摄像部230可以成对,起到所谓立体相机的作用。底面侧的2个摄像部230也可以成对,起到立体相机的作用。可以基于由多个摄像部230拍摄的图像来生成无人驾驶航空器100周围的三维空间数据。另外,无人驾驶航空器100所包括的摄像部230的数量不限于四个。无人驾驶航空器100只要包括至少一个摄像部230即可。无人驾驶航空器100可以在无人驾驶航空器100的机头、机尾、侧面、底面及顶面分别包括至少一个摄像部230。摄像部230中可设定的视角可大于摄像部220中可设定的视角。摄像部230可以具有单焦点镜头或鱼眼镜头。
图4是示出无人驾驶航空器100的硬件构成的一个示例的框图。无人驾驶航空器100的构成为包括UAV控制部110、通信接口150、内存160、存储器170、万向节200、旋翼机构210、摄像部220、摄像部230、GPS接收器240、惯性测量装置(IMU:Inertial Measurement Unit)250、磁罗盘260、气压高度计270、超声波传感器280、激光测量仪290。GPS接收器240是测位部的一个示例。
另外,在此,设想无人驾驶航空器100包括GPS接收器240,但GPS接收器240接收的GPS信号的精度低。另外,也可以设想无人驾驶航空器100不包括GPS接收器240而完全无法获取GPS信号。
UAV控制部110例如由CPU(Central Processing Unit:中央处理器)、MPU(Micro Processing Unit:微处理器)或DSP(Digital Signal Processor:数字信号处理器)构成。UAV控制部110执行用于总体控制无人驾驶航空器100的各部的动作的信号处理,与其他各部之间的数据的输入输出处理,数据的运算处理和数据的存储处理。
UAV控制部110按照存储于内存160中的程序来控制无人驾驶航空器100的飞行。UAV控制部110可以航拍图像。UAV控制部110可以经由通信接口150从PID控制装置700获取飞行参数的信息。UAV控制部110可以基于所获取的飞行参数来控制无人驾驶航空器100的飞行。
UAV控制部110获取表示无人驾驶航空器100的位置的位置信息。UAV控制部110可以从GPS接收器240获取表示无人驾驶航空器100所在的纬度、经度以及高度的位置信息。UAV控制部110可以分别从GPS接收器240获取表示无人驾驶航空器100所在的纬度以及经度的纬度经度信息,并从气压高度计270获取表示无人驾驶航空器100所在的高度的高度信息,作为位置信息。UAV控制部110可以获取超声波传感器280产生的超声波的辐射点与超声波的反射点之间的距离作为高度信息。
UAV控制部110可以从磁罗盘260获取表示无人驾驶航空器100的朝向的朝向信 息。朝向信息可以用例如与无人驾驶航空器100的机头的朝向相对应的方位来表示。
UAV控制部110可以获取表示在摄像部220对应该拍摄的摄像范围进行拍摄时无人驾驶航空器100所应该存在的位置的位置信息。UAV控制部110可以从内存160获取表示无人驾驶航空器100所应该存在的位置的位置信息。UAV控制部110可以经由通信接口150从其他装置获取表示无人驾驶航空器100所应该存在的位置的位置信息。UAV控制部110可以参照三维地图数据库,来确定无人驾驶航空器100所能够存在的位置,并获取该位置作为表示无人驾驶航空器100所应该存在的位置的位置信息。UAV控制部110可以经由通信接口150将表示无人驾驶航空器100所应该存在的位置的位置信息发送到飞行控制装置800(例如处理后端500)。表示无人驾驶航空器100所应该存在的位置的位置信息可以包含在预定的飞行路径的信息中。
UAV控制部110可以获取表示摄像部220以及摄像部230的各自的摄像范围的摄像范围信息。UAV控制部110可以从摄像部220以及摄像部230获取表示摄像部220以及摄像部230的视角的视角信息,作为用于确定摄像范围的参数。UAV控制部110可以获取表示摄像部220以及摄像部230的摄像方向的信息,作为用于确定摄像范围的参数。UAV控制部110例如可以从万向节200获取表示摄像部220的姿势状态的姿势信息,作为表示摄像部220的摄像方向的信息。摄像部220的姿势信息可以表示万向节200的俯仰轴和偏航轴从基准旋转角度旋转的角度。
UAV控制部110可以获取表示无人驾驶航空器100所在的推定位置的推定位置信息作为用于确定摄像范围的参数。UAV控制部110可以根据摄像部220和摄像部230的视角和摄像方向、以及无人驾驶航空器100的推定位置,来划定表示摄像部220拍摄的地理范围的摄像范围并生成摄像范围信息,从而获取摄像范围信息。
UAV控制部110可以从内存160获取摄像范围信息。UAV控制部110可以经由通信接口150获取摄像范围信息。
UAV控制部110控制万向节200、旋翼机构210、摄像部220以及摄像部230。UAV控制部110可以通过改变摄像部220的摄像方向或视角来控制摄像部220的摄像范围。UAV控制部110可以通过控制万向节200的旋转机构来控制万向节200所支持的摄像部220的摄像范围。
摄像范围是指由摄像部220或摄像部230拍摄的地理范围。摄像范围由纬度、经度和高度定义。摄像范围可以是由纬度、经度和高度定义的三维空间数据的范围。摄像范围可以是由纬度和经度定义的二维空间数据的范围。摄像范围可以根据摄像部220或摄像部230的视角和摄像方向、以及无人驾驶航空器100所在的位置而指定。摄像部220和摄像部230的摄像方向可以由设置有摄像部220和摄像部230的摄像镜头的正面所朝的方位和俯角来定义。摄像部220的摄像方向可以是由无人驾驶航空器 100的机头的方位和相对于万向节200的摄像部220的姿势状态而确定的方向。摄像部230的摄像方向可以是由无人驾驶航空器100的机头的方位和设置有摄像部230的位置而确定的方向。摄像方向可以与摄像朝向一致。
UAV控制部110可以通过对由多个摄像部230拍摄到的多个图像进行分析,来确定无人驾驶航空器100的周围的环境。UAV控制部110可以根据无人驾驶航空器100的周围的环境,例如避开障碍物来控制飞行。
UAV控制部110可以获取表示存在于无人驾驶航空器100周围的对象的立体形状(三维形状)的立体信息(三维信息)。对象例如可以是建筑物、道路、车辆、树木等风景的一部分。立体信息例如是三维空间数据。UAV控制部110可以根据从多个摄像部230得到的各个图像,生成表示存在于无人驾驶航空器100的周围的对象的立体形状的立体信息,由此获取立体信息。UAV控制部110可以通过参照存储在内存160或存储器170中的三维地图数据库,来获取表示存在于无人驾驶航空器100的周围的对象的立体形状的立体信息。UAV控制部110可以通过参照由网络上存在的服务器所管理的三维地图数据库,来获取与存在于无人驾驶航空器100的周围的对象的立体形状相关的立体信息。
UAV控制部110通过控制旋翼机构210来控制无人驾驶航空器100的飞行。即,UAV控制部110通过控制旋翼机构210来对包括无人驾驶航空器100的纬度、经度以及高度的位置进行控制。UAV控制部110可以通过控制无人驾驶航空器100的飞行来控制摄像部220的摄像范围。UAV控制部110可以通过控制摄像部220所包括的变焦镜头来控制摄像部220的视角。UAV控制部110可以利用摄像部220的数字变焦功能,通过数字变焦来控制摄像部220的视角。
当摄像部220固定于无人驾驶航空器100,不能移动摄像部220时,UAV控制部110可以通过使无人驾驶航空器100在确定的日期向确定的位置移动,使摄像部220在所希望的环境下对所希望的摄像范围进行拍摄。或者,即使当摄像部220没有变焦功能,无法改变摄像部220的视角时,UAV控制部110也可以通过使无人驾驶航空器100在特定的日期向特定的位置移动,使摄像部220在所希望的环境下对所希望的摄像范围进行拍摄。
通信接口150与其他通信装置(例如指示无人驾驶航空器100的飞行控制的终端或发送器(遥控器))通信。通信接口150可以通过任意的无线通信方式进行无线通信。通信接口150可以通过任意的有线通信方式进行有线通信。通信接口150可以将航拍图像、与航拍图像相关的附加信息(元数据)发送到终端。通信接口150可以与包括在飞行控制装置800中的至少一个装置(例如,处理后端500、PID控制装置700)通信。
内存160存储UAV控制部110对万向节200、旋翼机构210、摄像部220、摄像部230、GPS接收器240、惯性测量装置250、磁罗盘260、气压高度计270、超声波传感器280以及激光测量仪290进行控制所需的程序等。内存160可以是计算机可读记录介质,可以包括SRAM(Static Random Access Memory:静态随机存取存储器)、DRAM(Dynamic Random Access Memory:动态随机存取存储器)、EPROM(Erasable Programmable Read Only Memory:可擦除可编程只读存储器)、EEPROM(Electrically Erasable Programmable Read-Only Memory:电可擦除可编程只读存储器)、以及USB(Universal Serial Bus:通用串行总线)存储器等闪存中的至少一个。内存160可以从无人驾驶航空器100上拆卸下来。内存160可以作为作业用内存进行工作。
存储器170可以包括HDD(Hard Disk Drive:硬盘驱动器)、SSD(Solid State Drive:固态硬盘)、SD卡、USB存储器、其他的存储器中的至少一个。存储器170可以保存各种信息、各种数据。存储器170可以从无人驾驶航空器100上拆卸下来。存储器170可以记录航拍图像。
内存160或存储器170可以保存由终端或无人驾驶航空器100生成的航拍位置、航拍路径(飞行路径)的信息。可以通过UAV控制部110设定航拍位置、航拍路径的信息,作为由无人驾驶航空器100预定的航拍所涉及的航拍参数或者由无人驾驶航空器100预定的飞行所涉及的飞行参数中的一个。该设定信息可以保存在内存160或存储器170中。
万向节200可以以偏航轴、俯仰轴以及滚转轴为中心可旋转地支持摄像部220。万向节200可以使摄像部220以偏航轴、俯仰轴以及滚转轴中的至少一个为中心旋转,从而改变摄像部220的摄像方向。
旋翼机构210具有多个旋翼和使多个旋翼旋转的多个驱动电机。旋翼机构210通过UAV控制部110控制旋转,从而使无人驾驶航空器100飞行。
摄像部220对所希望的摄像范围内的被摄体进行拍摄并生成摄像图像的数据。由摄像部220拍摄而得到的图像数据(例如航拍图像)可以存储于摄像部220具有的内存或存储器170中。
摄像部230对无人驾驶航空器100的周围进行拍摄并生成摄像图像的数据。摄像部230的图像数据可以存储于存储器170中。
GPS接收器240接收表示从多个导航卫星(即GPS卫星)发出的时刻以及各GPS卫星的位置(坐标)的多个信号(GPS信号)。GPS接收器240根据接收到的多个信号,计算出GPS接收器240的位置(即无人驾驶航空器100的位置)。GPS接收器240将无人驾驶航空器100的位置信息输出到UAV控制部110。另外,可以由UAV控制部110代替GPS接收器240来进行GPS接收器240的位置信息的计算出。在此情况 下,在UAV控制部110中输入有GPS接收器240所接收到的多个信号中包含的表示时刻以及各GPS卫星的位置的信息。
惯性测量装置250检测无人驾驶航空器100的姿势,并将检测结果输出到UAV控制部110。惯性测量装置250可以检测无人驾驶航空器100的前后、左右以及上下的三轴方向的加速度和俯仰轴、滚转轴以及偏航轴的三轴方向的角速度,作为无人驾驶航空器100的姿势。
磁罗盘260检测无人驾驶航空器100的机头的方位,并将检测结果输出到UAV控制部110。
气压高度计270检测无人驾驶航空器100的飞行高度,并将检测结果输出到UAV控制部110。
超声波传感器280发射超声波,检测地面、物体反射的超声波,并将检测结果输出到UAV控制部110。检测结果可以示出从无人驾驶航空器100到地面的距离,即高度。检测结果可以示出从无人驾驶航空器100到物体(被摄体)的距离。
激光测量仪290对物体照射激光,接收物体反射的反射光,并通过反射光来测量无人驾驶航空器100与物体(被摄体)之间的距离。作为基于激光的距离测量方法的一个示例,可以为飞行时间法。
另外,在图4中例示了无人驾驶航空器100包括GPS接收器240的情况,但无人驾驶航空器100也可以不包括GPS接收器240。即使在这种情况下,无论是否具有GPS接收器240,无人驾驶航空器100都可以基于无人驾驶航空器100的推定位置来进行飞行控制。
接着,示出在无人飞行体系统5中对无人驾驶航空器100的自主飞行进行辅助的动作。在此,设想无人驾驶航空器100在难以接收GPS信号的空间,例如桥梁的下方、室内等飞行的状况。即,可以设想无人驾驶航空器100的测位的精度低。另外,无人驾驶航空器100也可以不包括GPS接收器240,即,可以设想无人驾驶航空器100不具有测位功能。多个相机300配置在能够拍摄无人驾驶航空器100的地面等各场所。
图5是对再投影误差进行说明的图。在图5中,示出了无人驾驶航空器100的实际位置、无人驾驶航空器100的假定位置xt(也记载为x t)、多个相机300的摄像图像的投影面上的投影假定位置π(cj,xt)、观测位置ojt。另外,j、t是变量,假定位置xt的位置可变。
多个相机300分别对例如在上空飞行的无人驾驶航空器100进行拍摄。在一台相 机拍摄的摄像图像中,能够平面地识别无人驾驶航空器100在帧上的像素位置,但难以识别相对于摄像图像的图像面(投影面)的在深度方向上的位置。因此,设置有多个能够从不同角度对无人驾驶航空器100进行拍摄的相机300。
在图5中,示出了相机姿势cj(也记载为c j)的相机300拍摄的摄像图像的帧GM1以及相机姿势cj-1的相机300拍摄的摄像图像的帧GM2。在帧GM1中,存在观测无人驾驶航空器100而得到的像素位置即观测位置ojt(也记载为o jt)、对无人驾驶航空器100的假定位置xt进行投影而得到的投影假定位置π(cj,xt)。观测位置ojt和投影假定位置π(cj,xt)有时不一致,在它们之间产生作为差的再投影误差。同样地,在帧GM2中,存在无人驾驶航空器100而得到的观测位置o(j-1)t、对无人驾驶航空器100的假定位置xt进行投影而得到的投影假定位置π(cj-1,xt)。观测位置o(j-1)t与投影假定位置π(cj-1,xt)有时不一致,产生再投影误差。观测位置ojt、o(j-1)t是实际观测到的位置被投影的位置且是固定位置,但投影假定位置π(cj,xt)、π(cj-1,xt)是可变位置。处理后端500以使这些再投影误差变小的方式优化再投影误差,优化无人驾驶航空器100的推定位置。另外,j是相机的识别信息的一个示例,t是时刻的一个示例。
跟踪前端400的通信部405可以接收由多个相机300拍摄的摄像图像,并将其存储在存储部420中。通信部405可以将由多个相机300拍摄的摄像图像、无人驾驶航空器100的观测位置ojt、o(j-1)t、...、以及相机姿势cj、cj-1、...的信息发送到处理后端500。处理后端500的通信部505可以接收摄像图像、无人驾驶航空器100的观测位置以及相机姿势的信息,并存储在存储部520中。通信部405也可以不将摄像图像发送到处理后端500。
处理后端500的通信部505接收无人驾驶航空器100的观测位置ojt、o(j-1)t、...的信息、以及相机姿势cj、cj-1、...的信息。通信部505也可以接收摄像图像。另外,处理部510在各种位置对时刻t的无人驾驶航空器100的三维位置进行假定以获取假定位置xt,并导出(例如计算出)该假定位置xt投影到相机姿势cj、cj-1、...的图像面(投影面)上而得到的投影假定位置π(cj,xt)、π(cj-1,xt)、...。例如,处理部510可以基于作为三维位置的假定位置和相机姿势来计算出相机300的投影假定位置。处理部510根据观测位置ojt、o(j-1)t、...和投影假定位置π(cj,xt)、π(cj-1,xt)、...,按照式(1)进行优化处理,优化无人驾驶航空器100的假定位置,并推定无人驾驶航空器100的位置。
【数学式1】
argmin xtjw jtp(π(c j,x t)-o jt) ......(1)
式(1)所示的argmin函数是将函数值最小的x作为自变量而传递的函数。argmin函数的函数值表示用于表示各假定位置xt与无人驾驶航空器100的存在位置之间的误差的误差得分,并表示用于使再投影误差最小化的函数。cj表示相机姿势。相机姿 势可以由摄影位置和摄像方向(朝向)确定。j(j=1,...,n)可以是用于识别多个相机300的相机编号。xt表示在时刻t无人驾驶航空器100在实际空间中的三维位置,并假定为各种位置。t表示时刻。在相机姿势cj的相机300的帧(摄像面、投影面)上,投影有无人驾驶航空器100的投影假定位置π(cj,xt)和由相机姿势cj的相机300拍摄的无人驾驶航空器100的观测位置ojt。π(cj,xt)表示将假定为位置xt的无人驾驶航空器100投影到相机姿势cj的相机300的帧上而得到的投影假定位置。ojt表示拍入由相机姿势cj的相机300在时刻t拍摄到的摄像图像的帧中的无人驾驶航空器100的像素位置即观测位置。p(π(cj,xt)-ojt)是对作为投影假定位置与观测位置之间的差的再投影误差进行积分的函数。因此,处理部510可以按照式(1),在改变假定位置xt的同时,探索使再投影误差最小化的假定位置x,以推定无人驾驶航空器100的位置。
在图5中,在相机姿势cj的相机300的帧中,示出了投影假定位置π(cj,xt)以及观测位置ojt。同样地,在相机姿势cj-1的相机300的帧中,示出了投影假定位置π(cj-1,xt)和观测位置o(j-1)t。
另外,wjt(也记载为w jt)是表示可靠度的系数。可靠度wjt根据式(2)来计算出来。另外,可靠度也可以从式(1)中省去。在式(2)中,例示了通过相机300与无人驾驶航空器100的距离来确定可靠度的情况。
【数学式2】
Figure PCTCN2019113651-appb-000001
在这里,ωjt(也记载为ω jt)是由相机编号j和时刻t决定的固定值,可以是用于调整由式(1)、(2)导出的值的范围的值。ds是固定值。d表示无人驾驶航空器100与相机300之间的距离。例如,相机300可以通过图像识别等来确定拍入拍摄出的图像中的无人驾驶航空器100,并根据无人驾驶航空器100的大小导出(例如计算出)到无人驾驶航空器100的距离。另外,相机300可以具有测距传感器,利用测距传感器测量到无人驾驶航空器100的距离。距离d越短,可靠度wjt越高。另外,在无人驾驶航空器100未被拍入摄像图像中时,可以将可靠度Wjt设为值0,不将该情况下的无人驾驶航空器100的假定位置xt用于推定位置。另外,距离d的导出也可以由跟踪前端400、处理后端500来实施。例如,处理后端500的处理部510可以基于假定位置xt与各相机300之间的距离,导出(例如计算出)距离d。距离d的信息可以从相机300、跟踪前端400通知给处理后端500。
这样,argmin函数是在所有相机300(相机姿势C1~Cn的相机)且整个观测时间t内将对无人驾驶航空器100的投影假定位置π(cj,xt)与观测位置ojt之间的差值(再投影误差)乘以可靠度Wjt以及归一化系数p而得到的值相加而得到的值中最小的假 定位置xt,作为推定位置而求出的函数。n可以是表示相机总数的任意值。
另外,使用式(1)所示的argmin函数来推定无人驾驶航空器100的位置是无人驾驶航空器100的位置推定的一个示例,也可以通过其他方法进行位置推定。根据式(2)导出可靠度wjt是可靠度导出的一个示例,也可以通过其他方法导出可靠度。
处理后端500的通信部505将由处理部510优化后的无人驾驶航空器100的推定位置的数据发送到PID控制装置700。PID控制装置700的通信部705接收无人驾驶航空器100的推定位置的数据,并存储于存储部720。PID控制装置700的处理部710基于无人驾驶航空器100的推定位置,进行PID控制,用于使无人驾驶航空器100沿着存储于存储部720的目标飞行路径飞行。通信部705将通过PID控制得到的飞行参数发送到无人驾驶航空器100。
当从PID控制装置700接收到飞行参数时,无人驾驶航空器100的通信接口150将其存储在内存160中。UAV控制部110按照飞行参数控制旋翼机构210,控制无人驾驶航空器100的飞行,并继续自主飞行。
这样,无人飞行体系统5包括推定无人驾驶航空器100的存在位置的飞行控制装置800(位置推定装置的一个示例)。飞行控制装置800包括进行与推定无人驾驶航空器100的存在位置有关的处理的处理部(例如跟踪前端400的处理部410以及处理后端500的处理部510中的至少一个)。跟踪前端400的处理部410可以获取由多个相机300对无人驾驶航空器100进行拍摄的多个摄像图像。跟踪前端400的处理部410可以获取各相机300的姿势(布置位置以及各相机300的摄像朝向)的信息。处理后端500的处理部510可以基于相机300的姿势、假定在实际空间中存在无人驾驶航空器100的各假定位置、以及投影到各摄像图像的无人驾驶航空器100的观测位置(无人驾驶航空器100所存在的图像位置的一个示例),来计算出表示各假定位置xt与无人驾驶航空器100的存在位置之间的误差的误差得分(例如argmin函数的导出值)。处理后端500可以基于误差得分来推定实际空间中的无人驾驶航空器100的存在位置。另外,处理后端500单体也可以是位置推定装置的一个示例。
由此,飞行控制装置800能够通过使用各种姿势(布置位置、摄像朝向)的多个相机300对无人驾驶航空器100进行拍摄,从而不仅考虑沿着摄像图像的摄像面(投影面)的平面的位置,还考虑深度方向的位置来推定无人驾驶航空器100的位置。因此,即使无人驾驶航空器100无法接收GPS信号,也能够推定无人驾驶航空器100的位置。因此,即使在无人驾驶航空器100无法接收GPS信号的情况下,无人驾驶航空器100也能够基于无人驾驶航空器100的推定位置而自主飞行。另外,即使在无人驾驶航空器不具有基于GPS信号的测位功能的情况或基于GPS信号的测位的精度低的情况下,无人驾驶航空器100也能够自主飞行。另外,飞行控制装置800不需要 无人驾驶航空器100检测无人驾驶航空器100的加速度并对加速度进行二重积分(2次积分)来推定位置,因此不会产生由二重积分引起的误差,能够高精度地推定无人驾驶航空器的位置。另外,飞行控制装置800不需要使用动作捕捉、信标信号进行位置推定,因此能够防止无人驾驶航空器100能够使用的场所受到限制。
另外,跟踪前端400的处理部410可以获取通过多个相机300在多个时刻t对无人驾驶航空器100进行了拍摄的多个摄像图像。处理后端500可以在多个时刻t计算出误差得分。处理后端500的处理部510可以基于该误差得分来推定实际空间中的无人驾驶航空器100的存在位置。
由此,飞行控制装置800能够在多个时刻t(时点)推定无人驾驶航空器100的位置,因此也能够考虑无人驾驶航空器100的运动来推定无人驾驶航空器100的位置。因此,提高了无人驾驶航空器100的位置的推定精度。
另外,处理后端500的处理部510可以获取相机300与无人驾驶航空器100之间的距离d。处理部510可以根据该距离d,导出关于摄像图像的误差得分的可靠度wjt。处理部510可以基于可靠度wjt来计算出误差得分。
由此,若从相机300到无人驾驶航空器100的距离d较长,即无人驾驶航空器100处于远处,则由处理部510进行的无人驾驶航空器100的位置的推定精度下降,处理部510也能够根据这一点考虑到无人驾驶航空器100的距离d来推定无人驾驶航空器100的位置。例如,距离d越短,可靠度wjt越大,距离d越长,可靠度wjt越小。因此,飞行控制装置800能够提高使用误差得分的无人驾驶航空器100的位置的推定精度。
此外,处理部510可以对于以各相机300的布置位置及各相机300的摄像朝向(相机姿势cj)拍摄到的各摄像图像,基于各投影假定位置π(cj,xt)和各观测位置ojt的差(再投影误差),计算出误差得分,其中,投影假定位置π(cj,xt)是将假定在实际空间中存在无人驾驶航空器100的各假定位置xt在以各相机300的布置位置及各相机300的摄像朝向(相机姿势cj)拍摄到的各摄像图像上投影而得到的位置;观测位置是在以各相机300的布置位置以及各摄像装置的摄像朝向拍摄到的各摄像图像中,无人驾驶航空器100所存在的位置。处理部510可以将误差得分最小的无人驾驶航空器100的假定位置xt推定为无人驾驶航空器100的存在位置。例如处理部510可以根据式(1)来优化误差得分。
由此,飞行控制装置800能够基于投影在由各相机300拍摄到的摄像图像上的投影假定位置与图像位置(观测位置)之间的差来导出误差得分。然后,例如,飞行控制装置800能够在各时刻t求出误差得分,并将误差得分最小的假定位置xt推定为无人驾驶航空器100的存在位置。因此,处理部510能够例如根据式(1),基于由多个相 机300拍摄到的图像来优化无人驾驶航空器100的推定位置。
(实施方式2)
在实施方式2中,示出了针对实施方式1进一步考虑无人驾驶航空器100检测到的GPS位置信息以优化无人驾驶航空器100的推定位置的情况。
图6是示出实施方式2中的飞行控制装置800A的硬件构成的图。实施方式2的无人飞行体系统5A具有与所述实施方式1大致相同的结构。对于与实施方式1相同的结构要素,通过使用相同的符号,省略其说明。
无人驾驶航空器100包括GPS接收器240而没有省略。无人驾驶航空器100的UAV控制部110基于由GPS接收器240接收到的GPS信号,获取无人驾驶航空器100的GPS位置信息。UAV控制部110经由通信接口150向飞行控制装置800A发送GPS位置信息。
处理后端500A的通信部505A从无人驾驶航空器100接收GPS位置信息。另外,通信部505A与实施方式1同样地从跟踪前端400接收多个相机300拍摄的摄像图像、无人驾驶航空器100的观测位置、相机姿势的信息。另外,也可以不接收摄像图像。
处理后端500A的处理部510A在进行再投影误差的优化时,考虑经由通信部505A接收到的GPS位置信息。即,处理部510A可以按照式(3)来优化无人驾驶航空器100的推定位置。
式(3)所示的argmin函数是与式(1)同样地将函数值最小的x作为自变量而传递的函数,并包括与GPS位置信息相关的项λGG(X,Z)。λGG(X,Z)是考虑了GPS信号的得分。
【数学式3】
argmin x(∑ tjw jtp(π(c j,x t)-o jt)+λ GG(X,Z)) ......(3)
λG(也记载为λ G)如式(4)所示。
λG=Ca×(Cn-N) ......(4)
这里,Cn是GPS卫星的总数。N是GPS接收器240在信号接收中所使用的GPS卫星的数量。因此,Cn-N表示GPS接收器240在信号接收中未使用的GPS卫星的数量。Ca是系数。根据式(4),GPS接收器240在信号接收使用的GPS的数量越多,λG的值越小。另外,λG可以是与GPS信号的强度对应的值。在GPS信号的强度较大时,可以减小λG的值,在GPS信号的强度较小时,可以增大λG的值。在该情况下,在式(3)的λGG(X,Z)中,受G(X,Z)的值的影响很大,从而能够更加反映GPS位置信息。即,在GPS信号的强度高时,GPS信号的可靠性高,因此G(X,Z)的值可以较 大地影响式(3)的值。
另外,G(X,Z)是相当于无人驾驶航空器100的全部假定位置xt与GPS位置之差的总和的值,如式(5)所示。
【数学式4】
G(X,Z)=∑(x t-ζ) 2 ......(5)
其中,ζ表示GPS位置(由GPS接收器240检测出的位置)。假定位置xt与GPS位置ζ之间的差越大,G(X,Z)越大。
这样,通过argmin函数,使无人驾驶航空器100的假定位置xt优化,使得argmin函数的括号内(argminx()的()内)的值(再投影误差)变小。即,由argmin函数所示的无人驾驶航空器100的推定位置是argmin函数的括号内的值最小的假定位置。
通信部505A将由处理部510A优化后的无人驾驶航空器100的推定位置发送到PID控制装置700。PID控制装置700基于优化后的无人驾驶航空器100的推定位置,生成无人驾驶航空器100的飞行参数,并发送到无人驾驶航空器100。无人驾驶航空器100的动作与实施方式1相同。
在实施方式2的无人飞行体系统5A中,处理后端500A的处理部510A考虑GPS位置信息,来优化再投影误差,以使得例如式(3)的argmin函数的值为阈值th2以下(例如最小),并推定无人驾驶航空器100的存在位置。
这样,处理后端500的处理部510A可以获取由无人驾驶航空器100所包括的GPS接收器240(测位部的一个示例)测量出的GPS位置(第一测量位置的一个示例)。处理部510A可以基于GPS位置,计算出例如式(3)所示的误差得分。
由此,飞行控制装置800A可以考虑在无人驾驶航空器100中通常用于定位的GPS信号,来计算出误差得分,并优化无人驾驶航空器100的推定位置。因此,即使无人驾驶航空器100所包括的GPS接收器240的精度低,飞行控制装置800A也能够与GPS信号一起基于与实施方式1相同的误差得分来推定无人驾驶航空器100的位置,并能够提高位置推定精度。由此,即使在GPS接收器240的测位精度低的情况下,飞行控制装置800A也能够通过使用误差得分推定无人驾驶航空器100的位置,对基于GPS接收器240的测位进行辅助。
另外,GPS接收器240可以从多个GPS卫星接收GPS信号以获取GPS位置的信息。处理部510A可以以GPS接收器240能够接收的GPS卫星的数量越多,就越能够反映GPS位置信息,即GPS信号对误差得分的影响就越大的方式来计算出误差得分。例如,在式(3)的λG·G(X,Z)中,在无人驾驶航空器100在信号接收中使用的 GPS的数量越多,系数λG的值越小,受G(X,Z)的值的影响很大,从而能够更加反映GPS位置信息。
由此,飞行控制装置800A能够根据GPS信号的可靠度(GPS信号的强度),调整GPS信号值对误差得分的影响,计算出误差得分,优化再投影误差,并推定无人驾驶航空器100的位置。因此,飞行控制装置800A考虑GPS的接收状态来推定无人驾驶航空器100的位置,因此能够进一步提高无人驾驶航空器100的位置推定精度。由此,飞行控制装置800A能够根据GPS信号的可靠度,对使用实施方式1的误差得分的无人驾驶航空器100的位置的推定结果进行修正。
(实施方式3)
在实施方式3中,示出了对于实施方式2进一步考虑基于动态因子、无人驾驶航空器100的加速度的无人驾驶航空器100的推定位置来优化无人驾驶航空器100的推定位置的情况。动态因子是考虑了物理现象的因子。
图7是示出实施方式3中的飞行控制装置800B的硬件构成的图。
飞行控制装置800B除了与实施方式1同样地具有跟踪前端400、处理后端500B以及PID控制装置700以外,还具有IMU前端600。
IMU前端600具有通信部605、处理部610和存储部620。处理部610通过例如对由惯性测量装置250测量出的加速度(IMU数据)进行二重积分(2次积分),来推定无人驾驶航空器100的位置,并获取推定位置的信息。基于该加速度推定出的位置也称为加速度推定位置。
通信部605与无人驾驶航空器100以及处理后端500B进行通信。作为与无人驾驶航空器100以及处理后端500B的通信方式,采用专用线、有线LAN、无线LAN、移动通信等。通信部605例如从无人驾驶航空器100接收由惯性测量装置250测量出的加速度的信息。通信部605例如将基于加速度的无人驾驶航空器100的加速度推定位置发送到处理后端500B。存储部420用作处理部410的工作存储器。
在IMU前端600中,例如一秒获取100个IMU的数据(加速度的数据)。另外,在跟踪前端400中,例如以一秒30张(30fps)或60张(60fps)的频率获取摄像图像,导出无人驾驶航空器100A所存在的图像位置(观测位置)。另外,在处理后端500B中,处理速度为例如一秒使用10个IMU。
在此,在处理后端500B优化再投影误差以推定无人驾驶航空器100的位置的情况下,优选使由IMU前端600更新无人驾驶航空器100的加速度推定位置的频率、由跟踪前端400获取摄像图像的频率(观测位置的导出频率)和由处理后端500B处理无人驾驶航空器100的推定位置的频率一致。为了在同一时点处理IMU前端600、 跟踪前端400以及处理后端500B,处理部510B可以进行控制,以使其与处理后端500B的处理频率(例如一秒10次)一致。
例如,处理部510B可以经由通信部505以及通信部605,例如对IMU数据进行积分以在一秒内获取10个加速度推定位置。处理部510B可以经由通信部505以及通信部405,例如在一秒内获取10个摄像图像中的图像位置(观测位置)的信息。
处理后端500B的通信部505B与实施方式2同样地从无人驾驶航空器100接收无人驾驶航空器100的GPS位置信息。无人驾驶航空器100的UAV控制部110可以基于由GPS接收器240接收到的GPS信号,获取无人驾驶航空器100的GPS位置信息,并经由通信接口150发送到飞行控制装置800B。另外,在本实施方式中,处理后端500B也可以不获取GPS位置信息,并且在误差得分的计算出中不考虑GPS位置信息。
与实施方式1、2同样地,处理后端500B的通信部505B可以从跟踪前端400接收由多个相机300拍摄的摄像图像、与相机300的姿势相关的数据。通信部505B可以从IMU前端600接收加速度推定位置的信息。另外,处理后端300B也可以不获取摄像图像。
另外,存储部520B可以存储动态因子。动态因子例如可以包含用于按照运动方程式使无人驾驶航空器100的推定位置处于能够物理移动的范围内的数据。动态因子例如可以包含卡尔曼滤波器。在使用卡尔曼滤波器的情况下,处理部510B可以基于无人驾驶航空器100的当前的位置、速度、加速度等来推定无人驾驶航空器100的下一个位置。
处理后端500B的处理部510B在优化再投影误差以推定无人驾驶航空器100的位置时,考虑经由通信部505B接收到的加速度推定位置和通过动态因子得到的物理的推定位置(物理推定位置)中的至少一个。另外,处理部510B在优化再投影误差以推定无人驾驶航空器100的位置时,可以考虑GPS位置信息。即,处理部510B可以根据式(6)来优化再投影误差。
【数学式5】
argmin x(∑ tjw jtp(π(c j,x t)-o jt)+λ RR(X,Γ)+λ II(X,Δ)+λ GG(X,Z)) ......(6)
其中,R(X,Γ)表示考虑了物理推定位置的得分,也就是考虑了动态因子的得分,如式(7)所示。
【数学式6】
Figure PCTCN2019113651-appb-000002
λR(也记载为λ R)是固定值。xt上划线(γt)表示时刻t的根据各种物理定律、式(例如运动方程式)而得到的物理推定位置。因此,R(X,Γ)表示无人驾驶航空器100的假定位置xt与无人驾驶航空器100的物理推定位置之间的差的累积值。此外,xt上划线可以是用于导出物理推定位置的函数。γt(也记载为γ t)可以是导出物理推定位置所需的变量,例如时刻t时无人驾驶航空器100的位置、速度。
这样,处理部510B可以基于无人驾驶航空器100能够物理地移动的位置来计算出误差得分。
在无人驾驶航空器100的假定位置xt与无人驾驶航空器100的物理推定位置之差较大时,式(6)中的λRR(X,Γ)是较大的值,对式(6)的值的影响变大。例如,在计算出的无人驾驶航空器100的下一个推定位置大大偏离当前的无人驾驶航空器100的假定位置且不能移动时,为使这样的假定位置xt不会被推定为无人驾驶航空器100的位置,动态因子的值变大,式(6)的值难以成为最小值。另外,在计算出的无人驾驶航空器100的速度(例如50m/秒)超过无人驾驶航空器100的最高速度(例如20m/秒)时,为使与这样的速度对应的假定位置xt不会被推定为无人驾驶航空器100的位置,动态因子的值变大,式(6)的值难以成为最小值。飞行控制装置800B能够考虑例如无人驾驶航空器100能够移动的物理的推定位置来推定无人驾驶航空器100的位置,并提高位置推定精度。
另外,I(X,Δ)表示考虑了加速度推定位置的得分,即考虑了加速度因子的得分,如式(8)所示。
【式7】
Figure PCTCN2019113651-appb-000003
λI(也记载为λ I)是固定值。xt上划线(δt)表示基于由惯性测量装置250测量的加速度导出的位置(加速度推定位置)。因此,I(X,Δ)表示无人驾驶航空器100的假定位置xt与无人驾驶航空器100的加速度推定位置之间的差的累积值。此外,xt上划线可以是用于导出加速度推定位置的函数。δt(也记载为δ t)可以是导出加速度推定位置所需的变量,例如时刻t时无人驾驶航空器100的位置、速度。
这样,处理部510B可以基于由无人驾驶航空器100所包括的惯性测量装置250(加速度测量器的一个示例)测量的加速度,来导出(例如计算出)无人驾驶航空器100的位置(加速度推定位置)(第二测量位置的一个示例)。处理部510B可以基于加速度测量位置来计算出误差得分。
在无人驾驶航空器100的假定位置xt与无人驾驶航空器100的加速度推定位置之差较大时,式(8)中的λII(X,Δ)是较大的值,对式(8)的值的影响变大。飞行控制装置 800B能够考虑无人驾驶航空器100所包括的惯性测量装置250的测量结果(加速度)来推定无人驾驶航空器100的位置,并能够提高位置推定精度。
在实施方式3的无人飞行体系统5B中,处理后端500B的处理部510B考虑GPS位置信息、物理推定位置、加速度推定位置中的至少一个,来优化再投影误差,以使得例如式(6)的argmin函数的值为阈值th3以下(例如最小),并推定无人驾驶航空器100的存在位置。
通信部505B将由处理部510B优化后的无人驾驶航空器100的推定位置发送到PID控制装置700。PID控制装置700基于优化后的无人驾驶航空器100的推定位置,生成无人驾驶航空器100的飞行参数,并发送到无人驾驶航空器100。无人驾驶航空器100的动作与实施方式1,2相同。
另外,在式(6)中,例示了对于式(3)附加了考虑动态因子的得分λRR(X,Γ)和考虑加速度因子的得分λRR(X,Γ)这两者的情况,但也可以仅附加任意一方。
上述各实施方式中的飞行控制装置800、800A、800B构成为与相机300及无人驾驶航空器100分体的装置,但其至少一部分的结构可以由相机300、无人驾驶航空器100、相机300及无人驾驶航空器100以外的终端、服务器构成。终端例如可以是能够操纵无人驾驶航空器的终端。服务器可以是能够与相机300及无人驾驶航空器100通信地与网络连接的计算机。在飞行控制装置800、800A、800B中,例如,跟踪前端400可以设置在相机300、无人驾驶航空器100、终端和服务器中的任一个中。处理后端500、500A、500B可以设置在相机300、无人驾驶航空器100、终端和服务器中的任一个中。PID控制装置700可以设置在相机300、无人驾驶航空器100、终端和服务器中的任一个中。IMU前端600可以设置在相机300、无人驾驶航空器100、终端和服务器中的任一个中。
上述各实施方式中的各处理部例如通过处理器执行保存于各存储部的程序来实现各种功能。处理器可以包括MPU(Micro processing Unit),CPU(Central Processing Unit),DSP(Digital Signal Processor),GPU(Graphical Processing Unit:图形处理单元)等。各处理部控制装置内的各部。各处理部执行各种处理。
上述各实施方式中的各存储部包括主存储装置(例如RAM(Random Access Memory:随机存取存储器)、ROM(Read Only Memory:只读存储器))。各存储部可以包括二级存储装置(例如HDD(Hard Disk Drive)、SSD(Solid State Drive))、三级存储装置(例如光盘、SD卡)。各存储部可以包括其他存储装置。各存储部存储各种数据、信息、程序。
以上使用实施方式对本公开进行了说明,但是本公开的技术范围并不限于上述实 施方式所描述的范围。对本领域普通技术人员来说,显然可以对上述实施方式加以各种变更或改良。从权利要求书的记载即可明白,加以了这样的变更或改良的方式都可包含在本公开的技术范围之内。
权利要求书、说明书以及说明书附图中所示的装置、系统、程序和方法中的动作、过程、步骤、以及阶段等各项处理的执行顺序,只要没有特别明示“在...之前”、“事先”等,且只要前面处理的输出并不用在后面的处理中,即可以以任意顺序实现。关于权利要求书、说明书以及附图中的动作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。
【符号说明】
5,5A,5B 无人飞行体系统
100 无人驾驶航空器
102 UAV主体
110 UAV控制部
150 通信接口
160 内存
170 存储器
200 万向节
210 旋翼机构
220,230 摄像部
240 GPS接收器
250 惯性测量装置
260 磁罗盘
270 气压高度计
280 超声波传感器
290 激光测量仪
300 相机
400 跟踪前端
405,505,505A,505B,605,705 通信部
410,510,510A,510B,610,710 处理部
420,520,520B,620,720 存储部
500,500A,500B 处理后端
600 IMU前端
700 PID控制装置
800,800A,800B 飞行控制装置
GM1,GM2 帧
STV0 移动方向

Claims (18)

  1. 一种推定飞行体的存在位置的位置推定装置,其特征在于,包括:
    处理部,其进行与推定所述飞行体的存在位置有关的处理;
    所述处理部
    获取由多个摄像装置对所述飞行体进行拍摄而得到的多个摄像图像,
    获取各摄像装置的布置位置以及各摄像装置的摄像朝向的信息,
    基于各摄像装置的布置位置、各摄像装置的摄像朝向、假定在实际空间中存在所述飞行体的各假定位置、以及各摄像图像中所述飞行体所存在的图像位置,
    来计算出表示各假定位置与所述飞行体的存在位置之间的误差的误差得分,
    并基于所述误差得分来推定实际空间中的所述飞行体的存在位置。
  2. 根据权利要求1所述的位置推定装置,其特征在于,
    所述处理部获取由所述多个摄像装置在多个时刻对所述飞行体进行拍摄而得到的所述多个摄像图像,
    在所述多个时刻计算出所述误差得分,
    并基于所述误差得分来推定实际空间中的所述飞行体的存在位置。
  3. 根据权利要求2所述的位置推定装置,其特征在于,
    所述处理部获取所述摄像装置与所述飞行体之间的距离,
    根据所述距离,导出关于所述摄像图像的所述误差得分的可靠度,
    并基于所述可靠度计算出所述误差得分。
  4. 根据权利要求3所述的位置推定装置,其特征在于,
    所述处理部基于各投影假定位置和各图像位置之间的差,来计算出误差得分,所述各投影假定位置是将假定在实际空间中存在所述飞行体的各假定位置在以各摄像装置的布置位置及各摄像装置的摄像朝向拍摄到的各摄像图像上投影而得到的投影假定位置,所述各图像位置是在以各摄像装置的布置位置以及各摄像装置的摄像朝向拍摄到的各摄像图像中所述飞行体所存在的图像位置,并将所述误差得分最小的所述飞行体的假定位置推定为所述飞行体的存在位置。
  5. 根据权利要求1至4中任一项所述的位置推定装置,其特征在于,
    所述处理部获取由所述飞行体所包括的测位部测量出的所述飞行体的第一测量位置,
    并基于所述第一测量位置来计算出所述误差得分。
  6. 根据权利要求5所述的位置推定装置,其特征在于,所述测位部从多个GPS(Global Positioning System,全球定位系统)卫星接收GPS信号以获取所述第一测量位置;
    所述处理部以所述测位部不能接收的所述GPS卫星的数量越多,所述第一测量位置对所述误差得分的影响就越大的方式计算出所述误差得分。
  7. 根据权利要求1至6中任一项所述的位置推定装置,其特征在于,所述处理部基于所述飞行体能够物理移动的位置来计算出所述误差得分。
  8. 根据权利要求1至6中任一项所述的位置推定装置,其特征在于,
    所述处理部基于由所述飞行体所包括的加速度测量器测量出的加速度,导出所述飞行体的第二测量位置,
    并基于所述第二测量位置来计算出所述误差得分。
  9. 一种推定飞行体的存在位置的位置推定方法,其特征在于,其包括以下步骤:
    获取由多个摄像装置对所述飞行体进行拍摄而得到的多个摄像图像;
    获取各摄像装置的布置位置以及各摄像装置的摄像朝向的信息;
    基于各摄像装置的布置位置、各摄像装置的摄像朝向、假定在实际空间中存在所述飞行体的各假定位置、以及各摄像图像中所述飞行体所存在的图像位置,
    来计算出表示各假定位置与所述飞行体的存在位置之间的误差的误差得分;
    以及基于所述误差得分来推定实际空间中的所述飞行体的存在位置。
  10. 根据权利要求9所述的位置推定方法,其特征在于,获取所述多个摄像图像的步骤包括获取由所述多个摄像装置在多个时刻对所述飞行体进行拍摄而得到的所述多个摄像图像的步骤;
    计算出所述误差得分的步骤包括在所述多个时刻计算出所述误差得分的步骤;
    所述推定飞行体的存在位置的步骤包括基于所述误差得分来推定实际空间中的所述飞行体的存在位置的步骤。
  11. 根据权利要求10所述的位置推定方法,其特征在于,计算出所述误差得分的步骤包括以下步骤:
    获取所述摄像装置与所述飞行体之间的距离;
    根据所述距离导出关于所述摄像图像的所述误差得分的可靠度;
    以及根据所述可靠度来计算出所述误差得分。
  12. 根据权利要求11所述的位置推定方法,其特征在于,计算出所述误差得分的步骤包括以下步骤:基于各投影假定位置和各图像位置之间的差,来计算出误差得分,所述各投影假定位置是将假定在实际空间中存在所述飞行体的各假定位置在以各摄像装置的布置位置及各摄像装置的摄像朝向拍摄到的各摄像图像上投影而得到的投影假定位置,所述各图像位置是在以各摄像装置的布置位置以及各摄像装置的摄像朝向拍摄到的各摄像图像中所述飞行体所存在的图像位置;
    所述推定飞行体的存在位置的步骤包括将所述误差得分最小的所述飞行体的假 定位置推定为所述飞行体的存在位置的步骤。
  13. 根据权利要求9至12中任一项所述的位置推定方法,其特征在于,计算出所述误差得分的步骤包括以下步骤:
    获取由所述飞行体所包括的测位部测量出的所述飞行体的第一测量位置;
    以及基于所述第一测量位置来计算出所述误差得分。
  14. 根据权利要求13所述的位置推定方法,其特征在于,获取所述第一测量位置的步骤包括从多个GPS卫星接收GPS信号以获取所述第一测量位置的步骤;
    计算出所述误差得分的步骤包括以所述测位部不能接收的所述GPS卫星的数量越多,所述第一测量位置对所述误差得分的影响就越大的方式计算出所述误差得分的步骤。
  15. 根据权利要求9至14中任一项所述的位置推定方法,其特征在于,计算出所述误差得分的步骤包括基于所述飞行体能够物理移动的位置来计算出所述误差得分的步骤。
  16. 根据权利要求9至14中任一项所述的位置推定方法,其特征在于,计算出所述误差得分的步骤包括以下步骤:
    基于由所述飞行体所包括的加速度测量器测量出的加速度来导出所述飞行体的第二测量位置;
    以及基于所述第二测量位置来计算出所述误差得分。
  17. 一种程序,其特征在于,其用于使推定飞行体的存在位置的位置推定装置执行以下步骤:
    获取由多个摄像装置对所述飞行体进行拍摄而得到的多个摄像图像;
    获取各摄像装置的布置位置以及各摄像装置的摄像朝向的信息;
    基于各摄像装置的布置位置、各摄像装置的摄像朝向、假定在实际空间中存在所述飞行体的各假定位置、以及各摄像图像中所述飞行体所存在的图像位置,
    来计算出表示各假定位置与所述飞行体的存在位置之间的误差的误差得分;
    以及基于所述误差得分来推定实际空间中的所述飞行体的存在位置。
  18. 一种计算机可读记录介质,其特征在于,其记录有用于使推定飞行体的存在位置的位置推定装置执行以下步骤的程序:
    获取由多个摄像装置对所述飞行体进行拍摄而得到的多个摄像图像;
    获取各摄像装置的布置位置以及各摄像装置的摄像朝向的信息;
    基于各摄像装置的布置位置、各摄像装置的摄像朝向、假定在实际空间中存在所述飞行体的各假定位置、以及各摄像图像中所述飞行体所存在的图像位置,
    来计算出表示各假定位置与所述飞行体的存在位置之间的误差的误差得分;
    以及基于所述误差得分来推定实际空间中的所述飞行体的存在位置。
PCT/CN2019/113651 2018-10-31 2019-10-28 位置推定装置、位置推定方法、程序以及记录介质 WO2020088397A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201980009074.2A CN111615616A (zh) 2018-10-31 2019-10-28 位置推定装置、位置推定方法、程序以及记录介质

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-206007 2018-10-31
JP2018206007A JP6974290B2 (ja) 2018-10-31 2018-10-31 位置推定装置、位置推定方法、プログラム、及び記録媒体

Publications (1)

Publication Number Publication Date
WO2020088397A1 true WO2020088397A1 (zh) 2020-05-07

Family

ID=70462059

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/113651 WO2020088397A1 (zh) 2018-10-31 2019-10-28 位置推定装置、位置推定方法、程序以及记录介质

Country Status (3)

Country Link
JP (1) JP6974290B2 (zh)
CN (1) CN111615616A (zh)
WO (1) WO2020088397A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101313233A (zh) * 2005-11-21 2008-11-26 日本电气株式会社 位置推定系统、位置推定方法、以及位置推定用程序
JP2010169682A (ja) * 2009-01-23 2010-08-05 Honeywell Internatl Inc レーダー画像を使用して航空機の位置を求めるシステム及び方法
CN104854637A (zh) * 2012-12-12 2015-08-19 日产自动车株式会社 移动物体位置姿态角推定装置及移动物体位置姿态角推定方法
WO2018146803A1 (ja) * 2017-02-10 2018-08-16 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド 位置処理装置、飛行体、位置処理システム、飛行システム、位置処理方法、飛行制御方法、プログラム、及び記録媒体

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102004005380A1 (de) * 2004-02-03 2005-09-01 Isra Vision Systems Ag Verfahren zur Bestimmung der Lage eines Objekts im Raum
JP5050904B2 (ja) * 2007-04-09 2012-10-17 セイコーエプソン株式会社 現在位置測位方法及び測位装置
JP4985166B2 (ja) * 2007-07-12 2012-07-25 トヨタ自動車株式会社 自己位置推定装置
JP2014186004A (ja) * 2013-03-25 2014-10-02 Toshiba Corp 計測装置、方法及びプログラム
WO2016059930A1 (ja) * 2014-10-17 2016-04-21 ソニー株式会社 装置、方法及びプログラム
US10311739B2 (en) * 2015-01-13 2019-06-04 Guangzhou Xaircraft Technology Co., Ltd Scheduling method and system for unmanned aerial vehicle, and unmanned aerial vehicle
JP6734940B2 (ja) * 2017-02-01 2020-08-05 株式会社日立製作所 三次元計測装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101313233A (zh) * 2005-11-21 2008-11-26 日本电气株式会社 位置推定系统、位置推定方法、以及位置推定用程序
JP2010169682A (ja) * 2009-01-23 2010-08-05 Honeywell Internatl Inc レーダー画像を使用して航空機の位置を求めるシステム及び方法
CN104854637A (zh) * 2012-12-12 2015-08-19 日产自动车株式会社 移动物体位置姿态角推定装置及移动物体位置姿态角推定方法
WO2018146803A1 (ja) * 2017-02-10 2018-08-16 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド 位置処理装置、飛行体、位置処理システム、飛行システム、位置処理方法、飛行制御方法、プログラム、及び記録媒体

Also Published As

Publication number Publication date
CN111615616A (zh) 2020-09-01
JP2020071154A (ja) 2020-05-07
JP6974290B2 (ja) 2021-12-01

Similar Documents

Publication Publication Date Title
US20230236611A1 (en) Unmanned Aerial Vehicle Sensor Activation and Correlation System
WO2018209898A1 (zh) 信息处理装置、航拍路径生成方法、航拍路径生成系统、程序以及记录介质
JP6765512B2 (ja) 飛行経路生成方法、情報処理装置、飛行経路生成システム、プログラム及び記録媒体
JP6138326B1 (ja) 移動体、移動体の制御方法、移動体を制御するプログラム、制御システム、及び情報処理装置
JP6962775B2 (ja) 情報処理装置、空撮経路生成方法、プログラム、及び記録媒体
WO2018120350A1 (zh) 对无人机进行定位的方法及装置
US20200064133A1 (en) Information processing device, aerial photography route generation method, aerial photography route generation system, program, and storage medium
WO2019230604A1 (ja) 検査システム
CN111699454B (zh) 一种飞行规划方法及相关设备
US20210229810A1 (en) Information processing device, flight control method, and flight control system
WO2020198963A1 (zh) 关于拍摄设备的数据处理方法、装置及图像处理设备
CN111344650B (zh) 信息处理装置、飞行路径生成方法、程序以及记录介质
CN111213107B (zh) 信息处理装置、拍摄控制方法、程序以及记录介质
JP6265576B1 (ja) 撮像制御装置、影位置特定装置、撮像システム、移動体、撮像制御方法、影位置特定方法、及びプログラム
JP2019028560A (ja) モバイルプラットフォーム、画像合成方法、プログラム、及び記録媒体
JP6515423B2 (ja) 制御装置、移動体、制御方法、及びプログラム
WO2020088397A1 (zh) 位置推定装置、位置推定方法、程序以及记录介质
WO2021115192A1 (zh) 图像处理装置、图像处理方法、程序及记录介质
WO2020119572A1 (zh) 形状推断装置、形状推断方法、程序以及记录介质
JP2019096965A (ja) 決定装置、制御装置、撮像システム、飛行体、決定方法、及びプログラム
WO2020001629A1 (zh) 信息处理装置、飞行路径生成方法、程序以及记录介质
KR102439142B1 (ko) 기간 시설의 이미지를 획득하는 방법 및 장치
JP2020052255A (ja) 移動体、制御方法、及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19878483

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19878483

Country of ref document: EP

Kind code of ref document: A1