CN111344650B - Information processing device, flight path generation method, program, and recording medium - Google Patents

Information processing device, flight path generation method, program, and recording medium Download PDF

Info

Publication number
CN111344650B
CN111344650B CN201980005546.7A CN201980005546A CN111344650B CN 111344650 B CN111344650 B CN 111344650B CN 201980005546 A CN201980005546 A CN 201980005546A CN 111344650 B CN111344650 B CN 111344650B
Authority
CN
China
Prior art keywords
imaging
angle
flight
flight path
terrain
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201980005546.7A
Other languages
Chinese (zh)
Other versions
CN111344650A (en
Inventor
顾磊
沈思杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN111344650A publication Critical patent/CN111344650A/en
Application granted granted Critical
Publication of CN111344650B publication Critical patent/CN111344650B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C13/00Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
    • B64C13/02Initiating means
    • B64C13/16Initiating means actuated automatically, e.g. responsive to gust detectors
    • B64C13/18Initiating means actuated automatically, e.g. responsive to gust detectors using automatic pilot
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/14Flying platforms with four distinct rotor axes, e.g. quadcopters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U50/00Propulsion; Power supply
    • B64U50/10Propulsion
    • B64U50/19Propulsion using electrically powered motors

Abstract

It is desirable to enable the flying body to acquire more terrain front information while suppressing a decrease in imaging efficiency of the flying body. An information processing device for generating a flight path for a flight body to fly includes a processing section. The processing unit acquires terrain information of a flight range in which the flying body flies, generates a flight path including imaging positions in a three-dimensional space for imaging the terrain of the flight range based on the terrain information of the flight range, and derives an imaging angle for each imaging position in the flight path based on the terrain information of the flight range and the flight path.

Description

Information processing device, flight path generation method, program, and recording medium
Technical Field
The present disclosure relates to an information processing apparatus that generates a flight path for a flight of an aircraft, a flight path generation method, a program, and a recording medium.
Background
A platform (unmanned aerial vehicle) that performs photographing while passing through a preset fixed path is known (see patent document 1). The platform receives an image capturing instruction from a ground base and captures an image of an image capturing object. When the platform shoots an object, the platform flies along a fixed path, and the imaging device of the platform is tilted to shoot according to the position relation between the platform and the object.
Prior art literature
Patent literature
Patent document 1: japanese patent application laid-open No. 2010-61216
Disclosure of Invention
[ problem ] to be solved by the invention ]
In patent document 1, an imaging angle for imaging an imaging object from a stage is determined so that the imaging object enters an imaging range. However, the imaging angle is not determined for imaging the front of the terrain, and because there may be a place where imaging of the front of the terrain is difficult, a reduction in the amount of information at each point in the terrain may be caused. On the other hand, if the terrain is photographed at various photographing angles in order to make the information amount of each point in the terrain sufficient, image photographing at unnecessary photographing angles may be included, and photographing efficiency may be lowered. Accordingly, it is desirable to be able to acquire a lot of terrain front information from the flying body while suppressing a decrease in imaging efficiency of the flying body.
[ means for solving the problems ]
In one aspect, an information processing apparatus for generating a flight path for a flying object to fly includes a processing section that acquires terrain information of a flight range in which the flying object flies, generates a flight path including imaging positions in a three-dimensional space for imaging terrain of the flight range based on the terrain information of the flight range, and derives an imaging angle for each imaging position in the flight path based on the terrain information of the flight range and the flight path.
The processing unit may acquire candidate angles, which are candidates of an imaging angle for imaging a terrain of a flight range, calculate, for each candidate angle, a first imaging cost, which is an imaging cost when imaging at the imaging position at the candidate angle, and determine, as the imaging angle at the imaging position, the candidate angle at which the first imaging cost at the imaging position is greater than or equal to a first threshold.
The processing unit may sample the terrain of the flight range to acquire a plurality of sampling positions imaged by the flight body, calculate, for each sampling position, a second imaging cost that is an imaging cost when the sampling positions are imaged at candidate angles at the imaging positions, and calculate the first imaging cost by adding the second imaging costs at the respective sampling positions.
The shorter the distance between the imaging position and the sampling position, the greater the second imaging cost may be.
The second imaging cost may be increased as the inner product value between the normal vector with respect to the ground at the sampling position and the imaging vector, which is the vector along the imaging direction indicated by the candidate angle, is increased.
The processing unit may exclude the second imaging cost having the negative inner product value from the calculation target of the first imaging cost.
The processing section may acquire a region of interest included in the flight range and including the imaging object position, and derive an imaging angle for each imaging position in the flight path based on the terrain information of the region of interest and the flight path.
The processing section may acquire candidate angles, which are candidates of imaging angles for imaging the terrain of the flight range, calculate, for each candidate angle, an imaging cost, which is a third imaging cost when imaging the region of interest at the imaging position at the candidate angle, and determine, as the imaging angle at the imaging position, the candidate angle at which the third imaging cost at the imaging position is greater than or equal to the second threshold.
When the third imaging cost when imaging at the imaging angle at a first imaging position of the plurality of imaging positions is equal to or less than the third threshold value, the processing section may exclude the first imaging position from the plurality of imaging positions to generate the flight path.
The processing section may classify a plurality of imaging positions for each imaging object imaged from the imaging positions to generate a plurality of imaging position groups, and connect the plurality of imaging position groups to generate the flight path.
The information processing device is a terminal including a communication unit, and the processing unit can transmit information of the imaging position, the flight path, and the imaging angle to the flying body via the communication unit.
The information processing device is a flying body including an imaging section, and the processing section can control the flying according to a flying path and take an image at an imaging angle at an imaging position of the flying path via the imaging section.
In one aspect, a flight path generation method of an information processing apparatus for generating a flight path for a flight of a flying body includes the steps of: the method comprises the steps of obtaining terrain information of a flight range of a flight body; generating a flight path including an imaging position in a three-dimensional space for imaging a terrain of a flight range based on the terrain information of the flight range; based on the terrain information of the flight range and the flight path, an imaging angle for imaging the terrain of the flight range is derived for each imaging position in the flight path.
The step of deriving the imaging angle may include the steps of: acquiring a candidate angle which is a candidate of an imaging angle for imaging the terrain of the flight range; calculating, for each candidate angle, a first imaging cost, which is an imaging cost when imaging at the candidate angle at the imaging position; a candidate angle at which the first imaging cost at the imaging position is greater than or equal to the first threshold value is determined as an imaging angle at the imaging position.
The step of calculating the first imaging cost may include the steps of: sampling the topography of the flight range to obtain a plurality of sampling positions photographed by the flying body; calculating, for each sampling position, a second imaging cost that is an imaging cost when the sampling position is imaged at the imaging position with a candidate angle; the first imaging cost is calculated by adding the second imaging costs at the respective sampling positions.
The shorter the distance between the imaging position and the sampling position, the greater the second imaging cost may be.
The second imaging cost may be increased as the inner product value between the normal vector with respect to the ground at the sampling position and the imaging vector, which is the vector along the imaging direction indicated by the candidate angle, is increased.
The step of calculating the first imaging cost may include the steps of: the second imaging cost having the inner product value of a negative value is excluded from the calculation targets of the first imaging cost.
The step of deriving the imaging angle may include the steps of: acquiring a region of interest included in a flight range and including a camera object position; an imaging angle is derived for each imaging position in the flight path based on the terrain information of the region of interest and the flight path.
The step of deriving the imaging angle may include the steps of: acquiring a candidate angle which is a candidate of an imaging angle for imaging the terrain of the flight range; calculating, for each candidate angle, a third imaging cost, which is an imaging cost when the region of interest is imaged at the imaging position with the candidate angle; and determining a candidate angle at which the third imaging cost at the imaging position is greater than or equal to the second threshold as an imaging angle at the imaging position.
The step of generating a flight path may comprise the steps of: when a third imaging cost at the time of imaging at an imaging angle at a first imaging position of the plurality of imaging positions is equal to or smaller than a third threshold value, the first imaging position is excluded from the plurality of imaging positions to generate a flight path.
The step of generating a flight path may comprise the steps of: a plurality of imaging positions are classified for each imaging object imaged from the imaging positions to generate a plurality of imaging position groups, and the plurality of imaging position groups are connected to generate a flight path.
The information processing apparatus is a terminal, and may further include the steps of: information of the imaging position, the flight path, and the imaging angle is transmitted to the flying body.
The information processing device is a flying body, and may further include the steps of: controlling the flight according to the flight path; an image is taken at an imaging angle at an imaging position of the flight path.
In one aspect, a program causes an information processing apparatus that generates a flight path for a flight of a flying body to execute the steps of: the method comprises the steps of obtaining terrain information of a flight range of a flight body; generating a flight path including an imaging position in a three-dimensional space for imaging a terrain of a flight range based on the terrain information of the flight range; based on the terrain information of the flight range and the flight path, an imaging angle for imaging the terrain of the flight range is derived for each imaging position in the flight path.
In one aspect, a recording medium is a computer-readable medium and recorded with a program for causing an information processing apparatus that generates a flight path for a flying body to execute the steps of: the method comprises the steps of obtaining terrain information of a flight range of a flight body; generating a flight path including an imaging position in a three-dimensional space for imaging a terrain of a flight range based on the terrain information of the flight range; based on the terrain information of the flight range and the flight path, an imaging angle for imaging the terrain of the flight range is derived for each imaging position in the flight path.
Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding summary of the disclosure. Further, sub-combinations of these feature sets may also constitute the invention.
Drawings
Fig. 1 is a schematic diagram showing a first configuration example of the flying body system in the first embodiment.
Fig. 2 is a schematic diagram showing a second configuration example of the flying body system in the first embodiment.
Fig. 3 is a diagram showing one example of a specific appearance of the unmanned aerial vehicle.
Fig. 4 is a block diagram showing one example of the hardware configuration of the unmanned aerial vehicle.
Fig. 5 is a block diagram showing one example of the hardware configuration of the terminal.
Fig. 6 is a diagram showing one example of a flight range, a flight path, and a camera position.
Fig. 7 is a diagram showing one example of a table representing candidates of imaging angles.
Fig. 8 is a diagram showing one example of sampling positions provided on a rough ground.
Fig. 9 is a diagram illustrating an example of calculating the imaging cost corresponding to the candidate angle.
Fig. 10 is a sequence diagram showing one example of a flight path generation process in the flying body system in the first embodiment.
Fig. 11 is a sequence diagram showing one example of a flight path generation process in the flying body system in the second embodiment.
Fig. 12 is a diagram showing one example of designating a region of interest.
Fig. 13 is a diagram illustrating a process of regenerating a flight path for photographing a region of interest.
Fig. 14 is a sequence diagram showing one example of a flight path generation process in the flying body system in the third embodiment.
Fig. 15 is a sequence diagram showing one example of a flight path generation process in the flying body system in the fourth embodiment.
Fig. 16 is a view showing a case where the unmanned aerial vehicle shoots the terrain immediately below while flying along the terrain.
Fig. 17 is a view showing a case where the unmanned aerial vehicle shoots a terrain at a certain angle while flying along the terrain.
Fig. 18 is a view showing a case where the unmanned aerial vehicle shoots the terrain at various angles while flying along the terrain.
[ symbolic description ]
10. Flying body system
80. Terminal
81. Terminal control unit
83. Operation part
85. Communication unit
87. Memory
88. Display unit
89. Memory device
100. 100R Unmanned Aerial Vehicle (UAV)
110 UAV control unit
150. Communication interface
160. Memory
170. Memory device
200 universal joint
210. Rotor mechanism
220 230 camera part
240 GPS receiver
250. Inertial measurement device
260. Magnetic compass
270. Barometric altimeter
280. Ultrasonic sensor
290. Laser measuring instrument
501. 502 building
AR flight range
gp initial image point
gph, gpl imaging point
hm ground
k sampling positions
ms topography
ms1 hillside
ms2 back side
rt flight path
Image capturing position group of sg1 and sg2
wp imaging position
Detailed Description
The present disclosure will be described below with reference to embodiments of the invention, but the following embodiments do not limit the invention according to the claims. All combinations of features described in the embodiments are not necessarily essential to the inventive solution.
The claims, the specification, the drawings, and the abstract of the specification include matters to be protected by the copyright. The copyright owner does not present any objection as long as the copying of these documents is performed as represented by the patent office's documents or records. However, in other cases, everything is kept in the copyright.
When capturing a terrain (ground surface), it is conceivable to capture an image at a predetermined angle with respect to the direction immediately below the unmanned aerial vehicle. In this case, there may be places in the rugged terrain where it is difficult to photograph the front of the terrain, and thus the amount of information at various points in the terrain may be reduced (see fig. 16, 17). Fig. 16 is a diagram showing a case where the unmanned aircraft 100R as a flying object shoots the terrain ms immediately below while flying along the terrain ms. If the topography ms immediately below is a hillside ms1, the ground front information may not be sufficiently captured. Fig. 17 is a diagram showing a case where the unmanned aerial vehicle 100R shoots the terrain ms at a predetermined angle while flying along the terrain ms. When the unmanned aerial vehicle 100R photographs at a certain angle, it may be difficult to photograph the back surface ms2 of the mountain. I.e. it is difficult to obtain topographical frontal information.
Further, it is conceivable to photograph the terrain at a plurality of angles at respective imaging positions (waypoints) (see fig. 18). Fig. 18 is a diagram showing a case where the unmanned aerial vehicle 100R shoots the terrain ms at various angles while flying along the terrain ms. In this case, in addition to the restored topography (three-dimensional restoration) and the image required for the generation of the orthographic image, an unnecessary image may be captured. That is, the imaging efficiency when imaging the topographic shape is reduced.
Further, in patent document 1, an imaging angle for imaging an imaging object from a stage is determined so that the imaging object enters an imaging range. That is, since the imaging angle is not determined for imaging the front surface of the terrain, there may be a place where imaging of the front surface of the terrain is difficult, and thus the information amount of each point in the terrain may be reduced.
Accordingly, it is desirable to be able to acquire more terrain elevation information by the unmanned aerial vehicle while suppressing degradation of the imaging efficiency of the unmanned aerial vehicle.
In the following embodiments, an unmanned aerial vehicle (UAV: unmanned Aerial Vehicle) is taken as an example of an aircraft. Unmanned aircraft include aircraft that move in the air. In the drawings of this specification, unmanned aerial vehicles are also referred to as "UAVs". The information processing device may be, for example, a terminal, but may be other devices (for example, a transmitter, PC (Personal Computer), an unmanned aerial vehicle, and other information processing devices). The flight path generation method specifies operations in the information processing apparatus. Further, a program (for example, a program for causing an information processing apparatus to execute various processes) is recorded in the recording medium.
(embodiment 1)
Fig. 1 is a diagram showing a first configuration example of an aircraft system 10 in the first embodiment. The aircraft system 10 includes an unmanned aerial vehicle 100 and a terminal 80. The unmanned aerial vehicle 100 and the terminal 80 may communicate with each other via wired communication or wireless communication (e.g., wireless LAN (Local Area Network)). In fig. 1, the terminal 80 is illustrated as a portable terminal (e.g., a smart phone, tablet terminal). The terminal 80 is one example of an information processing apparatus.
The aircraft system 10 may include the unmanned aerial vehicle 100, a transmitter (proportional controller), and the terminal 80. When including a transmitter, a user can instruct control of the flight of the unmanned aerial vehicle using left and right control sticks disposed in front of the transmitter. In this case, the unmanned aerial vehicle 100, the transmitter, and the terminal 80 can communicate with each other through wired communication or wireless communication.
Fig. 2 is a schematic diagram showing a second configuration example of the flying body system 10 in the first embodiment. In fig. 2, the terminal 80 is illustrated as a PC. In either of fig. 1 and 2, the terminal 80 may have the same function.
Fig. 3 is a diagram showing one example of a specific appearance of the unmanned aerial vehicle 100. In fig. 3, a perspective view of the unmanned aerial vehicle 100 is shown when flying in the direction of movement STV 0. The unmanned aerial vehicle 100 is one example of a mobile body.
As shown in fig. 3, a rolling shaft (refer to the x-axis) may be provided in a direction parallel to the ground and along the moving direction STV 0. In this case, a pitch axis (see y axis) may be provided in a direction parallel to the ground and perpendicular to the roll axis, and a yaw axis (see z axis) may be provided in a direction perpendicular to the ground and perpendicular to the roll axis and the pitch axis.
The unmanned aerial vehicle 100 includes a UAV main body 102, a gimbal 200, an imaging unit 220, and a plurality of imaging units 230.
The UAV body 102 contains a plurality of rotors (propellers). The UAV body 102 flies the unmanned aerial vehicle 100 by controlling the rotation of the plurality of rotors. UAV body 102 flies unmanned aerial vehicle 100 using, for example, four rotors. The number of rotors is not limited to four. Furthermore, unmanned aircraft 100 may be a fixed wing aircraft without rotors.
The imaging unit 220 is a camera for imaging an object included in a desired imaging range (for example, a situation above an imaging target, a scene such as a mountain river, or a building on the ground).
The plurality of imaging units 230 are sensing cameras that capture images of the surroundings of the unmanned aerial vehicle 100 in order to control the flight of the unmanned aerial vehicle 100. The two imaging units 230 may be provided on the nose, i.e., the front surface of the unmanned aerial vehicle 100. The other two imaging units 230 may be provided on the bottom surface of the unmanned aerial vehicle 100. The two image pickup sections 230 on the front side may be paired to function as a so-called stereo camera. The two imaging units 230 on the bottom surface side may be paired to function as a stereoscopic camera. Three-dimensional space data around the unmanned aerial vehicle 100 may be generated based on images captured by the plurality of image capturing sections 230. The number of imaging units 230 included in the unmanned aerial vehicle 100 is not limited to four. The unmanned aerial vehicle 100 may include at least one image pickup unit 230. The unmanned aerial vehicle 100 may include at least one image pickup unit 230 on a nose, a tail, side surfaces, a bottom surface, and a top surface of the unmanned aerial vehicle 100, respectively. The settable angle of view in the image pickup section 230 may be larger than the settable angle of view in the image pickup section 220. The image pickup section 230 may have a single focus lens or a fisheye lens.
Fig. 4 is a block diagram showing one example of the hardware configuration of the unmanned aerial vehicle 100. The unmanned aerial vehicle 100 includes a UAV control unit 110, a communication interface 150, a memory 160, a memory 170, a gimbal 200, a rotor mechanism 210, an imaging unit 220, an imaging unit 230, a GPS receiver 240, an inertial measurement unit (IMU: inertial Measurement Unit) 250, a magnetic compass 260, an barometric altimeter 270, an ultrasonic sensor 280, and a laser measurement instrument 290.
The UAV control section 110 is constituted by, for example, a CPU (Central Processing Unit: central processing unit), an MPU (Micro Processing Unit: microprocessor), or a DSP (Digital Signal Processor: digital signal processor). The UAV control section 110 performs signal processing for overall controlling operations of each section of the unmanned aerial vehicle 100, input/output processing of data with other sections, data arithmetic processing, and data storage processing.
The UAV control unit 110 controls the flight of the unmanned aerial vehicle 100 according to a program stored in the memory 160. The UAV control section 110 may control flight. The UAV control section 110 may take an image (e.g., aerial photograph).
The UAV control unit 110 acquires position information indicating the position of the unmanned aerial vehicle 100. The UAV control section 110 may acquire position information indicating the latitude, longitude, and altitude at which the unmanned aerial vehicle 100 is located from the GPS receiver 240. The UAV control unit 110 may acquire latitude and longitude information indicating the latitude and longitude of the unmanned aerial vehicle 100 from the GPS receiver 240, and may acquire altitude information indicating the altitude of the unmanned aerial vehicle 100 from the barometer altimeter 270 as the position information. The UAV control unit 110 may acquire, as the height information, a distance between a radiation point of the ultrasonic wave generated by the ultrasonic sensor 280 and a reflection point of the ultrasonic wave.
The UAV control section 110 may acquire orientation information indicating the orientation of the unmanned aerial vehicle 100 from the magnetic compass 260. The orientation information may be represented, for example, by an orientation corresponding to the orientation of the nose of the unmanned aerial vehicle 100.
The UAV control unit 110 may acquire position information indicating a position where the unmanned aerial vehicle 100 should exist when the imaging unit 220 performs imaging in accordance with the imaging range of the imaging. The UAV control unit 110 may acquire position information indicating a position where the unmanned aerial vehicle 100 should exist from the memory 160. The UAV control unit 110 may acquire position information indicating a position where the unmanned aerial vehicle 100 should exist from another device via the communication interface 150. The UAV control unit 110 may refer to the three-dimensional map database, specify a position where the unmanned aerial vehicle 100 can exist, and acquire the position as position information indicating the position where the unmanned aerial vehicle 100 should exist.
The UAV control unit 110 may acquire imaging range information indicating the respective imaging ranges of the imaging unit 220 and the imaging unit 230. The UAV control unit 110 may acquire, from the imaging units 220 and 230, angle-of-view information indicating angles of view of the imaging units 220 and 230 as parameters for specifying the imaging range. The UAV control unit 110 may acquire information indicating the imaging directions of the imaging unit 220 and the imaging unit 230 as parameters for specifying the imaging range. The UAV control unit 110 may acquire, for example, posture information indicating the posture state of the image capturing unit 220 from the gimbal 200 as information indicating the image capturing direction of the image capturing unit 220. The posture information of the image pickup unit 220 may indicate the angle at which the pitch axis and the yaw axis of the gimbal 200 are rotated from the reference rotation angle.
The UAV control unit 110 may acquire position information indicating the position where the unmanned aerial vehicle 100 is located as a parameter for specifying the imaging range. The UAV control unit 110 may acquire imaging range information by defining an imaging range indicating a geographical range imaged by the imaging unit 220 and generating imaging range information based on the angles and imaging directions of the imaging unit 220 and the imaging unit 230 and the position where the unmanned aerial vehicle 100 is located.
The UAV control unit 110 may acquire imaging range information from the memory 160. The UAV control section 110 may acquire the imaging range information via the communication interface 150.
The UAV control unit 110 controls the gimbal 200, the rotor mechanism 210, the imaging unit 220, and the imaging unit 230. The UAV control unit 110 can control the imaging range of the imaging unit 220 by changing the imaging direction or the viewing angle of the imaging unit 220. The UAV control unit 110 can control the imaging range of the imaging unit 220 supported by the gimbal 200 by controlling the rotation mechanism of the gimbal 200.
The imaging range is a geographical range imaged by the imaging unit 220 or the imaging unit 230. The imaging range is defined by latitude, longitude, and altitude. The imaging range may be a range of three-dimensional space data defined by latitude, longitude, and altitude. The imaging range may be a range of two-dimensional spatial data defined by latitude and longitude. The imaging range may be specified according to the angle of view and imaging direction of the imaging unit 220 or the imaging unit 230, and the position where the unmanned aerial vehicle 100 is located. The imaging directions of the imaging unit 220 and the imaging unit 230 may be defined by the azimuth and the depression angle in which the front surfaces of the imaging lenses provided with the imaging unit 220 and the imaging unit 230 face. The imaging direction of the imaging unit 220 may be a direction specified by the orientation of the nose of the unmanned aerial vehicle 100 and the posture state of the imaging unit 220 with respect to the gimbal 200. The imaging direction of the imaging unit 230 may be a direction specified by the orientation of the nose of the unmanned aerial vehicle 100 and the position where the imaging unit 230 is provided.
The UAV control unit 110 can specify the environment around the unmanned aerial vehicle 100 by analyzing a plurality of images captured by the plurality of imaging units 230. The UAV control section 110 may control flight according to the surrounding environment of the unmanned aerial vehicle 100, for example, avoiding obstacles.
The UAV control section 110 may acquire stereoscopic information (three-dimensional information) indicating a stereoscopic shape (three-dimensional shape) of an object existing around the unmanned aerial vehicle 100. The object may be, for example, a part of a landscape of a building, a road, a vehicle, a tree, etc. The stereoscopic information is, for example, three-dimensional space data. The UAV control unit 110 may acquire stereo information by generating stereo information indicating the stereo shape of an object existing around the unmanned aerial vehicle 100 from the respective images obtained from the plurality of imaging units 230. The UAV control unit 110 may acquire stereoscopic information indicating the stereoscopic shape of the object existing around the unmanned aerial vehicle 100 by referring to the three-dimensional map database stored in the memory 160 or the memory 170. The UAV control unit 110 may acquire stereoscopic information about the stereoscopic shape of the object existing around the unmanned aerial vehicle 100 by referring to a three-dimensional map database managed by a server existing on the network.
The UAV control section 110 controls the rotor mechanism 210 to control the flight of the unmanned aerial vehicle 100. That is, the UAV control unit 110 controls the rotor mechanism 210 to control the position including the latitude, longitude, and altitude of the unmanned aerial vehicle 100. The UAV control section 110 can control the imaging range of the imaging section 220 by controlling the flight of the unmanned aerial vehicle 100. The UAV control unit 110 can control the angle of view of the image capturing unit 220 by controlling the zoom lens included in the image capturing unit 220. The UAV control section 110 may control the angle of view of the image capturing section 220 by digital zooming using the digital zooming function of the image capturing section 220.
When the imaging unit 220 is fixed to the unmanned aerial vehicle 100 and the imaging unit 220 cannot be moved, the UAV control unit 110 can cause the imaging unit 220 to capture a desired imaging range in a desired environment by moving the unmanned aerial vehicle 100 to a particular position on a particular date. Alternatively, even when the imaging unit 220 does not have a zoom function and the angle of view of the imaging unit 220 cannot be changed, the UAV control unit 110 may cause the imaging unit 220 to capture a desired imaging range in a desired environment by moving the unmanned aerial vehicle 100 to a particular position on a particular date.
The communication interface 150 communicates with the terminal 80. The communication interface 150 may perform wireless communication by any wireless communication method. The communication interface 150 may perform wired communication by any wired communication means. The communication interface 150 may transmit the photographed image, additional information (metadata) related to the photographed image to the terminal 80.
The memory 160 stores programs and the like necessary for the UAV control unit 110 to control the gimbal 200, the rotor mechanism 210, the imaging unit 220, the imaging unit 230, the GPS receiver 240, the inertial measurement unit 250, the magnetic compass 260, the barometer altimeter 270, the ultrasonic sensor 280, and the laser surveying instrument 290. The Memory 160 may be a computer-readable recording medium, and may include at least one of flash Memory such as SRAM (Static Random Access Memory: static random access Memory), DRAM (Dynamic Random Access Memory: dynamic random access Memory), EPROM (Erasable Programmable Read Only Memory: erasable programmable read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory: electrically erasable programmable read Only Memory), and USB (Universal Serial Bus: universal serial bus) Memory. The memory 160 may be removable from the unmanned aerial vehicle 100. The memory 160 may operate as a working memory.
The memory 170 may include at least one of HDD (Hard Disk Drive), SSD (Solid State Drive: solid state Disk), SD card, USB memory, and others. The memory 170 may store various information and various data. The memory 170 may be removable from the unmanned aerial vehicle 100. The memory 170 may record the captured image.
The memory 160 or storage 170 may hold information of the camera position, camera path generated by the terminal 80 or the unmanned aerial vehicle 100. The UAV control unit 110 may set information on the imaging position and the imaging path as one of imaging parameters related to imaging scheduled by the unmanned aerial vehicle 100 and flight parameters related to flight scheduled by the unmanned aerial vehicle 100. The setting information may be stored in the memory 160 or the storage 170. Further, the imaging parameters may include information of an imaging angle of the imaging section 220.
The gimbal 200 can rotatably support the imaging unit 220 about a yaw axis, a pitch axis, and a roll axis. The universal joint 200 may rotate the image pickup unit 220 about at least one of a yaw axis, a pitch axis, and a roll axis, thereby changing an image pickup direction of the image pickup unit 220.
The rotor mechanism 210 has a plurality of rotors and a plurality of drive motors for rotating the plurality of rotors. The rotor mechanism 210 controls rotation by the UAV control section 110, and thereby flies the unmanned aerial vehicle 100. The number of rotors 211 may be, for example, 4, or other numbers. Furthermore, unmanned aircraft 100 may be a fixed wing aircraft without rotors.
The imaging unit 220 captures an object within a desired imaging range and generates data of an imaged image. Image data (for example, an aerial image) obtained by imaging by the imaging unit 220 may be stored in a memory included in the imaging unit 220 or in the memory 170.
The imaging unit 230 captures the surroundings of the unmanned aerial vehicle 100 and generates data of a captured image. Image data of the image pickup section 230 may be stored in the memory 170.
The GPS receiver 240 receives a plurality of signals indicating the time transmitted from a plurality of navigation satellites (i.e., GPS satellites) and the position (coordinates) of each GPS satellite. The GPS receiver 240 calculates the position of the GPS receiver 240 (i.e., the position of the unmanned aerial vehicle 100) from the plurality of signals received. The GPS receiver 240 outputs the position information of the unmanned aerial vehicle 100 to the UAV control section 110. The UAV control unit 110 may calculate the position information of the GPS receiver 240 instead of the GPS receiver 240. In this case, information indicating the time and the position of each GPS satellite included in the plurality of signals received by the GPS receiver 240 is input to the UAV control unit 110.
The inertial measurement unit 250 detects the attitude of the unmanned aerial vehicle 100, and outputs the detection result to the UAV control unit 110. The inertial measurement unit 250 can detect acceleration in the three-axis directions of the front-rear, left-right, up-down, and pitch, roll, and yaw axes of the unmanned aerial vehicle 100 as the attitude of the unmanned aerial vehicle 100.
The magnetic compass 260 detects the orientation of the nose of the unmanned aerial vehicle 100, and outputs the detection result to the UAV control section 110.
The barometer 270 detects the flying height of the unmanned aerial vehicle 100, and outputs the detection result to the UAV control section 110.
The ultrasonic sensor 280 emits ultrasonic waves, detects ultrasonic waves reflected by the ground or an object, and outputs the detection result to the UAV control section 110. The detection result may show the distance from the unmanned aerial vehicle 100 to the ground, i.e. the altitude. The detection result may show the distance from the unmanned aerial vehicle 100 to an object (subject).
The laser measuring instrument 290 irradiates a laser to an object, receives reflected light reflected by the object, and measures a distance between the unmanned aerial vehicle 100 and the object (subject) by the reflected light. As an example of a laser-based distance measurement method, a time-of-flight method may be used.
Fig. 5 is a block diagram showing one example of the hardware configuration of the terminal 80. The terminal 80 includes a terminal control unit 81, an operation unit 83, a communication unit 85, a memory 87, a display unit 88, and a storage 89. The terminal 80 may be held by a user desiring to control the flight of the unmanned aerial vehicle 100.
The terminal control unit 81 is configured by, for example, a CPU, MPU, or DSP. The terminal control unit 81 performs signal processing for controlling the operation of each unit of the terminal 80 as a whole, input/output processing of data with other units, calculation processing of data, and storage processing of data.
The terminal control unit 81 may acquire data and information (various measurement data, captured images, additional information thereof, and the like) from the unmanned aerial vehicle 100 via the communication unit 85. The terminal control unit 81 can acquire data and information (for example, various parameters) input via the operation unit 83. The terminal control unit 81 can acquire data and information stored in the memory 87. The terminal control unit 81 may transmit data and information (for example, information of a shooting position, a shooting angle, and a flight path) to the unmanned aerial vehicle 100 via the communication unit 85. The terminal control unit 81 may transmit data and information to the display unit 88, and display information based on the data and information may be displayed on the display unit 88.
The terminal control unit 81 may execute an application program for performing flight control for the unmanned aerial vehicle 100. The terminal control unit 81 can generate various data used in the application program.
The operation unit 83 receives and acquires data and information input by the user of the terminal 80. The operation unit 83 may include an input device such as a button, a key, a touch display, and a microphone. The operation section 83 and the display section 88 are mainly shown here as being constituted by a touch panel. In this case, the operation unit 83 may function as a touch operation, a click operation, a drag operation, or the like. The operation unit 83 can receive information on various parameters. The information input by the operation unit 83 may be transmitted to the unmanned aerial vehicle 100. The various parameters may include parameters related to flight control.
The communication unit 85 performs wireless communication with the unmanned aerial vehicle 100 by various wireless communication methods. The wireless communication method of this wireless communication may include, for example, wireless LAN, bluetooth (registered trademark), or communication via a public wireless network. The communication unit 85 can perform wired communication by any wired communication method.
The memory 87 may include, for example, a program for defining the operation of the terminal 80, a ROM for storing data of a set value, and a RAM for temporarily storing various information and data used when the terminal control unit 81 performs processing. Memory 87 may include memory other than ROM and RAM. The memory 87 may be provided inside the terminal 80. The memory 87 may be provided to be detachable from the terminal 80. The program may include an application program.
The display unit 88 is configured by, for example, an LCD (Liquid Crystal Display ), and displays various information and data outputted from the terminal control unit 81. The display unit 88 may display various data and information related to execution of the application program.
The memory 89 stores and retains various data and information. The memory 89 may be a HDD, SSD, SD card, USB memory, or the like. The memory 89 may be provided inside the terminal 80. The memory 89 may be detachably provided on the terminal 80. The memory 89 may store camera images, additional information acquired from the unmanned aerial vehicle 100. Additional information may be stored in the memory 87.
In addition, when the aircraft system 10 includes a transmitter (proportional controller), the processing performed by the terminal 80 may also be performed by the transmitter. Since the transmitter has the same constituent parts as the terminal 80, it will not be described in detail. The transmitter includes a control unit, an operation unit, a communication unit, a display unit, a memory, and the like. When the aircraft system 10 has a transmitter, the terminal 80 may not be provided.
Fig. 6 is a diagram showing the flight range AR, the flight path rt, and the imaging position wp. The flight range AR represents the range in which the unmanned aerial vehicle 100 flies. The flight range AR may coincide with an imaging range imaged by the imaging section 220 of the unmanned aerial vehicle 100. The flight path rt represents a path along which the unmanned aerial vehicle 100 is flown. The imaging position wp is a position when the image is captured by the imaging unit 220 of the unmanned aerial vehicle 100. The flight path rt is generated through the imaging position wp. The terminal control unit 81 acquires the flight range AR, generates the flight path rt, and determines the imaging position wp.
The unmanned aerial vehicle 100 performs aerial photography at the imaging position wp while flying along the flight path rt within the flight range AR. In fig. 6, the flight path rt is a route that is set to enter from the lower left corner, move in a square wave shape, and exit from the upper right corner. The flight path rt in this case is a flight path according to a scanning method for uniformly photographing the flight range AR. The flight path rt may be a zigzag or spiral-shaped path, or may be a flight path of another shape, in addition to the square-wave-shaped path.
Fig. 7 is a diagram showing a table indicating candidates of imaging angles (also referred to as candidate angles). The imaging angle is an imaging angle at which the imaging unit 220 of the unmanned aerial vehicle 100 images the terrain of the flight range AR. The candidates of the imaging angle are candidates of imaging angles employed at the time of actual shooting among various imaging angles. The table representing the candidate angles may be registered in the memory 87 or the storage 89 of the terminal 80. The table indicating the candidate angles may be stored in the external server.
The imaging angle may be defined by a pitch angle and a yaw angle of the gimbal 200 supporting the imaging unit 220. Therefore, the candidate angle may be defined by the pitch angle and yaw angle of the gimbal 200 supporting the imaging unit 220. In the table shown in fig. 7, nine combinations (points) of pitch angle and yaw angle are shown as candidate angles. For example, among the nine points, a point having a pitch angle of 0 °, a point having a yaw angle of 0 °, a point having a pitch angle of 270 °, and a point having a pitch angle of-45 ° and a yaw angle of 270 ° are included.
In addition, fig. 7 is only an example, and a combination of a pitch angle and a yaw angle may be defined in more detail. In fig. 7, pitch angles and yaw angles of the candidate angles are defined at uniform intervals, or may be defined at uneven intervals. For example, a larger number of candidate angles may be defined in an angle range that is easily selected (easily assumed) as the imaging angle of the imaging section 220, and a smaller number of candidate angles may be defined in an angle range that is difficult to select (easily assumed) as the imaging angle of the imaging section 220.
In fig. 7, it is assumed that the unmanned aerial vehicle 100 performs aerial photographing from above, and the pitch angle is set to be negative, that is, the imaging direction is directed downward from the horizontal direction. The pitch angle is 0 ° when facing in the horizontal direction, and 90 ° when facing directly below. In the case where the unmanned aerial vehicle 100 is low in height and the imaging unit 220 is turned upward from below, the pitch angle may be set to be a positive angle. Thus, shooting appropriate for the subject situation can be performed.
The terminal control unit 81 may acquire a candidate angle for capturing a view of the topography of the flight range AR. The terminal control section 81 may acquire the candidate angle from the memory 87 or the storage 89. The terminal control section 81 may acquire the candidate angle from the external server via the communication section 85.
Fig. 8 is a diagram showing an example of the sampling position k provided on the rugged ground hm. The sampling position k may be a sampling position taken by the unmanned aerial vehicle 100 that extracts the topography of the flight range AR. The sampling location may be specified by a three-dimensional location (latitude, longitude, altitude). Terrain information may be defined based on a plurality of sampling locations.
In fig. 8, the sampling position k is provided in one direction within the flight range AR, but the sampling position k may be provided in a two-dimensional direction within the flight range AR. I.e. the sampling positions k can be set in a grid (lattice) within the flight range AR, i.e. at predetermined intervals. Furthermore, the sampling positions k may also be arranged at unequal intervals instead of equal intervals.
Further, an arrow directed from the sampling position k on the ground hm to the image pickup section 220 of the unmanned aerial vehicle 100 indicates a normal vector (normal direction) to the ground hm. In the case where the image pickup section 220 is present on the normal vector at the sampling position k, when the image pickup section 220 picks up the sampling position, a large amount of information around the sampling position k can be obtained from the front side of the sampling position k by picking up an image. On the other hand, the imaging range of the imaging unit 220 may include not only the sampling position k existing on the front surface but also another sampling position k not existing on the front surface. Therefore, it is desirable to take a picture of each sampling position k from a direction as close as possible to the front of the entire sampling position k included in the imaging range. The degree of suitability of the imaging unit 220 to take an image of each sampling position k may be digitized as an imaging cost.
In fig. 8, the vertical axis of the graph represents the height in three-dimensional space (e.g., the height of the imaging unit 220 of the unmanned aerial vehicle 100, the ground hm), and the horizontal axis represents the position (latitude, longitude) in three-dimensional space (e.g., the sampling position k, the position of the unmanned aerial vehicle 100 including the imaging unit 220).
The information of the sampling position may be stored in the memory 87 or the memory 89, in an external server. The terminal control unit 81 may acquire information of the sampling position k from the memory 87 or the storage 89. The terminal control section 81 may acquire information of the sampling position k via the communication section 85. The terminal control unit 81 itself can sample the topographic information and determine the sampling position k.
Fig. 9 is a diagram illustrating an example of calculating the imaging cost corresponding to the candidate angle. The imaging cost is obtained by digitizing whether or not the imaging unit 220 of the unmanned aerial vehicle 100 is suitable for imaging. For example, the imaging cost is calculated for each candidate angle.
The terminal control unit 81 performs the processing (for example, the calculation of the expression) of the following specific example, and calculates the imaging cost corresponding to the candidate angle. The terminal control section 81 determines the imaging angle θ based on the imaging cost. Since the imaging angle θ is calculated for each imaging position i, it is also called an imaging angle θi.
For example, the position of the ground hm as the imaging target is represented by a sampling position K (k=1, 2,..k). The imaging position wp imaged by the imaging unit 220 of the unmanned aerial vehicle 100 is represented by an imaging position I (i=1, 2,., I). The candidate angle of the image pickup unit 220 is represented by a candidate angle J (j=1, 2,., J).
In this case, the terminal control section 81 may calculate the imaging cost Cij of the candidate angle j at the imaging position i according to equation (1).
[ math 1 ]
Further, for example, a distance from the imaging unit 220 to the ground hm, that is, a distance between the imaging position i and the sampling position k is represented by a distance d. The imaging direction corresponding to the candidate angle j of the imaging unit 220 is represented by an imaging vector n. The normal direction of the ground hm at the sampling position k is denoted by a normal vector l. The terminal control unit 81 can calculate the imaging cost Cijk for the sampling position k of the imaging target according to equation (2).
[ formula 2 ]
Cijk=d -0.5 max(n*(-1).0)……(2)
According to equation (2), the shorter the distance d, the larger the value of the imaging cost Cijk at the sampling position. In addition, n (-1) represents an inner product (inner product value) of the imaging vector n and the normal vector 1. According to equation (2), the larger the inner product n (-1), the larger the value of the imaging cost Cijk of the sampling position.
In addition, max (n (-1), 0) represents the larger of the inner product n (-1) and the value 0. This means that the terminal control unit 81 excludes the imaging cost Cijk at the sampling position where the inner product n (-1) is negative from the calculation target of the imaging cost Cij at the imaging position i.
As shown in equations (1) and (2), the terminal control section 81 obtains the imaging cost Cij by adding the imaging costs Cijk at the respective sampling positions k. The candidate angle j when the imaging cost Cij at the imaging position i is maximum is the optimal imaging angle θim. The terminal control unit 81 may calculate the optimum imaging angle θim according to equation (3).
[ formula 3 ]
θim=argmax(cij)……(3)
In addition, argmax (Cij) is a candidate angle j when the imaging cost Cij is maximum (max), and this angle is the optimal imaging angle θim.
In addition, the optimal imaging angle θim is one example of the imaging angle θi. That is, the imaging angle θi is not limited to the angle at which the imaging cost Cij is maximum, and may be an angle satisfying a predetermined criterion. For example, the imaging angle θi determined (selected) from the candidate angles may be an angle at which the imaging cost Cij is the second largest and the third largest among angles greater than or equal to the threshold th 1. The imaging angle θi may be an imaging angle corresponding to an imaging cost Cij that is equal to or larger than an average value obtained by averaging the imaging costs Cij.
Further, when the imaging costs Cij at the imaging positions i are both smaller than the threshold th1, the imaging angle θi may not be determined from the candidate angles j and the imaging at the imaging positions may be omitted. In this case, for example, the imaging position i at which the image quality of the imaged image is less than or equal to the predetermined reference regardless of the imaging angle θi is not taken as the imaging position, whereby the terminal 80 can omit unnecessary imaging and improve the imaging efficiency. The terminal 80 is also capable of flying the unmanned aerial vehicle 100 along a flight path rt that does not pass through the imaging position i.
In this way, the terminal control unit 81 can sample the terrain of the flight range AR and acquire a plurality of sampling positions k imaged by the unmanned aerial vehicle 100. The terminal control section 81 may calculate an imaging cost Cijk (an example of the second imaging cost) when the sampling position k is imaged at the candidate angle j at the imaging position i for each sampling position k. The terminal control section 81 may calculate the imaging cost Cij (an example of the first imaging cost) at the imaging position i by adding the imaging costs Cijk at the respective sampling positions k.
Thus, the terminal 80 can determine the imaging angle θi at each imaging position i by taking into account the imaging cost Cijk at each sampling position k. For example, even if the imaging cost Cijk at one sampling position k is small, when the imaging cost Cijk at another sampling position k is large, the imaging cost Cij of the whole of the plurality of sampling positions k becomes large, and the terminal 80 can use the candidate angle j in this case as the imaging angle θi. Therefore, the terminal 80 can determine the imaging angle θi in consideration of the degrees of photographing at the plurality of sampling positions k.
Further, as shown in equation (2), the shorter the distance d, the larger the value of the imaging cost Cijk at the sampling position can be.
Thus, the imaging cost Cijk at the sampling position k tends to be increased as the distance d between the imaging position i and the sampling position k is shorter, and the imaging cost Cijk at the imaging position i tends to be increased as the sampling position k is closer to the unmanned aerial vehicle 100. Therefore, when determining the imaging angle θi, the terminal 80 can increase the degree of influence of the imaging cost Cijk at the sampling position k near the unmanned aerial vehicle 100. Further, when the sampling position k approaches the unmanned aerial vehicle 100, the imaging range (the range included in the imaged image) captured by the unmanned aerial vehicle 100 becomes narrow, and the image information at the sampling position k in the imaging range relatively increases. Therefore, the terminal 80 can improve the generation accuracy of the orthographic image and the three-dimensional restoration accuracy due to the increase of the ground information for the orthographic image and the three-dimensional restoration.
Further, as shown in the expression (2), the larger the inner product n (-1), the larger the value of the imaging cost Cijk at the sampling position k can be.
Thus, the larger the inner product value between the normal vector 1 to the ground hm at the sampling position k and the imaging vector n which is the vector in the imaging direction indicated by the candidate angle, the larger the imaging cost Cijk at the sampling position. Therefore, the smaller the angle formed by the normal vector 1 and the imaging vector n, the larger the imaging cost Cijk at the sampling position k, and the imaging cost Cij at the imaging position i tends to become larger. Therefore, when determining the imaging angle θi, the terminal 80 can increase the degree of influence of the imaging vector n having a smaller angle with the normal vector l. Further, when the angle formed by the normal vector 1 and the imaging vector n is small, shooting can be performed at a position close to the sampling position k from the front, and image information at the sampling position k can be increased. Therefore, the terminal 80 can improve the generation accuracy of the orthographic image and the three-dimensional restoration accuracy due to the increase of the ground information for the orthographic image and the three-dimensional restoration.
The terminal control unit 81 may exclude the imaging cost Cijk at the sampling position k where the value of the inner product n (-1) (inner product value) is negative from the calculation target of the imaging cost Cij at the imaging position i.
Thus, for example, the terminal 80 can suppress a great influence of one extreme value of which inner product value is negative on the imaging cost Cij at the imaging position i by setting the imaging cost Cijk at the sampling position at which the value of the inner product n (-1) is negative to 0.
An example of the operation of the aircraft system 10 is described below.
Fig. 10 is a sequence diagram showing one example of a flight path generation process in the flight body system 10. In fig. 10, the process of generating a flight path is illustrated as being performed mainly by the terminal 80.
The terminal control unit 81 acquires information on the flight range AR (T1). The terminal control section 81 may receive a user input via the operation section 83 and acquire the flight range AR. In this case, the terminal control section 81 may acquire the map information from the external server via the communication section 85. For example, when the flight range AR is set to a rectangular range, the user can obtain information of the flight range AR by inputting the positions (latitude, longitude) of the four corners of the rectangle in the map information. Further, when the flight range AR is set to a circular range, the user can obtain information of the flight range AR by inputting the radius of a circle centered on the flight position. Further, the user can obtain information of the flight range AR based on the map information by inputting information of an area, a specific place name (e.g., tokyo), or the like. The terminal control unit 81 may acquire the flight range AR stored in the memory 87 and the memory 89 from the memory 87 and the memory 89. The terminal control section 81 may acquire the flight range AR from an external server via the communication section 85.
The terminal control unit 81 acquires various parameters (T2). The parameter may be a parameter related to the flight of the unmanned aerial vehicle 100 captured by the image capturing section 220. The parameters may include, for example, an image capturing position, image capturing date and time, a distance to an object, an image capturing angle of view, image capturing conditions, camera parameters (shutter speed, exposure value, image capturing mode, and the like). The terminal control section 81 may acquire parameters input by the user via the operation section 83. The terminal control unit 81 may acquire various parameters stored in the memory 87 and the memory 89 from the memory 87 and the memory 89. The terminal control unit 81 may acquire various parameters from the unmanned aerial vehicle 100 and an external server via the communication unit 85.
The terminal control unit 81 acquires the terrain information based on the information of the flight range AR (T3). For example, the terminal control unit 81 may acquire the terrain information of the flight range AR in linkage with a map server on a network connected via the communication unit 85. The terrain information may include position information (latitude, longitude, altitude) at various positions of the flight range AR. By gathering the positional information at the respective positions, the three-dimensional shape of the flight range AR can be represented. Further, the topographic information may include information of ground shapes of buildings, mountains, forests, towers, and the like, information of objects.
The terminal control unit 81 calculates the flight level based on the topographic information of the flight range AR and information such as the distance to the subject included in the acquired parameters (T4). For example, the terminal control unit 81 may calculate the flying height of the unmanned aerial vehicle 100 in combination with the fluctuation of the ground hm indicated by the topographic information to secure the distance to the subject.
The terminal control unit 81 generates a flight path rt (T5). In this case, the terminal control section 81 may generate the flight path rt based on the flight range AR, the terrain information, and the flight height. The generated flight path rt maintains the derived flight heights at the respective positions within the flight range AR, and passes through the imaging position wp in the three-dimensional space for imaging the terrain within the flight range AR. The terminal control unit 81 may determine which position on the two-dimensional plane (latitude, longitude) within the flight range AR the flight path is to be passed through, and which sampling position k (imaging position wp) the flight path is to be passed through, according to a known method.
The terminal control unit 81 derives (e.g., calculates) an imaging angle θi for each imaging position i along the flight path rt based on the terrain information and the flight altitude (T6). When deriving the imaging angle θi, the terminal control unit 81 calculates the imaging cost Cij at the imaging position i for each candidate angle j. The terminal control section 81 determines a candidate angle (e.g., an optimal imaging angle θim) at which the imaging cost Cij at the imaging position i is greater than or equal to the threshold th1 (e.g., maximum) as the imaging angle θi at the imaging position i. When deriving the imaging angle θi, the optimum imaging angle θim can be calculated based on the topographic information and the information on the flight path rt.
The terminal control unit 81 transmits notification parameters including the imaging position wp, the flight path rt, and the imaging angle θi to the unmanned aerial vehicle 100 via the communication unit 85 (T7). The notification parameter may include an imaging parameter related to a camera (imaging unit 220) at the time of shooting, and a flight parameter related to a flight at the time of shooting.
In the unmanned aerial vehicle 100, the UAV control section 110 receives notification parameters from the terminal 80 via the communication interface 150 (T8). The UAV control unit 110 sets each parameter used by the unmanned aerial vehicle 100 by saving the received notification parameter in the memory 160 (T9). The UAV control unit 110 drives the imaging unit 220 while flying along the flight path rt based on the set parameters, and performs aerial photography at the imaging angle θi (T10).
In this way, the terminal 80 may generate a flight path rt for the unmanned aerial vehicle 100 to fly. The terminal control unit 81 can acquire the terrain information of the flight range AR in which the unmanned aerial vehicle 100 flies. The terminal control unit 81 may generate the flight path rt including the imaging position wp in the three-dimensional space for imaging the topography of the flight range AR, based on the topography information of the flight range AR. The terminal control unit 81 may derive (e.g., calculate) an imaging angle θi for each imaging position of the flight path rt based on the terrain information of the flight range AR and the flight path rt.
Thus, since the terminal 80 determines the imaging angle θi in consideration of the fluctuation of the terrain, the area of the ground hm where imaging is difficult due to the fluctuation of the terrain is reduced. Therefore, at each imaging position i, the unmanned aerial vehicle 100 can take an image from the front as much as possible when taking an image of each point of the ground hm. Therefore, the terminal 80 can improve the generation accuracy of the orthographic image and the three-dimensional restoration accuracy (three-dimensional shape estimation accuracy) by using the captured image captured at the specified imaging angle θi. In addition, in order to improve the generation accuracy and the three-dimensional restoration accuracy of the orthographic image, the terminal 80 can not take images at various angles at the respective imaging positions i and improve the imaging efficiency of the unmanned aerial vehicle 100. Therefore, the terminal 80 can suppress a decrease in imaging efficiency of the unmanned aerial vehicle 100, and can make the unmanned aerial vehicle 100 acquire as much information as possible at each point of the rough terrain. The process of generating the flight path, which is mainly performed by the terminal 80, may be performed during the flight of the unmanned aerial vehicle 100 or before the start of the flight.
The terminal control unit 81 may acquire a candidate angle j, which is a candidate of the imaging angle θi for imaging the terrain of the flight range AR. The terminal control section 81 may calculate an imaging cost Cij at the imaging position i (an example of the imaging cost at the imaging position when imaging is performed at the candidate angle j, that is, the first imaging cost) for each candidate angle j. The terminal control section 81 may determine the imaging angle θi at the imaging position based on the imaging cost Cij at the imaging position i. In this case, the terminal control section 81 may determine the candidate angle j at which the imaging cost Cij at the imaging position i is greater than or equal to the threshold th1 as the imaging angle θi at the imaging position.
Thus, the terminal 80 can quantify whether or not it is suitable for photographing into an image capturing cost, and can easily determine how suitable it is for photographing at the candidate angle j at the image capturing position i. Here, a candidate angle j of the nth largest (e.g., largest) image capturing cost Cij among the image capturing costs Cij at the image capturing position i that is greater than or equal to the threshold th1 may be determined as the image capturing angle θi.
Further, the terminal control section 81 may transmit information of the imaging position i, the flight path rt, and the imaging angle θi to the unmanned aerial vehicle 100 via the communication section 85.
As a result, a plurality of processes for generating the flight path rt can be performed on the terminal 80 side, and the terminal 80 can acquire more information on the front surface of the terrain by the unmanned aerial vehicle 100 while suppressing a decrease in the imaging efficiency of the unmanned aerial vehicle 100 while reducing the processing load of the unmanned aerial vehicle 100.
(second embodiment)
In the second embodiment, a case is illustrated in which the processing of generating the flight path is mainly performed by the unmanned aerial vehicle 100. The aircraft system in the second embodiment has substantially the same structure as the first embodiment. The same components as those of the first embodiment are denoted by the same reference numerals, and the description thereof is omitted or simplified.
Fig. 11 is a sequence diagram showing a flight path generation process in the flying body system 10 in the second embodiment. In fig. 11, a case is illustrated in which the processing of generating the flight path is performed mainly by the unmanned aerial vehicle 100.
The processing of the processes T21 to T23 is the same as the processes T1 to T3 of the first embodiment. The terminal control section 81 transmits the flight range AR, the parameters, and the terrain information acquired in the processes T21 to T23 to the unmanned aerial vehicle 100 via the communication section 85 (T24).
In the unmanned aerial vehicle 100, the UAV control section 110 receives the flight range AR, parameters, and terrain information via the communication interface 150 (T25). The UAV control section 110 stores the received flight range AR, parameters, and terrain information in the memory 160.
The UAV control unit 110 calculates the flight level (T26). The fly height calculation method may be the same as T4. The UAV control section 110 generates a flight path rt (T27). The method of generating the flight path rt may be the same as T5. The UAV control unit 110 derives an imaging angle θi for each imaging position along the flight path rt (T28). The method of deriving the imaging angle θi may be the same as T6. The UAV control section 110 holds and sets parameters including the imaging position, the flight path rt, and the imaging angle θi in the memory 160 (T29). The UAV control unit 110 drives the imaging unit 220 at the imaging position while flying along the flight path rt based on the set parameters, and performs aerial photography at the imaging angle θi (T30).
In addition, the processing of T23 may be performed in the unmanned aerial vehicle 100. In this case, the terminal control section 81 may transmit the flight range AR and the parameters acquired in the processes T21, T22 to the unmanned aerial vehicle 100 via the communication section 85. The UAV control section 110 may receive the flight range AR and the parameters and calculate terrain information and the flight altitude.
In this way, the unmanned aerial vehicle 100 may generate a flight path rt for the unmanned aerial vehicle 100 to fly. The UAV control section 110 may acquire terrain information of the flight range AR in which the unmanned aerial vehicle 100 flies. The UAV control unit 110 may generate a flight path rt including an imaging position wp in a three-dimensional space for imaging the terrain of the flight range AR, based on the terrain information of the flight range AR. The UAV control unit 110 may derive (e.g., calculate) an imaging angle θi for each imaging position of the flight path rt based on the terrain information of the flight range AR and the flight path rt.
In this way, since the unmanned aerial vehicle 100 determines the imaging angle θ in consideration of the fluctuation of the terrain, it is possible to reduce a place of the ground hm where imaging is difficult due to the fluctuation of the terrain. Therefore, the unmanned aerial vehicle 100 can take a photograph from the front as much as possible at each photographing position when photographing each point of the ground hm. Therefore, the unmanned aerial vehicle 100 can improve the generation accuracy and the three-dimensional restoration accuracy (three-dimensional shape estimation accuracy) of the orthographic image by using the captured image captured at the specified imaging angle θi. In addition, in order to improve the generation accuracy and the three-dimensional restoration accuracy of the orthographic image, the unmanned aerial vehicle 100 can improve the imaging efficiency without capturing images at various angles at the respective imaging positions. Therefore, the unmanned aerial vehicle 100 can acquire as much information as possible at each point of the rough terrain while suppressing a decrease in imaging efficiency. In addition, the processing of generating the flight path, which is mainly performed by the unmanned aerial vehicle 100, may be performed during the flight of the unmanned aerial vehicle 100 or before the start of the flight.
Further, the UAV control section 110 may control the flight in accordance with the flight path rt, and aerial the terrain surface at an imaging angle θi at an imaging position i of the flight path rt (one example of an imaged image) via the imaging section 220. Accordingly, a plurality of processes for generating the flight path rt can be performed on the unmanned aerial vehicle 100 side, and the unmanned aerial vehicle 100 can acquire more information on the front surface of the terrain while suppressing a decrease in the imaging efficiency of the imaging unit 220 while reducing the processing load of the terminal 80. Further, the unmanned aerial vehicle 100 can be intensively implemented from generating the flight path rt in which the information on the front surface of the terrain is acquired more by suppressing the decrease in the imaging efficiency of the imaging unit 220 to performing imaging along the generated flight path rt.
(third embodiment)
In the first and second embodiments, the case of taking an aerial photograph of the entire flight range AR is shown. In the third embodiment, a case is shown in which the region of interest RI is mainly aerial in the flight range AR. The region of interest RI may be a region of interest to the user or a region including a location where an object of interest to the user exists. For example, the region of interest RI may be set by performing an input operation on a region of interest of the user, an object, via the operation section 83.
The flying body system 10 in the third embodiment has substantially the same structure as the first and second embodiments. The same components as those of the first and second embodiments are denoted by the same reference numerals, and the description thereof is omitted or simplified.
Fig. 12 is a diagram showing one example of specifying the region of interest RI. Here, a case is shown where the region of interest RI is a region including a building.
For example, the terminal control unit 81 displays the acquired flight range AR on the display unit 88. For example, the display portion 88 and the operation portion 83 may be constituted by control panels. When the user touches the buildings 501 and 502 displayed in the flight range AR on the display unit 88 with a finger, the terminal control unit 81 receives the positions of the buildings 501 and 502 via the operation unit 83. The terminal control section 81 sets an area including the positions of the two buildings 501, 502 as the region of interest RI.
Further, the terminal control section 81 sets a plurality of initial image points gp in the region of interest RI. The initial image point gp refers to an image capturing position wp set as an initial setting. The terminal control section 81 may acquire information of the initial image point gp from the memory 87 or the storage 89, for example, or may acquire the information by a user operation via the operation section 83, or acquire the information from an external processor via the communication section 85. In fig. 12, the initial image pickup points gp are arranged in a two-dimensional plane in a lattice shape. Further, adjacent initial image points gp are arranged at equal intervals. The initial image points gp may be arranged in a shape other than a lattice shape, or adjacent initial image points gp may not be arranged at equal intervals.
Fig. 13 is a diagram illustrating a procedure of regenerating a flight path rt for photographing the region of interest RI by the unmanned aerial vehicle 100. In this case, as in the first embodiment, the terminal control unit 81 derives the imaging cost Cij at the candidate angle j at the plurality of initial imaging points gp. The terminal control section 81 deletes, from the image capturing position wp, an image capturing point gpl having a lower image capturing cost Cij among the plurality of initial image capturing points gp, wherein the image capturing cost Cij of the image capturing point gpl at the candidate angle j is less than or equal to the threshold th3. That is, at an imaging position (imaging point) where the imaging cost Cij is less than or equal to the threshold th3, the terminal control section 81 does not take an image. Further, the terminal control section 81 leaves, as the image capturing position wp, an image capturing point gph having a higher image capturing cost Cij out of the plurality of initial image capturing points gp, the image capturing cost Cij of which at the candidate angle j exceeds the threshold th3.
The terminal control unit 81 may cluster (classify) a plurality of image points gph having a high imaging cost Cij at the imaging angle θi for each region of interest RI (here, each building as an object of interest). For clustering, known methods such as K-means (K-means), DBSCAN (Density-based spatial clustering of applications with noise, density-based spatial clustering with noise application) and the like can be used.
In fig. 13, the terminal control section 81 calculates image capturing position groups sg1, sg2 including four image capturing points gph whose image capturing cost Cij is equal to or greater than a threshold th3 by clustering. The terminal control unit 81 connects the image pickup points gph included in the image pickup position groups sg1 and sg2 as the image pickup positions wp, and regenerates the flight path rtn.
In this way, the terminal control unit 81 classifies a plurality of image pickup points gph (image pickup positions) having a high image pickup cost Cij at the image pickup angle θi for each region of interest RI photographed from the image pickup point gph, and generates a plurality of image pickup position groups sg1 and sg2. The terminal control unit 81 can link (connect) the plurality of imaging position groups sg1 and sg2 to regenerate the flight path rtn (one example of generating the flight path).
Thus, the unmanned aerial vehicle 100 can collectively capture each of the plurality of imaging points gph (imaging positions) included in the imaging position groups sg1, sg2. Accordingly, since the flying distance of the unmanned aerial vehicle 100 for photographing each region of interest RI is shortened, the unmanned aerial vehicle 100 can improve the photographing efficiency.
An example of the operation of the aircraft system 10 is described below.
Fig. 14 is a sequence diagram showing one example of a flight path generation process in the flying body system 10 of the third embodiment. In fig. 14, a case is illustrated in which the terminal 80 mainly performs the process of generating the flight path.
The processes T41 to T43 are the same as the processes T1 to T3 of the first embodiment.
The terminal control section 81 acquires information of the region of interest RI. The terminal control section 81 may receive a user input via the operation section 83 and acquire the region of interest RI (T44). For example, the user may specify the region of interest RI directly by inputting a place name or by circling a part of the region in the map information via the operation section 83. In this case, the terminal control section 81 may acquire the map information via the communication section 85. Further, the terminal control section 81 may receive a user input via the operation section 83 and acquire type information of an imaging object. The terminal control section 81 may also detect a region of interest RI (for example, a region including a building) corresponding to the type of the imaging object included in the flight range AR using an image processing technique based on the type of the imaging object.
The processes T45, T46 are the same as the processes T4, T5 in the first embodiment. The terminal control unit 81 derives (e.g., calculates) an imaging angle θi for imaging the region of interest RI for each initial imaging point gp (imaging position) along the flight path rt based on the topographic information and the flight altitude of the region of interest RI (T47).
When deriving the imaging angle θi for imaging the region of interest RI, the terminal control unit 81 may calculate the imaging cost Cij at the candidate angle j according to equations (4), (5), and (6). The imaging cost Cij at the candidate angle j is calculated for each candidate angle j.
[ math figure 4 ]
[ formula 5 ]
Cijk=(Pk in ROI)d -0.5 max(n*(-1),0)……(5)
[ formula 6 ]
θim=argmax(cij)……(6)
The expression (Pk In ROI) In the expression (5) means that the sampling position k as the calculation target of the imaging cost Cijk is limited to a position included In the region of interest RI. That is, the terminal control unit 81 may calculate the imaging cost Cijk when the sampling position k in the region of interest RI is imaged at the candidate angle j from the initial imaging point gp (imaging position) according to the expression (5). For other aspects, formulas (4), (5), (6) may be the same as formulas (1), (2), (3).
For example, the terminal control unit 81 calculates an imaging angle θi for imaging the region of interest based on the imaging cost Cij calculated according to the formulas (4), (5), and (6). The imaging angle θi may be an imaging angle θim for imaging the region of interest.
When the imaging cost Cij at the imaging angle θi at the plurality of initial imaging points gp is less than or equal to the threshold th3, the terminal control section 81 deletes the corresponding (having a low imaging cost Cij) imaging point gpl (T48). On the other hand, when the imaging cost Cij at the imaging angle θi at the plurality of initial imaging points gp exceeds the threshold th3, the terminal control section 81 does not delete the corresponding (having a high imaging cost Cij) imaging point gph, but remains as the imaging position wp.
The terminal control unit 81 clusters (groups of image points) a plurality of image points gph having a high imaging cost Cij, and calculates imaging position groups sg1 and sg2 (T49). The terminal control unit 81 excludes the imaging position wp at which the imaging cost Cij of the imaging angle is less than or equal to the threshold th3, and connects the imaging points gph included in the imaging position groups sg1 and sg2 as imaging positions to reproduce the flight path rtn (T50).
In addition, generation of the image capturing position group is not indispensable, and it is also not necessary to pass through the entire image capturing position wp included in the image capturing position group sg2 after passing through the entire image capturing position wp included in the image capturing position group sg 1. The respective flight paths rtn may be reproduced in such a manner that, for example, the imaging positions included in the imaging position group sg1, the imaging positions included in the imaging position group sg2, and the imaging positions included in the imaging position group sg1 fly in this order, the imaging positions wp for imaging the different regions of interest RI may be disordered.
The terminal control unit 81 transmits notification parameters including the imaging position wp (corresponding to the plurality of imaging points gph having the high imaging cost Cij), the reproduced flight path rtn, and the imaging angle θi to the unmanned aerial vehicle 100 via the communication unit 85 (T51).
In the unmanned aerial vehicle 100, the UAV control section 110 receives the notification parameter via the communication interface 50 (T52). The UAV control unit 110 sets each parameter used by the unmanned aerial vehicle 100 by saving the received notification parameter in the memory 160 (T53). The UAV control unit 110 drives the imaging unit 220 while flying along the flight path rtn based on the set parameters, and aerial photographs the region of interest RI at the imaging angle θi (T54).
In this way, the terminal control section 81 can acquire the region of interest RI including, for example, the positions of the two buildings 501, 502 included in the flight range AR. The terminal control unit 81 may derive the imaging angle θi for each imaging position based on the topographic information of the region of interest RI and the flight path rt.
Thus, when the unmanned aerial vehicle 100 aerial photographs the region of interest RI, the terminal 80 can suppress a decrease in the image capturing efficiency and acquire as much information as possible for each point of the rugged topography. Further, the terminal 80 can derive an imaging angle θi that can improve the generation accuracy and the three-dimensional restoration accuracy of the orthographic image of the region of interest RI.
The terminal control unit 81 may acquire a candidate angle j, which is a candidate of the imaging angle θi for imaging the terrain of the flight range AR. The terminal control section 81 may calculate an imaging cost Cij (one example of a third imaging cost) when the region of interest RI is imaged at the candidate angle j at the imaging position i for each candidate angle j. The terminal control section 81 may determine the imaging angle θi at the imaging position i based on the imaging cost Cij at the imaging position i. In this case, the terminal control section 81 may determine the candidate angle j at which the imaging cost Cij at the imaging position i is greater than or equal to the threshold th2 (one example of the second threshold) as the imaging angle θi at the imaging position i.
Thus, the terminal 80 can numerically determine whether or not to fit the imaging of the region of interest RI as the imaging cost, and can easily determine how appropriate to fit the imaging. Here, a candidate angle of an nth largest (e.g., largest) imaging cost Cij among the imaging costs Cij at the imaging position i that is greater than or equal to the threshold th2 may be determined as the optimal imaging angle θim.
Further, the terminal control section 81 may exclude, from among the plurality of imaging positions wp, an imaging point gpl (one example of the first imaging position) of the plurality of imaging positions i at which the imaging cost Cij at the imaging position at the imaging angle θi is less than or equal to the threshold th3, to regenerate the flight path rtn. That is, the terminal control unit 81 may exclude the imaging point gpl from the generated flight path rtn to regenerate the flight path rtn.
Thus, since the imaging position having a small influence on the imaging cost Cij at the imaging angle θi is not included in the generation (regeneration) of the flight path rtn, the terminal 80 can suppress a decrease in the generation accuracy and the three-dimensional restoration accuracy of the orthographic image of the region of interest RI and improve the imaging efficiency when the unmanned aerial vehicle 100 photographs the region of interest RI.
(fourth embodiment)
In the fourth embodiment, a case is illustrated in which the processing of generating the flight path in consideration of the region of interest is performed mainly by the unmanned aerial vehicle 100. The aircraft system of the fourth embodiment has substantially the same structure as the first to third embodiments. The same reference numerals are used for the same constituent elements as those of the first to third embodiments, and the description thereof is omitted or simplified.
Fig. 15 is a sequence chart showing one example of the flight path generation process of the flying body system 10 in the fourth embodiment. Fig. 15 illustrates a case where the process of generating the flight path is performed mainly by the unmanned aircraft 100.
The processing of the processes T61 to T64 is the same as the processes T41 to T44 of the third embodiment. The terminal control section 81 transmits notification parameters including the flight range AR, parameters, terrain information, and region of interest RI acquired in the processes T61 to T64 to the unmanned aerial vehicle 100 via the communication section 85 (T65).
In the unmanned aerial vehicle 100, the UAV control section 110 receives the notification parameter via the communication interface 150 (T66). The UAV control unit 110 calculates the flight level (T67). The fly height calculation method may be the same as the processes T4, T45. The UAV control unit 110 generates a flight path (T68). The method of generating the flight path rt may be the same as the processes T5, T46.
The UAV control unit 110 derives an imaging angle θi for imaging the region of interest RI (T69). The method of deriving the imaging angle θi may be the same as the procedure T47. The UAV control unit 110 deletes unnecessary image points gpl (T70). The deletion method of the unnecessary image point gpl may be the same as the procedure T48. The UAV control unit 110 performs clustering, and calculates imaging position groups sg1 and sg2 (T71). The clustering method and the calculation method of the imaging position groups sg1, sg2 may be the same as the procedure T49. The UAV control section 110 regenerates the flight path rtn (T72). The regeneration method of the flight path rtn can be the same as the process T50.
The UAV control unit 110 stores parameters including the imaging position wp (corresponding to the imaging point gph having the high imaging cost Cij), the reproduced flight path rtn, and the imaging angle θi in the memory 160, thereby setting the respective parameters used by the unmanned aerial vehicle 100 (T73). The UAV control unit 110 drives the imaging unit 220 while flying along the flight path rtn based on the set parameters, and aerial photographs the region of interest RI at the imaging angle θi (T74).
Thus, the unmanned aerial vehicle 100 can perform various processes for generating the flight path rtn in consideration of the region of interest RI on the unmanned aerial vehicle 100 side, and reduce the processing load of the terminal 80.
The present disclosure has been described above using the embodiments, but the technical scope of the present disclosure is not limited to the scope described in the above embodiments. Various alterations and modifications to the above-described embodiments will be apparent to those skilled in the art. Such alterations and modifications are intended to be included within the scope of the present disclosure as defined in the appended claims.
The order of execution of the operations, sequences, steps, and phases of the devices, systems, programs, and methods shown in the claims, the description, and the drawings of the description may be in any order as long as "before …", "before", and the like are not particularly explicitly indicated, and as long as the output of the preceding process is not used in the following process. With regard to the operational flows in the claims, specification, and drawings of the specification, "first," "then," etc. are used for convenience, but do not necessarily imply that the operations must be performed in such order.

Claims (26)

1. An information processing apparatus for generating a flight path for a flight of a flight body, characterized in that,
the device comprises a processing part and an imaging part;
the processing part acquires the terrain information of the flight range of the flying body;
Generating a flight path including an imaging position in a three-dimensional space for imaging a terrain of a flight range based on the terrain information of the flight range;
and deriving an imaging angle for each imaging position of the flight path based on the terrain information of the flight range and the flight path, wherein the imaging angle is an imaging angle for the imaging section to image the terrain of the flight range, and the imaging angle is specified by a pitch angle and a yaw angle of a gimbal supporting the imaging section.
2. The information processing apparatus according to claim 1, wherein,
the processing unit acquires a candidate angle which is a candidate of the imaging angle for imaging the terrain of the flight range,
calculating a first imaging cost which is an imaging cost when imaging at the candidate angle at the imaging position for each of the candidate angles,
and determining a candidate angle at the imaging position at which the first imaging cost is greater than or equal to a first threshold as the imaging angle at the imaging position.
3. The information processing apparatus according to claim 2, wherein,
the processing part samples the terrain of the flight range to acquire a plurality of sampling positions photographed by the flight body,
Calculating a second imaging cost which is an imaging cost when the sampling position is imaged at the candidate angle at the imaging position for each of the sampling positions,
and calculates the first imaging cost by adding the second imaging costs at the respective sampling positions.
4. An information processing apparatus according to claim 3, wherein the shorter the distance between the imaging position and the sampling position is, the greater the second imaging cost is.
5. An information processing apparatus according to claim 3, wherein the second imaging cost is greater as an inner product value between a normal vector with respect to the ground at the sampling position and an imaging vector which is a vector along an imaging direction indicated by the candidate angle is greater.
6. The information processing apparatus according to claim 5, wherein the processing section excludes the second imaging cost, the inner product value of which is negative, from the calculation target of the first imaging cost.
7. The information processing apparatus according to any one of claims 1 to 6, wherein,
the processing section acquires a region of interest included in the flight range and including a position of an imaging subject,
And deriving an imaging angle for each imaging position in the flight path based on the terrain information of the region of interest and the flight path.
8. The information processing apparatus according to claim 7, wherein,
the processing unit acquires a candidate angle which is a candidate of the imaging angle for imaging the terrain of the flight range,
calculating a third imaging cost which is an imaging cost when the region of interest is imaged at the candidate angle at the imaging position for each of the candidate angles,
and determining a candidate angle at the imaging position at which the third imaging cost is greater than or equal to a second threshold as the imaging angle at the imaging position.
9. The information processing apparatus according to claim 8, wherein the processing portion excludes a first imaging position from a plurality of imaging positions to generate the flight path when the third imaging cost when imaged at the imaging angle at the first imaging position is less than or equal to a third threshold.
10. The information processing apparatus according to claim 7, wherein,
The processing section classifies a plurality of imaging positions for each imaging object imaged from the imaging positions to generate a plurality of imaging position groups,
and connecting a plurality of the imaging position groups to generate the flight path.
11. The information processing apparatus according to any one of claims 1 to 6, wherein the information processing apparatus is a terminal including a communication section,
the processing unit transmits information of the imaging position, the flight path, and the imaging angle to the flying body via the communication unit.
12. The information processing apparatus according to any one of claims 1 to 6, wherein the information processing apparatus is a flying body including an image pickup section,
the processing part controls the flight according to the flight path,
and capturing an image at the imaging angle at the imaging position of the flight path via the imaging section.
13. A flight path generation method in an information processing apparatus for generating a flight path for a flight of a flight body, comprising the steps of:
the terrain information of the flight range of the flying body is acquired;
generating a flight path including an imaging position in a three-dimensional space for imaging a terrain of a flight range based on the terrain information of the flight range;
And deriving an imaging angle for imaging the terrain of the flight range for each imaging position in the flight path based on the terrain information of the flight range and the flight path, wherein the imaging angle is an imaging angle for imaging the terrain of the flight range by an imaging section, and the imaging angle is specified by a pitch angle and a yaw angle of a gimbal supporting the imaging section.
14. The flight path generation method according to claim 13, wherein the step of deriving an imaging angle for imaging the terrain of the flight range for each of the imaging positions in the flight path includes the steps of:
acquiring a candidate angle which is a candidate of the shooting angle for shooting the terrain of the flight range;
calculating, for each of the candidate angles, a first imaging cost that is an imaging cost when imaging at the candidate angle at the imaging position;
a candidate angle at the imaging position at which the first imaging cost is greater than or equal to a first threshold is determined as the imaging angle at the imaging position.
15. The flight path generation method according to claim 14, wherein the step of calculating, for each of the candidate angles, an imaging cost at the imaging position when imaging at the candidate angle, that is, a first imaging cost, includes the steps of:
Sampling the topography of the flight range to obtain a plurality of sampling positions photographed by the flight body;
calculating, for each of the sampling positions, a second imaging cost that is an imaging cost when the sampling position is imaged at the candidate angle at the imaging position;
the first imaging cost is calculated by adding the second imaging costs at the respective sampling positions.
16. The flight path generation method according to claim 15, wherein the shorter the distance between the imaging position and the sampling position is, the greater the second imaging cost is.
17. The flight path generation method according to claim 15, wherein the second imaging cost is greater as an inner product value between a normal vector with respect to the ground at the sampling position and a vector in an imaging direction indicated by the candidate angle, that is, an imaging vector is greater.
18. The flight path generation method according to claim 17, wherein the step of calculating, for each of the candidate angles, an imaging cost at the imaging position when imaging at the candidate angle, that is, a first imaging cost, includes the steps of:
And excluding the second imaging cost having the inner product value of a negative value from the calculation target of the first imaging cost.
19. The flight path generation method according to any one of claims 13 to 18, characterized in that the step of deriving an imaging angle for imaging the terrain of the flight range for each of the imaging positions in the flight path includes the steps of:
acquiring a region of interest included in the flight range and including a camera object position;
and deriving the imaging angle for each imaging position in the flight path based on the terrain information of the region of interest and the flight path.
20. The flight path generation method according to claim 19, wherein the step of deriving an imaging angle for imaging the terrain of the flight range for each of the imaging positions in the flight path includes the steps of:
acquiring a candidate angle which is a candidate of the shooting angle for shooting the terrain of the flight range;
calculating, for each of the candidate angles, a third imaging cost, which is an imaging cost when the region of interest is imaged at the candidate angle at the imaging position;
And determining a candidate angle at the imaging position at which the third imaging cost is greater than or equal to a second threshold as the imaging angle at the imaging position.
21. The flight path generation method according to claim 20, wherein the step of generating the flight path including the imaging position in the three-dimensional space for imaging the terrain of the flight range based on the terrain information of the flight range includes the steps of:
when the third imaging cost when imaging at the imaging angle at a first imaging position of the plurality of imaging positions is less than or equal to a third threshold,
the first imaging position is excluded from a plurality of imaging positions to generate the flight path.
22. The flight path generation method according to claim 19, wherein the step of generating the flight path including the imaging position in the three-dimensional space for imaging the terrain of the flight range based on the terrain information of the flight range includes:
classifying a plurality of the imaging positions for each imaging object imaged from the imaging positions to generate a plurality of imaging position groups;
and connecting a plurality of the camera position groups to generate the flight path.
23. The flight path generation method according to any one of claims 13 to 18, wherein the information processing apparatus is a terminal, further comprising the steps of:
and sending information of the shooting position, the flight path and the shooting angle to the flying body.
24. The flight path generation method according to any one of claims 13 to 18, wherein the information processing apparatus is a flight body, further comprising the steps of:
controlling the flight according to the flight path;
an image is taken at the imaging angle at the imaging position of the flight path.
25. A computer program product, characterized in that it causes an information processing device that generates a flight path for a flight of a flying body to perform the steps of:
the terrain information of the flight range of the flying body is acquired;
generating a flight path including an imaging position in a three-dimensional space for imaging a terrain of a flight range based on the terrain information of the flight range;
deriving an imaging angle for imaging the terrain of the flight range for each imaging position in the flight path based on the terrain information of the flight range and the flight path, wherein the imaging angle is an imaging angle for imaging the terrain of the flight range by an imaging section, and the imaging angle is specified by a pitch angle and a yaw angle of a gimbal supporting the imaging section.
26. A computer-readable storage medium, in which a program for causing an information processing apparatus that generates a flight path for a flight of a flying object to execute:
the terrain information of the flight range of the flying body is acquired;
generating a flight path including an imaging position in a three-dimensional space for imaging a terrain of a flight range based on the terrain information of the flight range;
and deriving an imaging angle for imaging the terrain of the flight range for each imaging position in the flight path based on the terrain information of the flight range and the flight path, wherein the imaging angle is an imaging angle for imaging the terrain of the flight range by an imaging section, and the imaging angle is specified by a pitch angle and a yaw angle of a gimbal supporting the imaging section.
CN201980005546.7A 2018-09-13 2019-09-10 Information processing device, flight path generation method, program, and recording medium Active CN111344650B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2018171672A JP7017998B2 (en) 2018-09-13 2018-09-13 Information processing equipment, flight path generation methods, programs, and recording media
JP2018-171672 2018-09-13
PCT/CN2019/105125 WO2020052549A1 (en) 2018-09-13 2019-09-10 Information processing apparatus, flight path generation method, and program and recording medium

Publications (2)

Publication Number Publication Date
CN111344650A CN111344650A (en) 2020-06-26
CN111344650B true CN111344650B (en) 2024-04-16

Family

ID=69777310

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980005546.7A Active CN111344650B (en) 2018-09-13 2019-09-10 Information processing device, flight path generation method, program, and recording medium

Country Status (3)

Country Link
JP (1) JP7017998B2 (en)
CN (1) CN111344650B (en)
WO (1) WO2020052549A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6902763B1 (en) * 2020-06-16 2021-07-14 九州電力株式会社 Drone flight planning system and program
WO2023182089A1 (en) * 2022-03-24 2023-09-28 ソニーセミコンダクタソリューションズ株式会社 Control device and control method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110056089A (en) * 2009-11-20 2011-05-26 삼성전자주식회사 Method for receiving a location information in digital photographing apparatus
CN102591353A (en) * 2011-01-04 2012-07-18 株式会社拓普康 Flight control system for flying object
CN106133629A (en) * 2014-04-25 2016-11-16 索尼公司 Information processor, information processing method, program and imaging system
WO2018073879A1 (en) * 2016-10-17 2018-04-26 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド Flight route generation method, flight route generation system, flight vehicle, program, and recording medium
WO2018073878A1 (en) * 2016-10-17 2018-04-26 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド Three-dimensional-shape estimation method, three-dimensional-shape estimation system, flying body, program, and recording medium
CN108417041A (en) * 2018-05-15 2018-08-17 江苏大学 A kind of backroad monitoring system and method based on quadrotor and Cloud Server

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10156855B2 (en) * 2014-05-30 2018-12-18 SZ DJI Technology Co., Ltd. Heading generation method and system of unmanned aerial vehicle
JP6621063B2 (en) * 2015-04-29 2019-12-18 パナソニックIpマネジメント株式会社 Camera selection method and video distribution system
CN106200693B (en) * 2016-08-12 2019-05-21 东南大学 The holder real-time control system and control method of land investigation small drone
US10402675B2 (en) * 2016-08-30 2019-09-03 The Boeing Company 2D vehicle localizing using geoarcs
CN106547276B (en) * 2016-10-19 2019-07-26 上海圣尧智能科技有限公司 Automatic spraying rectangular-ambulatory-plane paths planning method and fog machine spraying operation method
KR20180051996A (en) * 2016-11-09 2018-05-17 삼성전자주식회사 An unmanned aerialvehicles and method for pthotographing a subject using the same
CN107450573B (en) * 2016-11-17 2020-09-04 广州亿航智能技术有限公司 Flight shooting control system and method, intelligent mobile communication terminal and aircraft
CN110366670B (en) * 2017-03-02 2021-10-26 深圳市大疆创新科技有限公司 Three-dimensional shape estimation method, flight vehicle, mobile platform, program, and recording medium
CN107504957B (en) * 2017-07-12 2020-04-03 天津大学 Method for rapidly constructing three-dimensional terrain model by using unmanned aerial vehicle multi-view camera shooting
CN109871027B (en) * 2017-12-05 2022-07-01 深圳市九天创新科技有限责任公司 Oblique photography method and system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110056089A (en) * 2009-11-20 2011-05-26 삼성전자주식회사 Method for receiving a location information in digital photographing apparatus
CN102591353A (en) * 2011-01-04 2012-07-18 株式会社拓普康 Flight control system for flying object
CN106133629A (en) * 2014-04-25 2016-11-16 索尼公司 Information processor, information processing method, program and imaging system
WO2018073879A1 (en) * 2016-10-17 2018-04-26 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド Flight route generation method, flight route generation system, flight vehicle, program, and recording medium
WO2018073878A1 (en) * 2016-10-17 2018-04-26 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド Three-dimensional-shape estimation method, three-dimensional-shape estimation system, flying body, program, and recording medium
CN108417041A (en) * 2018-05-15 2018-08-17 江苏大学 A kind of backroad monitoring system and method based on quadrotor and Cloud Server

Also Published As

Publication number Publication date
WO2020052549A1 (en) 2020-03-19
JP2020043543A (en) 2020-03-19
CN111344650A (en) 2020-06-26
JP7017998B2 (en) 2022-02-09

Similar Documents

Publication Publication Date Title
US11649052B2 (en) System and method for providing autonomous photography and videography
JP6765512B2 (en) Flight path generation method, information processing device, flight path generation system, program and recording medium
JP6962775B2 (en) Information processing equipment, aerial photography route generation method, program, and recording medium
JP6803800B2 (en) Information processing device, aerial photography route generation method, aerial photography route generation system, program, and recording medium
JP6962812B2 (en) Information processing equipment, flight control instruction method, program, and recording medium
JP6675537B1 (en) Flight path generation device, flight path generation method and program, and structure inspection method
US20210185235A1 (en) Information processing device, imaging control method, program and recording medium
CN111344650B (en) Information processing device, flight path generation method, program, and recording medium
JP2019028560A (en) Mobile platform, image composition method, program and recording medium
CN109891188B (en) Mobile platform, imaging path generation method, program, and recording medium
JP6875269B2 (en) Information processing equipment, flight control instruction method, program, and recording medium
WO2021052217A1 (en) Control device for performing image processing and frame body control
JP2019096965A (en) Determination device, control arrangement, imaging system, flying object, determination method, program
WO2021115192A1 (en) Image processing device, image processing method, program and recording medium
WO2020119572A1 (en) Shape inferring device, shape inferring method, program, and recording medium
WO2020001629A1 (en) Information processing device, flight path generating method, program, and recording medium
CN112313942A (en) Control device for image processing and frame body control
WO2020108290A1 (en) Image generating device, image generating method, program and recording medium
JP6974290B2 (en) Position estimation device, position estimation method, program, and recording medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant