WO2019061859A1 - 移动平台、摄像路径生成方法、程序、以及记录介质 - Google Patents

移动平台、摄像路径生成方法、程序、以及记录介质 Download PDF

Info

Publication number
WO2019061859A1
WO2019061859A1 PCT/CN2017/116542 CN2017116542W WO2019061859A1 WO 2019061859 A1 WO2019061859 A1 WO 2019061859A1 CN 2017116542 W CN2017116542 W CN 2017116542W WO 2019061859 A1 WO2019061859 A1 WO 2019061859A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging
range
path
aerial
repetition
Prior art date
Application number
PCT/CN2017/116542
Other languages
English (en)
French (fr)
Inventor
顾磊
陈喆君
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201780065313.7A priority Critical patent/CN109891188B/zh
Publication of WO2019061859A1 publication Critical patent/WO2019061859A1/zh
Priority to US16/818,617 priority patent/US20200217665A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/003Flight plan management
    • G08G5/0034Assembly of a flight plan
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • B64U2101/32UAVs specially adapted for particular uses or applications for imaging, photography or videography for cartography or topography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • B64U2201/104UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS] using satellite radio beacon positioning systems, e.g. GPS
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls

Definitions

  • the present disclosure relates to a mobile platform, an imaging path generation method, a program, and a recording medium that generate an imaging path for imaging with a moving body.
  • a platform unmanned aerial vehicle that performs imaging while passing through a predetermined fixed path.
  • the platform receives an imaging instruction from a ground base station and images the imaged object.
  • the platform flies along a fixed path while imaging the imaging object, and the imaging device of the platform is tilted to perform imaging according to the positional relationship between the platform and the imaging target.
  • Patent Document 1 Japanese Patent Laid-Open Publication No. 2010-61216
  • a captured image in which the ground is photographed can be sequentially obtained.
  • the image can be captured in such a manner that the geographical range contained in these captured images is repeated, for example, at a certain repetition rate.
  • the repetition rate is low. tendency. Therefore, even if the ground is imaged at equal intervals in a predetermined area, the repetition rate of the obtained captured image is not constant depending on the position in the area.
  • the image quality when one composite image is generated from the plurality of captured images obtained may be degraded. Further, for example, the image quality at the time of generating a stereoscopic image from the plurality of captured images obtained may be degraded.
  • an unnecessary image may be captured in order to secure the repetition rate. In this case, useless imaging occurs, and the imaging efficiency is lowered.
  • a mobile platform is a mobile platform that generates an imaging path for imaging by a mobile body, and includes a processing unit that performs processing related to generating an imaging path, and the processing unit acquires information of an imaging range and generates
  • the first imaging path of the first imaging position for imaging the imaging range is calculated as the first degree of repetition of the image range of the captured image when imaging is performed at the first imaging position for each position included in the imaging range.
  • the second imaging position that complements the imaging of the imaging range is generated, and the second imaging path that has passed through the first imaging position and the second imaging position is generated.
  • the processing unit may extract an insufficient area in which the first repetition degree is equal to or smaller than the threshold value, and generate a second imaging position based on the position of the insufficient area.
  • the first imaging path may include a plurality of imaging routes.
  • the processing unit can generate the second imaging position at a position outside the first imaging position of the end portion on the insufficient region side of the imaging route among the imaging routes that have passed through the insufficient region among the plurality of imaging routes.
  • the processing unit calculates, for each position included in the imaging range, the second repetition degree, which is the repetition degree of the image range of the captured image when the first imaging position and the second imaging position are captured, and the second repetition degree is the threshold value. In the following position, the second imaging position is additionally generated.
  • the processing unit calculates, for each position included in the imaging range, the second repetition degree, which is the repetition degree of the image range of the captured image when the first imaging position and the second imaging position are captured, and the second repetition degree is the threshold value. In the following position, the second imaging position is additionally generated.
  • the processing unit calculates the first degree of repetition based on the first imaging position and the movement parameter and the imaging parameter of the moving object at the time of imaging at the first imaging position.
  • the mobile platform can be a terminal.
  • the processing unit can transmit information of the first imaging position, the second imaging position, and the second imaging path to the mobile object.
  • the mobile platform can be a terminal.
  • the processing unit can generate an image indicating the distribution of the first repetition degree of each position included in the imaging range, and display the image.
  • the mobile platform can be a mobile body.
  • the processing unit can set the first imaging position, the second imaging position, and the second imaging path.
  • the moving body may include a flying body.
  • Cameras can include aerial photography.
  • a captured image generating method is an imaging path generating method for generating a moving platform for capturing an imaging path using a moving body, which has the following steps of: acquiring information of an imaging range; a first imaging path of the first imaging position in which the imaging range is captured; and a first repetition degree of the degree of repetition of the image range of the captured image when the imaging is performed at the first imaging position is calculated for each position included in the imaging range; When there is a position where the first repetition degree is equal to or smaller than the threshold value, a second imaging position that complements imaging of the imaging range is generated, and a second imaging path that passes through the first imaging position and the second imaging position is generated.
  • the step of generating the second imaging position may include the steps of: extracting an insufficient area in which the first repetition degree is equal to or less than a threshold value; and generating a second imaging position based on the position of the insufficient area.
  • the first imaging path may include a plurality of imaging routes.
  • the step of generating the second imaging position may include the step of generating a position on the outer side of the first imaging position of the end portion on the insufficient region side of the imaging route in the imaging route that has passed through the insufficient region among the plurality of imaging routes 2 camera position.
  • the imaging path generation method may further include a step of calculating, for each position included in the imaging range, a second repetition degree which is a repetition degree of an image range of the captured image when imaging is performed at the first imaging position and the second imaging position.
  • the step of generating the second imaging position may include the step of additionally generating the second imaging position when there is a position where the second repetition degree is equal to or less than the threshold value.
  • the step of calculating the first degree of repetition may include the step of calculating the first weight based on the first imaging position and the movement parameter and the imaging parameter of the moving body at the time of imaging at the first imaging position. Complexity.
  • the mobile platform can be a terminal.
  • the imaging path generation method may further include the step of transmitting information of the first imaging position, the second imaging position, and the second imaging path to the mobile object.
  • the mobile platform can be a terminal.
  • the imaging path generation method may further include the step of generating an image indicating a distribution of the first repetition degree of each position included in the imaging range.
  • the mobile platform can be a mobile body.
  • the imaging path generation method may further include the steps of setting the first imaging position, the second imaging position, and the second imaging path.
  • the moving body may include a flying body.
  • Cameras can include aerial photography.
  • a program is a program for causing a mobile platform that generates an imaging path for imaging with a moving body to execute the following steps: acquiring information of an imaging range; and generating a camera for capturing an imaging range. a first imaging path of the first imaging position; for each position included in the imaging range, the first repetition degree of the image range of the captured image at the time of imaging at the first imaging position, that is, the first repetition degree is calculated; When the position is equal to or lower than the threshold value, a second imaging position that complements imaging of the imaging range is generated, and a second imaging path that passes through the first imaging position and the second imaging position is generated.
  • a recording medium is a computer-readable recording medium on which a program for causing a mobile platform for generating an imaging path for imaging with a moving body to execute the following steps: acquiring information of an imaging range; a first imaging path of the first imaging position that images the imaging range; and for each position included in the imaging range, the first repetition degree of the image range of the captured image at the time of imaging at the first imaging position is calculated.
  • a second imaging position that complements imaging of the imaging range is generated, and a second imaging path that passes through the first imaging position and the second imaging position is generated.
  • FIG. 1 is a schematic diagram showing a first configuration example of the aerial photography path generation system in the first embodiment.
  • FIG. 2 is a view showing a second configuration of the aerial photography path generation system in the first embodiment; A schematic diagram of an example.
  • FIG. 3 is a block diagram showing an example of a hardware configuration of an unmanned aerial vehicle.
  • FIG. 4 is a block diagram showing an example of a hardware configuration of a terminal.
  • FIG. 5 is a view showing an example of an aerial photographing range.
  • FIG. 6 is a view showing an example of the aerial photographing path AP12 that has passed through the aerial photographing position AP11.
  • Fig. 7 is a view for explaining the degree of repetition at an arbitrary position in the aerial photographing range.
  • FIG. 8 is a view showing an example of the degree of repetition of each position in the aerial photographing range.
  • FIG. 9 is a view showing an example of an insufficient area in the aerial photographing range.
  • FIG. 10 is a view showing an example of the arrangement of the aerial photographing position AP21.
  • FIG. 11 is a view showing an example of the aerial photographing path AP22 that has passed through the aerial photographing positions AP11 and AP21.
  • FIG. 12 is a flowchart showing an example of the operation of the terminal when the terminal generates an aerial photographing path.
  • Fig. 13 is a flow chart showing an operation example of the unmanned aerial vehicle when the UAV generates an aerial photographing path.
  • an unmanned aerial vehicle is mainly exemplified.
  • An unmanned aerial vehicle is an example of a flying body, including an aircraft that moves in the air.
  • the flying body is an example of a moving body.
  • the UAV is also referred to as "UAV".
  • the mobile platform can also be a device other than an unmanned aerial vehicle, such as a terminal, a personal computer (PC), or other device.
  • the camera path generation method specifies the mobile platform The action in the middle.
  • a program (for example, a program for causing the mobile platform to perform various processes) is recorded in the recording medium.
  • FIG. 1 is a schematic diagram showing a first configuration example of the aerial photographing path generation system 10 in the first embodiment.
  • the aerial photography path generation system 10 includes an unmanned aerial vehicle 100 and a terminal 80.
  • the UAV 100 and the terminal 80 can communicate with each other by wired communication or wireless communication such as a wireless LAN (Local Area Network).
  • the terminal 80 is a portable terminal (for example, a smart phone or a tablet terminal).
  • FIG. 2 is a schematic diagram showing a second configuration example of the aerial photographing path generation system 10 in the first embodiment.
  • the terminal 80 is a PC.
  • terminals 80 can all have the same function.
  • FIG. 3 is a block diagram showing an example of a hardware configuration of the unmanned aerial vehicle 100.
  • the unmanned aerial vehicle 100 is configured to include a UAV control unit 110, a communication interface 150, a memory 160, a memory 170, a pan/tilt head 200, a rotor mechanism 210, an imaging unit 220, an imaging unit 230, a GPS receiver 240, and an inertial measurement device (IMU: Inertial Measurement Unit 250, magnetic compass 260, barometric altimeter 270, ultrasonic sensor 280, and laser meter 290.
  • IMU Inertial Measurement Unit 250
  • magnetic compass 260 magnetic compass 260
  • barometric altimeter 270 barometric altimeter 270
  • ultrasonic sensor 280 ultrasonic sensor
  • the UAV control unit 110 is configured by, for example, a central processing unit (CPU), a micro processing unit (MPU), or a digital signal processor (DSP).
  • the UAV control unit 110 performs signal processing for controlling the operation of each unit of the UAV 100, input/output processing of data with other units, calculation processing of data, and storage processing of data.
  • the UAV control unit 110 controls the flight of the UAV 100 in accordance with a program stored in the memory 160.
  • the UAV control section 110 can control the flight in accordance with the aerial photography path generated by the terminal 80 or the unmanned aerial vehicle 100.
  • the UAV control unit 110 can perform aerial photography of the image in accordance with the aerial photographing position generated by the terminal 80 or the UAV 100.
  • aerial photography is an example of imaging.
  • the UAV control unit 110 acquires position information indicating the position of the unmanned aerial vehicle 100.
  • the UAV control unit 110 can acquire position information indicating the latitude, longitude, and altitude in which the unmanned aerial vehicle 100 is located from the GPS receiver 240.
  • the UAV control unit 110 may acquire a latitude and longitude letter indicating the latitude and longitude of the unmanned aerial vehicle 100 from the GPS receiver 240, respectively.
  • the height information indicating the height of the unmanned aerial vehicle 100 is acquired from the barometric altimeter 270 as position information.
  • the UAV control unit 110 can acquire the distance between the radiation point of the ultrasonic wave obtained by the ultrasonic sensor 280 and the reflection point of the ultrasonic wave as the height information.
  • the UAV control unit 110 can acquire orientation information indicating the orientation of the unmanned aerial vehicle 100 from the magnetic compass 260.
  • the orientation information may be represented, for example, by an orientation corresponding to the orientation of the nose of the UAV 100.
  • the UAV control unit 110 can acquire position information indicating the position where the UAV 100 should be located when the imaging unit 220 performs imaging on the imaging range to be imaged.
  • the UAV control unit 110 can acquire location information indicating the location where the UAV 100 should be located from the memory 160.
  • the UAV control section 110 can acquire location information indicating the location where the UAV 100 should be located from other devices through the communication interface 150.
  • the UAV control unit 110 can recognize the position where the UAV 100 may exist with reference to the three-dimensional map database, and acquire the position as position information indicating the position where the UAV 100 should be.
  • the UAV control unit 110 can acquire imaging range information indicating an imaging range of each of the imaging unit 220 and the imaging unit 230.
  • the UAV control unit 110 can acquire the angle of view information indicating the angle of view of the imaging unit 220 and the imaging unit 230 from the imaging unit 220 and the imaging unit 230 as parameters for specifying the imaging range.
  • the UAV control unit 110 can acquire information indicating the imaging direction of the imaging unit 220 and the imaging unit 230 as parameters for specifying the imaging range.
  • the UAV control unit 110 can acquire the posture information indicating the posture state of the imaging unit 220 from the pan/tilt head 200 as, for example, information indicating the imaging direction of the imaging unit 220.
  • the posture information of the imaging unit 220 may indicate an angle at which the pitch axis and the yaw axis of the pan-tilt 200 rotate from the reference rotation angle.
  • the UAV control section 110 can acquire position information indicating the position where the unmanned aerial vehicle 100 is located as a parameter for determining the imaging range.
  • the UAV control unit 110 can determine the imaging range indicating the geographical range in which the imaging unit 220 performs imaging based on the angle of view and the imaging direction of the imaging unit 220 and the imaging unit 230, and the position of the UAV 100, and generate imaging range information to acquire Camera range information.
  • the UAV control unit 110 can acquire imaging range information from the memory 160.
  • the UAV control section 110 can acquire imaging range information through the communication interface 150.
  • the UAV control unit 110 controls the pan/tilt head 200, the rotor mechanism 210, the imaging unit 220, and the imaging unit 230.
  • the UAV control unit 110 can change the imaging direction or angle of view of the imaging unit 220.
  • the imaging range of the imaging unit 220 is controlled.
  • the UAV control unit 110 can control the imaging range of the imaging unit 220 supported by the pan/tilt 200 by controlling the rotation mechanism of the pan-tilt head 200.
  • the imaging range refers to a geographical range in which imaging is performed by the imaging unit 220 or the imaging unit 230.
  • the camera range is defined by latitude, longitude and altitude.
  • the imaging range can be a range of three-dimensional spatial data defined in latitude, longitude, and altitude.
  • the imaging range can also be a range of two-dimensional spatial data defined by latitude and longitude.
  • the imaging range can be determined based on the angle of view of the imaging unit 220 or the imaging unit 230, the imaging direction, and the position of the UAV 100.
  • the imaging directions of the imaging unit 220 and the imaging unit 230 can be defined by the orientation and the depression angle of the imaging unit 220 and the imaging unit 230 on the front surface of the imaging lens.
  • the imaging direction of the imaging unit 220 may be a direction determined according to the orientation of the head of the UAV 100 and the state of the imaging unit 220 with respect to the posture of the PTZ 200.
  • the imaging direction of the imaging unit 230 may be a direction determined according to the orientation of the handpiece of the UAV 100 and the position of the imaging unit 230.
  • the UAV control unit 110 can recognize the environment around the unmanned aerial vehicle 100 by analyzing a plurality of images captured by the plurality of imaging units 230.
  • the UAV control unit 110 can control the flight according to the environment around the UAV 100, for example, avoiding obstacles.
  • the UAV control unit 110 can acquire stereoscopic information (three-dimensional information) indicating a three-dimensional shape (three-dimensional shape) of an object existing around the unmanned aerial vehicle 100.
  • the object may be part of a landscape such as a building, a road, a car, a tree, or the like.
  • the stereoscopic information is, for example, three-dimensional spatial data.
  • the UAV control unit 110 can acquire stereoscopic information by expressing stereoscopic information indicating a three-dimensional shape of an object existing around the unmanned aerial vehicle 100 from each image obtained by the plurality of imaging units 230.
  • the UAV control unit 110 can acquire stereoscopic information indicating a three-dimensional shape of an object existing around the unmanned aerial vehicle 100 by referring to the three-dimensional map database stored in the memory 160 or the memory 170.
  • the UAV control unit 110 can acquire stereoscopic information related to the stereoscopic shape of the object existing around the unmanned aerial vehicle 100 by referring to the three-dimensional map database managed by the server existing on the network.
  • the UAV control unit 110 controls the flight of the UAV 100 by controlling the rotor mechanism 210. That is, the UAV control unit 110 controls the position of the UAV 100 including the latitude, longitude, and altitude by controlling the rotor mechanism 210.
  • the UAV control unit 110 can control the imaging range of the imaging unit 220 by controlling the flight of the unmanned aerial vehicle 100.
  • the UAV control unit 110 controls the imaging by controlling the zoom lens provided in the imaging unit 220.
  • the angle of view of portion 220 The UAV control unit 110 can control the angle of view of the imaging unit 220 by performing digital zoom using the digital zoom function of the imaging unit 220.
  • the UAV control unit 110 can cause the imaging unit 220 to operate in a desired environment by moving the UAV 100 to a specific position at a specific time. The next imaging range is taken for the desired imaging range.
  • the UAV control unit 110 may move the UAV 100 to a specific position at a specific time, thereby causing the imaging unit 220 to be desired.
  • the desired imaging range is captured in the environment.
  • Communication interface 150 is in communication with terminal 80.
  • the communication interface 150 can perform wireless communication by any wireless communication method.
  • the communication interface 150 can perform wired communication by any wired communication method.
  • the communication interface 150 may transmit an aerial photography image or additional information (metadata) related to the aerial photography image to the terminal 80.
  • the memory 160 stores the UAV control unit 110, the pan/tilt head 200, the rotor mechanism 210, the imaging unit 220, the imaging unit 230, the GPS receiver 240, the inertial measurement device 250, the magnetic compass 260, the barometric altimeter 270, the ultrasonic sensor 280, and the laser measuring instrument. 290 Programs required for control, etc.
  • the memory 160 may be a computer readable recording medium, and may include a static random access memory (SRAM), a dynamic random access memory (DRAM), and an erasable programmable read-only memory (Erasable Programmable).
  • At least one of flash memory such as Read Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), and Universal Serial Bus (USB) memory.
  • EPROM Read Only Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • USB Universal Serial Bus
  • the memory 160 can also be detached from the unmanned aerial vehicle 100.
  • the memory 160 can operate as a working memory.
  • the memory 170 may include at least one of a Hard Disk Drive (HDD), a Solid State Drive (SSD), an SD card, a USB memory, and other memories.
  • the memory 170 can hold various information and various data.
  • the memory 170 can also be detached from the unmanned aerial vehicle 100.
  • the memory 170 can record aerial photography images.
  • the memory 160 or the memory 170 may hold information of an aerial photography location or an aerial photography path generated by the terminal 80 or the unmanned aerial vehicle 100.
  • Aerial photography location or aerial photography road The information of the diameter can be set by the UAV control unit 110 as one of the aerial photographing parameters in the aerial photographing preset by the unmanned aerial vehicle 100 or the flight parameters during the flight set by the unmanned aerial vehicle 100. This setting information can be saved in the memory 160 or the memory 170.
  • the flight parameter is an example of a movement parameter.
  • the pan/tilt head 200 can support the imaging unit 220 so as to be rotatable about the yaw axis, the pitch axis, and the roll axis.
  • the pan/tilt head 200 can change the imaging direction of the imaging unit 220 by rotating the imaging unit 220 around at least one of the yaw axis, the pitch axis, and the roll axis.
  • the yaw axis, the pitch axis, and the roll axis can be determined as follows.
  • the roll axis as horizontal (parallel to the ground).
  • the pitch axis is defined as a direction parallel to the ground and perpendicular to the roll axis
  • the yaw axis (refer to the z-axis) is defined as a direction perpendicular to the ground and perpendicular to the roll axis and the pitch axis.
  • the rotor mechanism 210 has a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors.
  • the rotor mechanism 210 causes the UAV 100 to fly by controlling the rotation by the UAV control unit 110.
  • the number of the rotors 211 may be, for example, four or other numbers.
  • the UAV 100 can also be a rotorless fixed wing aircraft.
  • the imaging unit 220 is an imaging camera that images an object (for example, a scene above the aerial image, a view of a mountain or a river, or a building on the ground) included in a desired imaging range.
  • the imaging unit 220 images the subject in the desired imaging range to generate data of the captured image.
  • the image data (for example, an aerial image) obtained by the imaging by the imaging unit 220 can be stored in the memory of the imaging unit 220 or the memory 170.
  • the imaging unit 230 may be a sensing camera that images the surroundings of the UAV 100 to control the flight of the UAV 100.
  • the two imaging units 230 may be provided on the front side of the nose of the unmanned aerial vehicle 100. Further, the other two imaging units 230 may be provided on the bottom surface of the UAV 100.
  • the two imaging units 230 on the front side can be paired and function as a so-called stereo camera.
  • the two imaging units 230 on the bottom side may be paired and function as a stereo camera.
  • the three-dimensional spatial data (three-dimensional shape data) around the UAV 100 can be generated based on the images taken by the plurality of imaging units 230. Further, the number of imaging units 230 included in the unmanned aerial vehicle 100 is not limited to four.
  • the unmanned aerial vehicle 100 may include at least one imaging unit 230.
  • the UAV 100 can be on the nose, tail, side, bottom and top of the UAV 100
  • Each of the surfaces includes at least one imaging unit 230.
  • the angle of view that can be set by the imaging unit 230 can be larger than the angle of view that can be set by the imaging unit 220.
  • the imaging unit 230 may have a fixed focus lens or a fisheye lens.
  • the imaging unit 230 images the surroundings of the UAV 100 to generate data of the captured image.
  • the image data of the imaging unit 230 can be stored in the memory 170.
  • the GPS receiver 240 receives a plurality of signals indicating the time and the position (coordinates) of each GPS satellite transmitted from a plurality of navigation satellites (ie, GPS satellites).
  • the GPS receiver 240 calculates the position of the GPS receiver 240 (i.e., the position of the UAV 100) based on the received plurality of signals.
  • the GPS receiver 240 outputs the position information of the UAV 100 to the UAV control unit 110. Further, the calculation of the position information of the GPS receiver 240 can be performed by the UAV control unit 110 instead of the GPS receiver 240. In this case, information indicating the time and the position of each GPS satellite included in the plurality of signals received by the GPS receiver 240 is input to the UAV control unit 110.
  • the inertial measurement device 250 detects the posture of the unmanned aerial vehicle 100 and outputs the detection result to the UAV control unit 110.
  • the inertial measurement device 250 can detect the acceleration in the three-axis direction of the front, rear, left and right, and up and down, and the angular velocity in the three-axis directions of the pitch axis, the roll axis, and the yaw axis as the posture of the unmanned aerial vehicle 100.
  • the magnetic compass 260 detects the orientation of the nose of the unmanned aerial vehicle 100, and outputs the detection result to the UAV control unit 110.
  • the barometric altimeter 270 detects the flying height of the unmanned aerial vehicle 100 and outputs the detection result to the UAV control section 110.
  • the ultrasonic sensor 280 emits ultrasonic waves, detects ultrasonic waves reflected by the ground or objects, and outputs the detection results to the UAV control unit 110.
  • the detection result may indicate the distance from the unmanned aerial vehicle 100 to the ground, that is, the height.
  • the detection result may indicate the distance from the UAV 100 to the object (subject).
  • the laser measuring instrument 290 irradiates the object with a laser beam, receives the reflected light reflected by the object, and measures the distance between the UAV 100 and the object (subject) based on the reflected light.
  • the method of measuring the distance by the laser beam may be, for example, a time of flight method.
  • FIG. 4 is a block diagram showing an example of a hardware configuration of the terminal 80.
  • the terminal 80 can include a terminal control unit 81, an operation unit 83, a communication unit 85, a memory 87, a display unit 88, and a memory. 89.
  • Terminal 80 can be held by a user who wishes to generate an aerial photography path.
  • the terminal control unit 81 is configured using, for example, a CPU, an MPU, or a DSP.
  • the terminal control unit 81 performs signal processing for controlling the operation of each unit of the terminal 80, input/output processing of data with other units, calculation processing of data, and storage processing of data.
  • the terminal control unit 81 can acquire data, aerial photography images, or information from the unmanned aerial vehicle 100 via the communication unit 85.
  • the terminal control unit 81 can acquire data or information (for example, various parameters) input through the operation unit 83.
  • the terminal control unit 81 can acquire data, aerial photography images, or information stored in the memory 87.
  • the terminal control unit 81 can transmit data or information (for example, information of the generated aerial photographing position and aerial photographing path) to the unmanned aerial vehicle 100 via the communication unit 85.
  • the terminal control unit 81 can transmit data, information, or aerial photography images to the display unit 88, and display display information based on the data, information, or aerial photography images on the display unit 88.
  • the terminal control section 81 can execute an application for generating an aerial photography path or an application for helping to generate an aerial photography path.
  • the terminal control unit 81 can generate various data used in the application.
  • the operation unit 83 accepts and acquires data or information input by the user of the terminal 80.
  • the operation unit 83 may include a button, a button, a touch panel, a microphone, and the like.
  • the operation unit 83 can accept a touch operation, a click operation, a drag operation, and the like.
  • the operation unit 83 can receive information of various parameters.
  • the information input by the operation unit 83 can be transmitted to the UAV 100.
  • the various parameters may include parameters related to the generation of the aerial photography path (eg, a threshold th of repetition, at least one of flight parameters or imaging parameters of the UAV 100 during aerial photography along the aerial photography path).
  • the communication unit 85 performs wireless communication with the UAV 100 by various wireless communication methods.
  • the wireless communication method of this wireless communication may include communication realized by, for example, a wireless LAN, Bluetooth (registered trademark), or a public wireless line.
  • the communication unit 85 can also perform wired communication by any wired communication method.
  • the memory 87 may have, for example, a ROM that stores data of a program or a set value for specifying the operation of the terminal 80, and a RAM that temporarily stores various kinds of information or data used when the terminal control unit 81 performs processing.
  • the memory 87 may include memory other than the ROM and the RAM. Inside The memory 87 can be provided inside the terminal 80.
  • the memory 87 can be configured to be detachable from the terminal 80.
  • the program can include an application.
  • the display unit 88 is configured by, for example, a liquid crystal display (LCD), and displays various kinds of information, data, or aerial photography images output from the terminal control unit 81.
  • the display unit 88 can display various data or information involved in executing the application.
  • the memory 89 can be an HDD, an SSD, an SD card, a USB memory, or the like.
  • the memory 89 can be provided inside the terminal 80.
  • the memory 89 can be configured to be detachable from the terminal 80.
  • the memory 89 can hold aerial photography images or additional information acquired from the unmanned aerial vehicle 100. Additional information can also be saved in memory 87.
  • the terminal control unit 81 is an example of a processing unit.
  • the terminal control unit 81 performs processing related to generation of an aerial photographing path.
  • the terminal control unit 81 acquires the aerial photographing range A1.
  • the aerial photography range A1 contains a range of aerial photography by the unmanned aerial vehicle 100.
  • the target is that the degree of repetition OV of the image range GH of the aerial photographed image at each position included in the aerial photographing range A1 is equal to or greater than the threshold value th. That is, a certain degree of repetition OV is maintained in the aerial photographing range A1.
  • the degree of repetition OV has a correspondence relationship with the degree of repetition indicating a ratio of repetition with a plurality of image ranges GH. For example, if the degree of repetition OV is equal to or greater than the threshold value th, it can be said that the repetition rate is also equal to or greater than a predetermined value.
  • FIG. 5 is a view showing an example of the aerial photographing range A1.
  • the terminal control unit 81 can acquire the aerial photographing range A1 from the memory 87.
  • the terminal control unit 81 can acquire the aerial photographing range A1 from the memory 87 or an external server.
  • the terminal control unit 81 can acquire the aerial photographing range A1 via the operation unit 83.
  • the operation unit 83 can accept the user input of the desired range of the desired aerial photography indicated by the map information acquired from the map database or the like as the aerial photographing range A1. Further, the operation unit 83 can input a name (also referred to as a place name, etc.) of a desired place name desired to be aerial photographed, a building or other information capable of recognizing the place.
  • a name also referred to as a place name, etc.
  • the terminal control unit 81 can acquire the range indicated by the place name or the like as the aerial photographing range A1, and can also acquire The predetermined range around the place name (for example, a range within a radius of 100 m centered on the position indicated by the place name) is taken as the aerial photographing range A1.
  • the terminal control unit 81 generates an aerial photographing path AP12 that has passed through the aerial photographing position AP11 in the aerial photographing range.
  • the aerial photography path AP12 can be generated using a well-known method.
  • the aerial photographing position AP11 can be generated by a known method.
  • An aerial photographing position AP11 disposed at an equally spaced position on the aerial photographing path AP12 can be generated. Further, the plurality of aerial photographing positions AP11 may be arranged not at equal intervals but at different intervals.
  • the aerial photographing position AP11 is an example of the first imaging position.
  • the aerial photography path AP12 is an example of the first imaging path.
  • FIG. 6 is a view showing an example of the aerial photographing path AP12 that has passed through the aerial photographing position AP11.
  • the aerial photography path AP12 has four linear aerial photography routes c1, c2, c3, and c4.
  • the aerial photographing position AP11 is disposed on each of the aerial photographing routes c1 to c4.
  • the aerial photographing position AP11 on each of the aerial photographing routes c1 to c4 may differ depending on the shape of the aerial photographing range A1.
  • the aerial photography routes c1 to c4 are sequentially connected to form an aerial photography path AP12.
  • the aerial photographing position AP11 is smaller than the aerial photographing routes c3 and c4.
  • the aerial photographing route c4 is formed in a straight line in the left-right direction of FIG. 6, but may be formed in another direction (for example, the up-and-down direction in FIG. 6).
  • the terminal control unit 81 calculates, for each position included in the aerial photographing range A1, the extent to which the image range GH of the aerial photographing aerial photography image is repeated by the imaging unit 220 or the imaging unit 230 of the UAV 100 at the aerial photographing position AP11 ( Repeatability OV).
  • the degree of repetition OV can be expressed by, for example, the number of sheets of aerial photography (the number of repeated sheets) included in the image range GH of the aerial photographed image at each position included in the aerial photographing range A1.
  • the terminal control unit 81 can map the repetition degree OV of each position on a two-dimensional plane to generate a repetition degree map OM.
  • the terminal control unit 81 can display the repetition degree map OM on the display unit 88 so that the repetition degree OV of each position is visible. By displaying the repeatability map OM, the terminal 80 can easily visually grasp the distribution of the degree of repetition OV at each position in the aerial photographing range A1.
  • the image range GH of the aerial photography image taken by the unmanned aerial vehicle 100 in the air The geographic extent correspondence in the aerial photography image was taken.
  • the image range GH of a plurality of aerial photographing images can be repeated.
  • the number of repeated photographs of the aerial photographing image is two at the position where the two image ranges GH in the aerial photographing range are repeated. That is, the position is captured in two aerial photography images.
  • the number of repeated photographs of the aerial photographed image is three. That is, the position is captured in three or more aerial photography images.
  • the number of repetitions of the image range of the aerial photography image is an example of the repetition degree OV of the aerial photography.
  • the image range GH can be determined based on the flight parameters when the UAV 100 is flying in the future and the imaging parameters of the imaging unit 230 or the imaging unit 230 provided in the UAV 100 for aerial photography.
  • the flight parameters may include at least one of aerial photography position information, aerial photography path information, aerial photography time information, and other information.
  • the imaging parameters may include at least one of aerial photography perspective information, aerial photography direction information, aerial photography posture information, imaging range information, subject distance information, and other information (for example, resolution, image range, and repetition rate information).
  • the aerial photography path information indicates a preset path (air photography path) in which aerial photography images are taken in the air.
  • the aerial photography path information is information of a path in which the UAV 100 is flying during aerial photography, and may be an aerial photography path AP12.
  • the aerial photographing position information is a preset position (for example, a three-dimensional position (latitude, longitude, altitude)) in which an aerial photographing image is taken in the air, and may be an aerial photographing position AP11.
  • the aerial photography time information indicates a preset time (air photography time) at which an aerial photography image is taken in the air.
  • the aerial photography viewing angle information indicates information of the viewing angle FOV (Field of View) of the imaging unit 220 or the imaging unit 230 when the aerial photography image is captured in the air.
  • the aerial photography direction information indicates the imaging direction of the imaging unit 220 or the imaging unit 230 when the aerial photography image is captured in the air (the aerial imaging direction).
  • the aerial photography posture information indicates the posture of the imaging unit 220 or the imaging unit 230 when the aerial photography image is captured in the air.
  • the imaging range information indicates the imaging range of the imaging unit 220 or the imaging unit 230 when the aerial photography image is captured in the air, and can be based on, for example, the rotation angle of the pan/tilt head 200.
  • the subject distance information indicates information of the distance from the imaging unit 220 or the imaging unit 230 to the subject when the aerial photography image is captured in the air.
  • flight parameters and camera parameters are not parameters in previous aerial photography. It is a parameter in the future preset aerial photography.
  • the parameters in future pre-set aerial photography may have the same parameters as in previous aerial photography.
  • the terminal control unit 81 can determine the image range GH of the plurality of aerial photography images based on at least one of the imaging parameters and the flight parameters. For example, the terminal control unit 81 can calculate the image range GH based on at least one of the angle of view FOV, the aerial photographing direction, the posture of the imaging unit 220, and the aerial photographing position (latitude, longitude, altitude).
  • the aerial photographing position interval d, the aerial photographing distance L, the imaging unit 220 that captures the aerial photographed image in the air, the viewing angle FOV of the imaging unit 230, and the repetition rate or of the image range GH of the aerial photographed image may have the following formula (1). )Relationship.
  • the aerial photographing position interval d may be, for example, a configured aerial photographing position (for example, an interval of two adjacent aerial photographing positions AP11).
  • the aerial photographing distance L may be, for example, the distance between the unmanned aerial vehicle 100 and the object (for example, the ground) at the time of aerial photography, that is, the flying height.
  • the repetition rate or may represent the repetition ratio of the image range GH of the two aerial photography images adjacent to the image range.
  • the aerial photographing position interval d, the repetition rate of the image range GH of the aerial photographing image hor, the width w of the image range GH of the aerial photographing image, and the resolution r of the aerial photographing image OG may have the following relationship of the formula (2).
  • the operation unit 83 of the terminal 80 can accept user operation and input at least one of imaging parameters and flight parameters.
  • the operation unit 83 can input at least a part of the parameters included in the equations (1) and (2).
  • the terminal control unit 81 can calculate the width w (for example, the length of one side of the rectangle) of the image range GH from each of the parameters of the equations (1) and (2). Further, the terminal control unit 81 can acquire the two-dimensional position (latitude, longitude) of the aerial photographing position AP11. Therefore, the terminal control unit 81 can specifically determine the image range GH surrounded by the imaging unit 220 or the imaging unit 230 of the UAV 100 when imaging the ground direction based on the width w of the image range GH and the two-dimensional position of the aerial photographing position AP11. Geographical scope. Therefore, the degree of repetition OV of the image range GH of the aerial photographing image can be calculated for each position in the aerial photographing range A1.
  • the terminal 80 can derive the repetition degree OV by calculating the repetition degree OV based on the flight parameters and the imaging parameters when performing aerial photography based on the plurality of aerial photographing positions AP11 and the aerial photographing position AP11, without actually causing the UAV 100 to fly or Aerial photography is performed by the imaging unit 220 or the imaging unit 230. Therefore, the degree of repetition OV can be easily obtained by calculation using the flight parameters and the imaging parameters on one device.
  • the terminal control unit 81 can obtain the image range GH based on the flight parameters and the imaging parameters, and calculate the degree of repetition OV based on the positional relationship of the plurality of image ranges GH.
  • the positional relationship of the plurality of image ranges GH can be determined based on the positional relationship of the plurality of aerial photographing positions AP11 that image the aerial photographing images.
  • FIG. 7 is a diagram for explaining the degree of repetition OV at the position p1 in the aerial photographing range A1.
  • the position p1 is included in the three image ranges GH (GH1, GH2, GH3), so that the number of repeated sheets at the position p1 is three.
  • the degree of repetition OV is expressed as the number of repeated sheets, but an arbitrary processing (for example, weighting) may be applied to the number of repeated sheets to generate the degree of repetition OV.
  • the degree of repetition OV at the position p1 is exemplified, but the degree of repetition OV at a position other than the position p1 in the aerial photographing range A1 can also be derived.
  • FIG. 8 is a view showing an example of the degree of repetition OV of each position in the aerial photographing range A1, and is a diagram showing an example of the repeatability map OM.
  • a different pattern is used to distinguish representations.
  • the degree of repetition OV the number of repetitions per position (1 sheet, 2 sheets, 3 sheets, 4 sheets, 5 sheets, 6 sheets, 7 sheets, 8 sheets, 9 sheets) is illustrated.
  • the number of repeated sheets may be nine or more. Referring to Fig. 8, when the vicinity of the peripheral end portion of the aerial photographing range A1 is compared with the portion near the center of the aerial photographing range A1, it is understood that the degree of repetition OV tends to be small.
  • the repeatability map OM can be used to assist the user in determining the configuration of the aerial photography location AP21.
  • the terminal control unit 81 extracts the shortage area LA.
  • Insufficient area LA is included in aerial photography
  • the degree of repetition OV (for example, the number of repeated sheets) in the range A1 is a region of one or more positions below the threshold value th (for example, four sheets). That is, each position in the shortage area LA is a position where the degree of repetition OV is relatively lower than the other areas in the aerial imaging range A1.
  • the shortage area LA is likely to appear at the peripheral end portion of the aerial photographing range A1 than the center portion of the aerial photographing range A1. Further, depending on the aerial photographing path AP12 or the aerial photographing position AP11, it is also possible that the insufficient area LA appears in the center portion of the aerial photographing range A1.
  • FIG. 9 is a view showing an example of the shortage area LA.
  • the insufficient area LA appears at three locations on the circumferential end of the aerial photographing range A1.
  • the terminal control unit 81 can generate and arrange the aerial photographing position AP21.
  • the aerial photographing position AP21 becomes an aerial photographing position for supplementing the aerial photographing of the aerial photographing range A1.
  • the terminal control unit 81 can generate and arrange the aerial photographing position AP21 based on the position of the insufficient area LA.
  • the aerial photographing position AP21 may be disposed at the same interval as the interval between the other aerial photographing positions (for example, the interval between the plurality of aerial photographing positions AP11), or may be disposed at an interval different from the interval of the other aerial photographing positions.
  • the aerial photographing position AP21 is an example of the second imaging position.
  • FIG. 10 is a view showing an example of the arrangement of the aerial photographing position AP21.
  • the aerial photographing position AP21 is disposed inside or in the vicinity of the insufficient area LA.
  • both ends of the aerial photographing route c1 are located inside the insufficient area LA, and therefore, two aerial photographings can be arranged outside the two aerial photographing positions AP11 at both ends of the aerial photographing route c1.
  • Location AP21 the aerial photographing positions AP11 located at both ends of the aerial photographing route c2 are located inside the insufficiency area LA, respectively, so that two airborne positions can be arranged outside the two aerial photographing positions AP11 at both ends of the aerial photographing route c1.
  • Photography position AP21 is arranged
  • the repetition degree OV (OV1) of the image range GH of the aerial photographing image which can be aerial photographed on the aerial photographing routes c1 and c2 is increased, so that the terminal 80 can improve the repeatability OV1.
  • the degree of repetition OV1 of the threshold value th or more can be obtained, and the degree of repetition OV1 desired by the user can be obtained on the aerial photographing routes c1 and c2.
  • the degree of repetition OV1 is an example of the first repetition degree.
  • one end of the aerial photography route c3 (the right end in FIG. 10) is located inside the insufficient area LA, so that one end of the aerial photography route c3 can be Two aerial photographing positions AP21 are disposed outside the one aerial photographing position AP11.
  • the degree of repetition OV1 of the image range GH of the aerial photographing image that can be aerial photographed along the aerial photographing route c3 also increases, so that the terminal 80 can improve the repeatability OV1.
  • a plurality of aerial photographing positions AP21 may be disposed outside one aerial photographing position AP11 at one end of the aerial photographing route c3.
  • the degree of repetition OV1 of the threshold value th or more can be obtained, and the degree of repetition OV1 desired by the user can be obtained on the aerial photographing route c3.
  • the aerial photographing route c4 since one aerial photographing position AP11 included in the aerial photographing route c4 is located inside the insufficient area LA, two aerial photographing positions AP21 can be disposed on both sides of the aerial photographing position AP11. Further, on the aerial photographing route c4, by arranging one aerial photographing position AP21 on at least one end side of the aerial photographing position AP11, the degree of repetition OV1 of the image range GH of the aerial photographing image in which the aerial photographing route c4 can be aerial photographed is increased. In this case, the terminal 80 can also improve the degree of repetition OV1.
  • the repetition degree OV1 of the threshold value th or more can be obtained, and the repeatability OV1 desired by the user can be obtained on the aerial photographing route c4.
  • the terminal control unit 81 can perform the following processing to determine the arrangement position of the aerial photographing position AP21. For example, the terminal control unit 81 can extract an aerial photographing route that has passed through the insufficient area LA. Here, any of the aerial photography routes c1 to c4 passes through a part of the insufficient area LA. In this case, the terminal control unit 81 can generate and arrange one aerial photographing position AP21 in the vicinity (side) of the aerial photographing position AP11 existing in or near the insufficient area LA. Thereby, the terminal 80 can improve the repetition degree OV1 of each of the aerial photography routes c1 to c4, and can provide the degree of repetition OV1 desired by the user on the aerial photography routes c1 and c2.
  • the terminal control unit 81 can calculate the degree of repetition OV (OV2) at each position in the aerial photographing range A1 again.
  • the terminal control unit 81 calculates the degree of repetition OV2 when the imaging unit 220 of the UAV 100 or the imaging unit 230 performs aerial photography at the aerial photographing position AP11 and the aerial photographing position AP21. Therefore, when it is assumed that the aerial photographing positions AP11 and AP21 perform aerial photographing, the position where the degree of repetition OV2 is equal to or less than the threshold value th is reduced as compared with the case where the aerial photographing position AP11 is assumed to be aerial photographing only, that is, the number of the insufficient area LA is decreased. Or the size is reduced.
  • the degree of repetition OV2 is an example of the second repetition degree.
  • the terminal control unit 81 may additionally generate and additionally arrange the aerial photographing position AP21 when the residual repeatability OV2 is equal to or lower than the threshold value th or the insufficient area LA.
  • the terminal control unit 81 can additionally arrange the aerial photographing position AP21 in accordance with the position of the shortage area LA. For example, in the aerial photographing route that has passed through the re-extracted insufficient area LA, the aerial photographing position AP21 may be additionally disposed outside the aerial photographing position AP21 located inside or in the vicinity of the insufficient area LA.
  • the aerial photographing position AP21 can be additionally arranged on the aerial photographing routes c1 and c2.
  • the terminal control unit 81 can calculate the degree of repetition OV2 at each position in the aerial photographing range A1 again.
  • the terminal control unit 81 calculates the degree of repetition OV2 when the imaging unit 220 or the imaging unit 230 of the unmanned aerial vehicle 100 performs aerial photography at the aerial photographing position AP11 and the aerial photographing position AP21 (including the position additionally arranged).
  • the terminal control unit 81 can repeatedly perform the additional arrangement of the aerial photographing position AP21, the recalculation of the repetition degree OV2, and the residual confirmation of the insufficient area LA until There is no shortage of area LA remaining.
  • the terminal 80 can ensure a certain degree of repetition OV desired by the user, that is, there is no shortage area LA in the entire aerial photographing range A1.
  • the terminal 80 can generate the aerial photographing position AP21 by the position of the insufficient area LA, and arrange the aerial photographing position AP21 in the vicinity of, for example, the insufficient area LA. Therefore, the terminal 80 can improve the shortage of the repetition degree OV in the insufficient area LA.
  • the terminal 80 generates an aerial photographing position AP21 at a position outside the aerial photographing position AP11 of the end portion on the insufficient area LA side of the aerial photographing route on the aerial photographing route passing through the insufficient area LA, thereby enabling aerial photography.
  • the circumferential end side of the range A1 improves the degree of repetition OV. Therefore, it is possible to improve the degree of repetition OV on the circumferential end side of the aerial photographing range A1 in which the degree of repetition OV is likely to be insufficient.
  • the terminal 80 does not need to perform aerial photography for improving the degree of repetition OV of the position where the sufficient degree of repetition OV has been obtained, so the number of aerial photographs can be reduced, and the improvement efficiency of the repeatability OV can be improved.
  • the terminal 80 when there is a position where the degree of repetition OV2 is equal to or less than the threshold value th, the terminal 80 additionally generates the aerial photographing position AP21, and even if the arrangement of the aerial photographing position AP21 is insufficient for the improvement of the degree of repetition OV1, it is possible to foresee further improvement of the degree of repetition OV2. . Therefore, if, for example, the terminal 80 is added to the aerial photographing position AP21 until the degree of repetition OV2 desired by the user is reached, the shortage area LA in which the degree of repetition OV2 is insufficient can be eliminated.
  • the terminal control unit 81 generates an aerial photographing path AP22 that passes through the aerial photographing position AP11 and the aerial photographing position AP21 generated and arranged as described above.
  • each aerial photographing route including the aerial photographing position AP11 or the aerial photographing position AP21 may be sequentially connected to generate an aerial photographing path AP22.
  • the aerial photographing position AP11 or AP21 existing at the end of the adjacent aerial photographing route may be connected to generate the aerial photographing path AP22.
  • the method of generating the aerial photographing path AP22 is not limited thereto, and any of the aerial photographing positions AP11 and AP21 may be connected to each other to generate the aerial photographing path AP22.
  • the aerial photographing path AP22 may ensure the repeatability OV of the threshold value th or more in the entire aerial photographing range A1, or may be a path that is connected to each of the aerial photographing positions AP11 and AP21 so as to be the shortest path.
  • the aerial photography path AP22 is an example of the second imaging path.
  • FIG. 11 is a view showing an example of the aerial photographing path AP22 that has passed through the aerial photographing positions AP11 and AP21.
  • the aerial photographing path AP22 is generated in such a manner that the left end of the aerial photographing route c4 is connected to the left end of the aerial photographing route c3 from the right end of the aerial photographing route c4, and is connected from the right end of the aerial photographing route c3 to the right end of the aerial photographing route c4.
  • the right end of the aerial photography route c2 is connected from the left end of the aerial photography route c2 to the left end of the aerial photography route c1, and ends at the right end of the aerial photography route c1.
  • FIG. 12 is a flowchart showing an operation example of the terminal 80.
  • the terminal control unit 81 acquires the aerial photographing range A1 (S11).
  • the terminal control unit 81 generates an aerial photographing path AP12 that has passed through the aerial photographing position AP11 for aerial photographing in the aerial photographing range A1.
  • the terminal control unit 81 calculates the degree of repetition OV when the imaging unit 220 of the UAV 100 or the imaging unit 230 performs aerial photography at the aerial photographing position AP11 for each position in the aerial photographing range A1. That is, terminal control The unit 81 calculates the degree of repetition distribution at each position in the aerial photographing range A1 (S13).
  • the terminal control unit 81 extracts the insufficient area LA based on the degree of repetition OV at each position in the aerial imaging range A1 (S14).
  • the terminal control unit 81 generates and arranges the aerial photographing position AP21 based on the shortage area LA (S15). With the aerial photographing position AP21, the insufficient repeatability OV in the aerial photographing only on the aerial photographing position AP11 can be improved.
  • the terminal control unit 81 adds the aerial photographing position AP21 to the aerial photographing path AP12, and generates the aerial photographing path AP22 (S16). That is, the terminal control unit 81 generates the aerial photography path AP22 that has passed through the aerial photographing positions AP11 and AP21.
  • the terminal control unit 81 outputs information of the aerial photographing positions AP11, AP21 and the aerial photographing path AP22 (S17). For example, the terminal control unit 81 can transmit information of the aerial photographing positions AP11 and AP21 and the aerial photographing path AP22 to the unmanned aerial vehicle 100 via the communication unit 85. The terminal control unit 81 can write and record information of the aerial photographing positions AP11, AP21 and the aerial photographing path AP22 into an external recording device (for example, an SD card) as the memory 89.
  • an external recording device for example, an SD card
  • the UAV control unit 110 acquires information of the aerial photographing positions AP11, AP21 and the aerial photographing path AP22 output from the terminal 80.
  • the UAV control unit 110 can receive information of the aerial photographing locations AP11, AP21, and the aerial photographing path AP22 through the communication interface 150.
  • the UAV control unit 110 can acquire information of the aerial photographing positions AP11, AP21 and the aerial photographing path AP22 by the external recording device. Further, the UAV control unit 110 sets the acquired aerial photographing positions AP11 and AP21 and the aerial photographing path AP22.
  • the UAV control unit 110 can store the information of the aerial photographing positions AP11, AP21 and the aerial photographing path AP22 in the memory 160, and can use the information of the aerial photographing positions AP11, AP21 and the aerial photographing path AP22 for use.
  • the unmanned aerial vehicle 100 can fly along the aerial photographing path AP22 generated by the terminal 80, and take pictures in the air at the aerial photographing positions AP11, AP21.
  • the aerial photography image can be used, for example, to generate a composite image or a stereoscopic image within the aerial photography range A1.
  • the terminal 80 when there is a portion where the degree of repetition OV is insufficient at any position in the aerial photographing range A1, the terminal 80 can compensate for the shortage of the degree of repetition OV by arranging the aerial photographing position AP21. Therefore, the terminal 80 can increase the weight of multiple image ranges GH The number of repeated sheets is repeated, and the degree of repetition OV above a certain standard can be ensured. In particular, although the degree of repetition OV tends to be insufficient in the peripheral end portion of the aerial photographing range A1, the terminal 80 can improve the shortage of the degree of repetition OV. Therefore, the terminal 80 can suppress the deterioration of the image quality when the composite image or the stereoscopic image is generated based on the obtained plurality of aerial photographing images.
  • the terminal 80 does not need to uniformly determine the range larger than the aerial photographing range A1 in advance as the range in which the aerial photographing position and the aerial photographing path are generated, and the aerial photographing position AP21 can be flexibly arranged in accordance with the shortage of the degree of repetition OV. Therefore, compared with the case where the range larger than the aerial photographing range A1 is uniformly determined in advance, the possibility that the terminal 80 configures the useless aerial photographing position AP21 is low, and both the imaging efficiency and the repetition degree OV can be improved.
  • the terminal 80 can transmit the aerial photographing positions AP11, AP21, and the aerial photographing path AP22 to the unmanned aerial vehicle 100 by transmitting the information of the aerial photographing positions AP11, AP21, and the aerial photographing path AP22 to the unmanned aerial vehicle 100. Therefore, the UAV 100 can fly along the aerial photographing path AP22 generated by the terminal 80, and can photograph images in the air at the aerial photographing positions AP11, AP21.
  • the aerial photography path generation of the present embodiment can also be implemented by the unmanned aerial vehicle 100.
  • the UAV control unit 110 of the UAV 100 has the same function as that of the terminal control unit 81 of the terminal 80 that is related to the generation of the aerial photography path.
  • the UAV control unit 110 is an example of a processing unit.
  • the UAV control unit 110 performs processing related to generation of an aerial photographing path. In the process related to the generation of the aerial photographing path by the UAV control unit 110, the same processing as the processing related to the generation of the aerial photographing path performed by the terminal control unit 81 is omitted or simplified.
  • FIG. 13 is a flowchart showing an operation example of the UAV 100.
  • the UAV control unit 110 acquires the aerial photographing range A1 (S21).
  • the UAV control unit 110 generates an aerial photographing path AP12 that passes through the aerial photographing position AP11 for aerial photographing in the aerial photographing range A1 (S22).
  • the UAV control unit 110 calculates the degree of repetition OV when the imaging unit 220 of the UAV 100 or the imaging unit 230 performs aerial photography at the aerial photographing position AP11 for each position in the aerial photographing range A1. In other words, the UAV control unit 110 calculates the degree of repetition distribution at each position in the aerial imaging range A1 (S23).
  • the UAV control unit 110 extracts the insufficient area LA based on the degree of repetition OV at each position in the aerial photographing range A1 (S24).
  • the UAV control unit 110 generates and arranges the aerial photographing position AP21 based on the shortage area LA (S25). With the aerial photographing position AP21, the insufficient degree of repetition OV in the aerial photographing only in the aerial photographing position AP11 can be improved.
  • the UAV control unit 110 adds the aerial photographing position AP21 to the aerial photographing path AP12, and generates an aerial photographing path AP22 (S26). That is, the UAV control unit 110 generates the aerial photographing path AP22 that has passed through the aerial photographing positions AP11 and AP21.
  • the UAV control unit 110 sets the generated aerial photographing positions AP11 and AP21 and the aerial photographing path AP22 (S27).
  • the UAV control unit 110 can store the information of the generated aerial photographing positions AP11, AP21 and the aerial photographing path AP22 in the memory 160, and the information of the aerial photographing positions AP11, AP21 and the aerial photographing path AP22 can be used for
  • the state in the flight control of the UAV control unit 110 is utilized.
  • the UAV 100 can fly along the aerial photographing path AP22 generated by the UAV 100, and can photograph images in the air at the aerial photographing positions AP11, AP21.
  • the aerial photography image can be used, for example, to generate a composite image or a stereoscopic image within the aerial photography range A1.
  • the UAV 100 when there is a portion where the degree of repetition OV is insufficient at any position in the aerial photographing range A1, the UAV 100 can compensate for the shortage of the degree of repetition OV by arranging the aerial photographing position AP21, and can secure a certain standard or more.
  • the repetition rate is OV.
  • the UAV 100 can improve the shortage of the degree of repetition OV. Therefore, the unmanned aerial vehicle 100 can suppress the deterioration of the image quality when a composite image or a stereoscopic image is generated from the plurality of aerial photograph images obtained.
  • the UAV 100 does not need to uniformly determine the range larger than the aerial photographing range A1 in advance as the range in which the aerial photographing position and the aerial photographing path are generated, and the aerial photographing position AP21 can be flexibly arranged in accordance with the shortage of the degree of repetition OV. Therefore, compared with the case where the range larger than the aerial photographing range A1 is uniformly determined in advance, the UAV 100 is less likely to configure the useless aerial photographing position AP21, and it is also possible to improve the imaging efficiency and ensure the repeatability OV.
  • the unmanned aerial vehicle 100 can follow the aerial photography road generated by the unmanned aerial vehicle 100 by setting the aerial photographing positions AP11, AP21, and the aerial photographing path AP22.
  • the path AP22 flies and can take aerial images on the aerial shooting positions AP11 and AP21. Therefore, the UAV 100 can improve the processing accuracy of processing of an image captured in the air (for example, generation of a composite image or generation of a stereoscopic image), and can improve the image quality of the processed image.
  • the terminal control section 81 can perform processing to assist (for example, various operations of the operation section 83 of the terminal 80 or various displays by the display section 88) in the terminal 80.
  • Aerial photography path For example, in the unmanned aerial vehicle 100, the UAV control unit 110 may transmit the information of the repetition degree OV of each position in the aerial photography range A1 which is the basis of the repeatability map OM through the communication interface 150.
  • the terminal control unit 81 can acquire information from the unmanned aerial vehicle 100 via the communication unit 85 and display the repeatability map OM on the display unit 88.
  • the user can confirm the repeatability map OM displayed on the display unit 88, and input it in the vicinity of the position where the degree of repetition OV is insufficient (for example, the shortage area LA), for example, by the operation unit 83 of the terminal 80.
  • Photography position AP21 various operational inputs and displays utilizing terminal 80 can assist UAV 100 in generating an aerial photography path.
  • an image is taken in the air by the unmanned aerial vehicle 100, but an image may be taken by a moving body (for example, a vehicle) other than the unmanned aerial vehicle 100.
  • a moving body for example, a vehicle
  • the present embodiment can also be applied when an imaging path for capturing an image using such a moving body is generated.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mechanical Engineering (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Navigation (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

本发明能维持摄像效率,抑制摄像范围内的摄像图像的图像范围的重复度的不足。本发明的移动平台生成用于利用移动体进行摄像的摄像路径,且具备进行与生成摄像路径相关的处理的处理部。处理部获取摄像范围的信息,生成经过用于对摄像范围进行摄像的第1摄像位置的第1摄像路径,针对摄像范围内所含的每个位置,算出在第1摄像位置进行摄像时摄像图像的图像范围的重复度即第1重复度,当存在第1重复度为阈值以下的位置时,生成对摄像范围的摄像进行补充的第2摄像位置,且生成经过第1摄像位置及第2摄像位置的第2摄像路径。

Description

移动平台、摄像路径生成方法、程序、以及记录介质 技术领域
本公开涉及生成用于利用移动体进行摄像的摄像路径的移动平台、摄像路径生成方法、程序、以及记录介质。
背景技术
以往,已知有一面经过预先设定的固定路径一面进行摄像的平台(无人机)。该平台从地面基站接受摄像指示并对摄像对象进行摄像。该平台在对摄像对象进行摄像时,一面沿固定路径飞行,一面根据平台与摄像对象的位置关系,使平台的摄像设备倾斜而进行摄像。
现有技术文献
专利文献
[专利文献1]日本特开2010-61216号公报
发明内容
发明所要解决的技术问题
专利文献1中记载的装置例如当一面在规定的区域内沿固定路径飞行一面等间隔地对地面进行摄像时,可依次获得拍出地面情况的摄像图像。可以按照这些摄像图像中所含的地理范围例如以一定的重复率重复的方式拍摄图像。此情况下认为,在规定区域的内部,越靠近区域的中心部,成为一定的重复率的部位越多。另一方面,在规定的区域的端部,由于在区域的外侧不拍摄图像,所以有重复率变低的 倾向。因此,即便在规定的区域内等间隔地对地面进行摄像,所得的摄像图像的重复率会依赖于区域中的位置而不恒定。此情况下,例如,根据所得的多个摄像图像生成1张合成图像时的画质有时会下降。而且,例如,根据所得的多个摄像图像生成立体图像时的画质有时会下降。
而且,在为了确保区域的端部的重复率而统一地确定大于规定的区域的范围而拍摄图像时,为了确保重复率,可能会拍摄多余的图像。此情况下,会产生无用的摄像,摄像效率下降。然而,难以预先掌握摄像效率高且能确保重复率的区域的大小。
用于解决技术问题的手段
在一个方式中,一种移动平台,其是生成用于利用移动体进行摄像的摄像路径的移动平台,其具备进行与生成摄像路径相关的处理的处理部,处理部获取摄像范围的信息,生成经过用于对摄像范围进行摄像的第1摄像位置的第1摄像路径,针对摄像范围内所含的每个位置,算出在第1摄像位置进行摄像时摄像图像的图像范围的重复度即第1重复度,当存在第1重复度为阈值以下的位置时,生成对摄像范围的摄像进行补充的第2摄像位置,生成经过第1摄像位置及第2摄像位置的第2摄像路径。
处理部可提取第1重复度为阈值以下的不足区域,根据不足区域的位置生成第2摄像位置。
第1摄像路径可包含多个摄像路线。处理部可在多个摄像路线中的经过不足区域的摄像路线中,在存在于摄像路线的不足区域侧的端部的第1摄像位置的外侧的位置生成第2摄像位置。
处理部可针对摄像范围内所含的每个位置,算出在第1摄像位置及第2摄像位置进行摄像时摄像图像的图像范围的重复度即第2重复度,当存在第2重复度为阈值以下的位置时,追加生成第2摄像位置。
处理部可针对摄像范围内所含的每个位置,算出在第1摄像位置及第2摄像位置进行摄像时摄像图像的图像范围的重复度即第2重复度,当存在第2重复度为阈值以下的位置时,追加生成第2摄像位置。
处理部可根据第1摄像位置及在第1摄像位置进行摄像时移动体的移动参数及摄像参数来算出第1重复度。
移动平台可为终端。处理部可将第1摄像位置、第2摄像位置、及第2摄像路径的信息发送给移动体。
移动平台可为终端。处理部可生成表示摄像范围内所含的每个位置的第1重复度的分布的图像,且显示图像。
移动平台可为移动体。处理部可设定第1摄像位置、第2摄像位置、及第2摄像路径。
移动体可包括飞行体。摄像可包括空中摄影。
在一个方式中,一种摄像图像生成方法,其是生成用于利用移动体进行摄像的摄像路径的移动平台的摄像路径生成方法,其具有如下步骤:获取摄像范围的信息;生成经过用于对摄像范围进行摄像的第1摄像位置的第1摄像路径;针对摄像范围内所含的每个位置,算出在第1摄像位置进行摄像时摄像图像的图像范围的重复度的第1重复度;当存在第1重复度为阈值以下的位置时,生成对摄像范围的摄像进行补充的第2摄像位置;及生成经过第1摄像位置及第2摄像位置的第2摄像路径。
生成第2摄像位置的步骤可包含如下步骤:提取第1重复度为阈值以下的不足区域;及根据不足区域的位置生成第2摄像位置。
第1摄像路径可包含多个摄像路线。生成第2摄像位置的步骤可包含如下步骤:在多个摄像路线中的经过不足区域的摄像路线中,在存在于摄像路线的不足区域侧的端部的第1摄像位置的外侧的位置生成第2摄像位置。
摄像路径生成方法可还包含如下步骤:针对摄像范围内所含的每个位置,算出在第1摄像位置及第2摄像位置进行摄像时摄像图像的图像范围的重复度即第2重复度。生成第2摄像位置的步骤可包含如下步骤:当存在第2重复度为阈值以下的位置时,追加生成第2摄像位置。
算出第1重复度的步骤可包含如下步骤:根据第1摄像位置及在第1摄像位置进行摄像时移动体的移动参数及摄像参数,算出第1重 复度。
移动平台可为终端。摄像路径生成方法可还包含如下步骤:将第1摄像位置、第2摄像位置、及第2摄像路径的信息发送给移动体。
移动平台可为终端。摄像路径生成方法可还包含如下步骤:生成表示摄像范围内所含的每个位置的第1重复度的分布的图像。
移动平台可为移动体。摄像路径生成方法可还包含如下步骤:设定第1摄像位置、第2摄像位置、及第2摄像路径。
移动体可包括飞行体。摄像可包括空中摄影。
在一个方式中,一种程序,其是用于使生成用于利用移动体进行摄像的摄像路径的移动平台执行如下步骤的程序:获取摄像范围的信息;生成经过用于对摄像范围进行摄像的第1摄像位置的第1摄像路径;针对摄像范围内所含的每个位置,算出在第1摄像位置进行摄像时摄像图像的图像范围的重复度即第1重复度;当存在第1重复度为阈值以下的位置时,生成对摄像范围的摄像进行补充的第2摄像位置;及生成经过第1摄像位置及第2摄像位置的第2摄像路径。
在一个方式中,一种记录介质,其是记录有使生成用于利用移动体进行摄像的摄像路径的移动平台执行如下步骤的程序的计算机可读记录介质:获取摄像范围的信息;生成经过用于对摄像范围进行摄像的第1摄像位置的第1摄像路径;针对摄像范围内所含的每个位置,算出在第1摄像位置进行摄像时摄像图像的图像范围的重复度即第1重复度;当存在第1重复度为阈值以下的位置时,生成对摄像范围的摄像进行补充的第2摄像位置;及生成经过第1摄像位置及第2摄像位置的第2摄像路径。
另外,所述发明的概要并未列举出本公开的所有特征。而且,这些特征群的子组合也可成为发明。
附图说明
图1是表示第1实施方式中的空中摄影路径生成系统的第1构成例的示意图。
图2是表示第1实施方式中的空中摄影路径生成系统的第2构成 例的示意图。
图3是表示无人飞行器的硬件构成的一例的框图。
图4是表示终端的硬件构成的一例的框图。
图5是表示空中摄影范围的一例的图。
图6是表示经过空中摄影位置AP11的空中摄影路径AP12的一例的图。
图7是用于说明空中摄影范围内的任意位置上的重复度的图。
图8是表示空中摄影范围内的每个位置的重复度的一例的图。
图9是表示空中摄影范围内的不足区域的一例的图。
图10是表示空中摄影位置AP21的配置的一例的图。
图11是表示经过空中摄影位置AP11、AP21的空中摄影路径AP22的一例的图。
图12是表示终端生成空中摄影路径时终端的动作例的流程图。
图13是表示无人飞行器生成空中摄影路径时无人飞行器的动作例的流程图。
具体实施方式
以下,利用发明的实施方式说明本公开,但以下的实施方式并不限定权利要求书中涉及的发明。实施方式中说明的特征的组合并非全部是发明的解决方案所必须的。
权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人就无法异议。但是,在除此以外的情况下,保留一切的著作权。
以下的实施方式中,作为移动平台,主要对无人飞行器(UAV:Unmanned Aerial Vehicle)进行例示。无人飞行器为飞行体的一例,包括在空中移动的飞行器。飞行体是移动体的一例。本说明书中所附的附图中,也将无人飞行器记作“UAV”。而且,移动平台也可为无人飞行器以外的装置,例如也可为终端、个人计算机(Personal Computer,PC)、或其他装置。摄像路径生成方法规定了移动平台 中的动作。记录介质中记录有程序(例如用于使移动平台执行各种处理的程序)。
(第1实施方式)
图1是表示第1实施方式中的空中摄影路径生成系统10的第1构成例的示意图。空中摄影路径生成系统10具备无人飞行器100及终端80。无人飞行器100及终端80可利用有线通信或无线通信(例如无线LAN(Local Area Network,局域网))相互通信。图1的例示中,终端80为便携式终端(例如智能电话、平板终端)。
图2是表示第1实施方式中的空中摄影路径生成系统10的第2构成例的示意图。图2的例示中,终端80为PC。图1及图2中,终端80可均具有相同功能。
图3是表示无人飞行器100的硬件构成的一例的框图。无人飞行器100的构成为包含UAV控制部110、通信接口150、内存160、存储器170、云台200、旋翼机构210、摄像部220、摄像部230、GPS接收器240、惯性测量装置(IMU:Inertial Measurement Unit)250、磁罗盘260、气压高度计270、超声波传感器280、及激光测定仪290。
UAV控制部110例如使用中央处理单元(Central Processing Unit,CPU)、微处理单元(Micro Processing Unit,MPU)或数字信号处理器(Digital Signal Processor,DSP)构成。UAV控制部110进行用于总体地控制无人飞行器100各部的动作的信号处理、与其他各部之间的数据的输入输出处理、数据的运算处理及数据的存储处理。
UAV控制部110按照内存160中储存的程序控制无人飞行器100的飞行。UAV控制部110可按照由终端80或无人飞行器100生成的空中摄影路径控制飞行。UAV控制部110可按照由终端80或无人飞行器100生成的空中摄影位置对图像进行空中摄影。另外,空中摄影为摄像的一例。
UAV控制部110获取表示无人飞行器100的位置的位置信息。UAV控制部110可从GPS接收器240获取表示无人飞行器100所在的纬度、经度及高度的位置信息。UAV控制部110可分别从GPS接收器240获取表示无人飞行器100所在的纬度及经度的纬度经度信 息,且从气压高度计270获取表示无人飞行器100所在的高度的高度信息,作为位置信息。UAV控制部110可获取利用超声波传感器280得到的超声波的放射点与超声波的反射点的距离作为高度信息。
UAV控制部110可从磁罗盘260获取表示无人飞行器100的朝向的朝向信息。朝向信息可以例如与无人飞行器100的机头的朝向对应的方位进行表示。
UAV控制部110可获取在摄像部220对于应摄像的摄像范围进行摄像时表示无人飞行器100应在的位置的位置信息。UAV控制部110可从内存160获取表示无人飞行器100应在的位置的位置信息。UAV控制部110可通过通信接口150从其他装置获取表示无人飞行器100应在的位置的位置信息。UAV控制部110可参照三维地图数据库识别无人飞行器100可能存在的位置,并获取该位置作为表示无人飞行器100应在的位置的位置信息。
UAV控制部110可获取表示摄像部220及摄像部230各自的摄像范围的摄像范围信息。UAV控制部110可从摄像部220及摄像部230获取表示摄像部220及摄像部230的视角的视角信息,作为用于确定摄像范围的参数。UAV控制部110可获取表示摄像部220及摄像部230的摄像方向的信息,作为用于确定摄像范围的参数。UAV控制部110可从云台200获取表示摄像部220的姿势状态的姿势信息,作为例如表示摄像部220的摄像方向的信息。摄像部220的姿势信息可表示云台200的俯仰轴及偏航轴自基准旋转角度旋转的角度。
UAV控制部110可获取表示无人飞行器100所在的位置的位置信息,作为用于确定摄像范围的参数。UAV控制部110可根据摄像部220及摄像部230的视角及摄像方向、以及无人飞行器100所在的位置,划定表示摄像部220进行摄像的地理范围的摄像范围,生成摄像范围信息,从而获取摄像范围信息。
UAV控制部110可从内存160获取摄像范围信息。UAV控制部110可通过通信接口150获取摄像范围信息。
UAV控制部110控制云台200、旋翼机构210、摄像部220及摄像部230。UAV控制部110可通过变更摄像部220的摄像方向或视角 而控制摄像部220的摄像范围。UAV控制部110可通过控制云台200的旋转机构,而控制由云台200支持的摄像部220的摄像范围。
摄像范围是指由摄像部220或摄像部230进行摄像的地理范围。摄像范围是以纬度、经度及高度定义。摄像范围可为以纬度、经度、及高度定义的三维空间数据的范围。摄像范围也可为以纬度及经度定义的二维空间数据的范围。摄像范围可根据摄像部220或摄像部230的视角及摄像方向、以及无人飞行器100所在的位置进行确定。摄像部220及摄像部230的摄像方向可根据摄像部220及摄像部230的设有摄像镜头的正面所朝的方位及俯角来定义。摄像部220的摄像方向可为根据无人飞行器100的机头的方位、与摄像部220相对于云台200的姿势的状态所确定的方向。摄像部230的摄像方向可为根据无人飞行器100的机头的方位、与摄像部230所设的位置所确定的方向。
UAV控制部110可通过对由多个摄像部230所摄像的多个图像进行解析而识别无人飞行器100周围的环境。UAV控制部110可根据无人飞行器100周围的环境控制飞行,例如避开障碍物。
UAV控制部110可获取表示存在于无人飞行器100周围的对象的立体形状(三维形状)的立体信息(三维信息)。对象可为例如建筑物、道路、车、树等风景的一部分。立体信息例如为三维空间数据。UAV控制部110可通过从由多个摄像部230所得的各个图像生成表示存在于无人飞行器100周围的对象的立体形状的立体信息,从而获取立体信息。UAV控制部110可通过参照内存160或存储器170中储存的三维地图数据库,而获取表示存在于无人飞行器100周围的对象的立体形状的立体信息。UAV控制部110可通过参照网络上存在的服务器所管理的三维地图数据库,而获取与存在于无人飞行器100周围的对象的立体形状相关的立体信息。
UAV控制部110通过控制旋翼机构210而控制无人飞行器100的飞行。即,UAV控制部110通过控制旋翼机构210而控制无人飞行器100的包括纬度、经度、及高度在内的位置。UAV控制部110可通过控制无人飞行器100的飞行而控制摄像部220的摄像范围。UAV控制部110通过控制摄像部220所具备的变焦镜头而控制摄像 部220的视角。UAV控制部110可通过利用摄像部220的数字变焦功能进行数字变焦而控制摄像部220的视角。
摄像部220固定在无人飞行器100上而不使摄像部220移动时,UAV控制部110可通过在特定的时间使无人飞行器100移动到特定的位置,而使摄像部220在所期望的环境下对所期望的摄像范围进行摄像。或者,当摄像部220不具备变焦功能,无法变更摄像部220的视角时,UAV控制部110也可在特定的时间使无人飞行器100移动到特定的位置,从而使摄像部220在所期望的环境下对所期望的摄像范围进行摄像。
通信接口150与终端80进行通信。通信接口150可通过任意的无线通信方式进行无线通信。通信接口150可通过任意的有线通信方式进行有线通信。通信接口150可将空中摄影图像或与空中摄影图像相关的附加信息(元数据)发送给终端80。
内存160中保存UAV控制部110对云台200、旋翼机构210、摄像部220、摄像部230、GPS接收器240、惯性测量装置250、磁罗盘260、气压高度计270、超声波传感器280及激光测定仪290进行控制所需的程序等。内存160可为计算机可读记录介质,可包含静态随机存取内存(Static Random Access Memory,SRAM)、动态随机存取内存(Dynamic Random Access Memory,DRAM)、可擦可编程只读内存(Erasable Programmable Read Only Memory,EPROM)、电可擦可编程只读内存(Electrically Erasable Programmable Read-Only Memory,EEPROM)、及通用串行总线(Universal Serial Bus,USB)存储器等闪存中的至少一种。内存160也可从无人飞行器100卸下。内存160可作为工作内存进行动作。
存储器170可包含硬盘驱动器(Hard Disk Drive,HDD)、固态硬盘(Solid State Drive,SSD)、SD卡、USB存储器、其他存储器中的至少一种。存储器170可保存各种信息、各种数据。存储器170也可从无人飞行器100卸下。存储器170可记录空中摄影图像。
内存160或存储器170可保存由终端80或无人飞行器100生成的空中摄影位置或空中摄影路径的信息。空中摄影位置或空中摄影路 径的信息可作为由无人飞行器100预先设定的空中摄影时的空中摄影参数、或由无人飞行器100预先设定的飞行时的飞行参数中的一个由UAV控制部110设定。此设定信息可保存在内存160或存储器170中。飞行参数为移动参数的一例。
云台200可使摄像部220能以偏航轴、俯仰轴及横滚轴为中心旋转地对其进行支持。云台200通过使摄像部220以偏航轴、俯仰轴、及横滚轴中的至少一个为中心旋转,可变更摄像部220的摄像方向。
偏航轴、俯仰轴及横滚轴可以如下确定。例如,将横滚轴定义为水平方向(平行于地面的方向)。此情况下,将俯仰轴定义为平行于地面且垂直于横滚轴的方向,将偏航轴(参照z轴)定义为垂直于地面且垂直于横滚轴及俯仰轴的方向。
旋翼机构210具有多个旋翼、及使多个旋翼旋转的多个驱动马达。旋翼机构210通过利用UAV控制部110控制旋转而使无人飞行器100飞行。旋翼211的数量可为例如4个,也可为其他数量。而且,无人飞行器100也可为无旋翼的固定翼机。
摄像部220是对所期望的摄像范围内所含的被摄体(例如作为空中摄影对象的上空的景象、山或河流等的景色、地面上的建筑物)进行摄像的摄像用相机。摄像部220对所期望的摄像范围内的被摄体进行摄像而生成摄像图像的数据。通过摄像部220的摄像所得的图像数据(例如空中摄影图像)可储存在摄像部220所具有的内存、或存储器170中。
摄像部230可为对无人飞行器100周围进行摄像以控制无人飞行器100的飞行的传感用相机。2个摄像部230可设在无人飞行器100的机头即正面。而且,另外2个摄像部230可设在无人飞行器100的底面。正面侧的2个摄像部230可成对,作为所谓立体相机发挥功能。底面侧的2个摄像部230也可成对,作为立体相机发挥功能。可根据由多个摄像部230所摄的图像而生成无人飞行器100周围的三维空间数据(三维形状数据)。另外,无人飞行器100所具备的摄像部230的数量并不限于4个。无人飞行器100可具备至少1个摄像部230。无人飞行器100可在无人飞行器100的机头、机尾、侧面、底面及顶 面分别具备至少1个摄像部230。摄像部230可设定的视角可大于摄像部220可设定的视角。摄像部230可具有定焦镜头或鱼眼镜头。摄像部230对无人飞行器100周围进行摄像而生成摄像图像的数据。摄像部230的图像数据可储存在存储器170中。
GPS接收器240接收表示从多个导航卫星(即GPS卫星)发送的表示时刻及各GPS卫星的位置(坐标)的多个信号。GPS接收器240根据接收的多个信号算出GPS接收器240的位置(即无人飞行器100的位置)。GPS接收器240将无人飞行器100的位置信息输出到UAV控制部110。另外,GPS接收器240的位置信息的算出可由UAV控制部110代替GPS接收器240而进行。此情况下,将GPS接收器240接收到的多个信号中包含的表示时刻及各GPS卫星的位置的信息输入到UAV控制部110。
惯性测量装置250检测无人飞行器100的姿势,且将检测结果输出到UAV控制部110。惯性测量装置250可检测无人飞行器100的前后、左右及上下这3轴方向的加速度、及俯仰轴、横滚轴及偏航轴这3轴方向的角速度,作为无人飞行器100的姿势。
磁罗盘260检测无人飞行器100的机头的方位,且将检测结果输出到UAV控制部110。
气压高度计270检测无人飞行器100的飞行高度,且将检测结果输出到UAV控制部110。
超声波传感器280放射出超声波,检测经地面或物体反射的超声波,且将检测结果输出到UAV控制部110。检测结果可表示无人飞行器100到地面的距离即高度。检测结果可表示无人飞行器100到物体(被摄体)的距离。
激光测定仪290向物体照射激光光束,接收经物体反射的反射光,且根据反射光测定无人飞行器100与物体(被摄体)之间的距离。利用激光光束测定距离的方式,作为一例,可为飞行时间(Time Of Flight)方式。
图4是表示终端80的硬件构成的一例的框图。终端80可具备终端控制部81、操作部83、通信部85、内存87、显示部88及存储器 89。终端80可由希望生成空中摄影路径的用户所持。
终端控制部81使用例如CPU、MPU或DSP构成。终端控制部81进行用于总体地控制终端80各部的动作的信号处理、与其他各部之间的数据的输入输出处理、数据的运算处理及数据的存储处理。
终端控制部81可通过通信部85从无人飞行器100获取数据、空中摄影图像或信息。终端控制部81可获取通过操作部83输入的数据或信息(例如各种参数)。终端控制部81可获取内存87内保存的数据、空中摄影图像或信息。终端控制部81可通过通信部85向无人飞行器100发送数据或信息(例如生成的空中摄影位置、空中摄影路径的信息)。终端控制部81可将数据、信息或空中摄影图像传送到显示部88,且使基于该数据、信息或空中摄影图像的显示信息显示在显示部88。
终端控制部81可执行用于生成空中摄影路径的应用程序或用于帮助生成空中摄影路径的应用程序。终端控制部81可生成应用程序中使用的各种数据。
操作部83接受并获取由终端80的用户输入的数据或信息。操作部83可包含按钮、按键、触摸面板、话筒等。这里,主要例示出操作部83与显示部88由触摸面板构成的情况。此情况下,操作部83可接受触摸操作、点击操作、拖动操作等。操作部83可接受各种参数的信息。由操作部83输入的信息可被发送到无人飞行器100。各种参数可包含与空中摄影路径的生成相关的参数(例如重复度的阈值th、沿着空中摄影路径空中摄影时无人飞行器100的飞行参数或摄像参数中的至少一种信息)。
通信部85通过各种无线通信方式与无人飞行器100之间进行无线通信。此无线通信的无线通信方式可包含例如通过无线LAN、Bluetooth(注册商标)或公共无线线路实现的通信。通信部85也可通过任意的有线通信方式进行有线通信。
内存87可具有例如储存有规定终端80的动作的程序或设定值的数据的ROM、及暂时保存在终端控制部81进行处理时使用的各种的信息或数据的RAM。内存87可包括ROM及RAM以外的内存。内 存87可设在终端80的内部。内存87可设置成能从终端80卸下。程序可包括应用程序。
显示部88例如使用液晶显示器(Liquid Crystal Display,LCD)构成,显示从终端控制部81输出的各种的信息、数据或空中摄影图像。显示部88可显示执行应用程序时涉及的各种数据或信息。
存储器89中储存且保存各种数据、信息。存储器89可为HDD、SSD、SD卡、USB存储器等。存储器89可设在终端80的内部。存储器89可设置成能从终端80卸下。存储器89可保存从无人飞行器100获取的空中摄影图像或附加信息。附加信息也可保存在内存87中。
接着,对生成空中摄影路径的相关功能进行说明。这里,主要对终端80的终端控制部81具有与生成空中摄影路径相关的功能的情况进行说明,但也可为无人飞行器100具有与生成空中摄影路径相关的功能。终端控制部81为处理部的一例。终端控制部81进行与生成空中摄影路径相关的处理。
终端控制部81获取空中摄影范围A1。空中摄影范围A1包含由无人飞行器100空中摄影的范围。空中摄影范围A1内,目标为在空中摄影范围A1内所包含的各位置上空中摄影图像的图像范围GH的重复度OV为阈值th以上。即,在空中摄影范围A1内维持一定的重复度OV。另外,重复度OV与表示和多个图像范围GH重复的比例的重复度存在对应关系。例如,若重复度OV为阈值th以上,则可以说重复率也在规定值以上。
图5是表示空中摄影范围A1的一例的图。终端控制部81可从内存87获取空中摄影范围A1。终端控制部81可从内存87或外部服务器获取空中摄影范围A1。终端控制部81可通过操作部83获取空中摄影范围A1。操作部83可接受从地图数据库等获取的地图信息中所表示的希望空中摄影的所期望范围的用户输入,作为空中摄影范围A1。而且,操作部83可输入希望空中摄影的所期望的地名、能识别场所的建筑物或其他信息的名称(也称为地名等)。此情况下,终端控制部81可获取地名等所示的范围作为空中摄影范围A1,也可获取 地名等的周围的规定范围(例如以地名表示的位置为中心的半径100m内的范围)作为空中摄影范围A1。
终端控制部81生成经过空中摄影范围内的空中摄影位置AP11的空中摄影路径AP12。空中摄影路径AP12可利用公知的方法生成。空中摄影位置AP11可利用公知的方法生成。可生成在空中摄影路径AP12上配置在等间隔的位置的空中摄影位置AP11。另外,多个空中摄影位置AP11也可并非等间隔地配置而是以不同的间隔配置。空中摄影位置AP11为第1摄像位置的一例。空中摄影路径AP12为第1摄像路径的一例。
图6是表示经过空中摄影位置AP11的空中摄影路径AP12的一例的图。图6中,空中摄影路径AP12具有线性的4条空中摄影路线c1、c2、c3、c4。图6中,在空中摄影范围A1的内部,空中摄影位置AP11配置在各空中摄影路线c1~c4上。图6中,各空中摄影路线c1~c4上的空中摄影位置AP11可根据空中摄影范围A1的形状而不同。空中摄影路线c1~c4依次连接,从而形成空中摄影路径AP12。例如,空中摄影路线c3、c4上,与空中摄影路线c3、c4相比,空中摄影位置AP11少。而且,图6中,空中摄影路线c4在图6的左右方向上形成为直线,但也可形成在其他方向(例如图6中的上下方向)。
终端控制部81针对空中摄影范围A1内包含的每个位置,算出由无人飞行器100的摄像部220或摄像部230在空中摄影位置AP11进行空中摄影时空中摄影图像的图像范围GH重复的程度(重复度OV)。重复度OV可由例如空中摄影范围A1内包含的各位置包含在空中摄影图像的图像范围GH中的空中摄影图像的张数(重复张数)表示。终端控制部81可将每个位置的重复度OV映射在二维平面上,生成重复度映射图OM。终端控制部81可通过显示部88显示重复度映射图OM,使每个位置的重复度OV可见。终端80通过显示重复度映射图OM,能易于使用户直观地掌握空中摄影范围A1内的各位置上的重复度OV的分布的样子。
由无人飞行器100所空中摄影的空中摄影图像的图像范围GH与 拍到空中摄影图像中的地理范围对应。多个空中摄影图像的图像范围GH可重复。例如,当2张空中摄影图像的图像范围GH重复时,在空中摄影范围内的2张图像范围GH重复的位置上,空中摄影图像的重复张数为2张。即,该位置被拍到2张空中摄影图像中。同样,在空中摄影范围A1内的3张图像范围重复的位置,空中摄影图像的重复张数为3张。即,该位置被拍到3张以上的空中摄影图像中。空中摄影图像的图像范围的重复张数为空中摄影的重复度OV的一例。
图像范围GH可根据无人飞行器100将来飞行时的飞行参数及无人飞行器100所具备的摄像部230或摄像部230进行空中摄影时的摄像参数而确定。飞行参数可包含空中摄影位置信息、空中摄影路径信息、空中摄影时刻信息及其他信息中的至少1种。摄像参数可包含空中摄影视角信息、空中摄影方向信息、空中摄影姿势信息、摄像范围信息、被摄体距离信息及其他信息(例如分辨率、图像范围、重复率的信息)中的至少1种。
空中摄影路径信息表示在空中拍摄空中摄影图像的预先设定的路径(空中摄影路径)。空中摄影路径信息是进行空中摄影时无人飞行器100飞行的路径的信息,可为空中摄影路径AP12。空中摄影位置信息是在空中拍摄空中摄影图像的预先设定的位置(例如三维位置(纬度、经度、高度)),可为空中摄影位置AP11。空中摄影时刻信息表示在空中拍摄空中摄影图像的预先设定的时刻(空中摄影时刻)。
空中摄影视角信息表示在空中拍摄空中摄影图像时的摄像部220或摄像部230的视角FOV(Field of View)的信息。空中摄影方向信息表示在空中拍摄空中摄影图像时的摄像部220或摄像部230的摄像方向(空中摄影方向)。空中摄影姿势信息表示在空中拍摄空中摄影图像时的摄像部220或摄像部230的姿势。摄像范围信息表示在空中拍摄空中摄影图像时的摄像部220或摄像部230的摄像范围,例如可基于云台200的旋转角度。被摄体距离信息表示在空中拍摄空中摄影图像时的自摄像部220或摄像部230到被摄体的距离的信息。
另外,这里,飞行参数及摄像参数并非以前的空中摄影中的参数, 而是将来的预先设定的空中摄影中的参数。将来的预先设定的空中摄影中的参数可具有与以前的空中摄影中的参数相同的参数。
终端控制部81可根据摄像参数及飞行参数中的至少一种而确定多个空中摄影图像的图像范围GH。例如,终端控制部81可根据视角FOV、空中摄影方向、摄像部220的姿势、空中摄影位置(纬度、经度、高度)等信息中的至少一种而算出图像范围GH。
作为一例,空中摄影位置间隔d、空中摄影距离L、在空中拍摄空中摄影图像的摄像部220或摄像部230的视角FOV及空中摄影图像的图像范围GH的重复率or可具有以下的式(1)的关系。
d=L*FOV*(1-or)···(1)
另外,式(1)中,“*”表示乘法运算符号。空中摄影位置间隔d可为例如配置好的空中摄影位置(例如相邻的2个空中摄影位置AP11的间隔)。空中摄影距离L可为例如空中摄影时的无人飞行器100与被摄体(例如地面)的距离、即飞行高度。重复率or可表示图像范围相邻的2个空中摄影图像的图像范围GH的重复比例。
而且,空中摄影位置间隔d、空中摄影图像的图像范围GH的重复率or、空中摄影图像的图像范围GH的宽度w及空中摄影图像OG的分辨率r可具有以下的式(2)的关系。
d=r*w*(1-or)···(2)
终端80的操作部83可接受用户操作且输入摄像参数及飞行参数中的至少一种。例如,操作部83可输入式(1)、式(2)中所含的参数的至少一部分。
终端控制部81能根据式(1)、(2)的各参数而算出图像范围GH的宽度w(例如矩形的一边的长度)。而且,终端控制部81能获取空中摄影位置AP11的二维位置(纬度、经度)。因此,终端控制部81能根据图像范围GH的宽度w及空中摄影位置AP11的二维位置,具体地确定无人飞行器100的摄像部220或摄像部230对地面方向进行摄像时图像范围GH包围的地理范围。因此,能针对空中摄影范围A1内的每个位置,算出空中摄影图像的图像范围GH的重复度OV。
这样,终端80能通过根据多个空中摄影位置AP11及在空中摄影位置AP11进行空中摄影时的飞行参数及摄像参数算出重复度OV,从而导出重复度OV,而无须实际使无人飞行器100飞行或由摄像部220或摄像部230进行空中摄影。因此,能在1个装置上使用飞行参数及摄像参数通过计算容易地求出重复度OV。此情况下,终端控制部81可根据飞行参数及摄像参数求出图像范围GH,且根据多个图像范围GH的位置关系而算出重复度OV。多个图像范围GH的位置关系可根据对空中摄影图像进行摄像的多个空中摄影位置AP11的位置关系而确定。
图7是用于说明空中摄影范围A1内的位置p1上的重复度OV的图。图7中,位置p1包含于3个图像范围GH(GH1、GH2、GH3)内,所以例示出位置p1上的重复张数为3张。另外,图7中,重复度OV表示为重复张数,但也可对重复张数施加任意的加工(例如加权)而生成重复度OV。图7中,例示出位置p1这一处的重复度OV,但同样也可导出空中摄影范围A1内的位置p1以外的位置上的重复度OV。
图8是表示空中摄影范围A1内的每个位置的重复度OV的一例的图,且为表示重复度映射图OM的一例的图。这里,对于每个重复度OV,采用不同图案区分表示。图8中,作为重复度OV,例示出每个位置的重复张数(1张、2张、3张、4张、5张、6张、7张、8张、9张)。另外,重复张数也可为9张以上。参照图8,将空中摄影范围A1的周端部附近与靠近空中摄影范围A1中心的部分进行比较时,可理解有重复度OV变小的倾向。
这样,终端80中,通过显示重复度映射图OM,容易使用户直观地掌握空中摄影范围A1内的各位置上的重复度OV的分布的样子。此情况下,用户也可例如通过终端80的操作部83进行输入,以将空中摄影位置AP21配置在重复度OV不足的位置(例如不足区域LA)的附近。此情况下,也能改善重复度OV的不足。因此,重复度映射图OM可用来辅助用户确定空中摄影位置AP21的配置。
终端控制部81提取不足区域LA。不足区域LA是包含空中摄影 范围A1内的重复度OV(例如重复张数)为阈值th(例如4张)以下的1个以上的位置的区域。即,不足区域LA内的各位置是与空中摄影范围A1内的其他区域相比,重复度OV相对低的位置。不足区域LA容易出现在比空中摄影范围A1的中心部更靠空中摄影范围A1的周端部。另外,根据空中摄影路径AP12或空中摄影位置AP11,也可能在空中摄影范围A1的中心部出现不足区域LA。
图9是表示不足区域LA的一例的图。图9中,不足区域LA出现在空中摄影范围A1的周端的3个部位。
在空中摄影范围A1内存在重复度OV为阈值th以下的位置时,终端控制部81可生成并配置空中摄影位置AP21。空中摄影位置AP21成为用于对空中摄影范围A1的空中摄影进行补充的空中摄影位置。例如,终端控制部81可根据不足区域LA的位置而生成并配置空中摄影位置AP21。空中摄影位置AP21可以以与其他空中摄影位置的间隔(例如多个空中摄影位置AP11的间隔)相同的间隔配置,也可以以与其他空中摄影位置的间隔不同的间隔配置。空中摄影位置AP21为第2摄像位置的一例。
图10是表示空中摄影位置AP21的配置的一例的图。图10中,空中摄影位置AP21配置在不足区域LA的内部或其附近。
具体来说,在空中摄影路线c1上,空中摄影路线c1的两端分别位于不足区域LA的内部,所以,可在空中摄影路线c1的两端的2个空中摄影位置AP11的外侧配置2个空中摄影位置AP21。同样,在空中摄影路线c2上,位于空中摄影路线c2两端的空中摄影位置AP11分别位于不足区域LA的内部,所以,可在空中摄影路线c1两端的2个空中摄影位置AP11的外侧配置2个空中摄影位置AP21。由此,空中摄影路线c1、c2上可空中摄影的空中摄影图像的图像范围GH的重复度OV(OV1)增大,所以,终端80能改善重复度OV1。而且,可获得阈值th以上的重复度OV1,在空中摄影路线c1、c2上可获得用户所期望的重复度OV1。重复度OV1为第1重复度的一例。
在空中摄影路线c3上,空中摄影路线c3的一端(图10中为右端)位于不足区域LA的内部,所以,可在空中摄影路线c3的一端 的1个空中摄影位置AP11的外侧配置2个空中摄影位置AP21。此情况下,沿空中摄影路线c3可空中摄影的空中摄影图像的图像范围GH的重复度OV1也增大,所以,终端80能改善重复度OV1。另外,可在空中摄影路线c3的一端的1个空中摄影位置AP11的外侧配置多个空中摄影位置AP21。由此,例如可获得阈值th以上的重复度OV1,在空中摄影路线c3上可获得用户所期望的重复度OV1。
在空中摄影路线c4上,因空中摄影路线c4中所含的1个空中摄影位置AP11位于不足区域LA的内部,所以,可在该空中摄影位置AP11的两侧配置2个空中摄影位置AP21。另外,在空中摄影路线c4上,通过在空中摄影位置AP11的至少一端侧配置1个空中摄影位置AP21,使空中摄影路线c4可空中摄影的空中摄影图像的图像范围GH的重复度OV1增大。此情况下,终端80也能改善重复度OV1。另外,通过在空中摄影位置AP11的两侧分别配置2个空中摄影位置AP21,可获得阈值th以上的重复度OV1,在空中摄影路线c4上可获得用户所期望的重复度OV1。
终端控制部81可实施以下的处理来确定空中摄影位置AP21的配置位置。例如,终端控制部81可提取经过不足区域LA的空中摄影路线。这里,任一条空中摄影路线c1~c4都经过不足区域LA的一部分。此情况下,终端控制部81可在存在于不足区域LA的内部或其附近的空中摄影位置AP11的附近(旁边),生成并配置1个空中摄影位置AP21。由此,终端80能改善各空中摄影路线c1~c4的重复度OV1,且能在空中摄影路线c1、c2上提供用户所期望的重复度OV1。
然后,终端控制部81可再次算出空中摄影范围A1内的各位置上的重复度OV(OV2)。此情况下,终端控制部81算出由无人飞行器100的摄像部220或摄像部230在空中摄影位置AP11及空中摄影位置AP21空中摄影时的重复度OV2。由此,当假设在空中摄影位置AP11、AP21进行空中摄影时,与假设仅在空中摄影位置AP11进行空中摄影时相比,重复度OV2为阈值th以下的位置减少,即不足区域LA的数量减少或大小减小。重复度OV2为第2重复度的一例。
当假设在空中摄影位置AP11、AP21进行空中摄影时,也可在残留重复度OV2为阈值th以下的位置或不足区域LA的情况下,由终端控制部81追加生成并追加配置空中摄影位置AP21。终端控制部81可根据不足区域LA的位置而追加配置空中摄影位置AP21。例如,在经过被再提取的不足区域LA的空中摄影路线上,可在位于不足区域LA的内部或附近的空中摄影位置AP21的外侧,追加配置空中摄影位置AP21。在空中摄影路线c1、c2上,因有1个空中摄影位置AP21,使得重复度在阈值th以上,所以,这里可在空中摄影路线c1、c2上追加配置空中摄影位置AP21。
然后,终端控制部81可再次算出空中摄影范围A1内的各位置上的重复度OV2。此情况下,终端控制部81算出由无人飞行器100的摄像部220或摄像部230在空中摄影位置AP11及空中摄影位置AP21(也包括追加配置的位置)进行空中摄影时的重复度OV2。以下同样,当残留重复度OV2为阈值th以下的位置或不足区域LA时,终端控制部81可反复实施空中摄影位置AP21的追加配置、重复度OV2的再次算出及不足区域LA的残留确认,直至无不足区域LA残留为止。
由此,不存在重复度OV(OV1、OV2)为阈值th以下的空中摄影路线,终端80能确保用户所期望的一定的重复度OV,即整个空中摄影范围A1内不存在不足区域LA。
这样,终端80能通过根据不足区域LA的位置生成空中摄影位置AP21,而在例如不足区域LA的附近配置空中摄影位置AP21。因此,终端80能改善不足区域LA中重复度OV的不足。
而且,终端80通过在经过不足区域LA的空中摄影路线上,在存在于空中摄影路线的不足区域LA侧的端部的空中摄影位置AP11的外侧的位置生成空中摄影位置AP21,从而能从空中摄影范围A1的周端部侧改善重复度OV。因此,能重点改善重复度OV容易不足的空中摄影范围A1的周端部侧的重复度OV。而且,终端80无需进行用于改善已获得充分重复度OV的位置的重复度OV的空中摄影,所以空中摄影张数可少,能提高重复度OV的改善效率。
而且,当存在重复度OV2为阈值th以下的位置时,终端80通过追加生成空中摄影位置AP21,即便空中摄影位置AP21的配置对重复度OV1的改善不充分,也能预见重复度OV2的进一步改善。因此,终端80若例如追加空中摄影位置AP21直至达到用户所期望的重复度OV2为止,则能使重复度OV2不足的不足区域LA消失。
终端控制部81生成经过空中摄影位置AP11、及以上述方式生成且配置的空中摄影位置AP21的空中摄影路径AP22。例如,可依次连接包含空中摄影位置AP11或空中摄影位置AP21的各空中摄影路线,生成空中摄影路径AP22。例如,可将存在于相邻的空中摄影路线的端部的空中摄影位置AP11或AP21连接,从而生成空中摄影路径AP22。而且,空中摄影路径AP22的生成方法并不限于此,也可将任意的空中摄影位置AP11、AP21连结而生成空中摄影路径AP22。此情况下,空中摄影路径AP22只要能在整个空中摄影范围A1内确保阈值th以上的重复度OV即可,也可并非由各空中摄影位置AP11、AP21以成为最短路径的方式连接而成的路径。空中摄影路径AP22为第2摄像路径的一例。
图11是表示经过空中摄影位置AP11、AP21的空中摄影路径AP22的一例的图。图11中,以如下方式生成空中摄影路径AP22,即,以空中摄影路线c4的右端为起点,从空中摄影路线c4的左端连接到空中摄影路线c3的左端,从空中摄影路线c3的右端连接到空中摄影路线c2的右端,从空中摄影路线c2的左端连接到空中摄影路线c1的左端,以空中摄影路线c1的右端为终点。
接着,说明空中摄影路径生成系统10的动作例。
本实施方式中,空中摄影路径的生成动作例如由终端80实施。图12是表示终端80的动作例的流程图。
首先,终端控制部81获取空中摄影范围A1(S11)。终端控制部81生成经过用于在空中摄影范围A1内进行空中摄影的空中摄影位置AP11的空中摄影路径AP12。终端控制部81针对空中摄影范围A1内的每个位置,算出由无人飞行器100的摄像部220或摄像部230在空中摄影位置AP11进行空中摄影时的重复度OV。即,终端控制 部81算出空中摄影范围A1内的各位置上的重复度分布(S13)。
终端控制部81根据空中摄影范围A1内的各位置上的重复度OV提取不足区域LA(S14)。终端控制部81根据不足区域LA生成并配置空中摄影位置AP21(S15)。利用空中摄影位置AP21,可改善仅在空中摄影位置AP11上的空中摄影中不足的重复度OV。终端控制部81将空中摄影位置AP21追加到空中摄影路径AP12中,生成空中摄影路径AP22(S16)。即,终端控制部81生成经过空中摄影位置AP11、AP21的空中摄影路径AP22。
终端控制部81输出空中摄影位置AP11、AP21及空中摄影路径AP22的信息(S17)。例如,终端控制部81可通过通信部85将空中摄影位置AP11、AP21及空中摄影路径AP22的信息发送到无人飞行器100。终端控制部81可将空中摄影位置AP11、AP21及空中摄影路径AP22的信息写入并记录到作为存储器89的外部记录装置(例如SD卡)中。
无人飞行器100中,UAV控制部110获取从终端80输出的空中摄影位置AP11、AP21及空中摄影路径AP22的信息。例如,UAV控制部110可通过通信接口150接收空中摄影位置AP11、AP21及空中摄影路径AP22的信息。UAV控制部110可通过外部记录装置获取空中摄影位置AP11、AP21及空中摄影路径AP22的信息。并且,UAV控制部110设定所获取的空中摄影位置AP11、AP21及空中摄影路径AP22。此情况下,UAV控制部110可将空中摄影位置AP11、AP21及空中摄影路径AP22的信息保存在内存160中,且设为可将空中摄影位置AP11、AP21及空中摄影路径AP22的信息用于利用UAV控制部110的飞行控制中的状态。由此,无人飞行器100能沿着由终端80所生成的空中摄影路径AP22飞行,且在空中摄影位置AP11、AP21空中摄影图像。该空中摄影图像可用于例如生成空中摄影范围A1内的合成图像或立体图像。
根据这样的动作例,当在空中摄影范围A1内的任一位置上存在重复度OV不足的部位时,终端80能通过配置空中摄影位置AP21弥补重复度OV的不足。因此,终端80能增加多个图像范围GH重 复的重复张数,且能确保一定基准以上的重复度OV。尤其是,虽然在空中摄影范围A1的周端部存在重复度OV容易不足的倾向,但终端80能改善该重复度OV的不足。因此,终端80能抑制当根据所得的多个空中摄影图像生成合成图像或立体图像时画质的下降。
而且,终端80无需预先统一地确定大于空中摄影范围A1的范围来作为空中摄影位置及空中摄影路径的生成对象的范围,能根据重复度OV的不足而灵活地配置空中摄影位置AP21。所以,与预先统一地确定大于空中摄影范围A1的范围的情况相比,终端80配置无用的空中摄影位置AP21的可能性低,能兼顾提高摄像效率和确保重复度OV。
而且,终端80通过将空中摄影位置AP11、AP21、空中摄影路径AP22的信息发送给无人飞行器100,能对无人飞行器100设定空中摄影位置AP11、AP21、空中摄影路径AP22。所以,无人飞行器100能沿着由终端80生成的空中摄影路径AP22飞行,且能在空中摄影位置AP11、AP21上空中摄影图像。
本实施方式的空中摄影路径生成也可由无人飞行器100实施。此情况下,无人飞行器100的UAV控制部110具有和终端80的终端控制部81所具有的与生成空中摄影路径相关的功能相同的功能。UAV控制部110为处理部的一例。UAV控制部110进行与生成空中摄影路径相关的处理。另外,在利用UAV控制部110的与生成空中摄影路径相关的处理中,对于和终端控制部81所进行的与生成空中摄影路径相关的处理相同的处理,省略或简化其说明。
图13是表示无人飞行器100的动作例的流程图。
首先,UAV控制部110获取空中摄影范围A1(S21)。UAV控制部110生成经过用于在空中摄影范围A1内进行空中摄影的空中摄影位置AP11的空中摄影路径AP12(S22)。UAV控制部110针对空中摄影范围A1内的每个位置,算出由无人飞行器100的摄像部220或摄像部230在空中摄影位置AP11进行空中摄影时的重复度OV。即,UAV控制部110算出空中摄影范围A1内的各位置上的重复度分布(S23)。
UAV控制部110根据空中摄影范围A1内的各位置上的重复度OV而提取不足区域LA(S24)。UAV控制部110根据不足区域LA生成并配置空中摄影位置AP21(S25)。利用空中摄影位置AP21,可改善仅在空中摄影位置AP11的空中摄影中不足的重复度OV。UAV控制部110将空中摄影位置AP21追加到空中摄影路径AP12中,并生成空中摄影路径AP22(S26)。即,UAV控制部110生成经过空中摄影位置AP11、AP21的空中摄影路径AP22。
UAV控制部110设定所生成的空中摄影位置AP11、AP21及空中摄影路径AP22(S27)。此情况下,UAV控制部110可将所生成的空中摄影位置AP11、AP21及空中摄影路径AP22的信息保存在内存160中,且设为空中摄影位置AP11、AP21及空中摄影路径AP22的信息可用于利用UAV控制部110的飞行控制中的状态。由此,无人飞行器100能沿着无人飞行器100所生成的空中摄影路径AP22飞行,且能在空中摄影位置AP11、AP21上空中摄影图像。该空中摄影图像可用于例如生成空中摄影范围A1内的合成图像或立体图像。
根据这样的动作例,在空中摄影范围A1内的任一位置上存在重复度OV不足的部位时,无人飞行器100能通过配置空中摄影位置AP21而弥补重复度OV的不足,能确保一定基准以上的重复度OV。尤其是,虽然在空中摄影范围A1的周端部存在重复度OV容易不足的倾向,但无人飞行器100能改善该重复度OV的不足。因此,无人飞行器100能抑制当根据所得的多个空中摄影图像生成合成图像或立体图像时画质的下降。
而且,无人飞行器100无需预先统一地确定大于空中摄影范围A1的范围来作为空中摄影位置及空中摄影路径的生成对象的范围,能根据重复度OV的不足而灵活地配置空中摄影位置AP21。所以,与预先统一地确定大于空中摄影范围A1的范围的情况相比,无人飞行器100配置无用的空中摄影位置AP21的可能性低,而且还能兼顾提高摄像效率和确保重复度OV。
而且,无人飞行器100通过设定空中摄影位置AP11、AP21、空中摄影路径AP22,而能沿着由无人飞行器100所生成的空中摄影路 径AP22飞行,且能在空中摄影位置AP11、AP21上空中摄影图像。所以,无人飞行器100能提高空中摄影出的图像的加工(例如合成图像的生成或立体图像的生成)的加工精度,能提高经加工所得的图像的画质。
另外,当无人飞行器100生成空中摄影路径时,终端80中,终端控制部81可进行处理以帮助(例如对于终端80的操作部83的各种操作或利用显示部88的各种显示)生成空中摄影路径。例如,无人飞行器100中,UAV控制部110可通过通信接口150,将重复度映射图OM或作为重复度映射图OM的基础的空中摄影范围A1内的每个位置的重复度OV的信息发送给终端80。终端控制部81可通过通信部85从无人飞行器100获取信息,并将重复度映射图OM显示在显示部88。
而且,用户可一面对显示部88上显示的重复度映射图OM进行确认,一面例如通过终端80的操作部83进行输入以在重复度OV不足的位置(例如不足区域LA)的附近配置空中摄影位置AP21。这样,利用终端80的各种操作输入及显示可辅助无人飞行器100生成空中摄影路径。
本实施方式中,假设的是由无人飞行器100空中摄影图像,但也可由无人飞行器100以外的移动体(例如车辆)拍摄图像。当生成用于利用这样的移动体来拍摄图像的摄像路径时,也可应用本实施方式。
以上,使用实施方式对本公开进行了说明,但本公开的技术范围并不限于上述实施方式中记载的范围。对本领域普通技术人员来说,显然可以对上述实施方式加以各种变更或改良。从权利要求书的记载也可明白,加以了这样的变更或改良的方式也都可包含在本公开的技术范围之内。
权利要求书、说明书、以及说明书附图中所示的装置、系统、程序以及方法中的动作、顺序、步骤、以及阶段等各项处理的执行顺序,只要没有特别明示“在...之前”、“事先”等,只要前面处理的输出并不用在后面的处理中,则可以以任意顺序实现。关于权利要求书、说明 书以及说明书附图中的动作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。
符号说明
10  空中摄影路径生成系统
80  终端
81  终端控制部
83  操作部
85  通信部
87  内存
88  显示部
89  存储器
100 无人飞行器
110 UAV控制部
150 通信接口
160 内存
170 存储器
200 云台
210 旋翼机构
220、230摄像部
240 GPS接收器
250 惯性测量装置
260 磁罗盘
270 气压高度计
280 超声波传感器
290 激光测定仪
A1  空中摄影范围
AP11、AP21空中摄影位置
AP12、AP22空中摄影路径
c1、c2、c3、c4空中摄影路线
GH1、GH2、GH3图像范围
LA  不足区域
p1  位置

Claims (19)

  1. 一种移动平台,其是生成用于利用移动体进行摄像的摄像路径的移动平台,该移动平台具备进行与生成所述摄像路径相关的处理的处理部,
    所述处理部获取摄像范围的信息,生成经过用于对所述摄像范围进行摄像的第一摄像位置的第一摄像路径,针对所述摄像范围内所含的每个位置,算出在所述第一摄像位置进行摄像时摄像图像的图像范围的重复度即第一重复度,当存在所述第一重复度为阈值以下的位置时,生成对所述摄像范围的摄像进行补充的第二摄像位置,生成经过所述第一摄像位置及所述第二摄像位置的第二摄像路径。
  2. 如权利要求1所述的移动平台,其中,
    所述处理部提取所述第一重复度为所述阈值以下的不足区域,根据所述不足区域的位置,生成所述第二摄像位置。
  3. 如权利要求2所述的移动平台,其中,
    所述第一摄像路径包含多个摄像路线,
    所述处理部在所述多个摄像路线中的经过所述不足区域的摄像路线上,在存在于所述摄像路线的所述不足区域侧的端部的所述第一摄像位置的外侧的位置生成所述第二摄像位置。
  4. 如权利要求1至3中任一项所述的移动平台,其中,
    所述处理部针对所述摄像范围内所含的每个位置,算出在所述第一摄像位置及所述第二摄像位置进行摄像时摄像图像的图像范围的重复度即第二重复度,当存在所述第二重复度为所述阈值以下的位置时,追加生成所述第二摄像位置。
  5. 如权利要求1至4中任一项所述的移动平台,其中,
    所述处理部根据所述第一摄像位置及在所述第一摄像位置进行摄像时所述移动体的移动参数及摄像参数,算出所述第一重复度。
  6. 如权利要求1至5中任一项所述的移动平台,其中,
    所述移动平台是终端,
    所述处理部将所述第一摄像位置、所述第2摄像位置、及所述第2摄像路径的信息发送给所述移动体。
  7. 如权利要求1至6中任一项所述的移动平台,其中,
    所述移动平台是终端,
    所述处理部生成表示所述摄像范围内所含的每个位置的所述第一重复度的分布的图像,并显示所述图像。
  8. 如权利要求1至5中任一项所述的移动平台,其中,
    所述移动平台是所述移动体,
    所述处理部设定所述第一摄像位置、所述第二摄像位置、及所述第2摄像路径。
  9. 如权利要求1至8中任一项所述的移动平台,其中,
    所述移动体包含飞行体,
    所述摄像包含航拍。
  10. 一种摄像路径生成方法,其是生成用于利用移动体进行摄像的摄像路径的移动平台的摄像路径生成方法,其具有如下步骤:
    获取摄像范围的信息;
    生成经过用于对所述摄像范围进行摄像的第1摄像位置的第1摄像路径;
    针对所述摄像范围内所含的每个位置,算出在所述第一摄像位置进行摄像时摄像图像的图像范围的重复度即第一重复度;
    当存在所述第一重复度为阈值以下的位置时,生成对所述摄像范围的摄像进行补充的第二摄像位置;及
    生成经过所述第一摄像位置及所述第二摄像位置的第二摄像路径。
  11. 如权利要求10所述的摄像路径生成方法,其中,
    生成所述第二摄像位置的步骤包括如下步骤:
    提取所述第一重复度为所述阈值以下的不足区域;及
    根据所述不足区域的位置,生成所述第二摄像位置。
  12. 如权利要求11所述的摄像路径生成方法,其中,
    所述第一摄像路径包含多个摄像路线,
    生成所述第2摄像位置的步骤包括如下步骤:在所述多个摄像路线中的经过所述不足区域的摄像路线上,在存在于所述摄像路线的所述不足区域侧的端部的所述第一摄像位置的外侧的位置生成所述第二摄像位置。
  13. 如权利要求10至12中任一项所述的摄像路径生成方法,
    其还包括如下步骤:针对所述摄像范围内所含的每个位置,算出在所述第一摄像位置及所述第二摄像位置进行摄像时摄像图像的图像范围的重复度即第二重复度,
    生成所述第二摄像位置的步骤包括如下步骤:当存在所述第二重复度为所述阈值以下的位置时,追加生成所述第二摄像位置。
  14. 如权利要求10至13中任一项所述的摄像路径生成方法,其中,
    算出所述第一重复度步骤包括如下步骤:根据在所述第1摄像位置及所述第一摄像位置进行摄像时所述移动体的移动参数及摄像参数,算出所述第一重复度。
  15. 如权利要求10至14中任一项所述的摄像路径生成方法,其中,
    所述移动平台是终端,
    还包括如下步骤:将所述第一摄像位置、所述第二摄像位置、及所述第二摄像路径的信息发送给所述移动体。
  16. 如权利要求10至15中任一项所述的摄像路径生成方法,其中,
    所述移动平台是终端,
    还包括如下步骤:
    生成表示所述摄像范围内所含的每个位置的所述第一重复度的分布的图像;及
    显示所述图像。
  17. 如权利要求10至14中任一项所述的摄像路径生成方法,其中,
    所述移动平台是所述移动体,
    还包括如下步骤:设定所述第一摄像位置、所述第二摄像位置、及所述第二摄像路径。
  18. 如权利要求10至17中任一项所述的摄像路径生成方法,其中,
    所述移动体包含飞行体,
    所述摄像包含空中摄影。
  19. 一种计算机可读记录介质,其记录有用于使生成用于利用移动体进行摄像的摄像路径的移动平台执行如下步骤的程序:
    获取摄像范围的信息;
    生成经过用于对所述摄像范围进行摄像的第一摄像位置的第一摄像路径;
    针对所述摄像范围内所含的每个位置,算出在所述第一摄像位置进行摄像时摄像图像的图像范围的重复度即第一重复度;
    当存在所述第一重复度为阈值以下的位置时,生成对所述摄像范围的摄像进行补充的第二摄像位置;及
    生成经过所述第一摄像位置及所述第二摄像位置的第二摄像路径。
PCT/CN2017/116542 2017-09-28 2017-12-15 移动平台、摄像路径生成方法、程序、以及记录介质 WO2019061859A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201780065313.7A CN109891188B (zh) 2017-09-28 2017-12-15 移动平台、摄像路径生成方法、程序、以及记录介质
US16/818,617 US20200217665A1 (en) 2017-09-28 2020-03-13 Mobile platform, image capture path generation method, program, and recording medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-188023 2017-09-28
JP2017188023A JP2019060827A (ja) 2017-09-28 2017-09-28 モバイルプラットフォーム、撮像経路生成方法、プログラム、及び記録媒体

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/818,617 Continuation US20200217665A1 (en) 2017-09-28 2020-03-13 Mobile platform, image capture path generation method, program, and recording medium

Publications (1)

Publication Number Publication Date
WO2019061859A1 true WO2019061859A1 (zh) 2019-04-04

Family

ID=65900460

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/116542 WO2019061859A1 (zh) 2017-09-28 2017-12-15 移动平台、摄像路径生成方法、程序、以及记录介质

Country Status (4)

Country Link
US (1) US20200217665A1 (zh)
JP (1) JP2019060827A (zh)
CN (1) CN109891188B (zh)
WO (1) WO2019061859A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7274978B2 (ja) * 2019-08-20 2023-05-17 株式会社クボタ 飛行体の支援システム
JP7486805B2 (ja) 2020-09-16 2024-05-20 国立研究開発法人農業・食品産業技術総合研究機構 情報処理装置、システム、情報処理方法、及びプログラム
WO2024053307A1 (ja) * 2022-09-09 2024-03-14 富士フイルム株式会社 空撮計画作成装置及び方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2787319A1 (de) * 2013-04-05 2014-10-08 Leica Geosystems AG Steuerung einer Bildauslösung zur Luftbilderfassung in Nadir-Ausrichtung für ein unbemanntes Fluggerät
CN105444740A (zh) * 2016-01-01 2016-03-30 三峡大学 一种基于小型无人机遥感辅助滑坡应急治理工程勘查设计方法
CN105606073A (zh) * 2016-01-11 2016-05-25 谭圆圆 无人飞行器处理系统及其飞行状态数据处理方法
CN106296816A (zh) * 2016-08-01 2017-01-04 清华大学深圳研究生院 用于三维模型重建的无人机路径确定方法及装置
CN106647804A (zh) * 2016-12-01 2017-05-10 深圳创动科技有限公司 一种自动巡检方法及系统

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4523833B2 (ja) * 2004-11-18 2010-08-11 株式会社パスコ 撮影計画支援装置及びそのためのプログラム
JP4988673B2 (ja) * 2008-09-01 2012-08-01 株式会社日立製作所 撮影計画作成システム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2787319A1 (de) * 2013-04-05 2014-10-08 Leica Geosystems AG Steuerung einer Bildauslösung zur Luftbilderfassung in Nadir-Ausrichtung für ein unbemanntes Fluggerät
CN105444740A (zh) * 2016-01-01 2016-03-30 三峡大学 一种基于小型无人机遥感辅助滑坡应急治理工程勘查设计方法
CN105606073A (zh) * 2016-01-11 2016-05-25 谭圆圆 无人飞行器处理系统及其飞行状态数据处理方法
CN106296816A (zh) * 2016-08-01 2017-01-04 清华大学深圳研究生院 用于三维模型重建的无人机路径确定方法及装置
CN106647804A (zh) * 2016-12-01 2017-05-10 深圳创动科技有限公司 一种自动巡检方法及系统

Also Published As

Publication number Publication date
JP2019060827A (ja) 2019-04-18
US20200217665A1 (en) 2020-07-09
CN109891188B (zh) 2022-03-04
CN109891188A (zh) 2019-06-14

Similar Documents

Publication Publication Date Title
JP6962775B2 (ja) 情報処理装置、空撮経路生成方法、プログラム、及び記録媒体
JP6803800B2 (ja) 情報処理装置、空撮経路生成方法、空撮経路生成システム、プログラム、及び記録媒体
JP6765512B2 (ja) 飛行経路生成方法、情報処理装置、飛行経路生成システム、プログラム及び記録媒体
JP6583840B1 (ja) 検査システム
JP6878194B2 (ja) モバイルプラットフォーム、情報出力方法、プログラム、及び記録媒体
JP2019115012A (ja) 情報処理装置、飛行制御指示方法、プログラム、及び記録媒体
WO2018214401A1 (zh) 移动平台、飞行体、支持装置、便携式终端、摄像辅助方法、程序以及记录介质
CN111344650B (zh) 信息处理装置、飞行路径生成方法、程序以及记录介质
US20200217665A1 (en) Mobile platform, image capture path generation method, program, and recording medium
US20210185235A1 (en) Information processing device, imaging control method, program and recording medium
US20210229810A1 (en) Information processing device, flight control method, and flight control system
JP2019028560A (ja) モバイルプラットフォーム、画像合成方法、プログラム、及び記録媒体
JP6875269B2 (ja) 情報処理装置、飛行制御指示方法、プログラム、及び記録媒体
JP6790206B1 (ja) 制御装置、制御方法、プログラム、及び記録媒体
WO2020001629A1 (zh) 信息处理装置、飞行路径生成方法、程序以及记录介质
WO2020119572A1 (zh) 形状推断装置、形状推断方法、程序以及记录介质
CN114586335A (zh) 图像处理装置、图像处理方法、程序及记录介质
CN112313942A (zh) 一种进行图像处理和框架体控制的控制装置
JP2020016664A (ja) 検査システム
WO2020108290A1 (zh) 图像生成装置、图像生成方法、程序以及记录介质
CN111615616A (zh) 位置推定装置、位置推定方法、程序以及记录介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17927669

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17927669

Country of ref document: EP

Kind code of ref document: A1