WO2018193574A1 - Procédé de production de trajectoire de vol, dispositif de traitement d'informations, système de production de trajectoire de vol, programme et support d'enregistrement - Google Patents

Procédé de production de trajectoire de vol, dispositif de traitement d'informations, système de production de trajectoire de vol, programme et support d'enregistrement Download PDF

Info

Publication number
WO2018193574A1
WO2018193574A1 PCT/JP2017/015876 JP2017015876W WO2018193574A1 WO 2018193574 A1 WO2018193574 A1 WO 2018193574A1 JP 2017015876 W JP2017015876 W JP 2017015876W WO 2018193574 A1 WO2018193574 A1 WO 2018193574A1
Authority
WO
WIPO (PCT)
Prior art keywords
flight path
imaging
imaging position
processing unit
generating
Prior art date
Application number
PCT/JP2017/015876
Other languages
English (en)
Japanese (ja)
Inventor
磊 顧
宗耀 瞿
Original Assignee
エスゼット ディージェイアイ テクノロジー カンパニー リミテッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by エスゼット ディージェイアイ テクノロジー カンパニー リミテッド filed Critical エスゼット ディージェイアイ テクノロジー カンパニー リミテッド
Priority to JP2019513156A priority Critical patent/JP6765512B2/ja
Priority to PCT/JP2017/015876 priority patent/WO2018193574A1/fr
Publication of WO2018193574A1 publication Critical patent/WO2018193574A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • B64U2101/32UAVs specially adapted for particular uses or applications for imaging, photography or videography for cartography or topography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/003Flight plan management
    • G08G5/0034Assembly of a flight plan
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0052Navigation or guidance aids for a single aircraft for cruising
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0069Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0086Surveillance aids for monitoring terrain

Definitions

  • the present disclosure relates to a flight path generation method, an information processing apparatus, a flight path generation system, a program, and a recording medium for generating a flight path of a flying object.
  • a platform for example, an unmanned air vehicle that is equipped with a photographing device and performs photographing while flying on a preset fixed route is known (for example, see Patent Document 1).
  • This platform receives a command such as a flight route and a shooting instruction from the ground base, flies in accordance with the command, performs shooting, and sends an acquired image to the ground base.
  • the platform inclines the imaging device of the platform based on the positional relationship between the platform and the imaging target while flying along the fixed path that has been set.
  • Patent Document 1 captures an image while passing through a fixed path, but does not sufficiently consider the presence of an object (for example, a building) positioned in the vertical direction from the fixed path. Therefore, it is difficult to sufficiently acquire the captured image of the side surface of the object and the other part of the captured image hidden in a part of the object that can be observed from above. Therefore, a captured image for estimating the three-dimensional shape is insufficient, and the estimation accuracy of the three-dimensional shape is lowered.
  • object for example, a building
  • the flight route on which the unmanned aircraft flies is manually determined in advance.
  • a desired position around the object is designated as an imaging position
  • the position (latitude, longitude, altitude) in the three-dimensional space is designated by user input. In this case, since each imaging position is determined by user input, user convenience is reduced. In addition, since detailed information on the object is required in advance for determining the flight path, it takes time to prepare.
  • a flight path generation method for generating a flight path of an aircraft that images a subject, the step of acquiring a schematic shape of an object included in the subject, a step of extracting a side surface in the schematic shape, Setting a corresponding imaging position and generating a flight path passing through the imaging position.
  • the step of setting the imaging position may include a step of setting an imaging position facing the side surface for each extracted side surface.
  • the step of setting the imaging position may include a step of setting a plurality of imaging positions having a predetermined imaging position interval corresponding to the side surface.
  • the step of generating the flight route may include a step of determining a shooting route passing through a plurality of imaging positions and generating a flight route including the shooting route.
  • the flight path generation method further includes a step of generating an imaging plane parallel to the side surface with a predetermined imaging distance, and the step of setting the imaging position has a predetermined imaging position interval on the imaging plane.
  • a step of setting a plurality of imaging positions may be included.
  • the step of setting the imaging position may use an imaging position interval in which a part of the captured image captured at each imaging position overlaps with the other as the predetermined imaging position interval.
  • the flight path generation method further includes a step of calculating a polyhedron surrounding the general shape of the object, and the step of extracting the side surface is a surface along the vertical direction in the polyhedron or a surface standing within a predetermined angle range in the vertical direction. May be included as a side.
  • the flight path generation method further includes a step of calculating a polyhedron in which the schematic shape of the object is simplified, and the step of extracting the side surface stands on a plane along the vertical direction of the polyhedron or within a predetermined angle range in the vertical direction.
  • a step of extracting the face as a side face may be included.
  • the step of calculating the polyhedron may include a step of calculating a polyhedron corresponding to a plurality of approximate shapes of the object, and combining a plurality of adjacent polyhedrons.
  • the step of generating the flight path may include a step of generating a flight path that passes through the imaging position in one of the side surfaces and a flight path that passes through the imaging position in the next side surface adjacent to the side surface.
  • the flight path generation method further includes the step of acquiring a captured image obtained by capturing the object downward, and the step of acquiring the approximate shape includes acquiring three-dimensional shape data of the approximate shape of the object using the captured image. May be included.
  • an information processing apparatus that generates a flight path of a flying object that captures an image of a subject, and includes a processing unit that executes processing related to the flight path, and the processing unit acquires a schematic shape of an object included in the subject Then, the information processing apparatus extracts a side surface in a schematic shape, sets an imaging position corresponding to the side surface, and generates a flight path passing through the imaging position.
  • the processing unit may set an imaging position facing the side surface for each extracted side surface in setting the imaging position.
  • the processing unit may set a plurality of imaging positions having a predetermined imaging position interval corresponding to the side surface in setting the imaging position.
  • the processing unit may determine a shooting path that passes through a plurality of imaging positions and generate a flight path including the shooting path.
  • the processing unit further generates a shooting plane parallel to the side surface with a predetermined shooting distance, and sets a plurality of shooting positions having a predetermined shooting position interval on the shooting plane in the setting of the shooting position. Good.
  • the processing unit may use an imaging position interval in which a part of the captured image captured at each imaging position overlaps with the other as the predetermined imaging position interval.
  • the processing unit may further calculate a polyhedron surrounding the approximate shape of the object, and in extracting the side surface, a surface along the vertical direction of the polyhedron or a surface standing within a predetermined angle range in the vertical direction may be extracted as the side surface.
  • the processing unit further calculates a polyhedron in which the schematic shape of the object is simplified, and in the side surface extraction, extracts a surface along the vertical direction in the polyhedron or a surface standing within a predetermined angle range in the vertical direction as a side surface. Good.
  • the processing unit may calculate a polyhedron corresponding to a plurality of approximate shapes of the object, and combine a plurality of adjacent polyhedrons.
  • the processing unit may generate a flight path that passes through the imaging position on one of the side surfaces, and a flight path that passes through the imaging position on the next side surface adjacent to the side surface.
  • the processing unit may further acquire a captured image obtained by capturing the object downward, and acquire the three-dimensional shape data of the approximate shape of the object using the captured image in acquiring the approximate shape.
  • a flight path generation system having a flying body that captures an image of a subject and a processing unit that generates a flight path of the flying body, the processing unit acquires a schematic shape of an object included in the subject, A flight path generation system that extracts a side surface in a schematic shape, sets an imaging position corresponding to the side surface, generates a flight path that passes through the imaging position, and obtains and sets a flight path.
  • a program corresponds to a side, a step of acquiring a rough shape of an object included in the subject, a step of extracting a side surface in the rough shape, and a computer that generates a flight path of a flying object that images the subject.
  • the recording medium corresponds to a step of obtaining a rough shape of an object included in the subject, a step of extracting a side surface in the rough shape, and a computer that generates a flight path of a flying object that images the subject.
  • a computer-readable recording medium recording a program for executing a step of setting an imaging position to be performed and a step of generating a flight path passing through the imaging position.
  • the flight path generation system includes a flying object as an example of a moving object and a platform for remotely controlling the operation or processing of the flying object.
  • the information processing apparatus is a computer included in at least one of the platform and the flying object, and executes various processes related to the operation of the flying object.
  • the flying object includes an aircraft (for example, drone, helicopter) moving in the air.
  • the flying body may be an unmanned flying vehicle (UAV: Unmanned ⁇ Aerial Vehicle) having an imaging device.
  • UAV Unmanned ⁇ Aerial Vehicle
  • the flying object flies along a predetermined flight path in order to image a subject in the imaging range (for example, a ground shape of a building, road, park, etc. within a certain range), and is set on the flight path.
  • the subject is imaged at a plurality of imaging positions.
  • the subject includes objects such as buildings and roads, for example.
  • the platform is a computer, for example, a transmitter for instructing remote control of various processes including movement of the flying object, or a communication terminal connected to the transmitter or the flying object so as to be able to input and output information and data.
  • the communication terminal may be, for example, a mobile terminal, a PC (Personal Computer), or the like. Note that the flying object itself may be included as a platform.
  • the flight path generation method defines various processes (steps) in an information processing apparatus (platform, flying object) or flight path generation system.
  • the program according to the present disclosure is a program for causing an information processing device (platform, flying object) or a flight path generation system to execute various processes (steps).
  • the recording medium stores a program (that is, a program for causing the information processing apparatus (platform, flying object) or the flight path generation system to execute various processes (steps)).
  • a program that is, a program for causing the information processing apparatus (platform, flying object) or the flight path generation system to execute various processes (steps)).
  • an unmanned aerial vehicle (UAV) is exemplified as the flying object.
  • UAV unmanned aerial vehicle
  • the unmanned air vehicle sets a flight path including an imaging position at which the side surface of the object can be imaged.
  • FIG. 1 is a schematic diagram illustrating a first configuration example of a flight path generation system 10 according to the embodiment.
  • the flight path generation system 10 includes an unmanned air vehicle 100, a transmitter 50, and a portable terminal 80.
  • the unmanned air vehicle 100, the transmitter 50, and the portable terminal 80 can communicate with each other using wired communication or wireless communication (for example, a wireless local area network (LAN) or Bluetooth (registered trademark)).
  • the transmitter 50 is used in a state of being held by both hands of a person who uses the transmitter 50 (hereinafter referred to as “user”), for example.
  • user a person who uses the transmitter 50
  • FIG. 2 is a diagram showing an example of the appearance of the unmanned air vehicle 100.
  • FIG. 3 is a diagram illustrating an example of a specific external appearance of the unmanned air vehicle 100.
  • a side view when the unmanned air vehicle 100 flies in the moving direction STV0 is shown in FIG. 2, and a perspective view when the unmanned air vehicle 100 flies in the moving direction STV0 is shown in FIG.
  • the unmanned air vehicle 100 is an example of a moving body that includes the imaging devices 220 and 230 as an example of an imaging unit and moves.
  • the moving body is a concept including, in addition to the unmanned air vehicle 100, other aircraft that moves in the air, vehicles that move on the ground, ships that move on the water, and the like.
  • the roll axis (see the x-axis in FIGS. 2 and 3) is defined in a direction parallel to the ground and along the movement direction STV0.
  • a pitch axis (see the y-axis in FIGS. 2 and 3) is defined in a direction parallel to the ground and perpendicular to the roll axis, and further, a direction perpendicular to the ground and perpendicular to the roll axis and the pitch axis.
  • the yaw axis (the z axis in FIGS. 2 and 3) is defined.
  • the unmanned air vehicle 100 includes a UAV main body 102, a gimbal 200, an imaging device 220, and a plurality of imaging devices 230.
  • the unmanned aerial vehicle 100 moves based on a remote control instruction transmitted from a transmitter 50 as an example of a platform.
  • the movement of the unmanned air vehicle 100 means a flight, and includes at least ascending, descending, left turning, right turning, left horizontal movement, and right horizontal movement.
  • the UAV main body 102 includes a plurality of rotor blades (propellers).
  • the UAV main body 102 moves the unmanned air vehicle 100 by controlling the rotation of a plurality of rotor blades.
  • the UAV main body 102 moves the unmanned aerial vehicle 100 using, for example, four rotary wings.
  • the number of rotor blades is not limited to four.
  • the unmanned air vehicle 100 may be a fixed wing aircraft that does not have rotating wings.
  • the imaging device 220 is an imaging camera that images a subject (for example, a building on the ground) included in a desired imaging range.
  • the subject may include, for example, an object such as a building and the like, an aerial view of the unmanned air vehicle 100, and a landscape such as a mountain or a river.
  • the plurality of imaging devices 230 are sensing cameras that image the surroundings of the unmanned air vehicle 100 in order to control the movement of the unmanned air vehicle 100.
  • Two imaging devices 230 may be provided on the front surface that is the nose of the unmanned air vehicle 100.
  • the other two imaging devices 230 may be provided on the bottom surface of the unmanned air vehicle 100.
  • the two imaging devices 230 on the front side may be paired and function as a so-called stereo camera.
  • the two imaging devices 230 on the bottom side may also be paired and function as a stereo camera.
  • Three-dimensional spatial data around the unmanned air vehicle 100 may be generated based on images captured by the plurality of imaging devices 230.
  • the number of imaging devices 230 included in the unmanned air vehicle 100 is not limited to four.
  • the unmanned air vehicle 100 only needs to include at least one imaging device 230.
  • the unmanned air vehicle 100 may include at least one imaging device 230 on each of the nose, the tail, the side surface, the bottom surface, and the ceiling surface of the unmanned air vehicle 100.
  • the angle of view that can be set by the imaging device 230 may be wider than the angle of view that can be set by the imaging device 220.
  • the imaging device 230 may have a single focus lens or a fisheye lens.
  • FIG. 4 is a block diagram showing an example of a hardware configuration of the unmanned air vehicle 100 constituting the flight path generation system 10 of FIG.
  • the unmanned air vehicle 100 includes a processing unit 110, a communication interface 150, a memory 160, a storage 170, a battery 190, a gimbal 200, a rotary wing mechanism 210, an imaging device 220, an imaging device 230, and GPS reception.
  • Machine 240 inertial measurement unit (IMU: Inertial Measurement Unit) 250, magnetic compass 260, barometric altimeter 270, ultrasonic sensor 280, and laser measuring instrument 290.
  • the communication interface 150 is an example of a communication unit.
  • the processing unit 110 is configured using a processor such as a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or a DSP (Digital Signal Processor).
  • the processing unit 110 performs signal processing for overall control of operations of each unit of the unmanned air vehicle 100, data input / output processing with other units, data calculation processing, and data storage processing.
  • the processing unit 110 controls the flight of the unmanned air vehicle 100 according to the program stored in the memory 160.
  • the processing unit 110 controls the movement (that is, the flight) of the unmanned air vehicle 100 according to a command received from the remote transmitter 50 via the communication interface 150.
  • the memory 160 may be removable from the unmanned air vehicle 100.
  • the processing unit 110 acquires image data of a subject imaged by the imaging device 220 and the imaging device 230 (hereinafter sometimes referred to as “captured image”).
  • the processing unit 110 controls the gimbal 200, the rotary blade mechanism 210, the imaging device 220, and the imaging device 230.
  • the processing unit 110 controls the imaging range of the imaging device 220 by changing the imaging direction or angle of view of the imaging device 220.
  • the processing unit 110 controls the imaging range of the imaging device 220 supported by the gimbal 200 by controlling the rotation mechanism of the gimbal 200.
  • the imaging range refers to a geographical range captured by the imaging device 220 or the imaging device 230.
  • the imaging range is defined by latitude, longitude, and altitude.
  • the imaging range may be a range in three-dimensional spatial data defined by latitude, longitude, and altitude.
  • the imaging range is specified based on the angle of view and imaging direction of the imaging device 220 or the imaging device 230, and the position where the unmanned air vehicle 100 is present.
  • the imaging directions of the imaging device 220 and the imaging device 230 are defined from the azimuth and the depression angle in which the front surface where the imaging lenses of the imaging device 220 and the imaging device 230 are provided is directed.
  • the imaging direction of the imaging device 220 is a direction specified from the nose direction of the unmanned air vehicle 100 and the posture state of the imaging device 220 with respect to the gimbal 200.
  • the imaging direction of the imaging device 230 is a direction specified from the nose direction of the unmanned air vehicle 100 and the position where the imaging device 230 is provided.
  • the processing unit 110 controls the flight of the unmanned air vehicle 100 by controlling the rotary wing mechanism 210. That is, the processing unit 110 controls the position including the latitude, longitude, and altitude of the unmanned air vehicle 100 by controlling the rotary wing mechanism 210.
  • the processing unit 110 may control the imaging ranges of the imaging device 220 and the imaging device 230 by controlling the flight of the unmanned air vehicle 100.
  • the processing unit 110 may control the angle of view of the imaging device 220 by controlling a zoom lens included in the imaging device 220.
  • the processing unit 110 may control the angle of view of the imaging device 220 by digital zoom using the digital zoom function of the imaging device 220.
  • the processing unit 110 captures the subject in the horizontal direction, the predetermined angle direction, or the vertical direction by the imaging device 220 or the imaging device 230 at an imaging position (waypoint to be described later) existing in the middle of the set flight path.
  • the direction of the predetermined angle is a direction of a predetermined angle suitable for the information processing apparatus (unmanned air vehicle or platform) to estimate the three-dimensional shape of the subject.
  • the processing unit 110 moves the unmanned air vehicle 100 to a specific position at a specific date and time to perform a desired operation under a desired environment.
  • the imaging range can be captured by the imaging device 220.
  • the processing unit 110 moves the unmanned air vehicle 100 to a specific position at the specified date and time, thereby In this environment, the imaging device 220 can capture a desired imaging range.
  • the communication interface 150 communicates with the transmitter 50.
  • the communication interface 150 receives various commands for the processing unit 110 from the remote transmitter 50.
  • the memory 160 is an example of a storage unit.
  • the memory 160 includes a gimbal 200, a rotating blade mechanism 210, an imaging device 220, an imaging device 230, a GPS receiver 240, an inertial measurement device 250, a magnetic compass 260, a barometric altimeter 270, an ultrasonic sensor 280, and laser measurement.
  • a program and the like necessary for controlling the device 290 are stored.
  • the memory 160 stores captured images captured by the imaging devices 220 and 230.
  • the memory 160 may be a computer-readable recording medium, such as SRAM (Static Random Access Memory), DRAM (Dynamic Random Access Memory), EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), and It may include at least one flash memory such as a USB memory.
  • the memory 160 may be provided inside the UAV main body 102. It may be provided so as to be removable from the UAV main body 102.
  • the storage 170 is an example of a storage unit.
  • the storage 170 accumulates and holds various data and information.
  • the storage 170 may be an HDD (Hard Disk Drive), an SSD (Solid State Drive), a memory card, a USB memory, or the like.
  • the storage 170 may be provided inside the UAV main body 102.
  • the storage 170 may be provided so as to be removable from the UAV main body 102.
  • the battery 190 has a function as a drive source of each part of the unmanned air vehicle 100 and supplies necessary power to each part of the unmanned air vehicle 100.
  • the gimbal 200 supports the imaging device 220 to be rotatable about at least one axis.
  • the gimbal 200 may support the imaging device 220 rotatably about the yaw axis, pitch axis, and roll axis.
  • the gimbal 200 may change the imaging direction of the imaging device 220 by rotating the imaging device 220 about at least one of the yaw axis, the pitch axis, and the roll axis.
  • the rotary blade mechanism 210 includes a plurality of rotary blades and a plurality of drive motors that rotate the plurality of rotary blades.
  • the imaging device 220 captures a subject within a desired imaging range and generates captured image data.
  • Image data obtained by imaging by the imaging device 220 is stored in a memory included in the imaging device 220 or the memory 160.
  • the imaging device 230 captures the surroundings of the unmanned air vehicle 100 and generates captured image data. Image data of the imaging device 230 is stored in the memory 160.
  • the GPS receiver 240 receives a plurality of signals indicating times and positions (coordinates) of each GPS satellite transmitted from a plurality of navigation satellites (that is, GPS satellites).
  • the GPS receiver 240 calculates the position of the GPS receiver 240 (that is, the position of the unmanned air vehicle 100) based on the received signals.
  • the GPS receiver 240 outputs position information of the unmanned air vehicle 100 to the processing unit 110.
  • the calculation of the position information of the GPS receiver 240 may be performed by the processing unit 110 instead of the GPS receiver 240. In this case, information indicating the time and the position of each GPS satellite included in the plurality of signals received by the GPS receiver 240 is input to the processing unit 110.
  • the inertial measurement device 250 detects the attitude of the unmanned air vehicle 100 and outputs the detection result to the processing unit 110.
  • the inertial measurement device 250 uses, as the attitude of the unmanned aerial vehicle 100, accelerations in the three axial directions of the front and rear, left and right, and up and down of the unmanned air vehicle 100, and angular velocities in the three axial directions of the pitch axis, roll axis, and yaw axis. To detect.
  • the magnetic compass 260 detects the nose direction of the unmanned air vehicle 100 and outputs the detection result to the processing unit 110.
  • the barometric altimeter 270 detects the altitude at which the unmanned air vehicle 100 flies, and outputs the detection result to the processing unit 110.
  • the ultrasonic sensor 280 emits ultrasonic waves, detects ultrasonic waves reflected by the ground or an object, and outputs the detection results to the processing unit 110.
  • the detection result may indicate a distance (that is, altitude) from the unmanned air vehicle 100 to the ground, for example.
  • the detection result may indicate a distance from the unmanned air vehicle 100 to the object, for example.
  • Laser measuring device 290 irradiates the object with laser light, receives the reflected light reflected by the object, and measures the distance between unmanned air vehicle 100 and the object using the reflected light.
  • the distance measurement result is input to the processing unit 110.
  • the distance measurement method using laser light may be a time-of-flight method.
  • the processing unit 110 may specify the environment around the unmanned air vehicle 100 by analyzing a plurality of images captured by the plurality of imaging devices 230. Based on the environment around the unmanned air vehicle 100, the processing unit 110 controls flight while avoiding obstacles, for example.
  • the processing unit 110 may generate three-dimensional spatial data around the unmanned air vehicle 100 based on a plurality of images captured by the plurality of imaging devices 230, and control the flight based on the three-dimensional spatial data.
  • the processing unit 110 acquires date / time information indicating the current date / time.
  • the processing unit 110 may acquire date / time information indicating the current date / time from the GPS receiver 240.
  • the processing unit 110 may acquire date / time information indicating the current date / time from a timer (not shown) mounted on the unmanned air vehicle 100.
  • the processing unit 110 acquires position information indicating the position of the unmanned air vehicle 100.
  • the processing unit 110 may acquire position information indicating the latitude, longitude, and altitude where the unmanned air vehicle 100 exists from the GPS receiver 240.
  • the processing unit 110 receives latitude and longitude information indicating the latitude and longitude where the unmanned air vehicle 100 exists from the GPS receiver 240, and altitude information indicating the altitude where the unmanned air vehicle 100 exists from the barometric altimeter 270 or the ultrasonic sensor 280, respectively. It may be acquired as position information.
  • the processing unit 110 may acquire orientation information indicating the orientation of the unmanned air vehicle 100 from the magnetic compass 260.
  • the orientation information may indicate an orientation corresponding to the nose orientation of the unmanned air vehicle 100, for example.
  • the processing unit 110 may acquire position information indicating a position where the unmanned air vehicle 100 should be present when the imaging device 220 captures an imaging range to be imaged.
  • the processing unit 110 may acquire position information indicating a position where the unmanned air vehicle 100 should exist from the memory 160.
  • the processing unit 110 may acquire position information indicating a position where the unmanned air vehicle 100 should be present from another device such as the transmitter 50 via the communication interface 150.
  • the processing unit 110 refers to the three-dimensional map database, specifies a position where the unmanned air vehicle 100 can exist in order to capture an image capturing range to be imaged, and the unmanned air vehicle 100 should exist at that position. You may acquire as positional information which shows a position.
  • the processing unit 110 may acquire imaging information indicating the imaging ranges of the imaging device 220 and the imaging device 230, respectively.
  • the processing unit 110 may acquire angle-of-view information indicating the angle of view of the imaging device 220 and the imaging device 230 from the imaging device 220 and the imaging device 230 as parameters for specifying the imaging range.
  • the processing unit 110 may acquire information indicating the imaging direction of the imaging device 220 and the imaging device 230 as a parameter for specifying the imaging range.
  • the processing unit 110 may acquire posture information indicating the posture state of the imaging device 220 from the gimbal 200 as information indicating the imaging direction of the imaging device 220, for example.
  • the information indicating the posture state of the imaging device 220 may be indicated by, for example, a rotation angle from the reference rotation angle of the pitch axis and yaw axis of the gimbal 200.
  • the processing unit 110 may acquire information indicating the orientation of the unmanned air vehicle 100 as information indicating the imaging direction of the imaging device 220, for example.
  • the processing unit 110 may acquire position information indicating a position where the unmanned air vehicle 100 exists as a parameter for specifying the imaging range.
  • the processing unit 110 defines an imaging range indicating a geographical range captured by the imaging device 220 based on the angle of view and the imaging direction of the imaging device 220 and the imaging device 230 and the position where the unmanned flying object 100 exists.
  • the imaging information may be acquired by generating imaging information indicating the imaging range.
  • the processing unit 110 may acquire imaging information indicating an imaging range to be imaged by the imaging device 220.
  • the processing unit 110 may acquire imaging information to be imaged by the imaging device 220 from the memory 160.
  • the processing unit 110 may acquire imaging information to be imaged by the imaging device 220 from another device such as the transmitter 50 via the communication interface 150.
  • the processing unit 110 acquires three-dimensional information (three-dimensional information) indicating a three-dimensional shape (three-dimensional shape) of an object existing around the unmanned air vehicle 100.
  • the object is a part of a landscape such as a building, a road, a car, and a tree.
  • the three-dimensional information is, for example, three-dimensional space data.
  • the processing unit 110 may acquire the three-dimensional information by generating the three-dimensional information indicating the three-dimensional shape of the object existing around the unmanned air vehicle 100 from the respective images obtained from the plurality of imaging devices 230.
  • the processing unit 110 may acquire the three-dimensional information indicating the three-dimensional shape of the object existing around the unmanned air vehicle 100 by referring to the three-dimensional map database stored in the memory 160.
  • the processing unit 110 may acquire three-dimensional information related to a three-dimensional shape of an object existing around the unmanned air vehicle 100 by referring to a three-dimensional map database managed by a server existing on the network.
  • FIG. 5 is a diagram illustrating an example of the appearance of the transmitter 50 to which the mobile terminal 80 is attached.
  • a smartphone is shown as an example of the mobile terminal 80.
  • the mobile terminal 80 may be a smartphone, a tablet terminal, or the like, for example.
  • the up / down / front / rear / left / right directions with respect to the transmitter 50 are assumed to follow the directions of arrows shown in FIG.
  • the transmitter 50 is used in a state of being held by both hands of a user who uses the transmitter 50, for example.
  • the transmitter 50 includes, for example, a resin casing 50B having a substantially rectangular parallelepiped shape (in other words, a substantially box shape) having a substantially square bottom surface and a height shorter than one side of the bottom surface.
  • a left control rod 53L and a right control rod 53R are provided in a projecting manner at approximately the center of the housing surface of the transmitter 50.
  • the left control rod 53L and the right control rod 53R are used in operations for remotely controlling the movement of the unmanned air vehicle 100 by the user (for example, moving the unmanned air vehicle 100 back and forth, moving left and right, moving up and down, and changing the direction). Is done.
  • the left control rod 53L and the right control rod 53R indicate positions in an initial state where no external force is applied from both hands of the user.
  • the left control rod 53L and the right control rod 53R automatically return to a predetermined position (for example, the initial position shown in FIG. 5) after the external force applied by the user is released.
  • the power button B1 of the transmitter 50 is disposed on the front side (in other words, the user side) of the left control rod 53L.
  • the power button B1 is pressed once by the user, for example, the remaining capacity of the battery built in the transmitter 50 is displayed in the remaining battery capacity display portion L2.
  • the power button B1 is pressed again by the user, for example, the power of the transmitter 50 is turned on, and power is supplied to each part (see FIG. 6) of the transmitter 50 so that it can be used.
  • RTH (Return To Home) button B2 is disposed on the front side (in other words, the user side) of the right control rod 53R.
  • the transmitter 50 transmits a signal for automatically returning the unmanned air vehicle 100 to a predetermined position.
  • the transmitter 50 can automatically return the unmanned air vehicle 100 to a predetermined position (for example, a take-off position stored in the unmanned air vehicle 100).
  • the RTH button B2 is used when, for example, the user loses sight of the airframe of the unmanned aerial vehicle 100 during aerial shooting with the unmanned air vehicle 100 outdoors, or when it becomes impossible to operate due to radio interference or unexpected troubles. Is available.
  • a remote status display unit L1 and a battery remaining amount display unit L2 are arranged on the front side (in other words, the user side) of the power button B1 and the RTH button B2.
  • the remote status display unit L1 is configured by using, for example, an LED (Light Emission Diode), and displays a wireless connection state between the transmitter 50 and the unmanned air vehicle 100.
  • the battery remaining amount display unit L2 is configured using, for example, an LED, and displays the remaining amount of battery capacity built in the transmitter 50.
  • Two antennas AN1 and AN2 project from the rear side of the housing 50B of the transmitter 50 and rearward from the left control rod 53L and the right control rod 53R.
  • the antennas AN1 and AN2 transmit a signal for controlling the movement of the unmanned air vehicle 100 to the unmanned air vehicle 100 based on the user's operation of the left control rod 53L and the right control rod 53R.
  • the antennas AN1 and AN2 can cover a transmission / reception range of 2 km, for example.
  • the antennas AN1 and AN2 receive from the unmanned aerial vehicle 100 captured images captured by the imaging devices 220 and 230 included in the unmanned aerial vehicle 100 wirelessly connected to the transmitter 50, or various data acquired by the unmanned aerial vehicle 100. When transmitted, these images or various data can be received.
  • the transmitter 50 does not include a display unit, but may include a display unit.
  • the portable terminal 80 may be mounted on the holder HLD.
  • the holder HLD may be bonded and attached to the transmitter 50. Thereby, the portable terminal 80 is attached to the transmitter 50 via the holder HLD.
  • the portable terminal 80 and the transmitter 50 may be connected via a wired cable (for example, a USB cable).
  • the portable terminal 80 and the transmitter 50 may be connected by wireless communication (for example, Bluetooth (registered trademark)).
  • the portable terminal 80 may not be attached to the transmitter 50, and the portable terminal 80 and the transmitter 50 may be provided independently.
  • FIG. 6 is a block diagram illustrating an example of a hardware configuration of the transmitter 50 configuring the flight path generation system 10 of FIG.
  • the transmitter 50 includes a left control rod 53L, a right control rod 53R, a processing unit 61, a wireless communication unit 63, an interface unit 65, a memory 67, a battery 69, a power button B1, and an RTH button B2.
  • the operation unit set OPS, the remote status display unit L1, and the battery remaining amount display unit L2 are included.
  • the transmitter 50 is an example of an operation terminal for remotely controlling the unmanned air vehicle 100.
  • the processing unit 61 is configured using a processor (for example, a CPU, MPU, or DSP).
  • the processing unit 61 performs signal processing for overall control of operations of each unit of the transmitter 50, data input / output processing with other units, data calculation processing, and data storage processing.
  • the processing unit 61 acquires captured image data captured by the imaging device 220 of the unmanned air vehicle 100 via the wireless communication unit 63, stores it in the memory 67, and outputs the data to the portable terminal 80 via the interface unit 65. It's okay. In other words, the processing unit 61 may cause the portable terminal 80 to display a captured image captured by the imaging device 220 of the unmanned air vehicle 100. Thereby, the captured image captured by the imaging device 220 of the unmanned air vehicle 100 can be displayed on the portable terminal 80.
  • the processing unit 61 may generate a signal for controlling the movement of the unmanned aerial vehicle 100 designated by the operation by the user's operation of the left control rod 53L and the right control rod 53R.
  • the processing unit 61 may transmit the generated signal to the unmanned air vehicle 100 via the wireless communication unit 63 and the antennas AN1 and AN2 to remotely control the unmanned air vehicle 100. Thereby, the transmitter 50 can control the movement of the unmanned air vehicle 100 remotely.
  • the processing unit 61 may acquire map information of a map database accumulated by an external server or the like via the wireless communication unit 63.
  • the left control rod 53L is used for an operation for remotely controlling the movement of the unmanned air vehicle 100 by, for example, the user's left hand.
  • the right control rod 53R is used for an operation for remotely controlling the movement of the unmanned air vehicle 100 by, for example, the user's right hand.
  • the movement of the unmanned aerial vehicle 100 includes, for example, a forward movement, a backward movement, a leftward movement, a rightward movement, a rising movement, a downward movement, and a leftward movement. Or a combination thereof, and so on.
  • the battery 69 has a function as a drive source for each part of the transmitter 50 and supplies necessary power to each part of the transmitter 50.
  • the processing section 61 displays the remaining capacity of the battery 69 built in the transmitter 50 on the remaining battery capacity display section L2. Thereby, the user can easily check the remaining capacity of the battery capacity built in the transmitter 50.
  • the processing unit 61 may display the remaining amount of the capacity of the battery built in the unmanned air vehicle 100 in the battery remaining amount display unit L2.
  • the processing unit 61 instructs the battery 69 built in the transmitter 50 to supply power to each unit in the transmitter 50. As a result, the user can turn on the transmitter 50 and easily start using the transmitter 50.
  • the processing unit 61 When the RTH button B2 is pressed, the processing unit 61 generates a signal for automatically returning the unmanned air vehicle 100 to a predetermined position (for example, the take-off position of the unmanned air vehicle 100), and the wireless communication unit 63 and the antenna AN1. , Transmitted to the unmanned air vehicle 100 via AN2.
  • a predetermined position for example, the take-off position of the unmanned air vehicle 100
  • the wireless communication unit 63 and the antenna AN1 Transmitted to the unmanned air vehicle 100 via AN2.
  • the user can automatically return (return) the unmanned air vehicle 100 to a predetermined position by a simple operation on the transmitter 50.
  • the operation unit set OPS is configured using a plurality of operation units (for example, operation units OP1,..., Operation unit OPn) (n: an integer of 2 or more).
  • the operation unit set OPS supports other operation units (for example, remote control of the unmanned air vehicle 100 by the transmitter 50) except for the left control rod 53L, the right control rod 53R, the power button B1, and the RTH button B2 shown in FIG. Various operation units).
  • the various operation units referred to here include, for example, a button for instructing to capture a still image using the imaging device 220 of the unmanned air vehicle 100, and start and end of video recording using the imaging device 220 of the unmanned air vehicle 100.
  • Button for adjusting the tilt direction of the gimbal 200 (see FIG. 4) of the unmanned air vehicle 100, a button for switching the flight mode of the unmanned air vehicle 100, and the imaging device 220 of the unmanned air vehicle 100 are set. Dial is applicable.
  • the remote status display unit L1 and the remaining battery level display unit L2 have been described with reference to FIG.
  • the wireless communication unit 63 is connected to two antennas AN1 and AN2.
  • the wireless communication unit 63 transmits / receives information and data to / from the unmanned air vehicle 100 via the two antennas AN1 and AN2 using a predetermined wireless communication method (for example, WiFi (registered trademark)).
  • a predetermined wireless communication method for example, WiFi (registered trademark)
  • the interface unit 65 inputs and outputs information and data between the transmitter 50 and the portable terminal 80.
  • the interface unit 65 may be a USB port (not shown) provided in the transmitter 50, for example.
  • the interface unit 65 may be an interface other than the USB port.
  • the memory 67 is an example of a storage unit.
  • the memory 67 temporarily stores, for example, a ROM (Read Only Memory) in which a program that defines the operation of the processing unit 61 and data of set values are stored, and various types of information and data that are used when the processing unit 61 performs processing.
  • RAM Random Access Memory
  • the program and setting value data stored in the ROM of the memory 64 may be copied to a predetermined recording medium (for example, CD-ROM, DVD-ROM).
  • a predetermined recording medium for example, CD-ROM, DVD-ROM.
  • data of captured images captured by the imaging devices 220 and 230 of the unmanned air vehicle 100 may be stored.
  • FIG. 7 is a block diagram illustrating an example of a hardware configuration of the mobile terminal 80 that configures the flight path generation system 10 of FIG.
  • the portable terminal 80 may include a processing unit 81, an interface unit 82, an operation unit 83, a wireless communication unit 85, a memory 87, a display unit 88, a storage 89, and a battery 99.
  • the portable terminal 80 has a function as an example of an information processing device, and the processing unit 81 of the portable terminal 80 is an example of a processing unit of the information processing device.
  • the processing unit 81 is configured using a processor (for example, CPU, MPU, or DSP).
  • the processing unit 81 performs signal processing for overall control of operations of each unit of the mobile terminal 80, data input / output processing with other units, data calculation processing, and data storage processing.
  • the processing unit 81 may acquire data and information from the unmanned air vehicle 100 via the wireless communication unit 85.
  • the processing unit 81 may acquire data and information from the transmitter 50 or another device via the interface unit 82.
  • the processing unit 81 may acquire data and information input via the operation unit 83.
  • the processing unit 81 may acquire data and information held in the memory 87.
  • the processing unit 81 may send data and information to the display unit 88 and cause the display unit 88 to display display information based on the data and information.
  • the processing unit 81 may send data and information to the storage 89 and store the data and information.
  • the processing unit 81 may acquire data and information stored in the storage 89.
  • the processing unit 81 may execute an application for instructing control of the unmanned air vehicle 100.
  • the processing unit 81 may generate various data used in the application.
  • the interface unit 82 inputs and outputs information and data between the transmitter 50 or another device and the portable terminal 80.
  • the interface unit 82 may be a USB connector (not shown) provided in the mobile terminal 80, for example.
  • the interface unit 65 may be an interface other than the USB connector.
  • the operation unit 83 receives data and information input by the operator of the mobile terminal 80.
  • the operation unit 83 may include buttons, keys, a touch panel, a microphone, and the like.
  • the operation unit 83 and the display unit 88 are mainly configured by a touch panel.
  • the operation unit 83 can accept a touch operation, a tap operation, a drag operation, and the like.
  • the wireless communication unit 85 communicates with the unmanned air vehicle 100 by various wireless communication methods.
  • the wireless communication method may include, for example, wireless LAN, Bluetooth (registered trademark), short-range wireless communication, or communication via a public wireless line.
  • the wireless communication unit 85 may transmit and receive data and information by communicating with other devices.
  • the memory 87 is an example of a storage unit.
  • the memory 87 includes, for example, a ROM that stores a program that defines the operation of the portable terminal 80 and data of setting values, and a RAM that temporarily stores various information and data used during processing by the processing unit 81. It's okay.
  • the memory 87 may include memories other than ROM and RAM.
  • the memory 87 may be provided inside the mobile terminal 80.
  • the memory 87 may be provided so as to be removable from the portable terminal 80.
  • the program may include an application program.
  • the display unit 88 is configured using, for example, an LCD (Liquid Crystal Display) or an organic EL (ElectroLuminescence) display, and displays various information and data output from the processing unit 81.
  • the display unit 88 may display captured image data captured by the imaging devices 220 and 230 of the unmanned air vehicle 100.
  • the storage 89 is an example of a storage unit.
  • the storage 89 stores and holds various data and information.
  • the storage 89 may be a flash memory, an SSD (Solid State Drive), a memory card, a USB memory, or the like.
  • the storage 89 may be provided so as to be removable from the main body of the mobile terminal 80.
  • the battery 99 has a function as a drive source for each part of the mobile terminal 80 and supplies necessary power to each part of the mobile terminal 80.
  • the processing unit 81 as an example of the processing unit of the information processing apparatus includes a flight path processing unit 811 that performs processing related to generation of a flight path of the unmanned air vehicle 100.
  • the processing unit 81 includes a shape data processing unit 812 that performs processing related to estimation and generation of three-dimensional shape data of a subject.
  • the flight path processing unit 811 generates a flight path of the unmanned air vehicle 100 that images the subject.
  • the flight path processing unit 811 may acquire input parameters.
  • the flight path processing unit 811 may acquire the input parameter input by the transmitter 50 by receiving the input parameter via the interface unit 82 or the wireless communication unit 85. Further, the flight path processing unit 811 may acquire at least a part of information included in the input parameter from another device instead of acquiring from the transmitter 50.
  • the flight path processing unit 811 may acquire at least a part of information included in the input parameter from a server or the like existing on the network.
  • the acquired input parameters may be held in the memory 87.
  • the processing unit 81 of the portable terminal 80 may refer to the memory 87 as appropriate (for example, at the time of generating a flight route, at the time of generating three-dimensional shape data).
  • the input parameters may include information on the approximate shape of the object, information on the flight range, information on the flight altitude, information on the imaging distance, and information on the imaging position interval.
  • the input parameter may include setting resolution information. Note that the set resolution is the resolution of the captured image captured by the imaging devices 220 and 230 of the unmanned air vehicle 100 (that is, for obtaining an appropriate captured image so that the three-dimensional shape of the subject can be estimated with high accuracy). Resolution) and may be stored in the memory 160 of the unmanned air vehicle 100 or the memory 67 of the transmitter 50.
  • the input parameters include information on imaging positions (that is, waypoints) in the flight path of the unmanned air vehicle 100 and various parameters for generating a flight path that passes through the imaging positions.
  • the imaging position is a position in a three-dimensional space.
  • the input parameter may include information on the overlapping rate of the imaging range when the unmanned air vehicle 100 images the subject at the imaging position, for example.
  • the input parameter may include information on the interval between imaging positions in the flight path.
  • the imaging position interval is an interval (distance) between two adjacent imaging positions among a plurality of imaging positions (waypoints) arranged on the flight path.
  • the input parameter may include information on the angle of view of the imaging device 220 or 230 of the unmanned air vehicle 100.
  • the flight path processing unit 811 may receive and acquire subject identification information.
  • the flight path processing unit 811 communicates with the external server via the interface unit 82 or the wireless communication unit 85 based on the identified subject identification information, and information on the shape of the subject corresponding to the subject identification information or the subject The size information may be received and acquired.
  • the overlapping ratio of the imaging ranges indicates a ratio of overlapping two imaging ranges when images are captured by the imaging device 220 or the imaging device 230 of the unmanned air vehicle 100 at imaging positions adjacent in the horizontal direction or the vertical direction.
  • the overlapping range of the imaging range is at least one of information on the overlapping rate of the imaging range in the horizontal direction (also referred to as horizontal overlapping rate) and information on the overlapping rate of the imaging range in the vertical direction (also referred to as vertical overlapping rate). May include.
  • the horizontal overlap rate and the vertical overlap rate may be the same or different. When the horizontal overlap rate and the vertical overlap rate are different values, both the horizontal overlap rate information and the vertical overlap rate information may be included in the input parameter. When the horizontal overlap rate and the vertical overlap rate are the same value, information on one overlap rate that is the same value may be included in the input parameter.
  • the imaging position interval is a spatial imaging interval, and is a distance between adjacent imaging positions among a plurality of imaging positions at which the unmanned air vehicle 100 should take an image in the flight path.
  • the imaging position interval may include at least one of an imaging position interval in the horizontal direction (also referred to as a horizontal imaging interval) and an imaging position interval in the vertical direction (also referred to as an upper and lower imaging interval).
  • the flight path processing unit 811 may calculate and acquire an imaging position interval including a horizontal imaging interval and an upper and lower imaging interval, or may acquire it from input parameters.
  • the flight path processing unit 811 may place an imaging position (waypoint) for imaging by the imaging device 220 or 230 on the flight path.
  • the intervals between the imaging positions may be arranged at regular intervals, for example.
  • the imaging positions are arranged so that the imaging ranges related to the captured images at adjacent imaging positions partially overlap. This is to enable estimation of a three-dimensional shape using a plurality of captured images. Since the imaging device 220 or 230 has a predetermined angle of view, a part of both imaging ranges overlaps by shortening the imaging position interval.
  • the flight path processing unit 811 may calculate the imaging position interval based on, for example, the altitude (imaging altitude) at which the imaging position is arranged and the resolution of the imaging device 220 or 230. The higher the imaging altitude or the longer the imaging distance, the larger the imaging range overlap rate, so that the imaging position interval can be made longer (sparse). As the imaging altitude is lower or the imaging distance is shorter, the overlapping ratio of the imaging ranges becomes smaller, so the imaging position interval is shortened (densely). The flight path processing unit 811 may further calculate the imaging position interval based on the angle of view of the imaging device 220 or 230. The flight path processing unit 811 may calculate the imaging position interval by other known methods.
  • the flight path processing unit 811 may acquire the angle of view of the imaging device 220 or the angle of view of the imaging device 230 from the imaging device 220 or the imaging device 230.
  • the angle of view of the imaging device 220 or the angle of view of the imaging device 230 may be the same or different in the horizontal direction and the vertical direction.
  • the angle of view of the imaging device 220 in the horizontal direction or the angle of view of the imaging device 230 is also referred to as a horizontal angle of view.
  • the angle of view of the imaging device 220 or the angle of view of the imaging device 230 in the vertical direction is also referred to as the vertical angle of view.
  • the flight path processing unit 811 may acquire information on one angle of view having the same value when the horizontal angle of view and the vertical angle of view are the same value.
  • the flight path processing unit 811 determines the imaging position (waypoint) of the subject by the unmanned air vehicle 100 based on the flight range and the imaging position interval.
  • the imaging positions by the unmanned air vehicle 100 may be arranged at equal intervals in the horizontal direction, and the distance between the last imaging position and the first imaging position may be shorter than the imaging position interval. This interval is a horizontal imaging interval.
  • the imaging positions by the unmanned air vehicle 100 may be arranged at equal intervals in the vertical direction, and the distance between the last imaging position and the first imaging position may be shorter than the imaging position interval. This interval is the vertical imaging interval.
  • the flight path processing unit 811 generates a flight path passing through the determined imaging position for each imaging plane corresponding to the side surface of the object.
  • the flight path processing unit 811 sequentially passes through the imaging positions adjacent to each other in the flight course of one imaging plane, passes through all the imaging positions in this flight course, and then enters the flight path entering the flight course of the next imaging plane. May be generated.
  • the flight path processing unit 811 sequentially passes through the imaging positions adjacent to each other, passes through all the imaging positions in the flight course, and then moves to the flight course of the next imaging plane.
  • An incoming flight path may be generated.
  • the flight path may be formed such that the altitude decreases as the flight path starts from the sky side.
  • the flight path may be formed such that the altitude increases as the flight path starts from the ground side.
  • the processing unit 110 of the unmanned air vehicle 100 may control the flight of the unmanned air vehicle 100 according to the generated flight path.
  • the processing unit 110 may cause the imaging device 220 or the imaging device 230 to image the subject at an imaging position that exists in the middle of the flight path. Therefore, the imaging device 220 or the imaging device 230 may capture the side surface of the subject at the imaging position in the flight path.
  • the captured image captured by the imaging device 220 or the imaging device 230 may be held in the memory 160 or the storage 170 of the unmanned air vehicle 100 or the memory 87 or the storage 89 of the portable terminal 80.
  • the processing unit 110 may refer to the memory 160 as appropriate (for example, when setting a flight path).
  • the shape data processing unit 812 is a three-dimensional information (three-dimensional shape) indicating a three-dimensional shape (three-dimensional shape) of an object (subject) based on a plurality of picked-up images picked up at different image pickup positions by any of the image pickup devices 220 and 230.
  • Information, three-dimensional shape data may be used as one image for restoring the three-dimensional shape data.
  • the captured image for restoring the three-dimensional shape data may be a still image.
  • a known method may be used as a method for generating three-dimensional shape data based on a plurality of captured images. a known method may be used. Known methods include, for example, MVS (Multi View Stereo), PMVS (Patch-based MVS), and SfM (Structure from Motion).
  • the captured image used for generating the three-dimensional shape data may be a still image.
  • the plurality of captured images used for generating the three-dimensional shape data include two captured images whose imaging ranges partially overlap each other.
  • the higher the overlapping ratio that is, the imaging area overlapping ratio
  • the shape data processing unit 812 can improve the reconstruction accuracy of the three-dimensional shape.
  • the lower the overlapping ratio of the imaging ranges the smaller the number of captured images used for generating the three-dimensional shape data when generating the three-dimensional shape data in the same range. Therefore, the shape data processing unit 812 can shorten the generation time of the three-dimensional shape data. Note that two captured images whose imaging ranges partially overlap each other may not be included in the plurality of captured images.
  • the shape data processing unit 812 may acquire a plurality of captured images including captured images obtained by capturing the side surface of the subject. As a result, the shape data processing unit 812 can collect a large number of image features on the side surface of the subject, and can restore the three-dimensional shape around the subject, as compared with the case of acquiring a captured image obtained by uniformly capturing the vertical direction from the sky. It can be improved.
  • FIG. 8 is a flowchart illustrating an example of a processing procedure of the flight path generation method according to the embodiment.
  • the illustrated example illustrates a process of performing aerial photography of a target region to acquire a rough shape of an object, and generating a flight path for estimating a three-dimensional shape based on the acquired rough shape.
  • the processing unit 81 of the mobile terminal 80 executes the process independently.
  • the flight path processing unit 811 of the processing unit 81 generates a flight path of the unmanned air vehicle 100 for shooting an object when performing shooting for estimating the three-dimensional shape of the object.
  • the flight path processing unit 811 inputs the flight range of the unmanned air vehicle 100 and designates the range of the imaging target region (S11).
  • the processing unit 110 of the unmanned aerial vehicle 100 inputs information on a designated flight range, flies in the corresponding flight range, and performs aerial photography in a state where an object in the imaging target area is looked down vertically at a predetermined imaging position. (S12). In this case, the processing unit 110 roughly captures an object (hereinafter, may be referred to as “schematic imaging”) at a small number of imaging positions.
  • the processing unit 110 of the unmanned aerial vehicle 100 acquires a bird's-eye shot image at each imaging position and records the captured image in the memory 160.
  • the flight path processing unit 811 of the processing unit 81 acquires a captured image obtained by rough imaging (downward aerial shooting) in the vertical direction of the imaging target region, and stores the acquired image in the memory 87 or the storage 89.
  • the flight path processing unit 811 acquires the approximate shape by estimating the approximate shape of the object (building, ground, etc.) using a known three-dimensional shape restoration technique using the acquired captured image group (S13).
  • the three-dimensional shape data of the approximate shape may include polygon data, for example.
  • the approximate shape of the object is obtained by taking a 3D map database held by another device such as the mobile terminal 80 or the server instead of being acquired by aerial imaging of the target region.
  • the three-dimensional shape data of the approximate shape may be acquired by three-dimensional information (for example, polygon data) such as a building and a road included in the information.
  • the flight path processing unit 811 generates a detailed imaging flight path for estimating the three-dimensional shape of the object using the acquired schematic shape of the object (S14). Several examples of the flight path generation procedure using the general shape of the object will be described later.
  • the above operation example can generate a flight path for estimating the three-dimensional shape of an object and automate the detailed imaging of the object.
  • the setting of an appropriate flight path for the object can be automated.
  • FIG. 9 is a diagram for explaining an input example of the flight range A1.
  • the processing unit 81 of the portable terminal 80 inputs information on the flight range A1 through the operation unit 83.
  • the operation unit 83 may accept a user input of a desired range where the generation of the three-dimensional shape data indicated in the map information M1 is desired as the flight range A1.
  • the information on the flight range A1 is not limited to a desired range, and may be a predetermined flight range.
  • the predetermined flight range may be, for example, one of ranges for periodically generating 3D shape data and measuring the 3D shape.
  • FIG. 10 is a diagram for explaining schematic imaging in the flight path FPA.
  • the flight path processing unit 811 may set the interval between the imaging positions CP (imaging position interval) to the interval d11 in the flight path FPA.
  • the interval d11 is a sparse interval (for example, an interval of several tens of meters) such that the size of an object (for example, a building) can be estimated.
  • the interval d11 is set to an interval where at least imaging ranges at adjacent imaging positions CP partially overlap. Imaging at each imaging position CP at the interval d11 of the flight path FPA may be referred to as schematic imaging.
  • the unmanned air vehicle 100 can reduce the imaging time by capturing images at sparse intervals as compared to capturing images at dense intervals.
  • a landscape including the building BL and the mountain MT may spread in the vertical direction (direction toward the ground, direction of gravity) of the flight path on which the unmanned air vehicle 100 flies. Therefore, the building BL and the mountain MT exist in the imaging range and are imaging targets.
  • the approximate shape of the object can be acquired from the captured image obtained by the approximate imaging.
  • FIG. 11 is a diagram for explaining generation of three-dimensional shape data having a rough shape based on rough imaging obtained by the flight path FPA.
  • the shape data processing unit 812 generates the three-dimensional shape data SD1 of the approximate shape of the object based on the plurality of captured images CI1 obtained at each imaging position CP by the schematic imaging of the flight path FPA.
  • the user can grasp the approximate shape of the ground existing in the vertical direction of the flight path FPA by confirming the three-dimensional shape data SD1 by display or the like.
  • the user can confirm that the mountain MT exists by confirming the shape (schematic shape) obtained from the three-dimensional shape data SD1 based on the schematic imaging, but cannot confirm the existence of the building BL.
  • the mountain MT has a gentle outline, and even if an image is taken from the sky according to the flight path FPA, an image necessary for generating the three-dimensional shape data SD1 is sufficient in the captured image CI1.
  • the outline of the building BL is substantially parallel to the vertical direction, and the side surface of the building BL is sufficiently imaged at the imaging position CP of the flight path FPA where the unmanned air vehicle 100 travels in the horizontal direction above the building BL. This is because it is difficult to do. That is, information necessary for three-dimensional shape estimation cannot be acquired from the captured image captured downward in the vicinity of the building BL.
  • the flight path processing unit 811 uses the data of the approximate shape of the object to move the side surface of the object toward the direction parallel to the vertical direction of the object, that is, in the horizontal direction (normal direction of the vertical direction).
  • a flight path and an imaging position are generated and set so as to capture an image from one side.
  • the shape data processing unit 812 generates the three-dimensional shape data of the object using the captured image including the captured image of the side of the object captured according to the generated flight path. Thereby, the estimation accuracy of the three-dimensional shape of the object can be improved.
  • FIG. 12 is a flowchart illustrating an example of a processing procedure of the three-dimensional shape estimation method according to the embodiment.
  • the processing unit 81 of the mobile terminal 80 as an example of the processing unit of the information processing apparatus executes the processing independently.
  • the flight path processing unit 811 of the processing unit 81 sets a flight path for the unmanned air vehicle 100 using the generated flight path (S21).
  • the processing unit 110 of the unmanned aerial vehicle 100 flies over the flight range of the imaging target area according to the set flight path, and aerially captures the object laterally at a predetermined imaging position (S22).
  • the processing unit 110 captures an object in detail (hereinafter, may be referred to as “detailed imaging”) by partially overlapping the imaging range at every predetermined imaging position interval.
  • the processing unit 110 of the unmanned air vehicle 100 acquires the captured image at each imaging position and records the captured image in the memory 160.
  • the shape data processing unit 812 of the processing unit 81 acquires a captured image obtained by detailed imaging (side aerial imaging) of the imaging target region and stores it in the memory 87 or the storage 89.
  • the shape data processing unit 812 generates three-dimensional shape data by estimating the three-dimensional shape of an object (building, ground, etc.) from the acquired captured image group using a known three-dimensional shape restoration technique (S23).
  • three-dimensional shape data including the shape of the side surface of the object can be generated using a detailed captured image obtained by capturing the object from the side. Therefore, it is possible to estimate the detailed shape of the side surface that has been difficult to restore in the captured image obtained by capturing the object downward, and to improve the accuracy of the three-dimensional shape data of the object.
  • FIG. 13 is a diagram for describing a first operation example of flight path generation using the schematic shape of an object in the embodiment.
  • the first operation example is an example in which a polyhedron such as a cube surrounding an object is calculated to generate a shooting plane that faces the side of the object.
  • the flight path processing unit 811 of the processing unit 81 uses the acquired schematic shape to calculate a polyhedron that surrounds the outer shape of the object.
  • This polyhedron is a solid that touches the outside or is slightly larger than the general shape of the object.
  • a cube 301 is calculated as an example of a polyhedron.
  • the flight path processing unit 811 extracts at least one side surface 303 in the polyhedron of the cube 301.
  • the side surface 303 may be a surface along the vertical direction in the polyhedron or a surface standing within a predetermined angle range in the vertical direction.
  • the flight path processing unit 811 calculates a normal 304 outward of the polyhedron with respect to the extracted side surface 303.
  • the normal 304 can be calculated by an outer product of two vectors (for example, a vector connecting any one of the vertices) along the surface of the side surface 303.
  • the flight path processing unit 811 calculates an imaging plane 305 having a predetermined imaging distance and parallel to the side surface 303 using the acquired normal line 304.
  • the imaging plane 305 is located at a predetermined imaging distance from the side surface 303 and is a plane perpendicular to the normal line 304.
  • the flight path processing unit 811 sets a plurality of imaging positions (waypoints) 306 having predetermined imaging position intervals in the generated imaging plane 305 and determines an imaging path 307 passing through each imaging position 306. As a result, a flight path including the imaging path 307 is generated.
  • the shooting direction at each imaging position 306 is opposite to the side of the object in the direction opposite to the direction of the normal 304.
  • the shooting plane is a vertical plane
  • the shooting direction is a horizontal direction perpendicular to the shooting plane.
  • FIG. 14 is a diagram for explaining the setting of a plurality of imaging positions 306 on the imaging plane 305.
  • the flight path processing unit 811 sets a predetermined shooting distance L in the normal direction with respect to the side surface 303 of the polyhedron, and calculates a shooting plane 305 parallel to the side surface 303 at a position away from the side surface 303 by the shooting distance L.
  • the flight path processing unit 811 sets a predetermined imaging position interval d on the imaging plane 305 and determines an imaging position 306 for each imaging position interval d. For example, the following method may be used to set the shooting distance L and the imaging position interval d.
  • the user designates an imaging distance L [m] and an imaging position interval d [m].
  • the processing unit 81 of the portable terminal 80 inputs information on the shooting distance L and the imaging position interval d through the operation unit 83 according to the operation input by the user, and stores the information in the memory 87. Thereby, the imaging position for detailed imaging can be set based on the imaging distance specified by the user and the imaging position interval.
  • the user designates the shooting distance L [m] and the overlapping rate r side [%] of the imaging range, and calculates the imaging position interval d [m] from the shooting distance L and the overlapping rate r side .
  • the imaging position interval d can be calculated by the following equation (1) using the imaging distance L, the overlapping rate r side , and the angle of view FOV (Field of View) of the imaging device.
  • the processing unit 81 of the portable terminal 80 inputs information on the shooting distance L and the overlap rate r side by the operation unit 83 according to the operation input by the user, and stores the information in the memory 87.
  • the processing unit 81 acquires information on the angle of view FOV of the imaging device 220 from the unmanned air vehicle 100 by the interface unit 82 or the wireless communication unit 85 and stores the information in the memory 87.
  • the processing unit 81 calculates the imaging position interval d by the above mathematical formula (1). As a result, the imaging position interval can be calculated based on the shooting distance specified by the user and the overlapping ratio of the imaging ranges, and the imaging position for detailed imaging can be set.
  • the user designates the resolution r [m / pixel] of the captured image and the overlapping rate r side [%] of the imaging range, and calculates the imaging position interval d [m] from the resolution r and the overlapping rate r side . Also, the shooting distance L [m] is calculated from the imaging position interval d.
  • the imaging position interval d can be calculated by the following equation (2) using the resolution r, the width w of the captured image, and the overlap rate r side .
  • the shooting distance L can be calculated by the following equation (3) using the imaging position interval d, the overlapping rate r side , and the angle of view FOV of the imaging device.
  • the processing unit 81 of the portable terminal 80 inputs information about the resolution r and the overlap rate r side by the operation unit 83 according to the operation input by the user, and stores the information in the memory 87.
  • the processing unit 81 calculates the imaging position interval d by the above mathematical formula (2).
  • the processing unit 81 acquires information on the angle of view FOV of the imaging device 220 from the unmanned air vehicle 100 by the interface unit 82 or the wireless communication unit 85 and stores the information in the memory 87.
  • the processing unit 81 calculates the shooting distance L by the above mathematical formula (3). Thereby, the imaging position interval can be calculated based on the resolution of the captured image specified by the user and the overlapping range of the imaging range, and the imaging position for detailed imaging can be set.
  • the flight path processing unit 811 arranges a plurality of imaging positions 306 at equal intervals for each imaging position interval d on the imaging plane 305 based on the set imaging position interval d, and sets the imaging path 307 passing through these imaging positions 306. decide.
  • the imaging position of the end of the imaging plane 305 such as the imaging position of the start point and end point on the imaging plane 305, may be set within a range of 1 / 2d or less from the side edge of the imaging plane 305.
  • FIG. 15 is a flowchart illustrating a processing procedure of a first operation example of flight path generation using the schematic shape of an object in the embodiment.
  • the flight path processing unit 811 of the processing unit 81 uses the acquired schematic shape to calculate a polyhedron (cube 301) that surrounds the outer shape of the object (S31).
  • the flight path processing unit 811 sequentially extracts at least one (four in the case of a cube) side surfaces 303 in the polyhedron of the cube 301 (S32).
  • the flight path processing unit 811 calculates a normal 304 outward of the polyhedron for one extracted side surface 303 (S33).
  • the flight path processing unit 811 calculates an imaging plane 305 parallel to the side surface 303 at a position away from the predetermined imaging distance L by using the acquired normal line 304 (S34).
  • the flight path processing unit 811 sets a plurality of imaging positions (waypoints) 306 having a predetermined imaging position interval d on one calculated imaging plane 305, and shoots in the direction facing the object from each of these imaging positions.
  • An imaging route 307 is generated (S35).
  • the flight path processing unit 811 determines whether the generation of the shooting paths 307 of all the extracted side surfaces 303 has been completed for the object (S36). If the shooting route generation for all the side surfaces 303 has not been completed, the flight route processing unit 811 returns to the process of step S32, extracts the next side surface 303, and repeats the same processing until the generation of the shooting route 307 (S32). To S35).
  • step S36 when the shooting path generation for all the side surfaces 303 is completed, the flight path processing unit 811 combines the shooting paths 307 of the respective shooting planes 305 to generate a flight path (S37).
  • the processing unit 110 of the unmanned aerial vehicle 100 communicates with the portable terminal 80 through the communication interface 150, acquires the flight path information generated by the flight path processing unit 811, and sets the flight path of the unmanned air vehicle 100.
  • the processing unit 110 flies around the object according to the set flight path, and images the object by the imaging devices 220 and 230 at each of a plurality of imaging positions (waypoints).
  • the processing unit 110 captures images at each imaging position 306 in order for each imaging plane 305 by a flight path obtained by combining the imaging paths 307 of the imaging planes 305.
  • the processing unit 110 completes imaging at each imaging position 306 on the imaging plane 305 corresponding to one side surface 303 of the polyhedron (cube 301) having a substantially shape with respect to the object included in the subject. For example, a side plane adjacent to the current side surface, and imaging is performed at each imaging position on this plane. In this way, the unmanned aerial vehicle 100 acquires a captured image of the side imaged toward the side surface of the object at the imaging positions of all imaging planes set in the flight path.
  • the processing unit 81 of the portable terminal 80 communicates with the unmanned aerial vehicle 100 through the interface unit 82 or the wireless communication unit 85, and acquires a captured image captured by the unmanned aerial vehicle 100.
  • the shape data processing unit 812 of the processing unit 81 generates the three-dimensional shape data of the object (building, ground, etc.) using the acquired captured image of the side of the object, and includes details of the shape of the side surface of the object. A three-dimensional shape can be estimated.
  • the captured image may include a captured image on the lower side obtained by performing detailed imaging of the object in the vertical direction at an imaging position interval for detailed imaging, along with the side captured image.
  • the flight path processing unit 811 sets an imaging path including a plurality of imaging positions on the upper surface of the object as well as the side surface, and generates a flight path.
  • the polyhedron By calculating the polyhedron surrounding the approximate shape of the object and extracting the surface along the vertical direction of the polyhedron or the surface standing within the predetermined angle range in the vertical direction as the side surface by the above operation example, the polyhedron is directed toward the side of the object. Thus, it is possible to extract the side surface of the approximate shape that can be captured. For this reason, it is possible to set an imaging position where detailed imaging can be performed when the object is viewed from the side, using the schematic shape of the object.
  • FIG. 16 is a diagram for describing a second operation example of flight path generation using the schematic shape of an object in the embodiment.
  • the second operation example is an example in which a mesh indicating the schematic shape of an object is simplified and a shooting plane directed to the side of the object is generated.
  • the flight path processing unit 811 of the processing unit 81 uses the acquired approximate shape to simplify the mesh indicating the approximate shape of the object.
  • a mesh simplification method a known method may be used. Known methods include, for example, Vertex-clustering method, Incremental-decimation method and the like.
  • the polygon data is simplified to simplify a complicated shape, and smoothing is performed to reduce the number of polygons representing one surface.
  • the flight path processing unit 811 performs simplification processing on the schematic shape 311 and calculates a simplified polyhedron 312.
  • the flight path processing unit 811 extracts at least one side surface 313 in the simplified polyhedron 312, and calculates a normal 314 outward of the polyhedron with respect to the extracted side surface 313.
  • the direction of the normal line is determined for each plane of the polyhedron 312. If the absolute value of the normalized normal normal component Nz is smaller than 0.1 (
  • the side surface 313 may be a surface along the vertical direction in the polyhedron or a surface standing within a predetermined angle range in the vertical direction.
  • the flight path processing unit 811 calculates an imaging plane 315 having a predetermined imaging distance L and parallel to the side surface 313 using the acquired normal line 314.
  • the flight path processing unit 811 sets a plurality of imaging positions (waypoints) 316 having a predetermined imaging position interval d on the calculated imaging plane 315, determines an imaging path 317 passing through each imaging position 316, and this imaging A flight path including the path 317 is generated.
  • the shooting plane is a flat plane within a predetermined range with respect to the vertical direction
  • the shooting direction is a direction toward the side in a substantially horizontal direction, that is, a direction facing the side surface of the object.
  • FIG. 17 is a flowchart illustrating a processing procedure of the second operation example of the flight path generation using the schematic shape of the object in the embodiment.
  • the flight path processing unit 811 of the processing unit 81 simplifies the mesh of the approximate shape of the object using the acquired approximate shape, and calculates the polyhedron 312 in which the approximate shape 311 is simplified (S41).
  • the flight path processing unit 811 extracts at least one side surface 313 in the polyhedron 312 (S42).
  • the flight path processing unit 811 calculates a normal 314 outward of the polyhedron with respect to the extracted one side 313 (S43).
  • the flight path processing unit 811 calculates the imaging plane 315 parallel to the side surface 313 at a position away from the predetermined imaging distance L using the calculated normal 314 (S44).
  • the flight path processing unit 811 sets a plurality of imaging positions (waypoints) 316 having a predetermined imaging position interval d on one generated imaging plane 315, and shoots in the direction facing the object from each of these imaging positions.
  • a shooting path 317 is generated (S45).
  • the flight path processing unit 811 determines whether the generation of the shooting paths 317 of all the extracted side surfaces 313 has been completed for the object (S46). If the shooting route generation for all the side surfaces 313 has not been completed, the flight route processing unit 811 returns to the process of step S42, extracts the next side surface 313 adjacent to the current side surface, and generates the shooting route 317. The same processing is repeated until (S42 to S45).
  • step S46 when the shooting path generation for all the side surfaces 313 is completed, the flight path processing unit 811 combines the shooting paths 317 of the shooting planes 315 to generate a flight path (S47).
  • FIG. 18 is a diagram for describing a third operation example of flight path generation using the schematic shape of an object in the embodiment.
  • the third operation example is an example in which a photographing plane directed to the side of the object is generated by combining polyhedrons such as a plurality of cubes surrounding the object.
  • the flight path processing unit 811 of the processing unit 81 calculates a plurality of polyhedrons surrounding the schematic shape of each object, for a plurality of objects such as buildings, using the acquired schematic shape.
  • a cube or a rectangular parallelepiped polyhedron 321A, 321B, 321C present as an example of a plurality of polyhedrons is shown.
  • the flight path processing unit 811 combines a plurality of polyhedrons 321A, 321B, and 321C, and calculates the combined polyhedron 322. By combining adjacent polyhedrons, a collision with an object of the unmanned air vehicle 100 at the time of detailed side imaging is avoided.
  • the flight path processing unit 811 extracts at least one side surface 323 from the combined polyhedron 322.
  • the side surface 323 may be a surface along the vertical direction in the polyhedron or a surface standing within a predetermined angle range in the vertical direction.
  • the flight path processing unit 811 calculates a normal line 324 outward of the polyhedron with respect to the extracted side surface 323.
  • the flight path processing unit 811 calculates a shooting plane 325 having a predetermined shooting distance L and parallel to the side surface 323 using the calculated normal 324.
  • the flight path processing unit 811 sets a plurality of imaging positions (waypoints) 326 having a predetermined imaging position interval d inside the calculated imaging plane 325, determines an imaging path 327 passing through each imaging position 326, and A flight path including the imaging path 327 is generated.
  • the shooting direction at each imaging position 326 is opposite to the side of the object in the direction opposite to the direction of the normal 324.
  • the shooting plane is a vertical plane, and the shooting direction is a horizontal direction perpendicular to the shooting plane.
  • FIG. 19 is a flowchart illustrating a processing procedure of a third operation example of the flight path generation using the schematic shape of the object in the embodiment.
  • the flight path processing unit 811 of the processing unit 81 calculates a plurality of polyhedrons 321A, 321B, and 321C surrounding the outer shape of each object for a plurality of objects using the acquired schematic shape (S51).
  • the flight path processing unit 811 combines the polyhedrons 321A, 321B, and 321C, and calculates the combined polyhedron 322 (S52).
  • the flight path processing unit 811 sequentially extracts at least one side surface 323 from the combined polyhedron 322 (S32).
  • step S36 the flight path processing unit 811 returns to the process of step S32 when the shooting path generation for all the side surfaces 323 is not completed, extracts the next side surface 323 adjacent to the current side surface, and captures the image. The same process is repeated until the path 327 is generated (S32 to S35).
  • the flight path processing unit 811 combines the shooting paths 327 of the shooting planes 325 in the combined polyhedron 322 to generate a flight path (S57).
  • the above-described second operation example and the third operation example are combined to simplify the schematic shape of the object and to combine a plurality of polyhedrons, to extract side surfaces, to set the shooting plane and the shooting position, and to the shooting path.
  • the flight path may be generated by making the determination.
  • polyhedrons corresponding to a plurality of approximate shapes of the object are calculated, and a plurality of adjacent polyhedrons are combined to stand in a plane along the vertical direction of the combined polyhedrons or within a predetermined angle range in the vertical direction.
  • the mobile terminal 80 functioning as an information processing apparatus sets an imaging position where detailed imaging can be performed when the object is viewed from the side using the schematic shape of the object, and sets a flight path passing through the imaging position. it can.
  • the imaging position where the detailed imaging when the object is viewed from the side can be set using the approximate shape of the object.
  • By extracting the side surface of the schematic shape it is possible to set a side image capturing position for detailed imaging corresponding to the side surface of the object.
  • By generating a flight path that passes through the set imaging position a detailed imaging flight path that includes the side of the object can be set.
  • By setting an imaging position facing the side surface for each extracted side surface it is possible to perform detailed imaging in the horizontal direction when the object is viewed from the side.
  • the present embodiment by setting a plurality of imaging positions having predetermined imaging position intervals corresponding to the extracted side surfaces, captured images having an appropriate resolution and overlapping rate can be obtained.
  • determining a shooting route that passes through the plurality of set imaging positions and generating a flight route including the shooting route it is possible to set a flight route for detailed imaging including the side of the object.
  • the imaging position facing the side surface of the object can be easily determined.
  • a shooting plane parallel to the side surface at a predetermined shooting distance can be easily generated.
  • an imaging position capable of acquiring captured images having an appropriate overlapping rate can be set.
  • each side surface corresponds to a plurality of side surfaces of the object. Efficient images can be taken by flying in order.
  • FIG. 20 is a schematic diagram illustrating a second configuration example of the flight path generation system 10A in the embodiment.
  • the flight path generation system 10 ⁇ / b> A includes an unmanned air vehicle 100 and a PC (Personal Computer) 70.
  • the unmanned air vehicle 100 and the PC 70 can communicate with each other using wired communication or wireless communication (for example, wireless LAN or Bluetooth (registered trademark)).
  • the PC 70 may be a computer such as a desktop PC, a notebook PC, or a tablet terminal.
  • the PC 70 may be a computer having a server and a client terminal connected via a network.
  • the PC 70 is an example of an information processing apparatus.
  • the PC 70 may include a processor (eg, CPU, MPU, or DSP) as an example of a processing unit, a memory, an example of a storage unit, a communication interface, a display, an input device, and a storage.
  • the PC 70 as an example of the information processing apparatus has the same functions as the processing unit 81, the flight path processing unit 811, and the shape data processing unit 812 included in the mobile terminal 80 illustrated in FIG. 7.
  • the PC 70 functioning as the information processing apparatus can set the imaging position where the detailed imaging when the object is viewed from the side is set using the schematic shape of the object, and the flight path passing through the imaging position can be set.
  • FIG. 21 is a block diagram illustrating an example of a hardware configuration of an unmanned air vehicle 100A according to a third configuration example of the flight path generation system 10B in the embodiment.
  • the unmanned air vehicle 100A of the flight path generation system 10B includes a processing unit 110A instead of the processing unit 110, as compared with the unmanned air vehicle 100 illustrated in FIG.
  • the unmanned air vehicle 100A has a function as an example of an information processing device, and the processing unit 110A of the unmanned air vehicle 100A is an example of a processing unit of the information processing device.
  • the same components as those of the unmanned air vehicle 100 of FIG. 4 are denoted by the same reference numerals, and description thereof is omitted or simplified.
  • the processing unit 110A as an example of the processing unit of the information processing apparatus includes a flight path processing unit 111 and a shape data processing unit 112.
  • the flight path processing unit 111 has the same function as the flight path processing unit 811 provided in the portable terminal 80 shown in FIG.
  • the shape data processing unit 112 has the same function as the shape data processing unit 812 included in the mobile terminal 80 shown in FIG.
  • the processing unit 110A of the unmanned air vehicle 100A functioning as an information processing apparatus sets an imaging position where detailed imaging can be performed when the object is viewed from the side, using the approximate shape of the object, and sets the imaging position. It is possible to set the flight route through.
  • FIG. 22 is a block diagram illustrating an example of a hardware configuration of a transmitter 50A according to a fourth configuration example of the flight path generation system 10C in the embodiment.
  • the transmitter 50A includes a processing unit 61A instead of the processing unit 61, as compared with the transmitter 50 illustrated in FIG.
  • the transmitter 50A has a function as an example of an information processing device, and the processing unit 61A of the transmitter 50A is an example of a processing unit of the information processing device.
  • the same components as those of the transmitter 50 of FIG. 6 are denoted by the same reference numerals, and description thereof is omitted or simplified.
  • the processing unit 61A as an example of the processing unit of the information processing apparatus includes a flight path processing unit 611 and a shape data processing unit 612.
  • the flight path processing unit 611 has the same function as the flight path processing unit 811 provided in the portable terminal 80 shown in FIG.
  • the shape data processing unit 612 has the same function as the shape data processing unit 812 included in the mobile terminal 80 shown in FIG.
  • the processing unit 61A of the transmitter 50A functioning as an information processing apparatus sets an imaging position where detailed imaging is possible when the object is viewed from the side, using the approximate shape of the object, and passes through the imaging position. You can set the flight path.
  • the acquired flight path is set as the flying object, and the image obtained by performing imaging including lateral detailed imaging with respect to the object while the flying object flies in the imaging target area according to the flight path.
  • the image may be used for generating three-dimensional shape data of an object existing in the imaging target area.
  • a captured image acquired by detailed lateral imaging may be used for inspection of the side surface of the object.
  • the information processing apparatus that executes the steps in the flight path generation method is provided in any one of the mobile terminal 80, the unmanned air vehicle 100A, and the transmitter 50A has been described.
  • a device may be included to perform the steps in the flight path generation method.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Studio Devices (AREA)

Abstract

Selon l'invention, un procédé de production de trajectoire de vol permettant de produire une trajectoire de vol d'un objet volant pour effectuer l'imagerie d'un sujet d'imagerie comprend une étape consistant à obtenir la forme générale d'un objet compris dans le sujet d'imagerie, une étape consistant à extraire la surface latérale de la forme générale, une étape consistant à définir un emplacement d'imagerie qui correspond à la surface latérale, et une étape consistant à produire une trajectoire de vol qui passe par l'emplacement d'imagerie.
PCT/JP2017/015876 2017-04-20 2017-04-20 Procédé de production de trajectoire de vol, dispositif de traitement d'informations, système de production de trajectoire de vol, programme et support d'enregistrement WO2018193574A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2019513156A JP6765512B2 (ja) 2017-04-20 2017-04-20 飛行経路生成方法、情報処理装置、飛行経路生成システム、プログラム及び記録媒体
PCT/JP2017/015876 WO2018193574A1 (fr) 2017-04-20 2017-04-20 Procédé de production de trajectoire de vol, dispositif de traitement d'informations, système de production de trajectoire de vol, programme et support d'enregistrement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/015876 WO2018193574A1 (fr) 2017-04-20 2017-04-20 Procédé de production de trajectoire de vol, dispositif de traitement d'informations, système de production de trajectoire de vol, programme et support d'enregistrement

Publications (1)

Publication Number Publication Date
WO2018193574A1 true WO2018193574A1 (fr) 2018-10-25

Family

ID=63856531

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/015876 WO2018193574A1 (fr) 2017-04-20 2017-04-20 Procédé de production de trajectoire de vol, dispositif de traitement d'informations, système de production de trajectoire de vol, programme et support d'enregistrement

Country Status (2)

Country Link
JP (1) JP6765512B2 (fr)
WO (1) WO2018193574A1 (fr)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111179413A (zh) * 2019-12-19 2020-05-19 中建科技有限公司深圳分公司 三维重建方法、装置、终端设备及可读存储介质
US10962650B2 (en) 2017-10-31 2021-03-30 United States Of America As Represented By The Administrator Of Nasa Polyhedral geofences
US10983535B2 (en) * 2016-08-05 2021-04-20 SZ DJI Technology Co., Ltd. System and method for positioning a movable object
JP2021066423A (ja) * 2020-07-30 2021-04-30 株式会社センシンロボティクス 飛行体、点検方法及び点検システム
JP2021075262A (ja) * 2020-06-02 2021-05-20 株式会社センシンロボティクス 飛行体、点検方法及び点検システム
JP2021075263A (ja) * 2020-06-02 2021-05-20 株式会社センシンロボティクス 飛行体、点検方法及び点検システム
WO2021177139A1 (fr) * 2020-03-06 2021-09-10 ソニーグループ株式会社 Procédé de traitement d'informations, dispositif de traitement d'informations et programme
JP2021182177A (ja) * 2020-05-18 2021-11-25 防衛装備庁長官 車両操縦システムと車両操縦方法
JP2022067744A (ja) * 2020-10-21 2022-05-09 トヨタ自動車株式会社 ロボットシステム、ロボットシステムの制御方法、及びプログラム
JP7094432B1 (ja) 2021-12-03 2022-07-01 ソフトバンク株式会社 情報処理システム、情報処理装置、プログラム、及び情報処理方法

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014185947A (ja) * 2013-03-25 2014-10-02 Geo Technical Laboratory Co Ltd 3次元復元のための画像撮影方法
US20160253808A1 (en) * 2015-02-26 2016-09-01 Hexagon Technology Center Gmbh Determination of object data by template-based uav control

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014185947A (ja) * 2013-03-25 2014-10-02 Geo Technical Laboratory Co Ltd 3次元復元のための画像撮影方法
US20160253808A1 (en) * 2015-02-26 2016-09-01 Hexagon Technology Center Gmbh Determination of object data by template-based uav control

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10983535B2 (en) * 2016-08-05 2021-04-20 SZ DJI Technology Co., Ltd. System and method for positioning a movable object
US10962650B2 (en) 2017-10-31 2021-03-30 United States Of America As Represented By The Administrator Of Nasa Polyhedral geofences
CN111179413B (zh) * 2019-12-19 2023-10-31 中建科技有限公司深圳分公司 三维重建方法、装置、终端设备及可读存储介质
CN111179413A (zh) * 2019-12-19 2020-05-19 中建科技有限公司深圳分公司 三维重建方法、装置、终端设备及可读存储介质
WO2021177139A1 (fr) * 2020-03-06 2021-09-10 ソニーグループ株式会社 Procédé de traitement d'informations, dispositif de traitement d'informations et programme
JP2021182177A (ja) * 2020-05-18 2021-11-25 防衛装備庁長官 車両操縦システムと車両操縦方法
JP2021075262A (ja) * 2020-06-02 2021-05-20 株式会社センシンロボティクス 飛行体、点検方法及び点検システム
JP2021075263A (ja) * 2020-06-02 2021-05-20 株式会社センシンロボティクス 飛行体、点検方法及び点検システム
JP2021066423A (ja) * 2020-07-30 2021-04-30 株式会社センシンロボティクス 飛行体、点検方法及び点検システム
JP2022067744A (ja) * 2020-10-21 2022-05-09 トヨタ自動車株式会社 ロボットシステム、ロボットシステムの制御方法、及びプログラム
JP7420047B2 (ja) 2020-10-21 2024-01-23 トヨタ自動車株式会社 ロボットシステム
JP7094432B1 (ja) 2021-12-03 2022-07-01 ソフトバンク株式会社 情報処理システム、情報処理装置、プログラム、及び情報処理方法
JP2023083134A (ja) * 2021-12-03 2023-06-15 ソフトバンク株式会社 情報処理システム、情報処理装置、プログラム、及び情報処理方法

Also Published As

Publication number Publication date
JPWO2018193574A1 (ja) 2020-02-27
JP6765512B2 (ja) 2020-10-07

Similar Documents

Publication Publication Date Title
JP6765512B2 (ja) 飛行経路生成方法、情報処理装置、飛行経路生成システム、プログラム及び記録媒体
JP6803919B2 (ja) 飛行経路生成方法、飛行経路生成システム、飛行体、プログラム、及び記録媒体
JP6878567B2 (ja) 3次元形状推定方法、飛行体、モバイルプラットフォーム、プログラム及び記録媒体
US20190318636A1 (en) Flight route display method, mobile platform, flight system, recording medium and program
US11513514B2 (en) Location processing device, flight vehicle, location processing system, flight system, location processing method, flight control method, program and recording medium
US11122209B2 (en) Three-dimensional shape estimation method, three-dimensional shape estimation system, flying object, program and recording medium
JP6788094B2 (ja) 画像表示方法、画像表示システム、飛行体、プログラム、及び記録媒体
US20210185235A1 (en) Information processing device, imaging control method, program and recording medium
CN111344650B (zh) 信息处理装置、飞行路径生成方法、程序以及记录介质
WO2019061859A1 (fr) Plate-forme mobile, procédé de génération de trajet de capture d'image, programme et support d'enregistrement
JP6329219B2 (ja) 操作端末、及び移動体
WO2021115192A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image, programme, et support d'enregistrement
WO2020119572A1 (fr) Dispositif de déduction de forme, procédé de déduction de forme, programme et support d'enregistrement
CN112313942A (zh) 一种进行图像处理和框架体控制的控制装置
WO2018188086A1 (fr) Véhicule aérien sans pilote et son procédé de commande

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17906712

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019513156

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17906712

Country of ref document: EP

Kind code of ref document: A1