WO2018073879A1 - 飛行経路生成方法、飛行経路生成システム、飛行体、プログラム、及び記録媒体 - Google Patents
飛行経路生成方法、飛行経路生成システム、飛行体、プログラム、及び記録媒体 Download PDFInfo
- Publication number
- WO2018073879A1 WO2018073879A1 PCT/JP2016/080752 JP2016080752W WO2018073879A1 WO 2018073879 A1 WO2018073879 A1 WO 2018073879A1 JP 2016080752 W JP2016080752 W JP 2016080752W WO 2018073879 A1 WO2018073879 A1 WO 2018073879A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- imaging
- flight path
- subject
- flying object
- imaging position
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 53
- 238000003384 imaging method Methods 0.000 claims abstract description 611
- 238000012545 processing Methods 0.000 claims description 143
- 238000004891 communication Methods 0.000 description 56
- 239000013256 coordination polymer Substances 0.000 description 32
- 238000010586 diagram Methods 0.000 description 21
- 230000006870 function Effects 0.000 description 11
- 230000008569 process Effects 0.000 description 9
- 230000007246 mechanism Effects 0.000 description 8
- 230000033001 locomotion Effects 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 6
- 238000005259 measurement Methods 0.000 description 6
- 238000004519 manufacturing process Methods 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 230000007704 transition Effects 0.000 description 4
- 238000001514 detection method Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 239000013078 crystal Substances 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000011347 resin Substances 0.000 description 1
- 229920005989 resin Polymers 0.000 description 1
- 230000000087 stabilizing effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0094—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/106—Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
- H04N7/185—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U20/00—Constructional aspects of UAVs
- B64U20/80—Arrangement of on-board electronics, e.g. avionics systems or wiring
- B64U20/87—Mounting of imaging devices, e.g. mounting of gimbals
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/10—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/10—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
- B64U2201/104—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS] using satellite radio beacon positioning systems, e.g. GPS
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/20—Remote controls
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U30/00—Means for producing lift; Empennages; Arrangements thereof
- B64U30/20—Rotors; Rotor supports
- B64U30/21—Rotary wings
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U50/00—Propulsion; Power supply
- B64U50/30—Supply or distribution of electrical power
Definitions
- the present invention relates to a flight path generation method, a flight path generation system, a flying object, a program, and a recording medium.
- a platform unmanned aircraft that performs imaging while passing through a preset fixed route.
- This platform receives an imaging instruction from a ground base and images an imaging target.
- the platform tilts the imaging device of the platform and images based on the positional relationship between the platform and the imaging target while flying along a fixed path.
- Patent Document 1 captures an image while passing through a fixed route, but does not sufficiently consider the presence of a specific subject (for example, a building) positioned in the vertical direction from the fixed route. Therefore, it is difficult to sufficiently acquire the captured image of the side surface of the specific subject.
- a specific subject for example, a building
- the side surface of a specific subject is imaged by an unmanned aerial vehicle
- the imaging position it is conceivable that the position (latitude, longitude, altitude) in the three-dimensional space is designated by user input. In this case, since each imaging position is determined by user input, user convenience is reduced.
- the flight path generation method is a flight path generation method of a flying object that circulates around the side of the subject and images the subject, and the flight range of the flying object and the imaging position interval at which the subject is imaged by the flying object And determining the imaging position of the subject by the flying object, and generating the flight path of the flying object that passes through the imaging position.
- the imaging position interval may include a first imaging position interval that is an interval between imaging positions of a subject at the same altitude.
- the flight path generation method includes at least a radius of a subject, a radius of a flight range, an angle of view of an imaging unit included in the flying object, and an overlapping rate of imaging ranges captured by the flying object at adjacent imaging positions. And determining a first imaging position interval based on the overlap rate of the first imaging position.
- the interval between the first imaging positions in the flight path may be equal.
- the imaging position interval may include a second imaging position interval that is an imaging altitude interval at which the subject is imaged by the flying object.
- the flight path generation method includes at least the radius of the subject, the radius of the flight range, the angle of view of the imaging unit included in the flying object, and the overlapping rate of the imaging range captured by the flying object at adjacent imaging altitudes. And determining a second imaging position interval based on the overlap rate of the second imaging position.
- the second imaging position interval in the flight path may be equal.
- the flight path may be a flight path that changes from the first altitude to the second altitude after the flying object passes through each imaging position at the first altitude.
- the flight path generation method may further include a step of capturing a plurality of captured images by capturing the side surface of the subject with the flying object at each imaging position in the flight path.
- the flight path generation method may further include a step of capturing a plurality of captured images by capturing the side surface of the subject by partially overlapping the imaging range with the flying object at each imaging position in the flight path.
- the flight path generation method may further include a step of generating three-dimensional shape data of the subject based on a plurality of captured images.
- the flight path generation system is a flight path generation system that generates a flight path of a flying object that circulates around the side of the subject and images the subject. And a processing unit that determines an imaging position of the subject by the flying object based on the imaging position interval for imaging, and generates a flight path of the flying object that passes through the imaging position.
- the imaging position interval may include a first imaging position interval that is an interval between imaging positions of a subject at the same altitude.
- the processing unit includes at least a radius of the subject, a radius of the flight range, an angle of view of the imaging unit included in the flying object, and a first overlap that is an overlapping rate of imaging ranges captured by the flying object at adjacent imaging positions.
- the first imaging position interval may be determined based on the rate.
- the interval between the first imaging positions in the flight path may be equal.
- the imaging position interval may include a second imaging position interval that is an imaging altitude interval at which the subject is imaged by the flying object.
- the processing unit has at least a second overlap which is an overlapping rate of the imaging range captured by the flying object at the adjacent imaging altitude and the radius of the subject, the radius of the flying range, the angle of view of the imaging unit included in the flying object, and the adjacent imaging altitude.
- the second imaging position interval may be determined based on the rate.
- Intervals between the second imaging positions in the flight path may be equal intervals.
- the flight path may be a flight path that changes from the first altitude to the second altitude after the flying object passes through each imaging position at the first altitude.
- the flight path generation system may further include an imaging unit that captures a side surface of the subject by the flying object at each imaging position in the flight path and acquires a plurality of captured images.
- the flight path generation system may further include an imaging unit that captures a plurality of captured images by capturing an image of a side surface of a subject by partially overlapping an imaging range with a flying object at each imaging position in the flight path.
- the processing unit may generate three-dimensional shape data of the subject based on a plurality of captured images.
- the flying object is a flying object that circulates around the side of the subject and images the subject, and captures the subject based on the flight range of the flying object and the imaging position interval for imaging the subject.
- a processing unit that determines a position and generates a flight path of the flying object that passes through the imaging position.
- the imaging position interval may include a first imaging position interval that is an interval between imaging positions of a subject at the same altitude.
- the processing unit includes at least a radius of the subject, a radius of the flight range, an angle of view of the imaging unit included in the flying object, and a first overlap that is an overlapping rate of imaging ranges captured by the flying object at adjacent imaging positions.
- the first imaging position interval may be determined based on the rate.
- the interval between the first imaging positions in the flight path may be equal.
- the imaging position interval may include a second imaging position interval that is an imaging altitude interval at which the subject is imaged by the flying object.
- the processing unit has at least a second overlap which is an overlapping rate of the imaging range captured by the flying object at the adjacent imaging altitude and the radius of the subject, the radius of the flying range, the angle of view of the imaging unit included in the flying object, and the adjacent imaging altitude.
- the second imaging position interval may be determined based on the rate.
- Intervals between the second imaging positions in the flight path may be equal intervals.
- the flight path may be a flight path that changes from the first altitude to the second altitude after the flying object passes through each imaging position at the first altitude.
- the flying object may further include an imaging unit that captures a side surface of the subject and acquires a plurality of captured images at each imaging position in the flight path.
- the flying object may further include an imaging unit that captures a plurality of captured images by capturing the side surface of the subject by partially overlapping the imaging range at each imaging position in the flight path.
- the processing unit may generate three-dimensional shape data of the subject based on a plurality of captured images.
- the processing unit includes information on the radius of the subject, information on the radius of the flight range, information on the first overlap rate that is the overlap rate of the imaging range captured by the flying object at adjacent imaging positions, and the adjacent imaging altitude.
- a parameter including at least one of the information on the second overlapping rate that is the overlapping rate of the imaging range captured by the flying object may be acquired.
- the program circulates around the side of the subject, generates a flight path of the flying object that images the subject, and sets a flight range of the flying object and an imaging position interval for imaging the subject by the flying object.
- This is a program for executing a procedure for determining an imaging position of a subject by a flying object and a procedure for generating a flight path of the flying object that passes through the imaging position.
- the recording medium is a computer that generates a flight path of a flying object that circulates around the side of the subject and images the subject, a flight range of the flying object, an imaging position interval that images the subject by the flying object, Is a computer-readable recording medium on which a program for executing a procedure for determining the imaging position of a subject by a flying object and a procedure for generating a flight path of a flying object that passes through the imaging position is recorded. .
- a figure showing an example of the appearance of an unmanned aerial vehicle The figure which shows an example of the concrete appearance of an unmanned aerial vehicle Block diagram showing an example of the hardware configuration of an unmanned aerial vehicle
- Schematic diagram showing an example of the horizontal angle Plan view showing each imaging position and flight order of each imaging position in an arbitrary flight course Front view showing a first example of each imaging position and flight order of each imaging position in each flight course
- an unmanned aerial vehicle (UAV: Unmanned Aerial Vehicle) is exemplified as a flying object.
- UAV Unmanned Aerial Vehicle
- the flight path generation method defines the operation in the flight path generation system.
- the recording medium is a recording medium recorded with a program (for example, a program that causes at least one of an unmanned aircraft and a transmitter to execute various processes).
- FIG. 1 is a schematic diagram illustrating a configuration example of a flight path generation system 10 according to the first embodiment.
- the flight path generation system 10 includes an unmanned aircraft 100 and a transmitter 50.
- the unmanned aircraft 100 and the transmitter 50 can communicate by wired communication or wireless communication (for example, wireless LAN (Local Area Network), Bluetooth (registered trademark)).
- wireless LAN Local Area Network
- Bluetooth registered trademark
- FIG. 2 is a diagram illustrating an example of the appearance of the unmanned aerial vehicle 100.
- FIG. 3 is a diagram illustrating an example of a specific appearance of the unmanned aerial vehicle 100. A side view when the unmanned aircraft 100 flies in the moving direction STV0 is shown in FIG. 2, and a perspective view when the unmanned aircraft 100 flies in the moving direction STV0 is shown in FIG.
- a roll axis (see x-axis) is defined in a direction parallel to the ground and along the moving direction STV0.
- a pitch axis (see y-axis) is defined in a direction parallel to the ground and perpendicular to the roll axis, and further, a yaw axis (z-axis) in a direction perpendicular to the ground and perpendicular to the roll axis and the pitch axis. See).
- the unmanned aerial vehicle 100 includes a UAV main body 102, a gimbal 200, an imaging device 220, and a plurality of imaging devices 230.
- Unmanned aerial vehicle 100 is an example of a flying object.
- the imaging devices 220 and 230 are an example of an imaging unit.
- the UAV main body 102 includes a plurality of rotor blades.
- the UAV main body 102 causes the unmanned aircraft 100 to fly by controlling the rotation of a plurality of rotor blades.
- the UAV main body 102 causes the unmanned aircraft 100 to fly using, for example, four rotary wings.
- the number of rotor blades is not limited to four.
- Unmanned aerial vehicle 100 may also be a fixed wing aircraft that does not have rotating wings.
- the imaging device 220 is an imaging camera that captures a subject included in a desired imaging range (for example, an aerial subject, a landscape such as a mountain or a river, a building on the ground).
- a desired imaging range for example, an aerial subject, a landscape such as a mountain or a river, a building on the ground.
- the plurality of imaging devices 230 are sensing cameras that image the surroundings of the unmanned aircraft 100 in order to control the flight of the unmanned aircraft 100.
- the two imaging devices 230 may be provided on the front surface that is the nose of the unmanned aircraft 100. Further, the other two imaging devices 230 may be provided on the bottom surface of the unmanned aircraft 100.
- the two imaging devices 230 on the front side may be paired and function as a so-called stereo camera.
- the two imaging devices 230 on the bottom side may also be paired and function as a stereo camera.
- Three-dimensional spatial data around the unmanned aerial vehicle 100 may be generated based on images captured by the plurality of imaging devices 230. Note that the number of imaging devices 230 included in the unmanned aerial vehicle 100 is not limited to four.
- the unmanned aircraft 100 only needs to include at least one imaging device 230.
- the unmanned aerial vehicle 100 may include at least one imaging device 230 on each of the nose, tail, side, bottom, and ceiling of the unmanned aircraft 100.
- the angle of view that can be set by the imaging device 230 may be wider than the angle of view that can be set by the imaging device 220.
- the imaging device 230 may have a single focus lens or a fisheye lens.
- FIG. 4 is a block diagram showing an example of the hardware configuration of the unmanned aerial vehicle 100.
- the unmanned aircraft 100 includes a UAV control unit 110, a communication interface 150, a memory 160, a gimbal 200, a rotary wing mechanism 210, an imaging device 220, an imaging device 230, a GPS receiver 240, an inertial measurement device ( An IMU (Inertial Measurement Unit) 250, a magnetic compass 260, and a barometric altimeter 270 are included.
- the UAV control unit 110 is an example of a processing unit.
- the communication interface 150 is an example of a communication unit.
- the UAV control unit 110 is configured using, for example, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or a DSP (Digital Signal Processor).
- the UAV control unit 110 performs signal processing for overall control of operations of each unit of the unmanned aircraft 100, data input / output processing with respect to other units, data calculation processing, and data storage processing.
- the UAV control unit 110 controls the flight of the unmanned aircraft 100 according to a program stored in the memory 160.
- UAV control unit 110 controls the flight of unmanned aerial vehicle 100 in accordance with instructions received from remote transmitter 50 via communication interface 150.
- Memory 160 may be removable from unmanned aerial vehicle 100.
- the UAV control unit 110 may specify the environment around the unmanned aircraft 100 by analyzing a plurality of images captured by the plurality of imaging devices 230.
- the UAV control unit 110 controls the flight based on the environment around the unmanned aircraft 100 while avoiding obstacles, for example.
- the UAV control unit 110 acquires date / time information indicating the current date / time.
- the UAV control unit 110 may acquire date / time information indicating the current date / time from the GPS receiver 240.
- the UAV control unit 110 may acquire date / time information indicating the current date / time from a timer (not shown) mounted on the unmanned aircraft 100.
- the UAV control unit 110 acquires position information indicating the position of the unmanned aircraft 100.
- the UAV control unit 110 may acquire position information indicating the latitude, longitude, and altitude at which the unmanned aircraft 100 exists from the GPS receiver 240.
- the UAV control unit 110 acquires, from the GPS receiver 240, latitude / longitude information indicating the latitude and longitude where the unmanned aircraft 100 exists, and altitude information indicating the altitude where the unmanned aircraft 100 exists from the barometric altimeter 270, as position information. Good.
- the UAV control unit 110 acquires orientation information indicating the orientation of the unmanned aircraft 100 from the magnetic compass 260.
- direction information for example, a direction corresponding to the nose direction of the unmanned aircraft 100 is indicated.
- the UAV control unit 110 may acquire position information indicating a position where the unmanned aircraft 100 should be present when the imaging device 220 captures an imaging range to be imaged.
- the UAV control unit 110 may acquire position information indicating the position where the unmanned aircraft 100 should be present from the memory 160.
- the UAV control unit 110 may acquire position information indicating the position where the unmanned aircraft 100 should exist from another device such as the transmitter 50 via the communication interface 150.
- the UAV control unit 110 refers to the 3D map database, specifies a position where the unmanned aircraft 100 can exist in order to capture an imaging range to be imaged, and sets the position where the unmanned aircraft 100 should exist. May be acquired as position information indicating.
- the UAV control unit 110 acquires imaging information indicating the imaging ranges of the imaging device 220 and the imaging device 230.
- the UAV control unit 110 acquires angle-of-view information indicating the angle of view of the imaging device 220 and the imaging device 230 from the imaging device 220 and the imaging device 230 as parameters for specifying the imaging range.
- the UAV control unit 110 acquires information indicating the imaging direction of the imaging device 220 and the imaging device 230 as a parameter for specifying the imaging range.
- the UAV control unit 110 acquires posture information indicating the posture state of the imaging device 220 from the gimbal 200 as information indicating the imaging direction of the imaging device 220, for example.
- the UAV control unit 110 acquires information indicating the direction of the unmanned aircraft 100.
- Information indicating the posture state of the imaging device 220 indicates a rotation angle from the reference rotation angle of the pitch axis and yaw axis of the gimbal 200.
- the UAV control unit 110 acquires position information indicating a position where the unmanned aircraft 100 exists as a parameter for specifying the imaging range.
- the UAV control unit 110 defines an imaging range indicating a geographical range captured by the imaging device 220 based on the angle of view and the imaging direction of the imaging device 220 and the imaging device 230, and the position where the unmanned aircraft 100 exists.
- the imaging information may be acquired by generating imaging information indicating the imaging range.
- the UAV control unit 110 may acquire imaging information indicating an imaging range to be imaged by the imaging device 220.
- the UAV control unit 110 may acquire imaging information to be imaged by the imaging device 220 from the memory 160.
- the UAV control unit 110 may acquire imaging information to be imaged by the imaging device 220 from another device such as the transmitter 50 via the communication interface 150.
- the UAV control unit 110 may acquire three-dimensional information (three-dimensional information) indicating the three-dimensional shape (three-dimensional shape) of an object existing around the unmanned aircraft 100.
- the object is a part of a landscape such as a building, a road, a car, and a tree.
- the three-dimensional information is, for example, three-dimensional space data.
- the UAV control unit 110 may acquire the three-dimensional information by generating the three-dimensional information indicating the three-dimensional shape of the object existing around the unmanned aircraft 100 from each image obtained from the plurality of imaging devices 230.
- the UAV control unit 110 may acquire the three-dimensional information indicating the three-dimensional shape of the object existing around the unmanned aircraft 100 by referring to the three-dimensional map database stored in the memory 160.
- the UAV control unit 110 may acquire three-dimensional information related to the three-dimensional shape of an object existing around the unmanned aircraft 100 by referring to a three-dimensional map database managed by a server existing on the network.
- the UAV control unit 110 acquires image data captured by the imaging device 220 and the imaging device 230.
- the UAV control unit 110 controls the gimbal 200, the rotary blade mechanism 210, the imaging device 220, and the imaging device 230.
- the UAV control unit 110 controls the imaging range of the imaging device 220 by changing the imaging direction or angle of view of the imaging device 220.
- the UAV control unit 110 controls the imaging range of the imaging device 220 supported by the gimbal 200 by controlling the rotation mechanism of the gimbal 200.
- the imaging range refers to a geographical range captured by the imaging device 220 or the imaging device 230.
- the imaging range is defined by latitude, longitude, and altitude.
- the imaging range may be a range in three-dimensional spatial data defined by latitude, longitude, and altitude.
- the imaging range is specified based on the angle of view and imaging direction of the imaging device 220 or the imaging device 230, and the position where the unmanned aircraft 100 is present.
- the imaging directions of the imaging device 220 and the imaging device 230 are defined from the azimuth and the depression angle in which the front surface where the imaging lenses of the imaging device 220 and the imaging device 230 are provided is directed.
- the imaging direction of the imaging device 220 is a direction specified from the heading direction of the unmanned aerial vehicle 100 and the posture state of the imaging device 220 with respect to the gimbal 200.
- the imaging direction of the imaging device 230 is a direction specified from the heading of the unmanned aerial vehicle 100 and the position where the imaging device 230 is provided.
- the UAV control unit 110 controls the flight of the unmanned aircraft 100 by controlling the rotary wing mechanism 210. That is, the UAV control unit 110 controls the position including the latitude, longitude, and altitude of the unmanned aircraft 100 by controlling the rotary wing mechanism 210.
- the UAV control unit 110 may control the imaging ranges of the imaging device 220 and the imaging device 230 by controlling the flight of the unmanned aircraft 100.
- the UAV control unit 110 may control the angle of view of the imaging device 220 by controlling a zoom lens included in the imaging device 220.
- the UAV control unit 110 may control the angle of view of the imaging device 220 by digital zoom using the digital zoom function of the imaging device 220.
- the UAV control unit 110 moves the unmanned aircraft 100 to a specific position at a specific date and time to perform desired imaging under a desired environment.
- the range can be imaged by the imaging device 220.
- the UAV control unit 110 moves the unmanned aircraft 100 to a specific position at the specified date and time to In this environment, the imaging device 220 can capture a desired imaging range.
- the UAV control unit 110 includes a function as a flight path processing unit 111 that performs processing related to flight path generation.
- the UAV control unit 110 may include a function as the shape data processing unit 112 that performs processing related to generation of three-dimensional shape data.
- the flight path processing unit 111 may acquire input parameters. Alternatively, the flight path processing unit 111 may acquire the input parameter input by the transmitter 50 by receiving it via the communication interface 150.
- the acquired input parameters may be stored in the memory 160.
- the input parameters include various parameters for generating an imaging position (aerial imaging position) (Waypoint) of the image by the unmanned aircraft 100 and a flight path passing through the imaging position.
- the imaging position is a position in a three-dimensional space.
- Input parameters include flight range information, flight range radius (flight path radius) information, flight range center position information, subject radius information, subject height information, and imaging range overlap rate information.
- At least one of resolution information of the imaging device 220 or the imaging device 230 may be included.
- the input parameter may include at least one of information on an initial altitude of the flight path, information on an end altitude of the flight path, and information on an initial imaging position of the flight course.
- the input parameter may include information on the imaging position interval.
- the flight path processing unit 111 may acquire at least a part of information included in the input parameter from another device instead of acquiring from the transmitter 50.
- the flight path processing unit 111 may receive and acquire subject identification information specified by the transmitter 50.
- the flight path processing unit 111 communicates with the external server via the communication interface 150 based on the identified subject identification information, and obtains subject radius information and subject height information corresponding to the subject identification information. It may be received and received.
- the overlap ratio of the imaging ranges indicates the ratio at which two imaging ranges overlap when images are captured by the imaging device 220 or the imaging device 230 at adjacent imaging positions in the horizontal direction or the vertical direction.
- the overlapping range of the imaging range is at least one of information on the overlapping rate of the imaging range in the horizontal direction (also referred to as horizontal overlapping rate) and information on the overlapping rate of the imaging range in the vertical direction (also referred to as vertical overlapping rate). May include.
- the horizontal overlap rate and the vertical overlap rate may be the same or different. When the horizontal overlap rate and the vertical overlap rate are different values, both the horizontal overlap rate information and the vertical overlap rate information may be included in the input parameter. When the horizontal overlap rate and the vertical overlap rate are the same value, information on one overlap rate that is the same value may be included in the input parameter.
- the horizontal overlap rate is an example of a first overlap rate.
- the vertical overlap rate is an example of a second overlap rate.
- the imaging position interval is a spatial imaging interval, and is a distance between adjacent imaging positions among a plurality of imaging positions at which the unmanned aircraft 100 should capture images in the flight path.
- the imaging position interval may include at least one of an imaging position interval in the horizontal direction (also referred to as a horizontal imaging interval) and an imaging position interval in the vertical direction (also referred to as an upper and lower imaging interval).
- the horizontal imaging interval is an example of a first imaging position interval.
- the vertical imaging interval is an example of a second imaging position interval.
- the flight path processing unit 111 may calculate and acquire an imaging position interval including a horizontal imaging interval and an upper and lower imaging interval or may be acquired by being included in an input parameter.
- the flight range is a range including a flight path at the peripheral edge where the unmanned aircraft 100 circulates around the subject.
- the flight range may be a range in which a cross-sectional shape of the flight range viewed from directly above is approximated to a circular shape.
- the cross-sectional shape of the flight range viewed from directly above may be a shape other than a circle (for example, a polygonal shape).
- the flight path may have a plurality of flight courses having different altitudes (imaging altitudes).
- the flight path processing unit 111 may calculate the flight range based on information on the center position of the subject (for example, information on latitude and longitude) and information on the radius of the subject.
- the flight path processing unit 111 may calculate the flight range by approximating the subject to a circular shape based on the center position of the subject and the radius of the subject. In addition, the flight path processing unit 111 may acquire information on the flight range generated by the transmitter 50 included in the input parameters.
- the flight path processing unit 111 may acquire the angle of view of the imaging device 220 or the angle of view of the imaging device 230 from the imaging device 220 or the imaging device 230.
- the angle of view of the imaging device 220 or the angle of view of the imaging device 230 may be the same or different in the horizontal direction and the vertical direction.
- the angle of view of the imaging device 220 in the horizontal direction or the angle of view of the imaging device 230 is also referred to as a horizontal angle of view.
- the angle of view of the imaging device 220 or the angle of view of the imaging device 230 in the vertical direction is also referred to as the vertical angle of view.
- the flight path processing unit 111 may acquire information on one angle of view having the same value when the horizontal angle of view and the vertical angle of view are the same value.
- the flight path processing unit 111 may calculate the horizontal imaging interval based on the radius of the subject, the radius of the flight range, the horizontal angle of view of the imaging device 220 or the horizontal angle of view of the imaging device 230, and the horizontal overlap rate of the imaging range. .
- the flight path processing unit 111 may calculate the vertical imaging interval based on the radius of the subject, the radius of the flight range, the vertical angle of view of the imaging device 220 or the vertical angle of view of the imaging device 230, and the vertical overlap rate of the imaging range. .
- the flight path processing unit 111 determines the imaging position (Waypoint) of the subject by the unmanned aircraft 100 based on the flight range and the imaging position interval.
- the imaging positions by the unmanned aerial vehicle 100 may be arranged at equal intervals in the horizontal direction, and the distance between the last imaging position and the first imaging position may be shorter than the imaging position interval. This interval is a horizontal imaging interval.
- the imaging positions by the unmanned aircraft 100 may be arranged at equal intervals in the vertical direction, and the distance between the last imaging position and the first imaging position may be shorter than the imaging position interval. This interval is the vertical imaging interval.
- the flight path processing unit 111 generates a flight path that passes through the determined imaging position.
- the flight path processing unit 111 sequentially passes through the imaging positions adjacent in the horizontal direction in one flight course, generates a flight path that enters the next flight course after passing through all the imaging positions in the flight course. Good.
- the flight path processing unit 111 sequentially passes through the imaging positions adjacent in the horizontal direction, passes through all the imaging positions in the flight course, and then enters the next flight course.
- a route may be generated.
- the flight path may be formed such that the altitude increases as the flight path starts from the ground side.
- the flight path may be formed such that the altitude decreases as the flight path starts from the sky side.
- the flight path processing unit 111 may control the flight of the unmanned aircraft 100 according to the generated flight path.
- the flight path processing unit 111 may cause the imaging device 220 or the imaging device 230 to image a subject at an imaging position that exists in the middle of the flight path.
- the unmanned aerial vehicle 100 may orbit around the side of the subject and fly according to the flight path. Therefore, the imaging device 220 or the imaging device 230 may capture the side surface of the subject at the imaging position in the flight path.
- a captured image captured by the imaging device 220 or the imaging device 230 may be held in the memory 160.
- the UAV control unit 110 may refer to the memory 160 as appropriate (for example, when generating three-dimensional shape data).
- the shape data processing unit 112 is a three-dimensional information (three-dimensional shape) indicating a three-dimensional shape (three-dimensional shape) of an object (subject) based on a plurality of picked-up images picked up at different image pickup positions by any of the image pickup devices 220 and 230 Information, three-dimensional shape data). Therefore, the captured image may be used as one image for restoring the three-dimensional shape data.
- the captured image for restoring the three-dimensional shape data may be a still image.
- a known method may be used as a method for generating three-dimensional shape data based on a plurality of captured images. a known method may be used. Known methods include, for example, MVS (Multi View Stereo), PMVS (Patch-based MVS), and SfM (Structure from Motion).
- the captured image used for generating the three-dimensional shape data may be a still image.
- the plurality of captured images used for generating the three-dimensional shape data include two captured images whose imaging ranges partially overlap each other.
- the higher the overlapping ratio that is, the imaging area overlapping ratio
- the shape data processing unit 112 can improve the reconstruction accuracy of the three-dimensional shape.
- the lower the overlapping ratio of the imaging ranges the smaller the number of captured images used for generating the three-dimensional shape data when generating the three-dimensional shape data in the same range. Therefore, the shape data processing unit 112 can shorten the generation time of the three-dimensional shape data. Note that two captured images whose imaging ranges partially overlap each other may not be included in the plurality of captured images.
- the shape data processing unit 112 acquires a plurality of captured images including captured images in which the side surface of the subject is captured. Therefore, the shape data processing unit 112 can collect a large number of image features on the side surface of the subject and improve the reconstruction accuracy of the three-dimensional shape around the subject as compared to the case of acquiring a captured image obtained by uniformly capturing the vertical direction from the sky. it can.
- the communication interface 150 communicates with the transmitter 50 (see FIG. 4).
- the communication interface 150 receives various commands and information for the UAV control unit 110 from the remote transmitter 50.
- the memory 160 is a program necessary for the UAV control unit 110 to control the gimbal 200, the rotating blade mechanism 210, the imaging device 220, the imaging device 230, the GPS receiver 240, the inertial measurement device 250, the magnetic compass 260, and the barometric altimeter 270. Etc. are stored. In addition, the memory 160 stores programs and the like necessary for the UAV control unit 110 to execute the flight path processing unit 111 and the shape data processing unit 112.
- the memory 160 may be a computer-readable recording medium, such as SRAM (Static Random Access Memory), DRAM (Dynamic Random Access Memory), EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), and It may include at least one flash memory such as a USB memory.
- the memory 160 may be provided inside the UAV main body 102. It may be provided so as to be removable from the UAV main body 102.
- the gimbal 200 supports the imaging device 220 to be rotatable about at least one axis.
- the gimbal 200 may support the imaging device 220 rotatably about the yaw axis, pitch axis, and roll axis.
- the gimbal 200 may change the imaging direction of the imaging device 220 by rotating the imaging device 220 about at least one of the yaw axis, the pitch axis, and the roll axis.
- the rotary blade mechanism 210 includes a plurality of rotary blades and a plurality of drive motors that rotate the plurality of rotary blades.
- the imaging device 220 captures a subject within a desired imaging range and generates captured image data.
- Image data obtained by imaging by the imaging device 220 is stored in a memory included in the imaging device 220 or the memory 160.
- the imaging device 230 captures the surroundings of the unmanned aircraft 100 and generates captured image data. Image data of the imaging device 230 is stored in the memory 160.
- the GPS receiver 240 receives a plurality of signals indicating times and positions (coordinates) of each GPS satellite transmitted from a plurality of navigation satellites (that is, GPS satellites).
- the GPS receiver 240 calculates the position of the GPS receiver 240 (that is, the position of the unmanned aircraft 100) based on the plurality of received signals.
- the GPS receiver 240 outputs the position information of the unmanned aircraft 100 to the UAV control unit 110.
- the calculation of the position information of the GPS receiver 240 may be performed by the UAV control unit 110 instead of the GPS receiver 240. In this case, the UAV control unit 110 receives information indicating the time and the position of each GPS satellite included in a plurality of signals received by the GPS receiver 240.
- the inertial measurement device 250 detects the attitude of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
- the inertial measurement device IMU 250 detects the acceleration of the unmanned aircraft 100 in the three axial directions of the front, rear, left and right, and the angular velocity in the three axial directions of the pitch axis, the roll axis, and the yaw axis. .
- the magnetic compass 260 detects the heading of the unmanned aircraft 100 and outputs the detection result to the UAV control unit 110.
- the barometric altimeter 270 detects the altitude at which the unmanned aircraft 100 flies and outputs the detection result to the UAV control unit 110.
- FIG. 5 is a perspective view illustrating an example of the appearance of the transmitter 50.
- the up / down / front / rear / left / right directions with respect to the transmitter 50 are assumed to follow the directions of arrows shown in FIG.
- the transmitter 50 is used in a state of being held by both hands of a person using the transmitter 50 (hereinafter referred to as “operator”), for example.
- the transmitter 50 includes, for example, a resin casing 50B having a substantially rectangular parallelepiped shape (in other words, a substantially box shape) having a substantially square bottom surface and a height shorter than one side of the bottom surface.
- a specific configuration of the transmitter 50 will be described later with reference to FIG.
- a left control rod 53L and a right control rod 53R are provided in a projecting manner at approximately the center of the housing surface of the transmitter 50.
- the left control rod 53L and the right control rod 53R are used in operations for remotely controlling the movement of the unmanned aircraft 100 by the operator (for example, moving the unmanned aircraft 100 back and forth, moving left and right, moving up and down, and changing the direction).
- the In FIG. 5, the left control rod 53L and the right control rod 53R show positions in an initial state where no external force is applied from both hands of the operator.
- the left control rod 53L and the right control rod 53R automatically return to predetermined positions (for example, the initial position shown in FIG. 5) after the external force applied by the operator is released.
- the power button B1 of the transmitter 50 is disposed on the front side (in other words, the operator side) of the left control rod 53L.
- the power button B1 is pressed once by the operator, for example, the remaining capacity of the battery (not shown) built in the transmitter 50 is displayed in the remaining battery capacity display portion L2.
- the power button B1 is pressed once again by the operator, for example, the power of the transmitter 50 is turned on, and power is supplied to each part (see FIG. 6) of the transmitter 50 so that it can be used.
- RTH (Return To Home) button B2 is arranged on the front side (in other words, the operator side) of the right control rod 53R.
- the transmitter 50 transmits a signal for automatically returning the unmanned aircraft 100 to a predetermined position.
- the transmitter 50 can automatically return the unmanned aircraft 100 to a predetermined position (for example, a take-off position stored in the unmanned aircraft 100).
- the RTH button B2 is used when, for example, the operator loses sight of the fuselage of the unmanned aircraft 100 during aerial shooting with the unmanned aircraft 100 outdoors, or when it becomes impossible to operate due to radio interference or unexpected troubles. Is available.
- the remote status display part L1 and the remaining battery capacity display part L2 are arranged on the front side (in other words, the operator side) of the power button B1 and the RTH button B2.
- the remote status display unit L1 is configured using, for example, an LED (Light Emission Diode), and displays a wireless connection state between the transmitter 50 and the unmanned aircraft 100.
- the battery remaining amount display unit L2 is configured using, for example, an LED, and displays the remaining amount of the capacity of a battery (not shown) built in the transmitter 50.
- Two antennas AN1 and AN2 project from the rear side of the housing 50B of the transmitter 50 and rearward from the left control rod 53L and the right control rod 53R.
- the antennas AN1 and AN2 are unmanned signals generated by the transmitter control unit 61 (that is, signals for controlling the movement of the unmanned aircraft 100) based on the operations of the left control rod 53L and the right control rod 53R by the operator. Transmit to aircraft 100.
- the antennas AN1 and AN2 can cover a transmission / reception range of 2 km, for example.
- the antennas AN ⁇ b> 1 and AN ⁇ b> 2 are used when images taken by the imaging devices 220 and 230 included in the unmanned aircraft 100 wirelessly connected to the transmitter 50 or various data acquired by the unmanned aircraft 100 are transmitted from the unmanned aircraft 100. In addition, these images or various data can be received.
- the display unit DP includes, for example, an LCD (Crystal Liquid Display).
- the display unit DP displays various data.
- the shape, size, and arrangement position of the display unit DP are arbitrary, and are not limited to the example of FIG.
- FIG. 6 is a block diagram illustrating an example of a hardware configuration of the transmitter 50.
- the transmitter 50 includes a left control rod 53L, a right control rod 53R, a transmitter control unit 61, a wireless communication unit 63, a power button B1, an RTH button B2, an operation unit set OPS, and a remote status display unit.
- L1 the battery remaining amount display part L2, and the display part DP are comprised.
- the transmitter 50 is an example of a communication terminal.
- the wireless communication unit 63 is an example of a communication unit.
- the left control rod 53L is used for an operation for remotely controlling the movement of the unmanned aircraft 100 by, for example, the left hand of the operator.
- the right control rod 53R is used for an operation for remotely controlling the movement of the unmanned aircraft 100 by, for example, the operator's right hand.
- the unmanned aircraft 100 may move forward, move backward, move left, move right, move up, move down, rotate the unmanned aircraft 100 left. Or a combination thereof, and so on.
- the transmitter control unit 61 displays the remaining capacity of the battery (not shown) built in the transmitter 50 on the remaining battery amount display unit L2. Thus, the operator can easily check the remaining capacity of the battery capacity built in the transmitter 50.
- the power button B1 is pressed twice, a signal indicating that the power button B1 has been pressed twice is passed to the transmitter control unit 61.
- the transmitter control unit 61 instructs a battery (not shown) built in the transmitter 50 to supply power to each unit in the transmitter 50. As a result, the operator turns on the power of the transmitter 50 and can easily start using the transmitter 50.
- a signal indicating that the RTH button B2 has been pressed is input to the transmitter control unit 61.
- the transmitter control unit 61 generates a signal for automatically returning the unmanned aircraft 100 to a predetermined position (for example, the takeoff position of the unmanned aircraft 100), via the wireless communication unit 63 and the antennas AN1 and AN2. Transmit to unmanned aerial vehicle 100.
- the operator can automatically return (return) the unmanned aircraft 100 to a predetermined position by a simple operation on the transmitter 50.
- the operation unit set OPS is configured using a plurality of operation units (for example, operation units OP1,..., Operation unit OPn) (n: an integer of 2 or more).
- the operation unit set OPS supports other operation units (for example, the remote control of the unmanned aircraft 100 by the transmitter 50) except for the left control rod 53L, the right control rod 53R, the power button B1, and the RTH button B2 shown in FIG. Various operation units).
- the various operation units referred to here are, for example, a button for instructing imaging of a still image using the imaging device 220 of the unmanned aerial vehicle 100, and an instruction for starting and ending video recording using the imaging device 220 of the unmanned aircraft 100.
- the operation unit set OPS has a parameter operation unit OPA for inputting information of input parameters for generating an imaging interval position, an imaging position, or a flight path of the unmanned aircraft 100.
- the parameter operation unit OPA may be formed by a stick, a button, a key, a touch panel, or the like.
- the parameter operation unit OPA may be formed by the left control rod 53L and the right control rod 53R.
- the timing for inputting each parameter included in the input parameters by the parameter operation unit OPA may be the same or different.
- Input parameters are flight range information, flight range radius (flight path radius) information, flight range center position information, subject radius information, subject height information, horizontal overlap rate information, up and down It may include at least one of information on the overlapping rate and information on the resolution of the imaging device 220 or the imaging device 230.
- the input parameter may include at least one of information on an initial altitude of the flight path, information on an end altitude of the flight path, and information on an initial imaging position of the flight course.
- the input parameter may include at least one of information on the horizontal imaging interval and information on the vertical imaging interval.
- the parameter operation unit OPA inputs a specific value or range of latitude / longitude, so that the flight range information, the flight range radius (flight path radius) information, the flight range center position information, Input at least one of radius information, subject height (for example, initial altitude, end altitude) information, horizontal overlap rate information, vertical overlap rate information, and resolution information of the imaging device 220 or 230. It's okay.
- the parameter operation unit OPA inputs at least one of latitude / longitude values or ranges, so that at least one of information on the initial altitude of the flight path, information on the end altitude of the flight path, and information on the initial imaging position of the flight course. You may enter one.
- the parameter operation unit OPA may input at least one of the horizontal imaging interval information and the vertical imaging interval information by inputting specific values or ranges of latitude and longitude.
- the remote status display unit L1 and the remaining battery level display unit L2 have been described with reference to FIG.
- the transmitter controller 61 is configured using a processor (for example, CPU, MPU or DSP).
- the transmitter control unit 61 performs signal processing for overall control of operations of the respective units of the transmitter 50, data input / output processing with other units, data calculation processing, and data storage processing.
- the transmitter control unit 61 generates a signal for controlling the movement of the unmanned aircraft 100 designated by the operation of the operator by operating the left control rod 53L and the right control rod 53R.
- the transmitter control unit 61 transmits the generated signal to the unmanned aircraft 100 via the wireless communication unit 63 and the antennas AN1 and AN2, and remotely controls the unmanned aircraft 100.
- the transmitter 50 can control the movement of the unmanned aircraft 100 remotely.
- the transmitter control unit 61 acquires map information of a map database stored in an external server or the like via the wireless communication unit 63.
- the transmitter control unit 61 displays the map information via the display unit DP, selects the flight range by a touch operation on the map information via the parameter operation unit OPA, and the like.
- Information on the radius (radius of the flight path) may be acquired.
- the transmitter control unit 61 may select a subject by touch operation or the like with map information via the parameter operation unit OPA, and acquire information on the subject radius and height of the subject.
- the transmitter control unit 61 may calculate and acquire information on the initial altitude of the flight path and information on the end altitude of the flight path based on the information on the height of the subject.
- the initial altitude and end altitude may be calculated within a range in which the end of the side surface of the subject can be imaged.
- the transmitter control unit 61 transmits the input parameter input by the parameter operation unit OPA to the unmanned aircraft 100 via the wireless communication unit 63.
- the transmission timing of each parameter included in the input parameter may be the same timing or different timing.
- the transmitter control unit 61 acquires information on input parameters obtained by the parameter operation unit OPA, and sends the information to the display unit DP and the wireless communication unit 63.
- the wireless communication unit 63 is connected to two antennas AN1 and AN2.
- the wireless communication unit 63 transmits / receives information and data to / from the unmanned aircraft 100 via the two antennas AN1 and AN2 using a predetermined wireless communication method (for example, WiFi (registered trademark)).
- the wireless communication unit 63 transmits the input parameter information from the transmitter control unit 61 to the unmanned aircraft 100.
- the display unit DP may display various data processed by the transmitter control unit 61.
- the display unit DP displays information on the input parameters that have been input. Therefore, the operator of the transmitter 50 can confirm the contents of the input parameters by referring to the display unit DP.
- the transmitter 50 may be connected to a display terminal (not shown) by wire or wireless instead of including the display unit DP.
- Information on input parameters may be displayed on the display terminal in the same manner as the display unit DP.
- the display terminal may be a smartphone, a tablet terminal, a PC (Personal Computer), or the like. Further, even if the display terminal inputs at least one of the input parameters, sends the input parameter to the transmitter 50 by wired communication or wireless communication, and the wireless communication unit 63 of the transmitter 50 transmits the input parameter to the unmanned aircraft 100. Good.
- FIG. 7A is a plan view of the periphery of the subject BL as seen from above.
- FIG. 7B is a front view of the subject BL as seen from the front.
- the front of the subject BL is an example of a side view of the subject BL viewed from the side (horizontal direction).
- the subject BL may be a building.
- the flight path processing unit 111 may calculate the horizontal imaging interval d forward indicating the imaging position interval in the horizontal direction using (Equation 1).
- the meaning of each parameter in (Formula 1) is shown below.
- Rflight radius of flight path
- Robj radius of subject BL (radius of approximate circle indicating subject BL)
- FOV Field of View
- 1 Horizontal angle of view r forward of the imaging device 220 or the imaging device 230: Horizontal overlap rate
- the flight path processing unit 111 may receive information (for example, information on latitude and longitude) of the center position BLc of the subject BL included in the input parameter from the transmitter 50 via the communication interface 150.
- the flight path processing unit 111 may calculate the radius R flight of the flight path based on the resolution of the imaging device 220 or the imaging device 230. In this case, the flight path processing unit 111 may receive resolution information included in the input parameter from the transmitter 50 via the communication interface 150. The flight path processing unit 111 may receive information on the radius R flight of the flight path from the transmitter 50 via the communication interface 150 included in the input parameters. The flight path processing unit 111 may receive information on the radius R obj of the subject BL included in the input parameter from the transmitter 50 via the communication interface 150.
- Information on the horizontal angle of view FOV1 may be held in the memory 160 as hardware information related to the unmanned aerial vehicle 100, or may be acquired from the transmitter 50.
- the flight path processing unit 111 may read information on the horizontal field angle FOV1 from the memory 160 when calculating the horizontal imaging interval.
- the flight path processing unit 111 may receive information on the horizontal overlap rate r forward from the transmitter 50 via the communication interface 150.
- the horizontal overlap rate r forward is 90%, for example.
- FIG. 8 is an explanatory diagram for calculating the horizontal imaging interval d forward according to (Equation 1).
- the horizontal angle of view FOV1 can be approximated as follows using the horizontal direction component ph1 of the imaging range by the imaging device 220 or the imaging device 230 and the distance to the subject BL as the imaging distance.
- the angle of view FOV (here FOV1) is indicated by the ratio of length (distance) as is apparent from the above equation.
- An asterisk “*” indicates a multiplication code.
- the flight path processing unit 111 may partially overlap the imaging ranges of two adjacent captured images when the imaging device 220 or the imaging device 230 acquires a plurality of captured images.
- the flight path processing unit 111 can generate three-dimensional shape data by partially overlapping a plurality of imaging ranges.
- the flight path processing unit 111 includes a non-overlapping portion that does not overlap with the horizontal component of the adjacent imaging range in the horizontal component ph1 of the imaging range as a part of (Equation 1) (ph1 * (1 ⁇ horizontal overlap rate r forward ))). Then, the flight path processing unit 111 determines a non-overlapping portion in the horizontal direction component ph1 of the imaging range based on the ratio between the radius R flight of the flight path and the radius R obj of the subject. And imaged as a horizontal imaging interval d forward .
- the flight path processing unit 111 may calculate the horizontal angle ⁇ forward instead of the horizontal imaging interval d forward .
- FIG. 9 is a schematic diagram illustrating an example of the horizontal angle ⁇ forward .
- the horizontal angle is calculated using, for example, (Equation 2).
- the flight path processing unit 111 may calculate the vertical imaging interval d side indicating the vertical imaging position interval using (Equation 3).
- the meaning of each parameter in (Formula 3) is shown below. The description of the parameters used in (Equation 1) is omitted.
- FOV (Field of View) 2 Vertical angle of view r side of imaging device 220 or imaging device 230: Vertical overlap rate
- Information on the vertical angle of view FOV2 is held in the memory 160 as hardware information.
- the flight path processing unit 111 may read information on the horizontal field angle FOV1 from the memory 160 when calculating the vertical imaging interval.
- the flight path processing unit 111 may receive information on the vertical overlap ratio r side included in the input parameter from the transmitter 50 via the communication interface 150.
- the vertical overlap rate r forward is 60%, for example.
- the flight path processing unit 111 mainly exemplified that the imaging position interval is calculated and acquired. Instead, the flight path processing unit 111 may receive and acquire information on the imaging position interval from the transmitter 50 via the communication interface 150.
- the unmanned aircraft 100 can arrange the imaging positions on the same flight course. Therefore, the unmanned aerial vehicle 100 can pass through a plurality of imaging positions without changing the altitude, and can fly stably. In addition, the unmanned aircraft 100 can stably acquire a captured image by making a round around the subject BL in the horizontal direction. In addition, since a large number of captured images of the same subject BL can be acquired at different angles, the restoration accuracy of the three-dimensional shape data can be improved over the entire circumference of the subject BL.
- the flight path processing unit 111 may determine the horizontal imaging interval based on at least the radius of the subject, the radius of the flight range, the horizontal angle of view of the imaging device 220 or 230, and the horizontal overlap rate. . Therefore, the unmanned aerial vehicle 100 can suitably acquire a plurality of horizontal captured images required for three-dimensional reconstruction in consideration of various parameters such as the size of a specific subject and the flight range. Further, if the interval between imaging positions becomes narrow, such as increasing the horizontal overlap ratio, the number of captured images in the horizontal direction increases, and the unmanned aircraft 100 can further improve the accuracy of three-dimensional restoration.
- the unmanned aircraft 100 can acquire captured images at different positions in the vertical direction, that is, at different altitudes. That is, the unmanned aerial vehicle 100 can acquire captured images at different altitudes, which are difficult to acquire especially with uniform imaging from the sky. Therefore, it is possible to suppress the occurrence of a missing area when generating the three-dimensional shape data.
- the flight path processing unit 111 may determine the vertical imaging interval based on at least the radius of the subject, the radius of the flight range, the vertical angle of view of the imaging device 220 or 230, and the vertical overlap rate. .
- the unmanned aerial vehicle 100 can suitably acquire a plurality of vertically-captured captured images required for three-dimensional reconstruction in consideration of various parameters such as the size of the specific subject BL and the flight range. Further, if the interval between the imaging positions becomes narrow, such as increasing the vertical overlap ratio, the number of captured images in the vertical direction increases, and the unmanned aircraft 100 can further improve the accuracy of the three-dimensional restoration.
- FIG. 10A is a plan view showing the imaging positions CP and the flight order of the imaging positions CP in an arbitrary flight course FC.
- the flight path processing unit 111 calculates the imaging position CP (Waypoint) of each flight course FC in the flight path based on the acquired (calculated or received) imaging position interval.
- the flight path processing unit 111 may arrange the imaging positions CP at equal intervals for each horizontal imaging interval in each flight course FC.
- the flight path processing unit 111 may arrange the imaging positions CP at equal intervals at every vertical imaging interval between the flight courses FC adjacent in the vertical direction.
- the flight path processing unit 111 determines and arranges one initial imaging position CP1 (initial imaging position CP) in an arbitrary flight course FC, and uses the initial imaging position CP1 as a base point.
- the imaging positions CP may be arranged at equal intervals in order on the flight course FC at every horizontal imaging interval.
- the flight path processing unit 111 may not arrange the imaging position CP after one round on the flight course FC at the same position as the initial imaging position CP1. That is, 360 degrees, which is one round of the flight course, may not be divided at equal intervals by the imaging position CP. Therefore, there may be intervals where the horizontal imaging intervals are not equal on the same flight course FC.
- the distance between the imaging position CP and the initial imaging position CP1 is the same as the horizontal imaging interval or shorter than the horizontal imaging interval.
- the flight path processing unit 111 generates a flight path FP that passes through each of the arranged imaging positions CP.
- the flight path processing unit 111 may determine a maximum altitude or a minimum altitude as an initial altitude among a plurality of flight courses FC that orbit around the side of the subject BL and have different altitudes. Information on at least one of the highest altitude and the lowest altitude may be included in the input parameter.
- the flight path FP passes through each imaging position CP in the flight course FC at the initial altitude, then changes the altitude to the flight course FC immediately above or directly below, and the flight path passes through each imaging position CP in the changed flight course FC. It's okay.
- the flight path FP may be a flight path that sequentially changes to the flight course FC immediately above or right after passing through all the imaging positions CP of one flight course FC.
- the flight direction on each flight course FC may be clockwise (clockwise) or counterclockwise (counterclockwise).
- the flight direction between the plurality of flight courses FC may be the upward direction (upward direction) or the downward direction (downward direction).
- FIG. 10B is a front view showing a first example of the imaging positions CP and the flight order of the imaging positions CP in each flight course FC.
- the respective imaging positions CP may have the same horizontal position (latitude, longitude).
- the flight path processing unit 111 arranges the imaging positions for one round at equal intervals in the horizontal imaging interval in an arbitrary flight course, then changes to another flight course (for example, an adjacent flight course), and changes this position to an arbitrary position.
- the initial imaging position CP1 on the flight course may be arranged, and the arrangement of the imaging positions may be continued based on the horizontal imaging interval.
- the flight path FP may be a flight path that passes through the imaging position CP that has traveled in the same direction as the flight direction in the horizontal direction on the flight course FC before the change.
- the unmanned aircraft 100 may display the generated flight course, flight path, and imaging position on the display unit DP of the transmitter 50.
- the flight path processing unit 111 may transmit the generated flight course and flight path and information on the determined imaging position to the transmitter 50 via the communication interface 150.
- the transmitter control unit 61 may receive and acquire information on the flight course, the flight path, and the imaging position via the wireless communication unit 63.
- the transmitter control unit 61 may display display information based on the flight course, flight path, and imaging position information via the display unit DP.
- the unmanned aerial vehicle 100 flies according to the generated flight path.
- the UAV control unit 110 transmits information (passing information) including the fact that the unmanned aircraft 100 has passed through the imaging position to the transmitter 50 via the communication interface 150. Good.
- the transmitter control unit 61 receives the passage information of the imaging position from the unmanned aircraft 100 via the wireless communication unit 63, the display color of the imaging position through which the unmanned aircraft 100 has passed among the imaging positions displayed on the display unit DP. May be changed. Thereby, the confirmer who confirms display part DP can confirm the present flight position in the flight path
- the unmanned aircraft 100 may display display information based on the flight course, the flight route, the imaging position, and the passing information of the imaging position on a display terminal connected to the transmitter 50 instead of the display unit DP.
- FIG. 11A is a front view showing a second example of the imaging positions CP and the flight order of the imaging positions CP in each flight course FC.
- the positions (latitude, longitude) of the respective imaging positions CP in the horizontal direction may be different.
- the flight path FP may be a flight path that passes through the imaging position CP that has traveled in the same direction as the flight direction in the horizontal direction on the flight course FC before the change.
- FIG. 11B is a front view showing a third example of the imaging positions CP and the flight order of the imaging positions CP in each flight course FC.
- the flight path processing unit 111 arranges the imaging positions CP for one round in an arbitrary flight course FC with equal horizontal imaging intervals, and then changes the altitude to another flight course without changing the horizontal position (latitude / longitude). For example, it may be changed to the adjacent flight course FC), the initial imaging position CP1 in this flight course FC may be arranged, and the arrangement of the imaging positions CP may be continued based on the horizontal imaging interval.
- the flight path FP may be a flight path that passes through the imaging position CP without changing the horizontal position in the flight course FC before and after the change when changing the altitude to the flight course FC immediately above or directly below.
- the flight path generated by the flight path processing unit 111 passes from the first altitude to the second flight course after passing through each imaging position CP at the first altitude where the first flight course FC exists.
- the flight path may be changed to the second altitude where FC exists.
- the unmanned aerial vehicle 100 can transition to the next altitude after the imaging at each imaging position CP at the same altitude where the flight posture is stable is completed. Therefore, the unmanned aerial vehicle 100 can capture a desired image with high accuracy while stabilizing the flight.
- the flight path processing unit 111 sets the horizontal imaging interval at equal intervals, so that the captured images captured at each imaging position on the same flight course are evenly divided on the side of the subject BL. . Therefore, the deviation in the horizontal position of the plurality of captured images for three-dimensional restoration is suppressed. Therefore, the unmanned aerial vehicle 100 can improve the restoration accuracy of the three-dimensional shape data.
- the flight path processing unit 111 sets the vertical imaging interval at equal intervals, so that the captured image captured at each imaging position between different flight courses is evenly divided in the height direction of the subject BL. Become. Therefore, the deviation in the vertical position of the plurality of captured images for three-dimensional restoration is suppressed. Therefore, the flight path generation system 10 and the unmanned aircraft 100 can improve the restoration accuracy of the three-dimensional shape data.
- FIG. 12 is a flowchart showing an operation example of the flight path generation system 10.
- FIG. 12 illustrates the generation of a flight path that gradually lowers the flight altitude.
- the parameter operation unit OPA accepts an input parameter input according to a user (operator of the transmitter 50) instruction.
- the wireless communication unit 63 transmits the input parameter to the unmanned aircraft 100.
- the communication interface 150 receives and acquires input parameters from the transmitter 50, and stores them in the memory 160 (S11).
- the input parameters held in the memory 160 are read from the memory 160 when necessary and referred to by the flight path processing unit 111 or the like.
- the flight path processing unit 111 calculates the imaging position interval based on the input parameters (S12). That is, the flight path processing unit 111 calculates the horizontal imaging interval d forward and the vertical imaging interval d side .
- the flight path processing unit 111 acquires the initial altitude information from the memory 160 (S13).
- the flight path processing unit 111 arranges (sets) the initial imaging position (initial waypoint) in the flight course (for example, the flight course of the initial altitude) of the current imaging position arrangement target (waypoint addition target) (S14).
- the information on the initial imaging position may be included in the input parameter input to the transmitter 50 and acquired from the memory 160.
- the initial imaging position may be determined by the flight path processing unit 111 based on a random number.
- the rotation position is defined as the position rotated in any direction (for example, clockwise or counterclockwise) on the flight course, with the current imaging position (for example, the initial imaging position) as a base point, separated by the length of the horizontal imaging interval. To do.
- the flight path processing unit 111 determines whether the angle (rotation angle) between the initial imaging position and the rotation position is 360 degrees or more with the center position of the subject as a base point (S15). That is, as a result of the rotation, the flight path processing unit 111 determines whether or not the rotation position is a position that has made one or more laps in the flight course to be arranged at the imaging position.
- the flight path processing unit 111 additionally arranges (sets) the imaging position at the rotation position on the same flight course as the current imaging position (S16). It progresses to S15 after the process of S16.
- the flight path processing unit 111 arranges the imaging position at the rotational position. Without proceeding to S17.
- the flight path processing unit 111 makes a transition to the next advanced flight course (S17). In other words, the flight path processing unit 111 sets the next altitude flight course as the flight course to be placed at the imaging position.
- the flight path processing unit 111 determines whether or not the flight altitude of the flight course after the transition is less than the end altitude (S18).
- the information on the end altitude may be included in the input parameter and may be held in the memory 160.
- the flight path processing unit 111 sets the imaging position within the up-and-down flight range. move on.
- the flight path processing unit 111 ends the additional arrangement of the imaging position because the imaging position is outside the vertical flight range. Then, the flight path processing unit 111 outputs the information of each imaging position arranged in each flight course to the memory 160 and holds it (S19).
- the flight path processing unit 111 determines each imaging position in each flight course, the flight path processing unit 111 generates a flight path that passes through each imaging position. The flight path processing unit 111 outputs the generated flight path information to the memory 16 to hold it.
- the unmanned aerial vehicle 100 acquires the input parameters input by the transmitter 50.
- the unmanned aircraft 100 can determine the imaging position interval and the imaging position at different altitudes on the sides of the subject BL based on the input parameters.
- the unmanned aerial vehicle 100 can set a flight path that sequentially passes through the imaging positions.
- the unmanned aerial vehicle 100 flies along a flight path and can capture the side surface of the subject BL by capturing an image toward the subject BL at each imaging position, that is, in the horizontal direction.
- the flight path generation system 10 and the unmanned aircraft 100 acquire many images of the side surface of the specific subject BL that cannot be obtained simply by flying over the sky uniformly while passing through the fixed path.
- the determination of the imaging position and the generation of the flight path can be performed.
- the imager does not have to capture the side surface of the subject BL by holding the imaging device. Therefore, the user who desires to acquire the image of the side surface of the subject BL does not need to move to the periphery of the subject BL and image the subject BL, and the convenience for the user is improved.
- the flight path generation system 10 and the unmanned aircraft 100 have a high possibility of acquiring a captured image in a desired state (for example, a desired imaging position of the subject, a desired imaging size of the subject, and a desired imaging direction of the subject). .
- the unmanned aircraft 100 can calculate the imaging position and the flight path based on the flight range and the imaging position interval without requiring user input of the position (latitude, longitude, altitude) of the three-dimensional space around the subject. Therefore, user convenience is improved.
- the unmanned aerial vehicle 100 may actually fly according to the generated flight path and capture an image at the determined imaging position. Thereby, since the imaging position determined by the flight path processing unit 111 and the generated flight path are used, an image of the side surface of the subject BL can be acquired easily and with high accuracy.
- the unmanned aerial vehicle 100 may actually fly according to the generated flight path and capture a plurality of images by partially overlapping the imaging range at the determined imaging position. Thereby, the unmanned aerial vehicle 100 can easily and accurately acquire a captured image necessary for three-dimensional restoration.
- the shape data processing unit 112 may generate three-dimensional shape data based on the actually captured image. Thereby, it is realizable with one system from the determination of an imaging position and the production
- the unmanned aerial vehicle 100 can suppress a shortage of captured images of the side surface of the subject BL, and can improve the three-dimensional shape restoration accuracy.
- the unmanned aircraft performs various processes for generating a flight path (for example, calculation of an imaging position interval, determination of an imaging position, generation of a flight path).
- the second embodiment exemplifies that devices (for example, a transmitter) other than the unmanned aerial vehicle perform various processes for generating a flight path.
- FIG. 13 is a schematic diagram illustrating a configuration example of a flight path generation system 10A according to the second embodiment.
- the flight path generation system 10A includes an unmanned aerial vehicle 100A and a transmitter 50A.
- Unmanned aerial vehicle 100A and transmitter 50A can communicate by wired communication or wireless communication (for example, wireless LAN, Bluetooth (registered trademark)).
- wireless communication for example, wireless LAN, Bluetooth (registered trademark)
- the description of the same matters as in the first embodiment is omitted or simplified.
- FIG. 14 is a block diagram illustrating an example of a hardware configuration of the transmitter 50A.
- the transmitter 50A includes a transmitter control unit 61A instead of the transmitter control unit 61.
- the same components as those of the transmitter 50 of FIG. 6 are denoted by the same reference numerals, and description thereof is omitted or simplified.
- the transmitter control unit 61A includes a function as a flight path processing unit 65 that performs processing related to generation of a flight path in addition to the function of the transmitter control unit 61.
- the transmitter control unit 61A may include a function as a shape data processing unit 66 that performs processing related to generation of three-dimensional shape data.
- the flight path processing unit 65 is the same as the flight path processing unit 111 of the UAV control unit 110 of the unmanned aircraft 100 according to the first embodiment.
- the shape data processing unit 66 is the same as the shape data processing unit 112 of the UAV control unit 110 of the unmanned aerial vehicle 100 in the first embodiment.
- the transmitter controller 61A may not include the shape data processor 66.
- the flight path processing unit 65 receives the input parameter input to the parameter operation unit OPA.
- the flight path processing unit 65 stores the input parameters in the memory 64 as necessary.
- the flight path processing unit 65 reads at least some of the input parameters from the memory 64 as necessary (for example, when calculating the imaging position interval, determining the imaging position, and generating the flight path).
- the memory 64 stores programs and the like necessary for controlling each unit in the transmitter 50A.
- the memory 64 stores programs necessary for the UAV control unit 110 to execute the flight path processing unit 65 and the shape data processing unit 66.
- the memory 64 may be a computer-readable recording medium, such as SRAM (Static Random Access Memory), DRAM (Dynamic Random Access Memory), EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), and It may include at least one flash memory such as a USB memory.
- the memory 64 may be provided inside the transmitter 50A. It may be provided so as to be removable from the transmitter 50A.
- the flight path processing unit 65 may acquire (for example, calculate) an imaging position interval, determine an imaging position, generate a flight path, and the like in the same manner as the flight path processing unit 111 of the first embodiment. Detailed description is omitted here.
- the transmitter 50A can perform processing with one device from input of an input parameter by the parameter operation unit OPA to acquisition (for example, calculation) of an imaging position interval, determination of an imaging position, and generation of a flight path. Therefore, since communication does not occur in determining the imaging position and generating the flight path, it is possible to determine the imaging position and generate the flight path regardless of whether the communication environment is good or bad.
- the flight path processing unit 65 transmits information on the determined imaging position and information on the generated flight path to the unmanned aircraft 100A via the wireless communication unit 63.
- the shape data processing unit 66 may receive and acquire a captured image captured by the unmanned aircraft 100A via the wireless communication unit 63. The received captured image may be held in the memory 64.
- the shape data processing unit 66 may generate three-dimensional information (three-dimensional information, three-dimensional shape data) indicating the three-dimensional shape (three-dimensional shape) of the object (subject) based on the acquired plurality of captured images.
- three-dimensional information three-dimensional information, three-dimensional shape data
- a known method may be used as a method for generating three-dimensional shape data based on a plurality of captured images. Examples of known methods include MVS, PMVS, and SfM.
- FIG. 15 is a block diagram illustrating an example of a hardware configuration of the unmanned aircraft 100A.
- unmanned aerial vehicle 100A includes UAV control unit 110A instead of UAV control unit 110.
- the UAV control unit 110A does not include the flight path processing unit 111 and the shape data processing unit 112.
- the UAV control unit 110A may include a shape data processing unit 112. That is, the unmanned aircraft 100A may generate three-dimensional shape data based on a plurality of captured images.
- the same components as those of the unmanned aircraft 100A of FIG. 4 are denoted by the same reference numerals, and the description thereof is omitted or simplified.
- the UAV control unit 110A may receive and acquire information on each imaging position and flight path information from the transmitter 50A via the communication interface 150. Information on the imaging position and information on the flight path may be held in the memory 160.
- the UAV control unit 110A controls the flight of the unmanned aircraft 100A based on the imaging position information and the flight path information acquired from the transmitter 50A, and images the side surface of the subject at each imaging position in the flight path. Each captured image may be held in the memory 160.
- the UAV control unit 110A may transmit the captured image captured by the imaging device 220 or 230 to the transmitter 50A via the communication interface 150.
- the parameter operation unit OPA of the transmitter 50A inputs input parameters.
- the flight path processing unit 65 determines an imaging position using the input parameters, and generates a flight path that passes through the imaging position.
- the UAV control unit 110A acquires information on the determined imaging position and information on the generated flight path from the transmitter 50A via the communication interface 150, and stores the information in the memory 160.
- the UAV control unit 110A performs flight control according to the acquired flight path.
- the UAV control unit 110A causes the imaging device 220 or 230 to capture an image (aerial image) at an imaging position (aerial position) (Waypoint) in the flight path.
- the captured image (captured image) may be used as one image for restoring a three-dimensional shape, for example.
- the flight path generation system 10A and the transmitter 50A it is possible to acquire many images of the side surface of a specific subject BL, which cannot be obtained simply by flying over the sky uniformly while passing through a fixed path.
- An imaging position can be determined and a flight path can be generated.
- the imager does not have to capture the side surface of the subject BL by holding the imaging device. Therefore, the user who desires to acquire the image of the side surface of the subject BL does not need to move to the periphery of the subject BL and image the subject BL, and the convenience for the user is improved.
- the flight path generation system 10A and the transmitter 50A have a high possibility of acquiring a captured image in a desired state (for example, a desired imaging position of the subject, a desired imaging size of the subject, and a desired imaging direction of the subject). .
- the transmitter 50A can calculate the imaging position and the flight path based on the flight range and the imaging position interval without requiring user input of the position (latitude, longitude, altitude) of the three-dimensional space around the subject. Therefore, user convenience is improved.
- information on the imaging position determined by the transmitter 50A and the generated flight path may be set in the unmanned aircraft 100A.
- the unmanned aircraft 100A may actually fly according to the generated flight path and capture an image at the determined imaging position. Thereby, the unmanned aerial vehicle 100A can easily and highly accurately acquire an image of the side surface of the subject BL.
- the unmanned aircraft 100A may actually fly according to the generated flight path, and capture a plurality of images by partially overlapping the imaging range at the determined imaging position. Thereby, the unmanned aerial vehicle 100A can easily and accurately acquire a captured image necessary for three-dimensional restoration.
- the transmitter 50A may acquire this captured image from the unmanned aircraft 100A and generate three-dimensional shape data.
- the transmitter performs various processes for generating the flight path (for example, calculation of the imaging position interval, determination of the imaging position, generation of the flight path).
- a communication terminal for example, a PC
- the transmitter performs various processes for generating a flight path.
- FIG. 16 is a schematic diagram illustrating a configuration example of a flight path generation system 10B according to the third embodiment.
- the flight path generation system 10B includes an unmanned aircraft 100A and a PC 70.
- the unmanned aircraft 100A and the PC 70 can communicate by wired communication or wireless communication (for example, wireless LAN, Bluetooth (registered trademark)).
- the PC 70 may include a communication device, a memory, a processor, an input device, and a display.
- the PC 70 may have the functions of the parameter operation unit OPA and the flight path processing unit 65 provided in the transmitter 50A of the second embodiment.
- the PC 70 may have the function of the shape data processing unit 66 provided in the transmitter 50A.
- the PC 70 may be installed with a program (application) for realizing the flight path generation method.
- the flight path generation system 10B and the PC 70 it is possible to easily determine the imaging position and generate the flight path by using the highly versatile PC 70 without using the transmitter 50A.
- the unmanned aerial vehicle 100 is exemplified as the flying body, but a manned aircraft on which a person is boarded may be automatically flying.
- the object as the subject may be an object other than an object constructed on the ground, for example, an object constructed on the sea.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Studio Devices (AREA)
Abstract
Description
図1は、第1の実施形態における飛行経路生成システム10の構成例を示す模式図である。飛行経路生成システム10は、無人航空機100及び送信機50を備える。無人航空機100及び送信機50は、有線通信又は無線通信(例えば無線LAN(Local Area Network)、Bluetooth(登録商標))により通信可能である。
Rflight:飛行経路の半径
Robj:被写体BLの半径(被写体BLを示す近似円の半径)
FOV(Field of View)1:撮像装置220又は撮像装置230の水平画角
rforward:水平重複率
FOV1=ph1/(Rflight-Robj)
従って、飛行経路処理部111は、(式1)の一部である(Rflight-Robj)*FOV1=ph1を算出する。画角FOV(ここではFOV1)は、上式から明らかなように、長さ(距離)の比によって示される。尚、アスタリスク「*」は乗算符号を示す。
FOV(Field of View)2:撮像装置220又は撮像装置230の上下画角
rside:上下重複率
図12は、飛行経路生成システム10の動作例を示すフローチャートである。図12では、飛行高度を徐々に下降させる飛行経路を生成することを例示する。
第1の実施形態では、飛行経路を生成するための各種処理(例えば撮像位置間隔の算出、撮像位置の決定、飛行経路の生成)を無人航空機が実施することを例示した。第2の実施形態では飛行経路を生成するための各種処理を無人航空機以外の機器(例えば送信機)が実施することを例示する。
第2の実施形態では、飛行経路を生成するための各種処理(例えば撮像位置間隔の算出、撮像位置の決定、飛行経路の生成)を送信機が実施することを例示した。第3の実施形態では、飛行経路を生成するための各種処理を送信機以外の通信端末(例えばPC)が実施する。
50,50A 送信機
50B 筐体
53L 左制御棒
53R 右制御棒
61 送信機制御部
63 無線通信部
64 メモリ
65 飛行経路処理部
66 形状データ処理部
70 PC
100,100A 無人航空機
102 UAV本体
110,110A UAV制御部
111 飛行経路処理部
112 形状データ処理部
150 通信インタフェース
160 メモリ
200 ジンバル
210 回転翼機構
220,230 撮像装置
240 GPS受信機
250 慣性計測装置
260 磁気コンパス
270 気圧高度計
AN1,AN2 アンテナ
B1 電源ボタン
B2 RTHボタン
BL 被写体
CP 撮像位置
CP1 初期撮像位置
FP 飛行経路
FC 飛行コース
L1 リモートステータス表示部
L2 バッテリ残量表示部
OPA パラメータ操作部
OPS 操作部セット
Claims (36)
- 被写体の側方を周回して被写体を撮像する飛行体の飛行経路生成方法であって、
前記飛行体の飛行範囲と、前記飛行体により前記被写体を撮像する撮像位置間隔と、に基づいて、前記飛行体による前記被写体の撮像位置を決定する段階と、
前記撮像位置を通過する前記飛行体の飛行経路を生成する段階と、
を含む飛行経路生成方法。 - 前記撮像位置間隔は、同一高度での前記被写体の撮像位置の間隔である第1の撮像位置間隔を含む、請求項1に記載の飛行経路生成方法。
- 少なくとも、前記被写体の半径と、前記飛行範囲の半径と、前記飛行体が備える撮像部の画角と、隣り合う撮像位置で前記飛行体により撮像される撮像範囲の重複率である第1の重複率と、に基づいて、前記第1の撮像位置間隔を決定する段階、を更に含む、請求項2に記載の飛行経路生成方法。
- 前記飛行経路における各前記第1の撮像位置間隔は、等間隔である、請求項2または3に記載の飛行経路生成方法。
- 前記撮像位置間隔は、前記飛行体により前記被写体を撮像する撮像高度の間隔である第2の撮像位置間隔を含む、請求項1に記載の飛行経路生成方法。
- 少なくとも、前記被写体の半径と、前記飛行範囲の半径と、前記飛行体が備える撮像部の画角と、隣り合う撮像高度で前記飛行体により撮像される撮像範囲の重複率である第2の重複率と、に基づいて、前記第2の撮像位置間隔を決定する段階、を更に含む、請求項5に記載の飛行経路生成方法。
- 前記飛行経路における前記第2の撮像位置間隔は、等間隔である、請求項5または6に記載の飛行経路生成方法。
- 前記飛行経路は、前記飛行体が第1の高度における各撮像位置を通過した後に、前記第1の高度から第2の高度へ変更する飛行経路である、請求項1~7のいずれか1項に記載の飛行経路生成方法。
- 前記飛行経路における各撮像位置において、前記飛行体により前記被写体の側面を撮像し、複数の撮像画像を取得する段階、を更に含む、請求項1~8のいずれか1項に記載の飛行経路生成方法。
- 前記飛行経路における各撮像位置において、前記飛行体により撮像範囲を一部重複させて前記被写体の側面を撮像し、複数の撮像画像を取得する段階、を更に含む、請求項1~8のいずれか1項に記載の飛行経路生成方法。
- 前記複数の撮像画像に基づいて、前記被写体の3次元形状データを生成する段階、を更に含む、請求項10に記載の飛行経路生成方法。
- 被写体の側方を周回して被写体を撮像する飛行体の飛行経路を生成する飛行経路生成システムであって、
前記飛行体の飛行範囲と、前記飛行体により前記被写体を撮像する撮像位置間隔と、に基づいて、前記飛行体による前記被写体の撮像位置を決定し、前記撮像位置を通過する前記飛行体の飛行経路を生成する処理部、を備える飛行経路生成システム。 - 前記撮像位置間隔は、同一高度での前記被写体の撮像位置の間隔である第1の撮像位置間隔を含む、
請求項12に記載の飛行経路生成システム。 - 前記処理部は、少なくとも、前記被写体の半径と、前記飛行範囲の半径と、前記飛行体が備える撮像部の画角と、隣り合う撮像位置で前記飛行体により撮像される撮像範囲の重複率である第1の重複率と、に基づいて、前記第1の撮像位置間隔を決定する、
請求項13に記載の飛行経路生成システム。 - 請求項13または14に記載の飛行経路生成システムであって、
前記飛行経路における各前記第1の撮像位置間隔は、等間隔である、飛行経路生成システム。 - 前記撮像位置間隔は、前記飛行体により前記被写体を撮像する撮像高度の間隔である第2の撮像位置間隔を含む、
請求項12に記載の飛行経路生成システム。 - 請求項16に記載の飛行経路生成システムであって、
前記処理部は、少なくとも、前記被写体の半径と、前記飛行範囲の半径と、前記飛行体が備える撮像部の画角と、隣り合う撮像高度で前記飛行体により撮像される撮像範囲の重複率である第2の重複率と、に基づいて、前記第2の撮像位置間隔を決定する、飛行経路生成システム。 - 前記飛行経路における各前記第2の撮像位置間隔は、等間隔である、請求項16または17に記載の飛行経路生成システム。
- 前記飛行経路は、前記飛行体が第1の高度における各撮像位置を通過した後に、前記第1の高度から第2の高度へ変更する飛行経路である、
請求項12~18のいずれか1項に記載の飛行経路生成システム。 - 前記飛行経路における各撮像位置において、前記飛行体により前記被写体の側面を撮像し、複数の撮像画像を取得する撮像部、を更に備える、
請求項12~19のいずれか1項に記載の飛行経路生成システム。 - 前記飛行経路における各撮像位置において、前記飛行体により撮像範囲を一部重複させて前記被写体の側面を撮像し、複数の撮像画像を取得する撮像部、を更に備える、
請求項12~19のいずれか1項に記載の飛行経路生成システム。 - 前記処理部は、前記複数の撮像画像に基づいて、前記被写体の3次元形状データを生成する、
請求項21に記載の飛行経路生成システム。 - 被写体の側方を周回して被写体を撮像する飛行体であって、
当該飛行体の飛行範囲と、前記被写体を撮像する撮像位置間隔と、に基づいて、前記被写体の撮像位置を決定し、前記撮像位置を通過する前記飛行体の飛行経路を生成する処理部、を備える飛行体。 - 前記撮像位置間隔は、同一高度での前記被写体の撮像位置の間隔である第1の撮像位置間隔を含む、
請求項23に記載の飛行体。 - 前記処理部は、少なくとも、前記被写体の半径と、前記飛行範囲の半径と、当該飛行体が備える撮像部の画角と、隣り合う撮像位置で前記飛行体により撮像される撮像範囲の重複率である第1の重複率と、に基づいて、前記第1の撮像位置間隔を決定する、
請求項24に記載の飛行体。 - 前記飛行経路における各前記第1の撮像位置間隔は、等間隔である、
請求項24または25に記載の飛行体。 - 前記撮像位置間隔は、前記飛行体により前記被写体を撮像する撮像高度の間隔である第2の撮像位置間隔を含む、
請求項23に記載の飛行体。 - 前記処理部は、少なくとも、前記被写体の半径と、前記飛行範囲の半径と、当該飛行体が備える撮像部の画角と、隣り合う撮像高度で前記飛行体により撮像される撮像範囲の重複率である第2の重複率と、に基づいて、前記第2の撮像位置間隔を決定する、
請求項27に記載の飛行体。 - 前記飛行経路における各前記第2の撮像位置間隔は、等間隔である、
請求項27または28に記載の飛行体。 - 前記飛行経路は、前記飛行体が第1の高度における各撮像位置を通過した後に、前記第1の高度から第2の高度へ変更する飛行経路である、
請求項23~29のいずれか1項に記載の飛行体。 - 前記飛行経路における各撮像位置において、前記被写体の側面を撮像し、複数の撮像画像を取得する撮像部、を更に備える、
請求項23~30のいずれか1項に記載の飛行体。 - 前記飛行経路における各撮像位置において、撮像範囲を一部重複させて前記被写体の側面を撮像し、複数の撮像画像を取得する撮像部、を更に備える、
請求項23~30のいずれか1項に記載の飛行体。 - 前記処理部は、前記複数の撮像画像に基づいて、前記被写体の3次元形状データを生成する、
請求項32に記載の飛行体。 - 前記処理部は、前記被写体の半径の情報と、前記飛行範囲の半径の情報と、隣り合う撮像位置で前記飛行体により撮像される撮像範囲の重複率である第1の重複率の情報と、隣り合う撮像高度で前記飛行体により撮像される撮像範囲の重複率である第2の重複率の情報と、の少なくとも1つを含むパラメータを取得する、
請求項23~33のいずれか1項に記載の飛行体。 - 被写体の側方を周回して被写体を撮像する飛行体の飛行経路を生成するコンピュータに、
前記飛行体の飛行範囲と、前記飛行体により前記被写体を撮像する撮像位置間隔と、に基づいて、前記飛行体による前記被写体の撮像位置を決定する手順と、
前記撮像位置を通過する前記飛行体の飛行経路を生成する手順と、
を実行させるためのプログラム。 - 被写体の側方を周回して被写体を撮像する飛行体の飛行経路を生成するコンピュータに、
前記飛行体の飛行範囲と、前記飛行体により前記被写体を撮像する撮像位置間隔と、に基づいて、前記飛行体による前記被写体の撮像位置を決定する手順と、
前記撮像位置を通過する前記飛行体の飛行経路を生成する手順と、
を実行させるためのプログラムを記録したコンピュータ読み取り可能な記録媒体。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/080752 WO2018073879A1 (ja) | 2016-10-17 | 2016-10-17 | 飛行経路生成方法、飛行経路生成システム、飛行体、プログラム、及び記録媒体 |
JP2018545736A JP6803919B2 (ja) | 2016-10-17 | 2016-10-17 | 飛行経路生成方法、飛行経路生成システム、飛行体、プログラム、及び記録媒体 |
CN201680090118.5A CN109952755B (zh) | 2016-10-17 | 2016-10-17 | 飞行路径生成方法、飞行路径生成系统、飞行体以及记录介质 |
US16/385,501 US11377211B2 (en) | 2016-10-17 | 2019-04-16 | Flight path generation method, flight path generation system, flight vehicle, program, and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/080752 WO2018073879A1 (ja) | 2016-10-17 | 2016-10-17 | 飛行経路生成方法、飛行経路生成システム、飛行体、プログラム、及び記録媒体 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/385,501 Continuation US11377211B2 (en) | 2016-10-17 | 2019-04-16 | Flight path generation method, flight path generation system, flight vehicle, program, and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018073879A1 true WO2018073879A1 (ja) | 2018-04-26 |
Family
ID=62019295
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/080752 WO2018073879A1 (ja) | 2016-10-17 | 2016-10-17 | 飛行経路生成方法、飛行経路生成システム、飛行体、プログラム、及び記録媒体 |
Country Status (4)
Country | Link |
---|---|
US (1) | US11377211B2 (ja) |
JP (1) | JP6803919B2 (ja) |
CN (1) | CN109952755B (ja) |
WO (1) | WO2018073879A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020036163A (ja) * | 2018-08-29 | 2020-03-05 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | 情報処理装置、撮影制御方法、プログラム及び記録媒体 |
JP2020072366A (ja) * | 2018-10-31 | 2020-05-07 | 株式会社センシンロボティクス | 撮像システム及び撮像方法 |
JP2020070006A (ja) * | 2019-05-16 | 2020-05-07 | 株式会社センシンロボティクス | 撮像システム及び撮像方法 |
KR20200058079A (ko) * | 2018-11-19 | 2020-05-27 | 네이버시스템(주) | 3차원 모델링 및 정사영상을 생성하기 위한 항공 촬영 장치 및 방법 |
CN111344650A (zh) * | 2018-09-13 | 2020-06-26 | 深圳市大疆创新科技有限公司 | 信息处理装置、飞行路径生成方法、程序以及记录介质 |
JP2021093592A (ja) * | 2019-12-09 | 2021-06-17 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | 画像処理装置、画像処理方法、プログラム、及び記録媒体 |
CN113206958A (zh) * | 2021-04-30 | 2021-08-03 | 成都睿铂科技有限责任公司 | 一种航线拍摄方法 |
US20220392079A1 (en) * | 2019-10-25 | 2022-12-08 | Sony Group Corporation | Information processing apparatus, information processing method, program, and flight object |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109564434B (zh) * | 2016-08-05 | 2023-07-25 | 深圳市大疆创新科技有限公司 | 用于定位可移动物体的系统和方法 |
KR20180056068A (ko) * | 2016-11-18 | 2018-05-28 | 삼성전자주식회사 | 무인 비행체를 제어하기 위한 전자 장치 및 방법 |
WO2021016907A1 (zh) * | 2019-07-31 | 2021-02-04 | 深圳市大疆创新科技有限公司 | 确定环绕航线的方法、航拍方法、终端、无人飞行器及系统 |
CN112106006A (zh) * | 2019-08-30 | 2020-12-18 | 深圳市大疆创新科技有限公司 | 无人飞行器的控制方法、装置及计算机可读存储介质 |
JP2021094890A (ja) * | 2019-12-13 | 2021-06-24 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | 飛行体、及び制御方法 |
CN111444441B (zh) * | 2020-03-12 | 2024-03-29 | 维沃移动通信有限公司 | 信息提示方法、电子设备及存储介质 |
CN111891356A (zh) * | 2020-08-17 | 2020-11-06 | 成都市玄上科技有限公司 | 一种无人机无头自旋飞行倾斜摄影航拍方法 |
CN112182278B (zh) * | 2020-09-23 | 2023-11-17 | 深圳市超时空探索科技有限公司 | 一种打卡足迹的模拟方法和装置 |
CN112261588B (zh) * | 2020-10-19 | 2022-07-05 | 中国科学院合肥物质科学研究院 | 有人车引导的多无人车自适应编队组网方法、系统及设备 |
KR102254961B1 (ko) * | 2021-03-08 | 2021-05-24 | 주식회사 지오멕스소프트 | 무인항공기를 이용하여 3차원 모델링 효율 향상을 위한 측면촬영 기법을 포함하는 무인항공기 사전비행 시뮬레이터 시스템 |
WO2022205210A1 (zh) * | 2021-03-31 | 2022-10-06 | 深圳市大疆创新科技有限公司 | 拍摄方法、装置及计算机可读存储介质,终端设备 |
JPWO2023089983A1 (ja) | 2021-11-19 | 2023-05-25 | ||
CN118285111A (zh) | 2021-11-19 | 2024-07-02 | 富士胶片株式会社 | 移动体、移动体摄影系统及移动体摄影方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014139538A (ja) * | 2013-01-21 | 2014-07-31 | Mitsubishi Heavy Ind Ltd | 地形情報取得装置、地形情報取得システム、地形情報取得方法及びプログラム |
JP2014185947A (ja) * | 2013-03-25 | 2014-10-02 | Geo Technical Laboratory Co Ltd | 3次元復元のための画像撮影方法 |
WO2015163012A1 (ja) * | 2014-04-25 | 2015-10-29 | ソニー株式会社 | 情報処理装置、情報処理方法、プログラムおよび撮像システム |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3410391B2 (ja) * | 1999-05-31 | 2003-05-26 | 株式会社エヌ・ティ・ティ・データ | 遠隔撮影システム、撮影指示装置、撮影装置、情報表示装置及び遠隔撮影方法 |
BRPI0519214A2 (pt) * | 2005-11-15 | 2009-01-06 | Bell Helicopter Textron Inc | sistema e mÉtodo de controle de vâo e aeronave |
US7970532B2 (en) * | 2007-05-24 | 2011-06-28 | Honeywell International Inc. | Flight path planning to reduce detection of an unmanned aerial vehicle |
JP4988673B2 (ja) | 2008-09-01 | 2012-08-01 | 株式会社日立製作所 | 撮影計画作成システム |
US20100286859A1 (en) * | 2008-11-18 | 2010-11-11 | Honeywell International Inc. | Methods for generating a flight plan for an unmanned aerial vehicle based on a predicted camera path |
EP2511656A1 (de) * | 2011-04-14 | 2012-10-17 | Hexagon Technology Center GmbH | Vermessungssystem zur Bestimmung von 3D-Koordinaten einer Objektoberfläche |
US9612598B2 (en) * | 2014-01-10 | 2017-04-04 | Pictometry International Corp. | Unmanned aircraft structure evaluation system and method |
US11157021B2 (en) * | 2014-10-17 | 2021-10-26 | Tyco Fire & Security Gmbh | Drone tours in security systems |
KR101740312B1 (ko) * | 2015-01-09 | 2017-06-09 | 주식회사 대한항공 | 무인 항공기의 카메라 조종정보를 이용한 무인 항공기 유도제어 방법 |
WO2016140985A1 (en) * | 2015-03-02 | 2016-09-09 | Izak Van Cruyningen | Flight planning for unmanned aerial tower inspection |
-
2016
- 2016-10-17 JP JP2018545736A patent/JP6803919B2/ja active Active
- 2016-10-17 CN CN201680090118.5A patent/CN109952755B/zh active Active
- 2016-10-17 WO PCT/JP2016/080752 patent/WO2018073879A1/ja active Application Filing
-
2019
- 2019-04-16 US US16/385,501 patent/US11377211B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014139538A (ja) * | 2013-01-21 | 2014-07-31 | Mitsubishi Heavy Ind Ltd | 地形情報取得装置、地形情報取得システム、地形情報取得方法及びプログラム |
JP2014185947A (ja) * | 2013-03-25 | 2014-10-02 | Geo Technical Laboratory Co Ltd | 3次元復元のための画像撮影方法 |
WO2015163012A1 (ja) * | 2014-04-25 | 2015-10-29 | ソニー株式会社 | 情報処理装置、情報処理方法、プログラムおよび撮像システム |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020036163A (ja) * | 2018-08-29 | 2020-03-05 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | 情報処理装置、撮影制御方法、プログラム及び記録媒体 |
CN111344650B (zh) * | 2018-09-13 | 2024-04-16 | 深圳市大疆创新科技有限公司 | 信息处理装置、飞行路径生成方法、程序以及记录介质 |
CN111344650A (zh) * | 2018-09-13 | 2020-06-26 | 深圳市大疆创新科技有限公司 | 信息处理装置、飞行路径生成方法、程序以及记录介质 |
JP2020072366A (ja) * | 2018-10-31 | 2020-05-07 | 株式会社センシンロボティクス | 撮像システム及び撮像方法 |
WO2020090406A1 (ja) * | 2018-10-31 | 2020-05-07 | 株式会社センシンロボティクス | 撮像システム及び撮像方法 |
KR20200058079A (ko) * | 2018-11-19 | 2020-05-27 | 네이버시스템(주) | 3차원 모델링 및 정사영상을 생성하기 위한 항공 촬영 장치 및 방법 |
KR102117641B1 (ko) * | 2018-11-19 | 2020-06-02 | 네이버시스템(주) | 3차원 모델링 및 정사영상을 생성하기 위한 항공 촬영 장치 및 방법 |
JP2020070006A (ja) * | 2019-05-16 | 2020-05-07 | 株式会社センシンロボティクス | 撮像システム及び撮像方法 |
US20220392079A1 (en) * | 2019-10-25 | 2022-12-08 | Sony Group Corporation | Information processing apparatus, information processing method, program, and flight object |
US11854210B2 (en) * | 2019-10-25 | 2023-12-26 | Sony Group Corporation | Information processing apparatus, information processing method, program, and flight object |
JP6997164B2 (ja) | 2019-12-09 | 2022-01-17 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッド | 画像処理装置、画像処理方法、プログラム、及び記録媒体 |
JP2021093592A (ja) * | 2019-12-09 | 2021-06-17 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | 画像処理装置、画像処理方法、プログラム、及び記録媒体 |
CN113206958A (zh) * | 2021-04-30 | 2021-08-03 | 成都睿铂科技有限责任公司 | 一种航线拍摄方法 |
CN113206958B (zh) * | 2021-04-30 | 2023-06-09 | 成都睿铂科技有限责任公司 | 一种航线拍摄方法 |
Also Published As
Publication number | Publication date |
---|---|
CN109952755B (zh) | 2021-08-20 |
JPWO2018073879A1 (ja) | 2019-09-26 |
US20190241263A1 (en) | 2019-08-08 |
CN109952755A (zh) | 2019-06-28 |
US11377211B2 (en) | 2022-07-05 |
JP6803919B2 (ja) | 2020-12-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018073879A1 (ja) | 飛行経路生成方法、飛行経路生成システム、飛行体、プログラム、及び記録媒体 | |
JP6878567B2 (ja) | 3次元形状推定方法、飛行体、モバイルプラットフォーム、プログラム及び記録媒体 | |
JP6765512B2 (ja) | 飛行経路生成方法、情報処理装置、飛行経路生成システム、プログラム及び記録媒体 | |
US20190318636A1 (en) | Flight route display method, mobile platform, flight system, recording medium and program | |
CN105045279A (zh) | 一种利用无人飞行器航拍自动生成全景照片的系统及方法 | |
CN110249281B (zh) | 位置处理装置、飞行体、及飞行系统 | |
US11122209B2 (en) | Three-dimensional shape estimation method, three-dimensional shape estimation system, flying object, program and recording medium | |
CN113875222B (zh) | 拍摄控制方法和装置、无人机及计算机可读存储介质 | |
JP6788094B2 (ja) | 画像表示方法、画像表示システム、飛行体、プログラム、及び記録媒体 | |
WO2020048365A1 (zh) | 飞行器的飞行控制方法、装置、终端设备及飞行控制系统 | |
CN111213107B (zh) | 信息处理装置、拍摄控制方法、程序以及记录介质 | |
JP6329219B2 (ja) | 操作端末、及び移動体 | |
CN110785724B (zh) | 发送器、飞行体、飞行控制指示方法、飞行控制方法、程序及存储介质 | |
WO2018138882A1 (ja) | 飛行体、動作制御方法、動作制御システム、プログラム及び記録媒体 | |
WO2023097494A1 (zh) | 全景图像拍摄方法、装置、无人机、系统及存储介质 | |
JP6997170B2 (ja) | 形状生成方法、画像取得方法、モバイルプラットフォーム、飛行体、プログラム及び記録媒体 | |
WO2018188086A1 (zh) | 无人机及其控制方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16919438 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2018545736 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 31.07.2019) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16919438 Country of ref document: EP Kind code of ref document: A1 |