WO2018214401A1 - 移动平台、飞行体、支持装置、便携式终端、摄像辅助方法、程序以及记录介质 - Google Patents

移动平台、飞行体、支持装置、便携式终端、摄像辅助方法、程序以及记录介质 Download PDF

Info

Publication number
WO2018214401A1
WO2018214401A1 PCT/CN2017/108413 CN2017108413W WO2018214401A1 WO 2018214401 A1 WO2018214401 A1 WO 2018214401A1 CN 2017108413 W CN2017108413 W CN 2017108413W WO 2018214401 A1 WO2018214401 A1 WO 2018214401A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
image
composition
subject
imaging
Prior art date
Application number
PCT/CN2017/108413
Other languages
English (en)
French (fr)
Inventor
周杰旻
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201780064135.6A priority Critical patent/CN109863745A/zh
Publication of WO2018214401A1 publication Critical patent/WO2018214401A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/14Flying platforms with four distinct rotor axes, e.g. quadcopters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C13/00Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
    • B64C13/02Initiating means
    • B64C13/16Initiating means actuated automatically, e.g. responsive to gust detectors
    • B64C13/20Initiating means actuated automatically, e.g. responsive to gust detectors using radiated signals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/56Accessories
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals

Definitions

  • the present disclosure relates to a mobile platform, a flying body, a supporting device, a portable terminal, a camera assisting method, a program, and a recording medium that support imaging of an image.
  • a virtual camera system uses a proven camera technique to calculate an image capture mode to visualize a computer generated plot, reproduce the event, and dynamically update the camera view.
  • This virtual camera system analyzes the camera configuration position for displaying the subject at a desired angle, distance, and with minimal occlusion.
  • This virtual camera system is mainly used in a 3D game camera system, and three-dimensional image data representing arbitrary virtual three-dimensional space is prepared in advance.
  • This virtual camera system can represent an area viewed from a particular viewpoint in a virtual three-dimensional space to form an arbitrary composition.
  • This virtual camera system can be prepared in advance, for example, according to a limited composition, a composition using a three-part method in which one subject has been added, a composition using a three-point method in which all elements have been added, and a balanced composition.
  • a part of the three-dimensional image data is displayed on the display (see Non-Patent Document 1).
  • Non-Patent Document 1 William Bares, "A Photographic Composition Assistant for Intelligent Virtual 3D Camera Systems", Millsaps College, Department of Computer Science, Jackson MS 39210, USA, Internet ⁇ URL: http://link.springer.com/chapter/10.1007/11795018_16>
  • Non-Patent Document 1 When the virtual camera system described in Non-Patent Document 1 is applied to a camera system that processes an image captured in a real space, the camera system processes into an arbitrary composition based on the three-dimensional image data previously captured. Therefore, the camera system cannot determine the composition of an image that has not been captured. Therefore, for example, when a camera user who photographs a subject is not familiar with imaging, it is difficult to attractively capture a desired image.
  • a mobile platform is a mobile platform that assists imaging of a second image by an imaging device, and includes: an image acquisition unit that acquires a first image; and an information acquisition unit that is first Acquiring information of the first subject from one or more subjects included in the image, and specifying one or more types of positions of one or more subjects including the first subject in the second image Acquiring information of the first composition; and a generating unit that generates motion information related to an action of the imaging device for capturing the second image according to the first composition.
  • the information acquisition section may select and acquire the first subject from the plurality of subjects included in the first image.
  • the information acquisition unit may acquire information of the first subject based on the color component of the subject included in the first image.
  • the information acquisition section may acquire information of the first subject based on the spatial frequency of the subject included in the first image.
  • the information acquisition section may acquire location information of the imaging apparatus, and acquire information of the first subject based on the location information of the imaging apparatus.
  • the information acquisition unit can acquire information of the first subject based on an imaging mode at the time of imaging of the second image by the imaging device.
  • the information acquisition section may select and acquire the first composition from the plurality of compositions.
  • the mobile platform may further include an identification portion for identifying the shape of the first subject.
  • the information acquisition section may acquire information of the first composition according to the shape of the first subject.
  • the mobile platform may further include an identification portion for identifying a scene when the second image is imaged.
  • the information acquisition unit may acquire the information of the first composition according to the scene.
  • the generating unit may generate, as the motion information, rotation information related to the rotation of the support member that rotatably supports the imaging device.
  • the generating portion may determine the amount of rotation and the direction of rotation of the support member based on the position of the first subject in the first image and the position of the first subject in the first composition.
  • the generating unit can generate movement information related to the movement of the imaging device as the motion information.
  • the generating portion may determine the amount of movement of the image pickup device in the gravity direction according to the size of the first subject in the first image and the size of the first subject in the first composition.
  • the generating portion may determine the correspondence relationship between the position of the first subject in the first image, the position of the first subject in the first composition, and the moving distance in the first image and the moving distance in the real space. The amount of movement and direction of movement of the camera.
  • a prompt portion may also be included for prompting action information.
  • the first image may be an image captured by the imaging device.
  • the mobile platform may be a flying body including an image pickup device and a support member that rotatably supports the image pickup device, and further includes a control portion that controls the flight of the flying body or the rotation of the support member according to the motion information.
  • the mobile platform may be a support device that is held by a user at the time of use and includes a support member that rotatably supports the image pickup device, and further includes a control portion that controls rotation of the support member according to the action information.
  • the mobile platform may be a portable terminal, and also includes a communication portion that transmits action information to the flying body or the supporting device.
  • a flying body includes: an image pickup device; a support member rotatably supporting the image pickup device; an action information acquisition portion that acquires motion information generated by the mobile platform; and a control portion that is based on the action information To control the flight of the flying body or the rotation of the support components.
  • a support device includes: a support member rotatably supporting an image pickup device; an action information acquisition portion that acquires action information generated by the mobile platform; and a control portion that controls support according to the action information The rotation of the part.
  • a camera assisting method is an image capturing assisting method in a mobile platform that assists imaging of a second image by an imaging device, and has the following steps: a step of acquiring a first image; a step of acquiring information of the first subject from one or more subjects included in the image; and specifying one or more positions of one or more subjects including the first subject in the second image a step of acquiring information of the first composition in the composition; and generating, according to the first composition, motion information related to an action of the imaging device for capturing the second image.
  • the step of acquiring the information of the first subject may include the step of selecting and acquiring the first subject from the plurality of subjects included in the first image.
  • the step of acquiring the information of the first subject may include the step of acquiring information of the first subject based on the color component of the subject included in the first image.
  • the step of acquiring the information of the first subject may include the step of acquiring information of the first subject based on the spatial frequency of the subject included in the first image.
  • the camera assisting method may further include the step of acquiring position information of the camera.
  • the step of acquiring the information of the first subject may include the step of acquiring information of the first subject based on the position information of the imaging apparatus.
  • the step of acquiring the information of the first subject may include the step of acquiring information of the first subject in accordance with an imaging mode at the time of imaging of the second image by the imaging device.
  • the step of acquiring the information of the first composition may include the step of selecting and acquiring the first composition from the plurality of compositions.
  • the image assisting method may further include the step of identifying the shape of the first subject.
  • the step of acquiring the information of the first composition may include the step of acquiring the information of the first composition according to the shape of the first subject.
  • the image assisting method may further include the step of identifying a scene when the second image is imaged.
  • the step of acquiring the information of the first composition may include the step of acquiring information of the first composition according to the scene.
  • the step of generating the action information may include the step of generating rotation information related to the rotation of the support member rotatably supporting the image pickup device as the action information.
  • the step of generating the motion information may include the step of determining the amount of rotation and the direction of rotation of the support member based on the position of the first subject in the first image and the position of the first subject in the first composition.
  • the step of generating the action information may include the step of generating the movement information related to the movement of the camera device as the action information.
  • the step of generating the motion information may include the step of determining the amount of movement of the image pickup device in the gravity direction according to the size of the first subject in the first image and the size of the first subject in the first composition.
  • the step of generating the action information may include according to the first subject in the first image
  • the step of determining the amount of movement and the direction of movement of the image pickup device is determined by the position, the position of the first subject in the first composition, and the correspondence relationship between the moving distance in the first image and the moving distance in the real space.
  • the image capturing assistance method may further include the step of prompting the action information at the prompting portion.
  • the first image may be an image captured by the imaging device.
  • the mobile platform may be a flying body including an imaging device and a support member that rotatably supports the imaging device.
  • the camera assist method may further include the step of controlling the flight of the flying body or the rotation of the support member based on the motion information.
  • the mobile platform may be a support device that is held by the user in use and includes a support member that rotatably supports the camera.
  • the camera assisting method may further include the step of controlling the rotation of the support member based on the action information.
  • the mobile platform can be a portable terminal.
  • the camera assisting method may further comprise the step of transmitting the action information to the flying body or the supporting device.
  • a program is a program for causing a mobile platform that assists imaging of a second image by an imaging device to perform the steps of: acquiring a first image; acquiring a first image a step of information of the first subject among the one or more subjects; and acquiring one or more of the one or more composition specifying the position of the one or more subjects including the first subject in the second image a step of patterning the information; and generating, according to the first composition, motion information related to an action of the camera for capturing the second image.
  • a recording medium is a computer-readable recording medium on which a program for causing a moving platform for assisting imaging of a second image by an image pickup apparatus to perform the following steps: a step of acquiring a first image And acquiring the information of the first subject among the one or more subjects included in the first image; acquiring the position of the one or more subjects including the first subject in the second image a step of information of the first composition in one or more of the composition; and generating and using according to the first composition The step of capturing motion information related to the operation of the imaging device of the second image.
  • FIG. 1 is a schematic diagram showing a configuration example of an imaging assistance system in the first embodiment.
  • FIG. 2 is a block diagram showing one example of a hardware configuration of an unmanned aerial vehicle in the first embodiment.
  • FIG. 3 is a block diagram showing an example of a functional configuration of a UAV control unit in the first embodiment.
  • FIG. 4 is a block diagram showing one example of a hardware configuration of the portable terminal in the first embodiment.
  • FIG. 5 is a view for explaining an outline of an operation of the imaging assistance system.
  • FIG. 6A is a diagram showing one example of a live view image.
  • FIG. 6B is a diagram showing one example of a color division image that divides a live view image by color.
  • FIG. 6C is a diagram showing a selection example of the main subject.
  • FIG. 7 is a diagram showing a selection example of composition.
  • FIG. 8A is a diagram showing an example of rotation of an imaging range for aerial photography with the determined composition.
  • FIG. 8B is a diagram showing an example of movement of an unmanned aerial vehicle for aerial photography with the determined composition.
  • Fig. 8C is a view for explaining the movement of the unmanned aerial vehicle viewed from the horizontal direction.
  • FIG. 9 is a flowchart showing an operation example of the imaging assistance system in the first embodiment.
  • FIG. 10 is a schematic diagram showing a configuration example of the imaging assistance system in the second embodiment.
  • Fig. 11 is a block diagram showing one example of a hardware configuration of an unmanned aerial vehicle in the second embodiment.
  • FIG. 12 is a block diagram showing an example of a functional configuration of a UAV control unit in the second embodiment.
  • FIG. 13 is a block diagram showing one example of a hardware configuration of the portable terminal in the second embodiment.
  • FIG. 14 is a block diagram showing an example of a functional configuration of a terminal control unit in the second embodiment.
  • FIG. 15 is a flowchart showing an operation example of the imaging assistance system in the second embodiment.
  • 16 is a perspective view showing a configuration example of an imaging assistance system including a gimbal device and a portable terminal in the third embodiment.
  • Fig. 17A is a front perspective view showing a configuration example of the gimbal device in the fourth embodiment.
  • 17B is a rear perspective view showing a configuration example of an image pickup assisting system including a gimbal device and a portable terminal in the fourth embodiment.
  • the aircraft is exemplified by an unmanned aerial vehicle (UAV).
  • UAV unmanned aerial vehicle
  • the flight body includes an aircraft that moves in the air.
  • the UAV is labeled "UAV.”
  • the mobile platform is exemplified by a flying body, a portable terminal, a gimbal device, and a gimbal camera device.
  • the mobile platform can also be other devices, such as a transmitter, a PC (Personal Computer), or other mobile platform.
  • the camera assist method specifies the actions in the mobile platform.
  • a program (for example, a program for causing the mobile platform to perform various processes) is recorded in the recording medium.
  • FIG. 1 is a schematic diagram showing a configuration example of the imaging assistance system 10 in the first embodiment.
  • the camera assistance system 10 includes an unmanned aerial vehicle 100, a transmitter 50, and a portable terminal 80.
  • the UAV 100, the transmitter 50, and the portable terminal 80 can communicate with each other by wired communication or wireless communication such as a wireless LAN (Local Area Network).
  • a wireless LAN Local Area Network
  • the UAV 100 can fly in accordance with a remote operation by the transmitter 50 or in accordance with a predetermined flight path.
  • the unmanned aerial vehicle 100 determines a composition for aerial photography and generates motion information of the unmanned aerial vehicle 100 to become the determined composition.
  • the unmanned aerial vehicle 100 controls the operation of the unmanned aerial vehicle 100 in accordance with the motion information.
  • the transmitter 50 can indicate the control of the flight of the UAV 100 by remote operation. That is, the transmitter 50 can operate as a remote controller.
  • the portable terminal 80 can be scheduled together with the transmitter 50 It is carried by the user who uses the unmanned aerial vehicle 100 for aerial photography.
  • the portable terminal 80 assists in the determination of the composition by the unmanned aerial vehicle 100 and assists in imaging.
  • FIG. 2 is a block diagram showing one example of the hardware configuration of the unmanned aerial vehicle 100.
  • the unmanned aerial vehicle 100 is configured to include a UAV control unit 110, a communication interface 150, a memory 160, a gimbal 200, a rotor mechanism 210, an imaging unit 220, an imaging unit 230, a GPS receiver 240, and an inertial measurement device (IMU: Inertial Measurement).
  • Unit 250 magnetic compass 260, barometric altimeter 270, ultrasonic sensor 280, and laser measuring instrument 290.
  • the gimbal 200 is an example of a support member.
  • the imaging unit 220 and the imaging unit 230 are an example of an imaging device.
  • the UAV control unit 110 is configured by, for example, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or a DSP (Digital Signal Processor).
  • the UAV control unit 110 performs signal processing for overall controlling the operation of each part of the UAV 100, input/output processing of data with other parts, arithmetic processing of data, and storage processing of data.
  • the UAV control unit 110 controls the flight of the UAV 100 in accordance with a program stored in the memory 160.
  • the UAV control section 110 may control the flight of the UAV 100 to achieve a composition determined in cooperation with the portable terminal 80.
  • the UAV control section 110 controls the flight of the UAV 100 in accordance with an instruction received from the remote transmitter 50 through the communication interface 150.
  • the memory 160 can be detached from the unmanned aerial vehicle 100.
  • the UAV control unit 110 acquires position information indicating the position of the unmanned aerial vehicle 100.
  • the UAV control unit 110 can acquire position information indicating the latitude, longitude, and altitude in which the unmanned aerial vehicle 100 is located from the GPS receiver 240.
  • the UAV control unit 110 can acquire latitude and longitude information indicating the latitude and longitude of the unmanned aerial vehicle 100 from the GPS receiver 240, and acquire height information indicating the height of the unmanned aerial vehicle 100 from the barometric altimeter 270 as position information.
  • the UAV control unit 110 acquires orientation information indicating the orientation of the unmanned aerial vehicle 100 from the magnetic compass 260.
  • the orientation information indicates, for example, the orientation of the nose of the UAV 100 Orientation.
  • the UAV control unit 110 acquires imaging range information indicating the imaging range of each of the imaging unit 220 and the imaging unit 230.
  • the UAV control unit 110 acquires angle of view information indicating the angle of view of the imaging unit 220 and the imaging unit 230 from the imaging unit 220 and the imaging unit 230 as parameters for specifying the imaging range.
  • the UAV control unit 110 acquires information indicating the imaging directions of the imaging unit 220 and the imaging unit 230 as parameters for specifying the imaging range.
  • the UAV control unit 110 acquires posture information indicating the posture state of the imaging unit 220 from the gimbal 200, for example, as information indicating the imaging direction of the imaging unit 220.
  • the UAV control unit 110 acquires information indicating the orientation of the UAV 100.
  • the information indicating the posture state of the imaging unit 220 indicates the angle at which the gimbal 200 rotates from the reference rotation angle of the pitch axis and the yaw axis.
  • the UAV control unit 110 acquires position information indicating the position where the unmanned aerial vehicle 100 is located as a parameter for determining the imaging range.
  • the UAV control unit 110 can acquire the imaging range of the geographical range in which the imaging unit 220 captures and generate imaging range information based on the angle of view and the imaging direction of the imaging unit 220 and the imaging unit 230 and the position of the UAV 100. Camera range information.
  • the UAV control unit 110 controls the gimbal 200, the rotor mechanism 210, the imaging unit 220, and the imaging unit 230.
  • the UAV control unit 110 controls the imaging range of the imaging unit 220 by changing the imaging direction or the angle of view of the imaging unit 220.
  • the UAV control unit 110 controls the imaging range of the imaging unit 220 supported by the gimbal 200 by controlling the rotation mechanism of the gimbal 200.
  • the imaging range refers to a geographical range captured by the imaging unit 220 or the imaging unit 230.
  • the camera range is defined by latitude, longitude and altitude.
  • the imaging range can be a range of three-dimensional spatial data defined by latitude, longitude, and altitude.
  • the imaging range is determined based on the angle of view and the imaging direction of the imaging unit 220 or the imaging unit 230 and the position of the UAV 100.
  • the imaging directions of the imaging unit 220 and the imaging unit 230 are defined by the orientation and depression angle of the imaging unit 220 and the imaging unit 230 on the front side of the imaging lens.
  • the imaging direction of the imaging unit 220 is a direction determined by the orientation of the head of the UAV 100 and the posture state of the imaging unit 220 with respect to the gimbal 200.
  • the imaging direction of the imaging unit 230 is determined by the unmanned aerial vehicle 100
  • the orientation of the handpiece and the position of the camera unit 230 are set to determine the direction.
  • the UAV control unit 110 may add information related to the aerial image to the captured image (aeronautical image) captured by the imaging unit 220 or the imaging unit 230 as additional information (an example of metadata).
  • the additional information includes information (flight information) related to the flight of the unmanned aerial vehicle 100 at the time of aerial photography, and information (imaging information) related to imaging of the imaging unit 220 or the imaging unit 230 at the time of aerial photography.
  • the flight information may include at least one of aerial position information, aerial path information, and aerial time information.
  • the imaging information may include at least one of aerial viewing angle information, aerial shooting direction information, aerial shooting posture information, imaging range information, and subject distance information.
  • the aerial position information indicates the position (aeronautical position) of the aerial aerial image.
  • the aerial position information may be based on location information acquired by the GPS receiver 240.
  • the aerial path information represents the path (aeronautical path) of the aerial aerial image.
  • the aerial path information may be composed of a collection of aerial positions that continuously align the aerial positions.
  • the aerial time information indicates the time (aeronautical time) of the aerial aerial image.
  • the aerial time information may be based on time information of a timer referred to by the UAV control unit 110.
  • the aerial view angle information indicates the angle of view information of the imaging unit 220 or the imaging unit 230 at the time of aerial photography.
  • the aerial direction information indicates the imaging direction of the imaging unit 220 or the imaging unit 230 (the aerial direction) when the aerial image is captured.
  • the aerial posture information indicates the posture information of the imaging unit 220 or the imaging unit 230 at the time of aerial photography.
  • the imaging range information indicates the imaging range of the imaging unit 220 or the imaging unit 230 when the aerial image is captured.
  • the subject distance information indicates information on the distance from the imaging unit 220 or the imaging unit 230 to the subject. The subject distance information may be based on the detection information measured by the ultrasonic sensor 280 or the laser measuring instrument 290.
  • the subject distance information it is also possible to calculate a distance to the subject by capturing a plurality of images including the same subject and using these images as stereoscopic images. Further, the imaging information may also include information on the orientation of the unmanned aerial vehicle 100 at the time of aerial photography.
  • the communication interface 150 communicates with the transmitter 50 and the portable terminal 80.
  • the communication interface 150 can transmit an aerial image to the portable terminal 80.
  • Communication interface 150 can take aerial photography At least a portion of the image and its additional information is transmitted to the portable terminal 80.
  • the communication interface 150 may receive information for determining a composition of an aerial image to be imaged by the imaging unit 220 or the imaging unit 230.
  • the information for determining the composition of the aerial image to be captured may include, for example, selection information for selecting a main subject in the aerial image (live view image), selection information for selecting a composition.
  • the communication interface 150 can transmit an aerial image (for example, an aerial video such as a live view image or an aerial still image) imaged by the imaging unit 220 or the imaging unit 230 to the portable terminal 80.
  • the communication interface 150 can communicate directly with the portable terminal 80, or can communicate with the portable terminal 80 via the transmitter 50.
  • the memory 160 stores the UAV control unit 110 for the gimbal 200, the rotor mechanism 210, the imaging unit 220, the imaging unit 230, the GPS receiver 240, the inertial measurement device 250, the magnetic compass 260, the barometric altimeter 270, the ultrasonic sensor 280, and the laser measuring instrument. 290 Programs required for control, etc.
  • the memory 160 may be a computer readable recording medium, and may include an SRAM (Static Random Access Memory), a DRAM (Dynamic Random Access Memory), and an EPROM (Erasable Programmable Read Only Memory: Erasable). At least one of a flash memory such as a programmable read only memory, an EEPROM (Electrically Erasable Programmable Read-Only Memory), and a USB (Universal Serial Bus) memory.
  • the memory 160 holds various kinds of information and various data acquired through the communication interface 150.
  • the memory 160 can store sample information for various compositions of the captured image.
  • the sample information of the composition can be stored in a table form.
  • the memory 160 can hold the information of the composition determined by the UAV control section 110.
  • the memory 160 may store motion information related to the actions of the UAV 100 for achieving imaging under the determined composition.
  • the motion information of the UAV 100 can be read from the memory 160 during aerial photography, and the UAV 100 can operate based on the motion information.
  • the gimbal 200 can be rotatably centered on the yaw axis, the pitch axis, and the roll axis
  • the imaging unit 220 is supported.
  • the gimbal 200 can change the imaging direction of the imaging unit 220 by rotating the imaging unit 220 around at least one of the yaw axis, the pitch axis, and the roll axis.
  • the yaw axis, the pitch axis, and the roll axis can be determined as follows.
  • the roll axis is defined as a horizontal direction (a direction parallel to the ground).
  • the pitch axis is determined to be parallel to the ground and perpendicular to the roll axis
  • the yaw axis (refer to the z-axis) is determined to be perpendicular to the ground and perpendicular to the roll axis and the pitch axis.
  • the imaging unit 220 captures a subject of a desired imaging range and generates data of the captured image.
  • the image data obtained by the imaging by the imaging unit 220 is stored in a memory included in the imaging unit 220 or in the memory 160.
  • the imaging unit 230 captures the periphery of the UAV 100 and generates data of the captured image.
  • the image data of the imaging unit 230 is stored in the memory 160.
  • the GPS receiver 240 receives a plurality of signals indicating the time transmitted from a plurality of navigation satellites (i.e., GPS satellites) and the position (coordinates) of each GPS satellite.
  • the GPS receiver 240 calculates the position of the GPS receiver 240 (i.e., the position of the UAV 100) based on the received plurality of signals.
  • the GPS receiver 240 outputs the position information of the UAV 100 to the UAV control unit 110.
  • the UAV control unit 110 can be used to calculate the position information of the GPS receiver 240 instead of the GPS receiver 240. In this case, the UAV control unit 110 inputs information indicating the time and the position of each GPS satellite included in the plurality of signals received by the GPS receiver 240.
  • the inertial measurement device 250 detects the posture of the unmanned aerial vehicle 100 and outputs the detection result to the UAV control unit 110.
  • the inertial measurement device IMU 250 detects the acceleration in the three-axis direction of the front, rear, left and right, and up and down of the unmanned aerial vehicle 100, and the angular velocity in the three-axis direction of the pitch axis, the roll axis, and the yaw axis as the posture of the unmanned aerial vehicle 100.
  • the magnetic compass 260 detects the orientation of the nose of the unmanned aerial vehicle 100, and outputs the detection result to the UAV control unit 110.
  • the barometric altimeter 270 detects the altitude at which the UAV 100 flies and outputs the detection result to the UAV control unit 110.
  • the height of the UAV 100 flight may be detected by a sensor other than the barometric altimeter 270.
  • the ultrasonic sensor 280 emits ultrasonic waves, detects ultrasonic waves reflected from the ground and the objects, and outputs the detection results to the UAV control unit 110.
  • the detection result may indicate the distance from the unmanned aerial vehicle 100 to the ground, that is, the height.
  • the detection result may indicate the distance from the unmanned aerial vehicle 100 to the object (subject).
  • the laser measuring instrument 290 irradiates the object with laser light, receives the reflected light reflected by the object, and measures the distance between the UAV 100 and the object (subject) by the reflected light.
  • the laser-based distance measuring method it may be a time-of-flight method.
  • FIG. 3 is a block diagram showing an example of a functional configuration of the UAV control section 110.
  • the UAV control unit 110 includes an image acquisition unit 111, a main subject determination unit 112, a composition determination unit 113, an operation information generation unit 114, and an operation control unit 115.
  • the main subject determination section 112 and the composition determination section 113 are one example of the information acquisition section.
  • the composition determination unit 113 is an example of the recognition unit.
  • the motion information generating unit 114 is an example of a generating unit.
  • the motion control unit 115 is an example of a control unit.
  • the image acquisition unit 111 can acquire an image stored in the memory 160 (for example, an aerial image captured by the imaging unit 220 or the imaging unit 230).
  • the image acquisition unit 111 can acquire an aerial image in the aerial photography of the imaging unit 220 or the imaging unit 230.
  • the aerial image can be a moving image or a still image.
  • the aerial image in aerial photography is also referred to as a live view image (an example of the first image).
  • the aerial image acquired by the image acquisition unit 111 is mainly taken as an example of a live view image.
  • the main subject determination unit 112 determines (determines) the main subject (one example of the first subject) among one or more subjects included in the live view image acquired by the image acquisition unit 111.
  • the determination of the main subject is an example of information acquisition of the main subject.
  • the determination of the main subject can be performed manually by the user of the portable terminal 80, for example, or automatically by the unmanned aerial vehicle 100.
  • the main subject can also be a subject other than the live view image. (for example, an arbitrary subject included in the map information corresponding to the aerial range desired by the user). At this time, the imaging range of the unmanned aerial vehicle 100 can be exceeded to determine the composition suitable for the desired subject.
  • the composition determination section 113 determines a composition for imaging the determined main subject.
  • the determination of the composition for capturing the main subject is an example of information acquisition for composing the composition of the main subject.
  • This composition is also referred to as a pre-composition (an example of the first composition) because it is a composition to be imaged that has not been imaged.
  • the composition may be information that specifies the positional relationship of one or more subjects in the image.
  • the composition determination section 113 can refer to the sample information of the composition stored in the memory 160, and determine the pre-composition based on the information of the main subject.
  • the determination of the pre-composition map can be performed manually by the user of the portable terminal 80, for example, or automatically by the unmanned aerial vehicle 100.
  • the sampled information of the composition may include, for example, information of at least one of a Rule of Thirds, a Dichotomy, a Triangle, a Diagonal Composition, a Letter Composition, a Center Composition, an Edge Composition, a Sandwich Composition, and a Tunnel Composition. , as sample information. Further, the sample information of the composition may also include information that configures the main subject at a predetermined intersection point, a division point (for example, a golden point) of each composition.
  • the motion information generating unit 114 generates motion information of the unmanned aerial vehicle 100 for realizing aerial photography in accordance with the determined pre-composition.
  • the motion information of the UAV 100 may include, for example, movement information related to the movement of the UAV 100 (eg, the amount of movement of the UAV 100, the direction of movement), rotation information related to the rotation of the UAV 100 (eg, no one) At least a part of the rotation information of the aircraft 100, the rotation direction, the rotation information related to the rotation of the gimbal 200 (for example, the rotation amount of the gimbal 200, the rotation direction), and the other operation information of the UAV 100.
  • the motion control unit 115 can control the flight of the UAV 100 in accordance with the generated motion information (for example, the amount of movement of the UAV 100 and the moving direction).
  • the motion control unit 115 can control the orientation of the UAV 100 in accordance with the generated motion information (for example, the amount of rotation of the UAV 100 and the direction of rotation).
  • the motion control unit 115 can pass
  • the imaging unit of the imaging unit 220 is changed by moving the unmanned aerial vehicle 100.
  • the motion control unit 115 can control the rotation of the gimbal 200 in accordance with the generated motion information (for example, the amount of rotation of the gimbal 200 and the direction of rotation).
  • the motion control unit 115 may control the posture of the UAV 100 in accordance with the motion information, and control the posture of the gimbal 200 by the rotation of the gimbal 200. In this way, the motion control unit 115 can change the imaging range of the imaging unit 220 by rotating the gimbal 200.
  • FIG. 4 is a block diagram showing one example of the hardware configuration of the portable terminal 80.
  • the mobile terminal 80 may include a terminal control unit 81, an interface unit 82, an operation unit 83, a wireless communication unit 85, a memory 87, and a display unit 88.
  • the operation unit 83 is an example of an information acquisition unit.
  • the wireless communication unit 85 is an example of a communication unit.
  • the terminal control unit 81 can be configured using, for example, a CPU, an MPU, or a DSP.
  • the terminal control unit 81 performs signal processing for overall controlling the operation of each part of the portable terminal 80, input/output processing of data with other parts, arithmetic processing of data, and storage processing of data.
  • the terminal control unit 81 can acquire data and information from the unmanned aerial vehicle 100 via the wireless communication unit 85.
  • the terminal control unit 81 can acquire data and information from the transmitter 50 via the interface unit 82.
  • the terminal control unit 81 can acquire data and information input through the operation unit 83.
  • the terminal control unit 81 can acquire data and information stored in the memory 87.
  • the terminal control unit 81 can transmit data and information to the display unit 88, and display display information based on the data and information on the display unit 88.
  • the terminal control unit 81 can execute an imaging assistance application.
  • the camera assist application may be an application that performs assistance for aerial photography by the unmanned aerial vehicle 100 under a desired composition.
  • the terminal control unit 81 can generate various data used in the application.
  • the interface unit 82 performs input and output of information and data between the transmitter 50 and the portable terminal 80.
  • the interface unit 82 can be input and output, for example, via a USB cable.
  • the interface unit 82 may also be an interface other than USB.
  • the operation unit 83 accepts and acquires data and information input by the user of the portable terminal 80.
  • the operation unit 83 may include a button, a button, a touch display screen, a microphone, and the like.
  • the operation unit 83 and the display unit 88 are constituted by a touch display screen.
  • the operation unit 83 can accept a touch operation, a click operation, a drag operation, and the like.
  • the operation unit 83 can acquire selection information of the main subject by accepting a selection operation for selecting the main subject.
  • the operation unit 83 can acquire the selection information of the composition by accepting the selection operation for selecting the composition.
  • the wireless communication unit 85 performs wireless communication with the UAV 100 by various wireless communication methods.
  • the wireless communication method of this wireless communication may include, for example, communication via a wireless LAN, Bluetooth (registered trademark), or a public wireless network.
  • the wireless communication unit 55 can transmit the selection information of the main subject and the selection information of the composition to the unmanned aerial vehicle 100.
  • the memory 87 may include a ROM that stores data specifying a program and a setting value for the operation of the mobile terminal 80, and a RAM that temporarily stores various information and data used when the terminal control unit 81 performs processing.
  • the memory 87 may include a memory other than the ROM and the RAM.
  • the memory 87 can be disposed inside the portable terminal 80.
  • the memory 87 can be configured to be detachable from the portable terminal 80.
  • the program can include an application.
  • the display unit 88 is configured by, for example, an LCD (Liquid Crystal Display), and displays various kinds of information and data output from the terminal control unit 81.
  • the display unit 88 can display various data and information related to the execution of the imaging assistance application.
  • the display section 88 can display a selection screen for selecting a main subject, and a selection screen for selecting a composition.
  • the portable terminal 80 can be mounted on the transmitter 50 via a bracket.
  • the portable terminal 80 and the transmitter 50 can be connected by a wired cable such as a USB cable. It is also possible not to install the portable terminal 80 on the transmitter 50, but to separately set the portable terminal 80 and the transmitter 50.
  • the camera assist system 10 may also not include the transmitter 50.
  • FIG. 5 is a view for explaining an outline of the operation of the imaging assistance system 10.
  • Fig. 5 there is a road R1 in the mountain M1 and a person H1 in the road R1.
  • the unmanned aerial vehicle 100 performs aerial photography while flying over the mountain M1.
  • the UAV 100 captures a live view image of the mountain M1 and transmits it to the portable terminal 80.
  • the portable terminal 80 receives the live view image G1 from the unmanned aerial vehicle 100, and displays the live view image G1 on the display unit 88. Thereby, the user of the portable terminal 80 can confirm the live view image G1.
  • the portable terminal 80 accepts an operation of instructing the composition adjustment by the operation unit 83, and transmits the composition adjustment command to the unmanned aerial vehicle 100.
  • the unmanned aerial vehicle 100 determines a main subject (for example, person H1) in the live view image G1, determines a pre-composition, and generates action information of the unmanned aerial vehicle 100.
  • the UAV 100 moves or the like according to the action information, and notifies the portable terminal 80 that it has moved to the desired position (movement completed).
  • the portable terminal 80 transmits an imaging command to the unmanned aerial vehicle 100 based on, for example, a user instruction via the operation unit 83.
  • the portable terminal 80 can acquire the live view image at the position of the unmanned aerial vehicle 100 after the movement by the wireless communication unit 85.
  • the user can confirm the moving live view image by display, and can judge whether or not aerial photography should be performed at the position.
  • the UAV 100 performs aerial photography by the imaging unit 220 or 230 in accordance with the imaging command, and obtains an aerial image (an example of the second image).
  • the UAV 100 can acquire a desired pre-composed aerial image.
  • the person H1 as the main subject in the live view image, the person H1 as the main subject is placed at an arbitrary position, but in the aerial image in which the composition is adjusted, the person H1 as the main subject is located in the three-way composition. At the intersection of the middle three-point line. In this way, the UAV 100 can take an action (here, move) to aerial photography in order to achieve the desired composition.
  • the imaging command may not be transmitted by the portable terminal 80 but transmitted by the transmitter 50.
  • the transmitter 50 can cooperate with the portable terminal 80 using communication or the like, and transmit information of pressing of the imaging button (not shown) of the transmitter 50 to the unmanned aerial vehicle 100.
  • FIG. 6A is a diagram showing one example of a live view image G1 imaged by the unmanned aerial vehicle 100.
  • FIG. 6B is a diagram showing one example of the color division image G2 that divides the live view image G1 by color.
  • FIG. 6C is a diagram showing a selection example of the main subject using the color division image G2.
  • the live view image G1 includes an ocean having a plurality of color components (for example, blue, light blue) and an island having a forest having a green component.
  • the communication interface 150 transmits the live view image G1 to the portable terminal 80.
  • the wireless communication section 85 can receive the live view image G1 from the unmanned aerial vehicle 100, and the display section 88 displays the live view image G1.
  • the main subject determination section 112 can divide the live view image G1 into a plurality of image blocks (for example, 16 ⁇ 16 blocks).
  • the main subject determination unit 112 can divide the live view image G1 into one or more regions based on the color components of the respective image blocks, and generate the color division image G2.
  • the blue portion of the ocean may be divided into region A
  • the light blue portion of the ocean may be divided into region B
  • the green portion of the island may be divided into region C.
  • the wireless communication section 85 can receive the color division image G2 from the unmanned aerial vehicle 100
  • the display section 88 displays the color division image G2.
  • the main subject determination unit 112 can determine an area of any one of the color division images G2 as a main subject. As shown in FIG. 6C, the display portion 88 can display the color division image G2 according to the imaging assistance application, and perform which region for determining which region is selected (here, ZA indicating the region A, ZB indicating the region B, ZC indicating the region C) As the guide display of the main subject. In FIG. 6C, the area C is selected as the main subject by the operation unit 83. In this case, in the portable terminal 80, wireless communication The portion 85 transmits the selection information of the main subject obtained by the operation unit 83 to the unmanned aerial vehicle 100. In the unmanned aerial vehicle 100, the communication interface 150 receives selection information of the main subject. The main subject determination section 112 determines the main subject based on the selection information of the main subject. In this case, the main subject becomes a set of pixels corresponding to the imaging target desired by the user in the captured image.
  • the body determining unit 112 may also interpolate the pixel information of the subject.
  • the main subject determination section 112 can interpolate the pixel information of the subject based on the pixel information (for example, a pixel value) around the subject in the image of the live view image G1.
  • the main subject determination unit 112 can interpolate pixel information (for example, a pixel value) around the subject in the image of the live view image G1 as the pixel information of the subject.
  • the main subject determination section 112 can collect a plurality of pieces of pixel information around the subject, and generate a new color by weighting or averaging based on the plurality of pieces of pixel information, and perform interpolation as pixel information of the subject.
  • the main subject determination section 112 can use Nearest neighbor, Bilinear, Bicubic, or the like.
  • the main subject determination section 112 can acquire the selection information of the main subject through the communication interface 150, and can determine the main subject based on the selection information.
  • the UAV 100 can determine the main subject desired by the user from among the subjects included in the color division image G2 based on the live view image G1. Therefore, the UAV 100 can perform composition adjustment based on the main subject desired by the user.
  • the main subject determination unit 112 may determine the main subject without using the selection information of the subject. In addition, it is also possible to determine a plurality of main subjects.
  • the main subject determination section 112 can determine a predetermined area among the areas of the color components in the color division image G2 as the main subject. That is, the main subject determination section 112 can group the live view image G1 by color and recognize the predetermined color group as the main Subject. In this case, the main subject determination unit 112 can determine, for example, a region located at the center surrounded by the respective color components (for example, the island region in FIG. 6B, that is, the region C) as the main subject. Thereby, the UAV 100 can take a subject that is protruded by being surrounded by the surrounding area as the main subject.
  • the main subject determination section 112 can determine an area having a predetermined size or less (for example, the smallest) among the areas divided by the color components as the main subject. Thereby, the UAV 100 can adjust the composition based on the small-sized area in the live view image G1 that is harder to distinguish than other areas. For example, a person who is lost in the mountains can be determined as a main subject.
  • the main subject determination section 112 can determine the predetermined color region obtained from the operation section 83 and the memory 87 as the main subject. Thereby, the UAV 100 can determine the subject of the color desired by the user and the subject of the color determined in advance as the main imaging target as the main subject.
  • the UAV 100 can roughly discriminate the subject in various colors such as mountains, sea, people's clothes, and the like. Therefore, the UAV 100 can register, for example, a specific color component as a subject to be noted, and can automatically distinguish the main subject from the color.
  • the main subject determination section 112 can divide the live view image G1 into one or more regions based on the spatial frequency, and determine the predetermined region as the main subject. That is, the main subject determination section 112 can group the live view image G1 by the spatial frequency, and recognize the group of the predetermined spatial frequency range as the main subject.
  • the main subject determination section 112 can determine the area in which the spatial frequency in the area divided by the spatial frequency is equal to or higher than the predetermined frequency (for example, the highest) as the main subject. Thereby, the UAV 100 can adjust the composition based on a relatively clear area.
  • the main subject determination section 112 can obtain the operation from the operation section 83 and the memory 87.
  • the area having the predetermined spatial frequency is determined to be the main subject.
  • the UAV 100 can determine the subject of the spatial frequency desired by the user and the subject of the spatial frequency determined in advance as the main imaging target as the main subject.
  • the main subject determination unit 112 can determine a predetermined subject among one or more subjects included in the live view image G1 as a main subject based on the aerial position information of the live view image G1 being captured.
  • the main subject determination section 112 can acquire map information of a map database stored by an external server or the like through the communication interface 150.
  • the map information may include information of a subject type such as a mountain, a river, a sea, a building.
  • the main subject determination section 112 can determine the main subject based on the type of the subject and the size of the subject.
  • the UAV 100 can adjust the composition based on the geographic information and terrain information of the UAV 100.
  • the main subject determination section 112 can acquire evaluation information related to the aerial image of the subject included in the live view image G1 from the external server or the like through the communication interface 150.
  • the main subject determination unit 112 can determine the subject whose evaluation information is equal to or higher than a predetermined criterion (for example, the highest evaluation) as the main subject.
  • the UAV 100 can adjust the composition based on the subject highly evaluated by others.
  • the aerial position information can be obtained from the information acquired by the GPS receiver 240.
  • the main subject determination section 112 can determine the main subject based on the imaging mode set when the determined main subject is aerial photographed. Further, in the case where the sunset mode is set, the sun, the vicinity of the horizontal line where the sun sinks, and the vicinity of the horizon can be determined as the main subject.
  • the UAV 100 can add the imaging mode in consideration of the subject to determine the main subject. Therefore, the UAV 100 can determine the main subject suitable for the imaging information (camera parameters) set according to the imaging mode, and it is expected that a clear aerial image can be obtained.
  • the main subject determination section 112 can store the determination information of the main subject, for example, based on any of the above-described methods, together with the information of the imaging range of the live view image G1 or the like in the memory 160.
  • the main subject determination section 112 can determine the subject included in the live view image G1 based on the determination information of the main subject stored in the memory 160 (that is, the determination information of the main subject having the past performance).
  • the main subject For example, the main subject determination section 112 can determine the subject that has been determined to be the main subject a predetermined number of times or more (for example, the most) in the same (for example, the same) imaging range, and the past is determined to be the main subject.
  • a subject having a predetermined frequency or more is determined as a main subject.
  • the UAV 100 can determine the main subject in a machine learning manner based on past performance, in other words, based on the user's selection tendency and the determined tendency of the UAV 100.
  • the UAV 100 can adjust the composition by using the main subject determined in a machine learning manner as a reference.
  • FIG. 7 is a diagram showing an example of composition selection.
  • the composition determination section 113 acquires information of the candidate composition to be aerial photographed from the memory 160 based on the main subject. For example, the composition determination unit 113 may compare the composition of the live view image with the sample information of the composition stored in the memory 160, and determine the sample information of the composition having the degree of matching with the composition of the live view image having a predetermined degree or more as one or more.
  • Candidate composition The composition determination section 113 can determine the live view based on at least one of the shape, the position, the size of the main subject in the two compositions, the shape, the position, the size, the positional relationship, and the like of the plurality of subjects in the two compositions. The degree of agreement between the composition of the image and the sample information of the composition.
  • the main subject determination section 112 selects a river having an elongated shape as a main subject.
  • the composition determining portion 113 can refer to the sample information of the composition saved by the memory 160, and acquire, for example, a diagonal composition, a three-way composition, or the like which is suitable for the aerial shape of the elongated shape as a candidate composition.
  • the river as the main subject can be arranged along the diagonal line in the composition.
  • the divisional composition as the main subject, the river can be arranged along a dividing line divided into three parts or overlapped with the dividing line.
  • the composition determination section 113 can transmit the acquired candidate composition to the portable terminal 80 through the communication interface 150.
  • the wireless communication section 85 can acquire information of candidate composition.
  • the display portion 88 can display information of the candidate composition.
  • the display unit 88 can display the candidate composition according to the imaging assistance application, and perform a guidance display for deciding which candidate composition to be selected (here, diagonal composition, three-way composition) as the pre-composition.
  • the diagonal composition is selected as the pre-pattern by the operation unit 83.
  • the wireless communication section 85 transmits the selection information of the composition obtained by the operation section 83 to the unmanned aerial vehicle 100.
  • the communication interface 150 receives the selected selection information of the composition.
  • the composition determination section 113 determines a pre-composition map based on the selection information of the composition.
  • the image of the candidate composition displayed on the display portion 88 may be an image stored in the memory 160 and transmitted to the portable terminal 80 (for example, a patterned image as sample information of the composition).
  • the image of the candidate composition displayed on the display unit 88 may be an image generated by the composition determining unit 113 and transmitted to the portable terminal 80.
  • the composition determination section 113 can generate an image of the candidate composition to be displayed based on the information of the composition corresponding to the shape of the main subject and the live view image G1. For example, when there is a river as the main subject in the live view image G1, and the mountain body is present as the subject on both sides thereof, the composition determining portion 113 can, for example, simplify the shapes of the objects, and generate the positional relationship according to the candidate composition.
  • the terminal control portion 81 of the portable terminal 80 may generate a candidate composition based on the information of the composition corresponding to the shape of the main subject acquired from the unmanned aerial vehicle 100 and the live view image G1. image.
  • the composition determination section 113 can add, for example, the shape of the main subject to determine the candidate composition.
  • the display portion 88 of the portable terminal 80 can display the determined candidate composition and prompt the user to make a selection.
  • the display portion 88 may display the candidate composition in a still image or may display the candidate composition in a preview of the moving image.
  • the composition determination section 113 can be connected by communication
  • the port 150 acquires the selection information of the composition and determines the pre-composition based on the selection information.
  • the UAV 100 can determine the composition in which the main subject is disposed at the desired position in the aerial image to be aerial photographed, and the main subject can be attractively aerial photographed. Further, the composition determination section 113 can automatically determine the candidate composition, and can preferentially present the candidate composition from the sample information of the various compositions. Since the composition determination section 113 determines the pre-composition based on the selection information, the user's will can be reflected in the selection of the pre-composition.
  • composition determination unit 113 may determine the composition based on the shape of the main subject without performing the presentation of the candidate composition. Thereby, the UAV 100 can add the shape of the main subject, thereby performing aerial photography using the pre-composition image with good balance, and the main subject can be attractively aerial photographed.
  • the composition determination section 113 can determine the composition without using the selection information of the composition.
  • the composition determination section 113 can recognize the scene of the live view image G1 by the scene recognition algorithm.
  • the composition determination section 113 can determine a composition or a cue candidate composition suitable for the scene based on the scene recognition result. For example, when recognizing that the live view image G1 is a scene of sunrise (sun rising), the composition determination section 113 may determine a composition such as a center composition, a bisection composition, or the like suitable for imaging the scene, or suggest these candidate compositions.
  • scene recognition for example, deep learning can be used, and a convolutional neural network can also be used.
  • the unmanned aerial vehicle 100 can determine the composition of the aerial photography that can be attractively combined with the scene in which the live view image G1 is aerially photographed.
  • the composition determination section 113 can store the determination information of the composition based on, for example, any one of the above methods, together with the information of the imaging range of the live view image G1 or the like into the memory 160.
  • the composition determination section 113 can determine the composition in consideration of the main subject based on the determination information of the composition stored in the memory 160 (that is, the determination information of the composition having the actual performance in the past).
  • the composition determination section 113 may use a composition that has been used for a predetermined number of times or more (for example, at most) in the same (for example, the same) imaging range, and is used in the past.
  • the composition of the predetermined frequency (for example, the highest frequency) is determined as the composition to be aerial.
  • the UAV 100 can determine the composition in a machine learning manner based on past performance, in other words, based on the user's selection tendency and the determined tendency of the UAV 100.
  • the UAV 100 can attractively take aerial photography of the main subject by a composition determined in a machine learning manner.
  • composition determination section 113 can acquire evaluation information related to the aerial image using the composition from the external server or the like through the communication interface 150.
  • the main subject determination unit 112 can determine the composition of the evaluation information above the predetermined reference (for example, the highest evaluation) as the pre-composition.
  • the UAV 100 can determine a composition highly evaluated by others as a pre-composition. Therefore, the UAV 100 can acquire an aerial image in which an object is disposed with an objectively preferable composition.
  • the motion information generating unit 114 determines the imaging range to perform aerial photography in accordance with the determined composition.
  • the imaging range may be determined by the position of the UAV 100, the orientation of the UAV 100, the orientation of the imaging unit 220, the imaging unit 220, or the angle of view of the imaging unit 230, and the like. Therefore, the motion information generating unit 114 can generate, as the motion information, information for changing the operating state of the unmanned aerial vehicle 100 that is captured from the live view image G1 to the operating state of the unmanned aerial vehicle 100 for realizing the determined composition.
  • the action information generating section 114 may include a position of the unmanned aerial vehicle 100 (for realizing the determined composition) from the position of the unmanned aerial vehicle 100 before moving (when the live view image G1 is aerial photographed)
  • the movement information of the moving unmanned aerial vehicle 100 is used as motion information.
  • the motion information generating unit 114 may include an orientation of the unmanned aerial vehicle 100 (for realizing the determined composition) from the orientation of the unmanned aerial vehicle 100 before the change (the live view image G1 is captured). Rotation information of the UAV 100, As action information.
  • the motion information generating unit 114 may include a gimbal for changing the rotation state (for example, the rotation angle) of the gimbal 200 before the change (corresponding to the orientation of the imaging unit 220) to the rotation state of the changed gimbal 200. 200 rotation information as action information.
  • the motion information generating unit 114 may include angle change information of the imaging unit 220 or the imaging unit 230 for changing the direction of the imaging unit 220 or the imaging unit 230 before the change, to the imaging unit 220 or the imaging unit 230 after the change. Action information.
  • the angle of view of the imaging unit 220 or the imaging unit 230 may correspond to the zoom magnification of the imaging unit 220 or the imaging unit 230.
  • the motion information generating unit 114 may include zoom magnification change information for changing the zoom magnification of the imaging unit 220 or the imaging unit 230 before the change to the zoom magnification of the imaging unit 220 or the imaging unit 230 after the change, as the motion information.
  • FIG. 8A is a diagram showing an example of rotation of an imaging range for aerial photography with the determined composition.
  • FIG. 8B is a diagram showing a movement example of the unmanned aerial vehicle 100 for aerial photography with the determined composition.
  • FIG. 8C is a diagram for explaining the movement of the UAV 100 as viewed from the horizontal direction.
  • FIG. 8A a current composition C1 and a pre-pattern C2 as a composition that simplifies the representation of a live view image are shown.
  • Fig. 8A as in Fig. 7, there are mountains M11 and mountains M12 on both sides of the river RV11. Make the pre-pattern C2 a diagonal composition.
  • the motion information generating unit 114 compares the size of the subject in the current composition C1 with the size of the subject in the pre-composition C2.
  • the motion information generating section 114 can calculate the amount of change in the height of the unmanned aerial vehicle 100 (that is, the amount of movement in the gravity direction) based on the size of the subject in the current composition C1 and the size of the subject in the pre-composition C2. For example, when the size of the subject in the pre-pattern C2 is twice the size of the subject in the current composition C1, the motion information generating portion 114 can calculate the moving direction and the amount of movement to make the height of the unmanned aerial vehicle 100 It becomes 1/2.
  • the motion information generating unit 114 may The direction of movement and the amount of movement are calculated to double the height of the UAV 100.
  • the height information of the UAV 100 in the current composition C1 may be an aerial height when the live view image G1 is aerial photographed, and may be acquired by the barometric altimeter 270 or the like.
  • the size of the river RV11, the size of the mountains M11, M12 are the same in the current composition C1 and the pre-pattern C2.
  • the motion information generating portion 114 can determine that the height of the UAV 100 is not changed, and the amount of movement in the gravity direction is a value of zero.
  • the UAV 100 can easily calculate the amount of change in height (i.e., the amount of movement in the direction of gravity). Therefore, the UAV 100 can move not only in a two-dimensional space (horizontal direction) but also in a three-dimensional space.
  • the zoom magnification of the imaging unit 220 or the imaging unit 230 may be changed.
  • the motion information generating unit 114 can compare the positional relationship of each subject in the current composition C1 with the positional relationship of each subject in the pre-composition C2.
  • the motion information generating unit 114 can calculate the amount of rotation and the direction of rotation of the composition, that is, the amount of rotation of the UAV 100, based on the positional relationship of each subject in the current composition C1 and the positional relationship of each subject in the pre-composition C2.
  • the direction of rotation, or the amount of rotation and the direction of rotation of the imaging unit 220 or the imaging unit 230 may be, for example, a direction along the horizontal direction.
  • information of the positional relationship of each subject can be calculated and acquired by the mapping based on the mathematical coordinate transformation.
  • the positional relationship of each subject in the pre-pattern C2 is rotated counterclockwise by 30 degrees with respect to each subject in the current composition C1.
  • the motion information generating unit 114 calculates a rotation amount of 30 degrees around the optical axis of the imaging unit 220 or the imaging unit 230 in the rotation direction counterclockwise.
  • the UAV 100 can easily calculate the amount of rotation of the composition.
  • the motion information generating unit 114 can generate rotation information of the gimbal 200.
  • the motion information generating unit 114 may be based on the angle of view information of the imaging unit 220, the screen position of the subject on the display unit 88 of the portable terminal 80 in the live view image G1, and the same subject on the display unit 88 in the pre-composition picture.
  • the position of the screen is used to calculate the rotation information of the gimbal 200.
  • the moving distance of the same subject on the screen is proportional to the amount of change (rotation amount) of the rotation angle of the gimbal 200.
  • the distance w1 (corresponding to the moving distance on the screen) of the screen position of the same subject M21 is 1/6 of the side w of the screen of the display unit 88. length.
  • the imaging angle of view indicated by the angle of view information of the imaging unit 220 is 90 degrees.
  • the rotation angle (angle ⁇ 2) of the gimbal 200 for realizing the pre-pattern C2 is 15 degrees.
  • the motion information generating unit 114 can derive (for example, calculate) the rotational direction of the gimbal 200 based on the correspondence relationship between the moving direction in the real space and the moving direction on the screen of the display unit 88.
  • the moving direction in the real space may be opposite to the moving direction on the screen of the display portion 88.
  • the gimbal 200 when the gimbal 200 is rotated in the direction of gravity (downward), the position of the island in the aerial image moves upward in FIG. 6A.
  • the motion information generating unit 114 can generate the motion information of the UAV 100. For example, the motion information generating unit 114 instructs the motion control unit 115 to cause the UAV 100 to fly a predetermined distance (for example, a predetermined short distance). The UAV 100 is operated by a predetermined distance under the control of the motion control unit 115. The motion information generating unit 114 can determine the correspondence relationship between the moving distance of the flight in the real space and the moving distance on the screen of the display unit 88 in cooperation with the terminal control unit 81 of the portable terminal 80.
  • the action information generating unit 114 can notify the portable terminal 80 of the predetermined distance information related to the flight through the communication interface 150.
  • the terminal control unit 81 of the portable terminal 80 can detect the screen on the display unit 88 in the aerial image during the flight. The same subject on the subject is accompanied by information on the moving distance that the UAV 100 moves by a predetermined distance.
  • the terminal control unit 81 can transmit the moving distance information to the unmanned aerial vehicle 100 via the wireless communication unit 85.
  • the action information generating unit 114 can receive the information of the moving distance through the communication interface 150.
  • the motion information generating unit 114 can determine the correspondence relationship between the moving distance of the flight in the real space and the moving distance on the screen, and store the information of the corresponding relationship in the memory 87 or the like in advance.
  • the memory 160 may store information in which the moving distance in the real space is ⁇ times the moving distance on the screen.
  • the terminal control unit 81 can also store information of the correspondence relationship between the moving direction in the real space and the moving direction on the screen in the memory 160.
  • the position before and after the movement in the real space can be acquired by the GPS receiver 240.
  • the same subject M21 imaged by the imaging unit 220 is moved from the position p1 to the position p2.
  • the moving distance of the unmanned aerial vehicle 100 for realizing the pre-composition can be calculated based on the distance d1 between the position p1 and the position p2 (corresponding to the moving distance) and the information (for example, ⁇ times) of the correspondence relationship stored in the memory 120d.
  • the UAV 100 can determine that it is located at the position where the pre-pattern C2 is realized, and acquire information for realizing the amount of movement and the moving direction of the pre-pattern C2.
  • FIG. 8C is a diagram showing an example of the relationship between the change in the angle of view of the imaging unit 220 before and after the movement and the moving distance in the horizontal direction.
  • the points indicating the respective positions in Fig. 8C show the cases observed from the side.
  • the motion information generating unit 114 can acquire the angle ⁇ 1 of the gimbal 200 with respect to the gravity direction at the time of aerial photography of the live view image G1.
  • the angle ⁇ 1 can be calculated based on the inclination of the UAV 100 with respect to the gravity direction and the inclination of the gimbal 200 with respect to the UAV 100 at the time of aerial photography of the live view image G1.
  • the motion information generating unit 114 can calculate based on the height of the UAV 100 and the angle ⁇ 1 of the gimbal 200.
  • the motion information generating unit 114 can calculate the universal direction based on the angle of view information of the imaging unit 220, the screen position of the subject in the current composition C1, and the screen position of the same subject on the display unit 88 in the pre-composition C2.
  • the rotation information of the bracket 200 (for example, the rotation angle corresponding to the angle ⁇ 2).
  • the motion information generating unit 114 can refer to the information of the correspondence relationship between the moving distance of the flying belt and the moving distance on the screen stored in the real space in the memory 160.
  • the motion information generating unit 114 can calculate the position p11 and the rotation in the horizontal direction of the UAV 100 based on the height h of the UAV 100 and the rotated angle ( ⁇ 1 + ⁇ 2) of the gimbal 200 with respect to the gravity direction.
  • the motion information generating unit 114 can calculate the position p12 in the horizontal direction and the rotation in the center portion of the imaging range of the unmanned aerial vehicle 100 before the rotation (the live view image G1 is in aerial photography) (when the pre-pattern C2 is realized) The difference between the position p13 in the horizontal direction in which the center portion of the imaging range of the unmanned aerial vehicle 100 is located, that is, the moving distance d12 corresponding to the angle ⁇ 2.
  • the unmanned aerial vehicle 100 can calculate the pre-composition image for realizing the pre-composition. The amount of movement and the direction of movement.
  • the motion information generating unit 114 calculates the moving distance d12 in one axial direction (for example, the x direction shown in FIG. 8C) in the horizontal direction by the above method. Similarly, the motion information generating unit 114 can calculate the moving distance d22 (not shown) in the other direction (for example, the y direction) orthogonal to one axis in the horizontal direction. The motion information generating unit 114 can calculate the moving distance in the horizontal direction (xy direction) by synthesizing the moving distance d12 and the moving distance d22. Further, the motion information generating unit 114 can calculate the moving direction by synthesizing the moving distances of the x-direction component and the y-direction component.
  • the UAV 100 can grasp the current composition by generating motion information.
  • C1 becomes the action of the UAV 100 required for the pre-pattern C2.
  • the unmanned aerial vehicle 100 can change the imaging range by adjusting the rotation information of the gimbal 200 as the motion information by adjusting the gimbal 200, and provide the operation information for the desired pre-pattern C2.
  • the UAV 100 can change the imaging range by the rotation of the gimbal 200, so that it is not necessary to move the UAV 100, and the power consumption required for the UAV 100 to fly can be saved, achieving low cost.
  • the UAV 100 can calculate the rotation angle of the gimbal 200 by a relatively simple operation based on the positions of the subjects in the current composition C1 and the pre-pattern C2. Therefore, the unmanned aerial vehicle 100 can suppress the arithmetic processing load to be low.
  • the unmanned aerial vehicle 100 can change the imaging range by the geographical position adjustment of the unmanned aerial vehicle 100 by generating the movement information of the unmanned aerial vehicle 100 as the motion information, and provide the operation information for the desired pre-composition C2. That is, the imaging range can be changed by the spatial movement of the UAV 100, and an appropriate pre-pattern C2 can be realized.
  • FIG. 9 is a flowchart showing an operation example of the imaging assistance system 10.
  • the imaging unit 220 captures an aerial image (S101).
  • the communication interface 150 transmits the captured aerial image (for example, a live view image) to the portable terminal 80 (S102).
  • the aerial image of S101 will be described as a live view image.
  • the wireless communication unit 85 receives the live view image (S151).
  • the display portion 88 can display, for example, a live view image.
  • the user can perform an operation of adjusting the composition by the operation unit 83 when confirming the display of the live view image and wishing to adjust the composition.
  • the terminal control unit 81 activates the imaging assistance application, and transmits the composition adjustment command to the UAV 100 via the wireless communication unit 85 (S152).
  • the UAV control unit 110 activates the imaging assistance application.
  • the main subject determination section 112 determines the main subject in the live view image (S104).
  • the main subject determination section 112 can provide the portable terminal 80 with screen information for selecting the main subject through the communication interface 150.
  • the wireless communication section 85 can receive screen information for selecting a main subject from the unmanned aerial vehicle 100, the operation section 83 accepts a selection operation of the main subject, and the wireless communication section 85 will be based on the main subject
  • the selection information of the body selection operation is transmitted to the unmanned aerial vehicle 100 (S153).
  • the main subject determination section 112 can acquire selection information of the main subject through the communication interface 150, and determine the main subject based on the selection information of the main subject.
  • the composition determination section 113 determines the composition based on the determined main subject (S105).
  • the composition determination section 113 can provide the portable terminal 80 with screen information for selecting the main composition through the communication interface 150.
  • the wireless communication section 85 receives screen information for selecting a composition from the unmanned aerial vehicle 100, the operation section 83 accepts a selection operation of composition, and the wireless communication section 85 transmits selection information based on the composition-based selection operation to none The human aircraft 100 (S154).
  • the composition determination section 113 can acquire the selection information of the composition through the communication interface 150, and determine the composition based on the selection information of the composition.
  • the motion information generating unit 114 generates (for example, calculates) motion information of the UAV 100 (for example, information on the moving direction and the moving amount of the UAV 100) based on the determined composition (S106).
  • the moving direction and the moving amount of the UAV 100 may be, for example, a moving direction and a moving amount of the live view image in S101 from the aerial position to the aerial position for realizing the pre-composition.
  • the motion control unit 115 performs flight control to the destination based on the calculated moving direction and the amount of movement, and moves (S107).
  • This destination is a position at which the moving direction and the amount of movement are moved from the position of the live view image before the movement.
  • the operation control unit 115 determines whether or not the movement to the destination is completed (S108). In the case where the movement to the destination has not been completed, the process goes to S107.
  • the communication interface 150 to the portable terminal 80 transmits a move completion notification (S109).
  • the wireless communication unit 85 receives the movement completion notification from the unmanned aerial vehicle 100 (S155).
  • the operation unit 83 can receive an operation for imaging by the imaging unit 220 or the imaging unit 230 included in the unmanned aerial vehicle 100 after the movement.
  • the terminal control unit 81 transmits an imaging command to the UAV 100 via the wireless communication unit 85 (S156).
  • the communication interface 150 receives an imaging command from the portable terminal 80 (S110).
  • the imaging unit 220 or the imaging unit 230 captures an aerial image in accordance with an imaging command (S111).
  • This aerial image is an aerial image taken at a position where the UAV 100 is moved to realize the pre-composition, and is an image conforming to the pre-composition.
  • the unmanned aerial vehicle 100 can perform aerial photography at the position after the movement when the movement based on the motion information is completed or completed.
  • processing of S101 to S106 may be performed during the period in which the UAV 100 does not move (when it is not moved and located at a predetermined position), or may be performed during the movement of the UAV 100.
  • the motion information may be the rotation information of the gimbal 200 instead of the movement information of the UAV 100.
  • the motion information generating unit 114 generates (for example, calculates) motion information of the UAV 100 (for example, information on the rotational direction and the amount of rotation of the gimbal 200) based on the determined composition.
  • the rotation direction and the rotation amount of the gimbal 200 may be, for example, a live view image in S101 for realizing the rotation direction and the rotation amount of the pre-patterned gimbal 200 from the aerial position.
  • the operation control unit 115 performs rotation control to the target rotation position based on the calculated rotation direction and the rotation amount, and rotates the gimbal 200.
  • This target rotation position is a position at which the rotation direction and the amount of rotation are moved from the angle of the gimbal 200 at the time of imaging of the live view image before the movement.
  • the operation control unit 115 determines whether or not the rotation to the target rotational position is completed. When the rotation to the target rotation position has not been completed, the process proceeds to S107, and the operation control unit 115 continues the rotation operation.
  • the operation control unit 115 may perform either one of the rotation control of the gimbal 200 and the flight control of the UAV 100, or both. Further, since the change of the imaging range due to the rotation of the gimbal 200 is not accompanied by the movement of the UAV 100, the degree of change in the imaging range is small. On the other hand, since the change of the imaging range caused by the movement of the unmanned aerial vehicle 100 is accompanied by the movement of the unmanned aerial vehicle 100, the degree of change of the imaging range is large. Therefore, when the flight control of the UAV 100 is performed after the rotation control of the gimbal 200, the flight control of the UAV 100 can be performed even in a case where the desired composition cannot be achieved by the rotation of the gimbal 200. To assist in achieving the desired composition. That is, the UAV 100 can save energy by the rotation control of the gimbal 20 while reliably achieving the desired composition by the flight control of the UAV 100.
  • a desired subject can be added to determine a composition for attractively aerial photography of the desired subject. That is, the UAV 100 and the imaging assistance system 10 can not only income the desired subject into the captured image, but also improve the imaging of the captured image to assist the imaging of the image. Therefore, even in the case where the user does not have sufficient expertise in photographing the photograph, the determination of the composition can be assisted by the unmanned aerial vehicle 100, and the aerial photograph of the desired subject can be assisted. Further, the unmanned aerial vehicle 100 can perform an action that matches the composition (for example, the movement of the UAV 100, the adjustment of the rotation angle of the gimbal 200), and thus the pre-pattern can be used for future aerial photography.
  • the UAV 100 can quickly perform the operation of the UAV 100 based on the motion information by performing the determination of the main subject related to the imaging assistance, the determination of the composition, and the generation of the motion information.
  • the unmanned aerial vehicle 100 can reduce the processing load of the portable terminal 80 by performing the determination of the main subject related to the imaging assistance, the determination of the composition, and the generation of the motion information, and can also realize the communication with the portable terminal 80. The load is reduced. Therefore, the portable terminal 80 can contribute to the processing related to the imaging assistance in cooperation with the unmanned aerial vehicle 100 while reducing the processing load of the portable terminal 80 itself.
  • the unmanned aerial vehicle 100 can generate a pre-composition based on a desired subject included in the live view image by performing determination of the main subject and determination of the composition based on the live view images of S101 and S102, and the like. Therefore, the unmanned aerial vehicle 100 can perform aerial photography of a desired subject with a desired composition in a flow of performing a series of imaging. Further, the UAV 100 can perform the main imaging with a composition suitable for the main subject included in the subject reflected in the temporary imaging before the official imaging.
  • the display unit 88 may display the motion information.
  • the display information related to the motion information may be "Please move 10 meters to the east", "Please rotate the gimbal 200 by 20 degrees in the direction of gravity", and the like.
  • the other presentation unit may present the motion information.
  • the sound output unit (not shown) may output sound information related to the motion information, or the vibration unit (not shown) may perform vibration indicating the motion information.
  • the user can confirm the content of the action information. Therefore, by the user who has confirmed the action information, the transmitter 50 can transmit an operation instruction from the transmitter 50 to the unmanned aerial vehicle 100.
  • the UAV 100 can also obtain an action indication via the communication interface 150 to move the UAV 100 to an aerial position for achieving pre-composition. In this case, the UAV 100 can perform the operation of the unmanned aerial vehicle 100 for realizing the pre-patterning even without the function of the motion control unit 115.
  • the unmanned aerial vehicle is exemplified to perform determination of the main subject, determination of composition, and generation of motion information.
  • the portable terminal is exemplified to perform determination of the main subject, determination of composition, and generation of motion information.
  • the same configurations and operations as those of the first embodiment are omitted or simplified.
  • FIG. 10 is a schematic diagram showing a configuration example of the imaging assistance system 10A in the second embodiment.
  • the camera assisting system 10A includes an unmanned aerial vehicle 100A, a transmitter 50, and a toilet Portable terminal 80A.
  • the UAV 100A, the transmitter 50, and the portable terminal 80A can communicate with each other by wired communication or wireless communication such as a wireless LAN (Local Area Network).
  • a wireless LAN Local Area Network
  • the portable terminal 80A determines the composition of the aerial photography performed by the unmanned aerial vehicle 100A, and generates motion information of the unmanned aerial vehicle 100A to become the determined composition.
  • the unmanned aerial vehicle 100A controls the operation of the unmanned aerial vehicle 100A in accordance with the motion information.
  • the portable terminal 80A, together with the transmitter 50, is carried by a user who is scheduled to perform aerial photography using the unmanned aerial vehicle 100A.
  • the portable terminal 80A assists the aerial photography performed by the unmanned aerial vehicle 100A.
  • FIG. 11 is a block diagram showing one example of the hardware configuration of the unmanned aerial vehicle 100A.
  • the UAV 100A has a UAV control unit 110A instead of the UAV control unit 110 as compared with the UAV 100 in the first embodiment.
  • the same components as those of the unmanned aerial vehicle 100 shown in Fig. 2 are denoted by the same reference numerals, and their description will be omitted or simplified.
  • the memory 160 may not store information related to image capturing assistance (for example, sample information of a composition, information of a correspondence between a distance in a real space and a distance on a screen).
  • FIG. 12 is a block diagram showing an example of a functional configuration of the UAV control unit 110A.
  • the UAV control unit 110A includes an operation control unit 115 and an operation information acquisition unit 116.
  • the same configurations as those of the UAV control unit 110 shown in FIG. 3 are denoted by the same reference numerals, and the description thereof will be omitted or simplified.
  • the motion information acquisition unit 116 acquires motion information of the UAV 100A from the mobile terminal 80A via the communication interface 150, for example.
  • the motion control unit 115 controls the operation of the UAV 100A in accordance with the acquired motion information.
  • the content of the motion control of the unmanned aerial vehicle 100A can be the same as that in the first embodiment.
  • FIG. 13 is a block diagram showing one example of the hardware configuration of the portable terminal 80A.
  • the mobile terminal 80A has a terminal control unit 81A instead of the terminal control unit 81 as compared with the mobile terminal 80 in the first embodiment.
  • the portable terminal 80A in FIG. The same components as those of the portable terminal 80 shown in FIG. 4 are denoted by the same reference numerals, and their description will be omitted or simplified.
  • the memory 87 can store information related to image capturing assistance (for example, sample information of a composition, distance in real space, and distance on the screen) Correspondence information, information related to machine learning).
  • FIG. 14 is a block diagram showing an example of a functional configuration of the terminal control unit 81A.
  • the terminal control unit 81A includes an image acquisition unit 811, a main subject determination unit 812, a composition determination unit 813, and an operation information generation unit 814.
  • the main subject determination section 812 and the composition determination section 813 are one example of the information acquisition section.
  • the motion information generating unit 114 is an example of a generating unit.
  • the image acquisition unit 811 can acquire an image stored in the memory 87 (for example, an aerial image captured by the imaging unit 220 of the UAV 100A or the imaging unit 230).
  • the image acquisition unit 811 can acquire, for example, an aerial image in the aerial photography of the imaging unit 220 or the imaging unit 230 via the communication interface 150.
  • the aerial image can be a moving image or a still image.
  • the aerial motion picture in aerial photography is also called a live view image.
  • the aerial image acquired by the image acquisition unit 811 is mainly exemplified by a live view image.
  • the main subject determination unit 812 determines (determines) the main subject among one or more subjects included in the live view image acquired by the image acquisition unit 811.
  • the determination of the main subject is an example of information acquisition of the main subject.
  • the main subject determination method of the main subject determination unit 812 can be the same as the main subject determination method of the main subject determination unit 112 included in the unmanned aerial vehicle 100 in the first embodiment.
  • the composition determination section 813 determines a composition for imaging the determined main subject.
  • the determination of the composition for capturing the main subject is an example of information acquisition for composing the composition of the main subject.
  • the composition determination method of the composition determination portion 813 can be the same as the composition determination method of the composition determination portion 113 included in the unmanned aerial vehicle 100 in the first embodiment.
  • the motion information generating unit 814 generates motion information of the unmanned aerial vehicle 100A for realizing aerial photography in accordance with the determined composition.
  • Action information of the motion information generating unit 814 The generation method can be the same as the operation information generation method of the motion information generation unit 814 included in the unmanned aerial vehicle 100 in the first embodiment.
  • the generated motion information can be transmitted to the unmanned aerial vehicle 100A by the wireless communication unit 85, for example.
  • FIG. 15 is a flowchart showing an operation example of the imaging assistance system 10A.
  • the unmanned aerial vehicle 100A performs the processes of S101 and S102.
  • the portable terminal 80A performs the processing of S151.
  • the display portion 88 can display, for example, a live view image.
  • the user can perform an operation for adjusting the composition by the operation unit 83 when confirming the display of the live view image and wishing to adjust the composition.
  • This operation is an example of a composition adjustment start indication.
  • the terminal control unit 81 activates the imaging assistance application (S161).
  • the main subject determination section 812 determines the main subject in the live view image (S162). In S162, the main subject determination section 812 can cause the display section 88 to display a selection screen for selecting the main subject.
  • the operation unit 83 can acquire the selection information of the main subject by accepting the selection operation of the main subject.
  • the main subject determination section 812 can determine the main subject based on the selection information of this main subject.
  • the composition determination section 813 determines a composition based on the determined main subject (S163).
  • the composition determining portion 813 can cause the display portion 88 to display a selection screen for selecting a pre-composition.
  • the operation unit 83 can acquire the selection information of the pre-composition map by accepting the selection operation of the composition.
  • the composition determination section 813 can determine the composition based on the selection information of this composition.
  • the motion information generating unit 814 generates motion information of the UAV 100A based on the determined composition (S164).
  • the wireless communication unit 85 transmits the generated motion information of the unmanned aerial vehicle 100A to the unmanned aerial vehicle 100A (S165).
  • the communication interface 150 receives from the portable terminal 80A.
  • the motion information of the unmanned aerial vehicle 100A (S121).
  • the unmanned aerial vehicle 100A performs the processing of S107 to S111, and the portable terminal 80A performs the processing of S155 and S156.
  • a desired subject can be added to determine a composition for attractively taking an aerial photograph of the desired subject. That is, the portable terminal 80A and the imaging assistance system 10A can not only income the desired subject into the captured image, but also improve the imaging of the captured image to assist the imaging of the image. Therefore, even in the case where the user does not have sufficient expertise for photographing the photograph, the determination of the composition can be assisted by the portable terminal 80A, and the aerial photograph of the desired subject can be assisted. Further, the unmanned aerial vehicle 100A can perform an action matching the composition (for example, the movement of the UAV 100A, the adjustment of the rotation angle of the gimbal 200), and thus the pre-pattern can be used for future aerial photography.
  • the unmanned aerial vehicle 100A can perform an action matching the composition (for example, the movement of the UAV 100A, the adjustment of the rotation angle of the gimbal 200), and thus the pre-pattern can be used for future aerial photography.
  • the portable terminal 80A can reduce the processing load of the unmanned aerial vehicle 100A by performing determination of the main subject related to the imaging assistance, determination of composition, and generation of motion information, and can concentrate the unmanned aerial vehicle 100A to the aerial image. Processing, flight control, etc. Further, since the UAV 100A can operate in accordance with the operation information generated by the portable terminal 80 or the like as another device, even if the processing load of the UAV 100 is reduced, a desired operation for realizing a desired composition can be performed.
  • an information processing device for example, a transmitter 50, a PC, or another information processing device
  • an imaging assistance function for example, a main subject determination function, a composition determination function, and a composition determination function
  • Action information generation function for example, a main subject determination function, a composition determination function, and a composition determination function
  • the aerial photography of the unmanned aerial vehicle is assisted.
  • the imaging of the image pickup device mounted on the gimbal device is exemplified. Line assist.
  • the same configurations and operations as those of the first and second embodiments are omitted or simplified.
  • FIG. 16 is a perspective view showing a configuration example of the imaging assistance system 10B in the third embodiment.
  • the camera assist system 10B includes a gimbal device 300 and a portable terminal 80B.
  • the gimbal device 300 and the portable terminal 80B can communicate with each other by wired communication (for example, USB communication) or wireless communication (for example, wireless LAN, Bluetooth (registered trademark), short-range communication, public wireless line).
  • the gimbal device 300 is an example of a support device.
  • the portable terminal 80B can determine a composition for imaging by the imaging unit 820 included in the portable terminal 80B mounted to the gimbal device 300, and can generate motion information of the gimbal device 300 to become the determined composition.
  • the gimbal device 300 can determine a composition for imaging by the imaging unit 820 included in the portable terminal 80B mounted to the gimbal device 300, and can generate motion information of the gimbal device 300 to become determined. Composition.
  • the gimbal device 300 controls the operation of the gimbal device 300 in accordance with the motion information.
  • the gimbal device 300 can be carried by a user who is scheduled to perform imaging using the portable terminal 80B.
  • the portable terminal 80B or the gimbal device 300 assists in imaging of the portable terminal 80B attached to the gimbal device 300.
  • the imaging unit 820 is an example of an imaging device.
  • the gimbal device 300 includes a gimbal 310, a mounting portion 315, and a grip portion 330.
  • the mounting portion 315 mounts the portable terminal 80B to the gimbal device 300, and fixes the position and orientation of the portable terminal 80B with respect to the gimbal device 300.
  • the gimbal 310 can rotatably support the portable terminal 80B centering on the yaw axis, the pitch axis, and the roll axis.
  • the gimbal 310 can change the imaging direction of the imaging unit 820 included in the mobile terminal 80B by rotating the portable terminal 80B around at least one of the yaw axis, the pitch axis, and the roll axis. Since the position of the imaging unit 820 in the portable terminal 80B is fixed, it can be said that the rotation of the portable terminal 80B corresponds to the rotation of the imaging unit 820.
  • the grip portion 330 can be held by the user.
  • the grip portion 330 shown in FIG. 16 is an example, and may be a shape of the grip portion 330 different from that of FIG. 16 , a position of the grip portion 330 with respect to the gimbal device 300, and a grip portion 330 with respect to the universal direction.
  • the dotted line of the gimbal 310 is shown in the vicinity of the portable terminal 80B, which shows that the gimbal 310 is located further to the back side than the portable terminal 80B.
  • the gimbal device 300 has at least a portion of the hardware configuration of the UAV 100 or the UAV 100A. Although not shown, the gimbal device 300 has at least a portion of the functional configuration of the UAV 100 or the UAV 100A.
  • the portable terminal 80B may be identical in hardware configuration to the portable terminal 80 or the portable terminal 80A. Although not illustrated, the portable terminal 80B may have the same functional configuration as that of the portable terminal 80 or the portable terminal 80A.
  • the gimbal device 300 can have the function of the unmanned aerial vehicle 100 of the first embodiment.
  • the gimbal device 300 can have a function of determining the main subject in the image (for example, a live view image) captured by the imaging unit 820, a function of determining the composition, and a function of generating the motion information of the gimbal device 300.
  • the gimbal device 300 may have a function of controlling the motion of the gimbal device 300 based on the motion information.
  • the gimbal device 300 can have the function of the unmanned aerial vehicle 100A of the second embodiment.
  • the portable terminal 80B can have a function of determining the main subject in the image (for example, a live view image) captured by the imaging unit 820, a function of determining the composition, and a function of generating the motion information of the gimbal device 300.
  • Universal branch The rack device 300 may have an acquisition function of motion information of the gimbal device 300. Further, the gimbal device 300 may have a function of controlling the motion of the gimbal device 300 based on the motion information.
  • the gimbal device 300 does not consider flying. Therefore, the motion information may be the rotation information of the gimbal 310. Therefore, the gimbal device 300 can control the rotation of the gimbal 310 based on the motion information.
  • a desired subject can be added to determine a composition for attractively aerial photography of the desired subject.
  • the gimbal device 300, the portable terminal 80B, and the imaging assistance system 10B can not only income a desired subject into the captured image, but also improve the imaging of the captured image to assist the imaging of the image. Therefore, even in the case where the user does not have sufficient expertise for photographing, the determination of the composition can be assisted by the gimbal device 300 or the portable terminal 80B, and the imaging of the desired subject can be assisted. Further, the gimbal device 300 can perform an action that matches the composition (for example, adjustment of the rotation angle of the gimbal 310), so the pre-pattern can be used for future imaging.
  • the gimbal device 300 can quickly perform the operation of the gimbal device 300 based on the motion information by performing the determination of the main subject related to the imaging assistance, the determination of the composition, and the generation of the motion information. Further, the gimbal device 300 can reduce the processing load of the portable terminal 80B by performing the determination of the main subject related to the imaging assistance, the determination of the composition, and the generation of the motion information, and can also be realized with the portable terminal 80B. The communication load is reduced. Therefore, the portable terminal 80B can contribute to the processing related to the imaging assistance in cooperation with the gimbal device 300 while reducing the processing load of the portable terminal 80B itself.
  • the gimbal device 300 can use the motion information generated by the imaging assistance for the rotation control of the gimbal 310.
  • the imaging assistance can be performed in order to use the imaging of the gimbal device 300.
  • the portable terminal 80B can perform the main subject related to the imaging assistance
  • the determination of the composition, the determination of the composition, and the generation of the motion information reduce the processing load of the gimbal device 300, and the gimbal device 300 can be concentrated in the processing of the captured image or the like. Further, since the gimbal device 300 can be operated in accordance with the operation information generated by the portable terminal 80B or the like as another device, even if the processing load of the gimbal device 300 is lowered, the expectation for realizing the desired composition can be realized. action.
  • the imaging of the imaging device mounted on the gimbal device is assisted.
  • the imaging of the imaging unit included in the gimbal device is assisted.
  • the same configurations and operations as those of the first to third embodiments are omitted or simplified.
  • FIG. 17A is a front perspective view showing a configuration example of the gimbal device 300C in the fourth embodiment.
  • 17B is a rear perspective view showing a configuration example of the imaging assistance system 10C in the fourth embodiment.
  • the camera assist system 10C includes a gimbal device 300C and a portable terminal 80C.
  • the gimbal device 300C and the portable terminal 80C can communicate with each other by wired communication (for example, USB communication) or wireless communication (for example, wireless LAN, Bluetooth (registered trademark), short-range communication, public wireless line).
  • the gimbal device 300C is an example of a support device.
  • the portable terminal 80C can determine a composition for imaging by the imaging unit 320 built in the gimbal device 300C, and can generate motion information of the gimbal device 300C to become the determined composition.
  • the gimbal device 300C may determine a composition for imaging by the imaging unit 320, and may generate motion information of the gimbal device 300C to become the determined composition.
  • the gimbal device 300C controls the operation of the gimbal device 300C in accordance with the motion information.
  • the gimbal device 300C can be carried by a user who is scheduled to perform imaging using the gimbal device 300C.
  • the portable terminal 80C or the gimbal device 300C assists the imaging of the gimbal device 300C.
  • the gimbal device 300C includes a gimbal 310C, an imaging unit 320, and a grip portion 330.
  • the imaging unit 320 is an example of an imaging device.
  • the same configurations as those of the gimbal device 300 and the imaging assistance system 10B shown in FIG. 16 are denoted by the same reference numerals, and the description thereof will be omitted or simplified.
  • the gimbal 310C can rotatably support the imaging unit 320 around the yaw axis, the pitch axis, and the roll axis.
  • the gimbal 310C can change the imaging direction of the imaging unit 320 by rotating the imaging unit 320 around at least one of the yaw axis, the pitch axis, and the roll axis.
  • the imaging unit 320 can image the depth direction on the surface of the paper.
  • the imaging unit 320 can change the imaging direction.
  • the grip 330 can be held by, for example, the user's hand HD1.
  • the gimbal device 300C has at least a portion of the hardware configuration of the UAV 100 or the UAV 100A. Although not shown, the gimbal device 300C has at least a part of the functional configuration of the unmanned aerial vehicle 100 or the unmanned aerial vehicle 100A.
  • the portable terminal 80C may be identical in hardware configuration to the portable terminal 80 or the portable terminal 80A. Although not illustrated, the portable terminal 80C may have the same functional configuration as that of the portable terminal 80 or the portable terminal 80A.
  • the gimbal device 300C may have the function of the unmanned aerial vehicle 100 of the first embodiment.
  • the gimbal device 300C can have a function of determining the main subject in the image (for example, a live view image) captured by the imaging unit 320, a function of determining the composition, and a function of generating the motion information of the gimbal device 300C.
  • the gimbal device 300C may have a function of controlling the operation of the gimbal device 300C based on the motion information.
  • the portable terminal 80C has the second embodiment.
  • the gimbal device 300C may have the function of the unmanned aerial vehicle 100A of the second embodiment.
  • the portable terminal 80C can have a function of determining the main subject in the image (for example, a live view image) captured by the imaging unit 320, a function of determining the composition, and a function of generating the motion information of the gimbal device 300C.
  • the gimbal device 300C can have an acquisition function of the motion information of the gimbal device 300C.
  • the gimbal device 300C may have a function of controlling the operation of the gimbal device 300C based on the motion information.
  • the gimbal device 300C does not consider flying. Therefore, the motion information may be the rotation information of the gimbal 310C. Therefore, the gimbal device 300C can control the rotation of the gimbal 310C based on the motion information.
  • a desired subject can be added to determine a composition for attractively taking an aerial photograph of the desired subject.
  • the gimbal device 300C, the portable terminal 80C, and the imaging assistance system 10C can not only receive a desired subject into the captured image, but also improve the imaging of the captured image to assist the imaging of the image. Therefore, even in the case where the user does not have sufficient expertise for photographing, the determination of the composition can be assisted by the gimbal device 300C or the portable terminal 80C, and the imaging of the desired subject can be assisted. Further, the gimbal device 300C can perform an action matching the composition (for example, adjustment of the rotation angle of the gimbal 310C), and thus the pre-pattern can be used for future imaging.
  • the gimbal device 300C can quickly perform the operation of the gimbal device 300C based on the motion information by performing the determination of the main subject related to the imaging assistance, the determination of the composition, and the generation of the motion information. Further, the gimbal device 300C can reduce the processing load of the portable terminal 80C by performing the determination of the main subject related to the imaging assistance, the determination of the composition, and the generation of the motion information, and can also be realized with the portable terminal 80C. The communication load is reduced. Therefore, the portable terminal 80C can contribute to the processing related to the imaging assistance in cooperation with the gimbal device 300C while reducing the processing load of the portable terminal 80C itself.
  • the gimbal device The 300C can use the motion information generated by the imaging assistance for the rotation control of the gimbal 310C.
  • the imaging assistance can be performed in order to use the imaging of the gimbal device 300C.
  • the portable terminal 80C can reduce the processing load of the gimbal device 300C by performing the determination of the main subject related to the imaging assistance, the determination of the composition, and the generation of the motion information, and the gimbal device 300C can be concentrated to the imaging. Processing of images, etc. Further, since the gimbal device 300C can operate in accordance with the operation information generated by the portable terminal 80C or the like as another device, even if the processing load of the gimbal device 300C is lowered, the desired configuration for realizing the desired composition can be implemented. action.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Automation & Control Theory (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Mechanical Engineering (AREA)
  • Studio Devices (AREA)
  • Details Of Cameras Including Film Mechanisms (AREA)
  • Accessories Of Cameras (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

即使在用户不熟悉摄像被摄体的相机的摄像的情况下,也能够摄像出期望的图像。一种移动平台,其是对由摄像装置进行的第二图像的摄像进行辅助的移动平台,其包含:获取部,其获取第一图像;信息获取部,其在第一图像所包含的一个以上的被摄体中获取第一被摄体的信息,并在对第二图像中的包括第一被摄体的一个以上的被摄体的位置进行规定的一个以上的构图中获取第一构图的信息;以及生成部,其根据第一构图,生成与用于摄像第二图像的摄像装置的动作相关的动作信息。

Description

移动平台、飞行体、支持装置、便携式终端、摄像辅助方法、程序以及记录介质 技术领域
本公开涉及支援图像的摄像的移动平台、飞行体、支持装置、便携式终端、摄像辅助方法、程序以及记录介质。
背景技术
现已知一种虚拟相机系统,其使用经验证的摄像技术来计算摄像方式,从而对计算机生成的情节进行可视化,对事件进行再现,动态地更新相机视图。此虚拟相机系统对用于在期望的角度、距离下、以最小程度的遮挡来显示被摄体的相机配置位置进行分析。
此虚拟相机系统主要用于3D游戏相机系统,预先准备了表现任意虚拟三维空间的三维图像数据。此虚拟相机系统可以表现虚拟三维空间中的从特定视点观看到的区域,以形成任意构图。此虚拟相机系统例如可以按照有限制的构图、使用了已添加一个被摄体的三分法的构图、使用了已添加全部要素的三分法的构图、以及经平衡化的构图,从预先准备的虚拟三维图像数据中,将三维图像数据的一部分显示在显示器上(参见非专利文献1)。
现有技术文献
非专利文献
非专利文献1:William Bares,“A Photographic Composition Assistant for Intelligent Virtual 3D Camera Systems”,Millsaps College,Department of Computer Science,Jackson MS 39210,USA,互联网< URL:http://link.springer.com/chapter/10.1007/11795018_16>
发明内容
发明所要解决的技术问题
当将非专利文献1所述的虚拟相机系统应用于处理在真实空间中摄像的图像的相机系统时,相机系统会基于预先摄像的三维图像数据加工成任意构图。因此,相机系统不能确定尚未摄像的图像的构图。因此,例如在摄像被摄体的相机用户不熟悉摄像时,就难以有吸引力地摄像出期望的图像。
用于解决技术问题的手段
在一个方式中,一种移动平台,其是对由摄像装置进行的第二图像的摄像进行辅助的移动平台,其包含:图像获取部,其获取第一图像;信息获取部,其在第一图像所包含的一个以上的被摄体中获取第一被摄体的信息,并在对第二图像中的包括第一被摄体的一个以上的被摄体的位置进行规定的一个以上的构图中获取第一构图的信息;以及生成部,其根据第一构图,生成与用于摄像第二图像的摄像装置的动作相关的动作信息。
信息获取部可以从第一图像所包含的多个被摄体中选择并获取第一被摄体。
信息获取部可以根据第一图像所包含的被摄体的颜色成分来获取第一被摄体的信息。
信息获取部可以根据第一图像所包含的被摄体的空间频率来获取第一被摄体的信息。
信息获取部可以获取摄像装置的位置信息,并且根据摄像装置的位置信息来获取第一被摄体的信息。
信息获取部可以根据由摄像装置进行的第二图像的摄像时的摄像模式来获取第一被摄体的信息。
信息获取部可以从多个构图中选择并获取第一构图。
移动平台可以还包含识别部,其用于识别第一被摄体的形状。信息获取部可以根据第一被摄体的形状来获取第一构图的信息。
移动平台可以还包含识别部,其用于识别第二图像被摄像时的场景。信息获取部可以根据场景来获取第一构图的信息。
生成部可以生成与可旋转地支持摄像装置的支持部件的旋转相关的旋转信息作为动作信息。
生成部可以根据第一图像中的第一被摄体的位置和第一构图中的第一被摄体的位置来确定支持部件的旋转量和旋转方向。
生成部可以生成与摄像装置的移动相关的移动信息作为动作信息。
生成部可以根据第一图像中的第一被摄体的大小和第一构图中的第一被摄体的大小来确定沿重力方向的摄像装置的移动量。
生成部可以根据第一图像中的第一被摄体的位置、第一构图中的第一被摄体的位置、以及第一图像中的移动距离与真实空间中的移动距离的对应关系来确定摄像装置的移动量和移动方向。
可以还包含提示部,其用于提示动作信息。
第一图像可以是由摄像装置摄像的图像。
移动平台可以是包含摄像装置以及可旋转地支持摄像装置的支持部件的飞行体,并且还包含控制部,其根据动作信息来控制飞行体的飞行或支持部件的旋转。
移动平台可以是在使用时由用户握持的、包含可旋转地支持摄像装置的支持部件的支持装置,并且还包含控制部,其根据动作信息来控制支持部件的旋转。
移动平台可以是便携式终端,并且还包含通信部,其将动作信息发送到飞行体或支持装置。
在一个方式中,一种飞行体,其包含:摄像装置;支持部件,其可旋转地支持摄像装置;动作信息获取部,其获取由移动平台生成的动作信息;以及控制部,其根据动作信息来控制飞行体的飞行或支持部件的旋转。
在一个方式中,一种支持装置,其包含:支持部件,其可旋转地支持摄像装置;动作信息获取部,其获取由移动平台生成的动作信息;以及控制部,其根据动作信息来控制支持部件的旋转。
在一个方式中,一种摄像辅助方法,其是对由摄像装置进行的第二图像的摄像进行辅助的移动平台中的摄像辅助方法,其具有以下步骤:获取第一图像的步骤;在第一图像所包含的一个以上的被摄体中获取第一被摄体的信息的步骤;在对第二图像中的包括第一被摄体的一个以上的被摄体的位置进行规定的一个以上的构图中获取第一构图的信息的步骤;以及根据第一构图,生成与用于摄像第二图像的摄像装置的动作相关的动作信息的步骤。
获取第一被摄体的信息的步骤可以包括从第一图像所包含的多个被摄体中选择并获取第一被摄体的步骤。
获取第一被摄体的信息的步骤可以包括根据第一图像所包含的被摄体的颜色成分来获取第一被摄体的信息的步骤。
获取第一被摄体的信息的步骤可以包括根据第一图像所包含的被摄体的空间频率来获取第一被摄体的信息的步骤。
摄像辅助方法可以还包括获取摄像装置的位置信息的步骤。获取第一被摄体的信息的步骤可以包括根据摄像装置的位置信息来获取第一被摄体的信息的步骤。
获取第一被摄体的信息的步骤可以包括根据由摄像装置进行的第二图像的摄像时的摄像模式来获取第一被摄体的信息的步骤。
获取第一构图的信息的步骤可以包括从多个构图中选择并获取第一构图的步骤。
摄像辅助方法可以还包括识别第一被摄体的形状的步骤。获取第一构图的信息的步骤可以包括根据第一被摄体的形状来获取第一构图的信息的步骤。
摄像辅助方法可以还包括识别第二图像被摄像时的场景的步骤。获取第一构图的信息的步骤可以包括根据场景来获取第一构图的信息的步骤。
生成动作信息的步骤可以包括生成与可旋转地支持摄像装置的支持部件的旋转相关的旋转信息作为动作信息的步骤。
生成动作信息的步骤可以包括根据第一图像中的第一被摄体的位置和第一构图中的第一被摄体的位置来确定支持部件的旋转量和旋转方向的步骤。
生成动作信息的步骤可以包括生成与摄像装置的移动相关的移动信息作为动作信息的步骤。
生成动作信息的步骤可以包括根据第一图像中的第一被摄体的大小和第一构图中的第一被摄体的大小来确定沿重力方向的摄像装置的移动量的步骤。
生成动作信息的步骤可以包括根据第一图像中的第一被摄体的 位置、第一构图中的第一被摄体的位置、以及第一图像中的移动距离与真实空间中的移动距离的对应关系来确定摄像装置的移动量和移动方向的步骤。
摄像辅助方法可以还包括在提示部提示动作信息的步骤。
第一图像可以是由摄像装置摄像的图像。
移动平台可以是包含摄像装置以及可旋转地支持摄像装置的支持部件的飞行体。摄像辅助方法可以还包括根据动作信息来控制飞行体的飞行或支持部件的旋转的步骤。
移动平台可以是在使用时由用户握持的、包含可旋转地支持摄像装置的支持部件的支持装置。摄像辅助方法可以还包括根据动作信息来控制支持部件的旋转的步骤。
移动平台可以是便携式终端。摄像辅助方法可以还包括将动作信息发送到飞行体或支持装置的步骤。
在一个方式中,一种程序,其是用于使对由摄像装置进行的第二图像的摄像进行辅助的移动平台执行以下步骤的程序:获取第一图像的步骤;获取第一图像所包含的一个以上的被摄体中的第一被摄体的信息的步骤;获取对第二图像中的包括第一被摄体的一个以上的被摄体的位置进行规定的一个以上的构图中的第一构图的信息的步骤;以及根据第一构图,生成与用于摄像第二图像的摄像装置的动作相关的动作信息的步骤。
在一个方式中,一种记录介质,其是记录有用于使对由摄像装置进行的第二图像的摄像进行辅助的移动平台执行以下步骤的程序的计算机可读记录介质:获取第一图像的步骤;获取第一图像所包含的一个以上的被摄体中的第一被摄体的信息的步骤;获取对第二图像中的包括第一被摄体的一个以上的被摄体的位置进行规定的一个以上的构图中的第一构图的信息的步骤;以及根据第一构图,生成与用于 摄像第二图像的摄像装置的动作相关的动作信息的步骤。
另外,上述的发明内容中没有穷举本公开的所有特征。此外,这些特征组的子组合也可以构成发明。
附图说明
图1是示出第一实施方式中的摄像辅助系统的构成示例的示意图。
图2是示出第一实施方式中的无人飞行器的硬件构成的一个示例的框图。
图3是示出第一实施方式中的UAV控制部的功能构成的一个示例的框图。
图4是示出第一实施方式中的便携式终端的硬件构成的一个示例的框图。
图5是用于说明摄像辅助系统的动作概要的图。
图6A是示出实时取景图像的一个示例的图。
图6B是示出通过颜色划分实时取景图像的颜色划分图像的一个示例的图。
图6C是示出主被摄体的选择示例的图。
图7是示出构图的选择示例的图。
图8A是示出用于以所确定的构图进行航拍的摄像范围的旋转示例的图。
图8B是示出用于以所确定的构图进行航拍的无人飞行器的移动示例的图。
图8C是用于对从水平方向观察到的无人飞行器的移动进行说明的图。
图9是示出第一实施方式中的摄像辅助系统的动作示例的流程图。
图10是示出第二实施方式中的摄像辅助系统的构成示例的示意图。
图11是示出第二实施方式中的无人飞行器的硬件构成的一个示例的框图。
图12是示出第二实施方式中的UAV控制部的功能构成的一个示例的框图。
图13是示出第二实施方式中的便携式终端的硬件构成的一个示例的框图。
图14是示出第二实施方式中的终端控制部的功能构成的一个示例的框图。
图15是示出第二实施方式中的摄像辅助系统的动作示例的流程图。
图16是示出第三实施方式中的包括万向支架装置和便携式终端的摄像辅助系统的构成示例的立体图。
图17A是示出第四实施方式中的万向支架装置的构成示例的正面立体图。
图17B是示出第四实施方式中的包括万向支架装置和便携式终端的摄像辅助系统的构成示例的背面立体图。
具体实施方式
以下,通过发明的实施方式来对本公开进行说明,但是以下实施方式并非限制权利要求书所涉及的发明。实施方式中说明的特征的组合并非全部是发明的解决方案所必须的。
权利要求书、说明书、附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人就不会异议。但是,在除此以外的情况下,保留一切的著作权。
在以下的实施方式中,飞行体以无人飞行器(UAV:Unmanned Aerial Vehicle)为例。飞行体包括在空中移动的飞行器。在本说明书的附图中,无人飞行器标记为“UAV”。此外,移动平台以飞行体、便携式终端、万向支架装置、万向支架相机装置为例。另外,移动平台还可以为除此之外的其他装置,例如发送器、PC(Personal Computer,个人电脑)、或其他移动平台。摄像辅助方法规定了移动平台中的动作。记录介质中记录有程序(例如用于使移动平台执行各种处理的程序)。
(第一实施方式)
图1是示出了第一实施方式中的摄像辅助系统10的构成示例的示意图。摄像辅助系统10包含无人飞行器100、发送器50和便携式终端80。无人飞行器100、发送器50和便携式终端80互相之间可以通过有线通信或无线通信(例如无线LAN(Local Area Network,局域网))进行通信。
无人飞行器100可以按照由发送器50进行的远程操作飞行、或者按照预先设定的飞行路径飞行。无人飞行器100确定用于航拍的构图,并生成无人飞行器100的动作信息,以成为所确定的构图。无人飞行器100按照动作信息控制无人飞行器100的动作。发送器50可以通过远程操作指示无人飞行器100的飞行的控制。即,发送器50可以作为遥控器工作。便携式终端80可以与发送器50一起,被预定 了使用无人飞行器100进行航拍的用户所携带。便携式终端80对由无人飞行器100进行的构图的确定进行辅助,并辅助摄像。
图2是示出了无人飞行器100的硬件构成的一个示例的框图。无人飞行器100的构成为包括UAV控制部110、通信接口150、存储器160、万向支架200、旋翼机构210、摄像部220、摄像部230、GPS接收器240、惯性测量装置(IMU:Inertial Measurement Unit)250、磁罗盘260、气压高度计270、超声波传感器280和激光测量仪290。万向支架200是支持部件的一个示例。摄像部220、摄像部230是摄像装置的一个示例。
UAV控制部110例如使用CPU(Central Processing Unit:中央处理单元)、MPU(Micro Processing Unit:微处理单元)或DSP(Digital Signal Processor:数字信号处理器)构成。UAV控制部110执行用于总体控制无人飞行器100的各部分的动作的信号处理、与其他各部分间的数据的输入输出处理、数据的运算处理和数据的存储处理。
UAV控制部110按照存储于存储器160的程序来控制无人飞行器100的飞行。例如,UAV控制部110可以控制无人飞行器100的飞行,以实现与便携式终端80协作确定的构图。UAV控制部110按照通过通信接口150从远程的发送器50接收到的指令来控制无人飞行器100的飞行。存储器160可以从无人飞行器100上拆卸下来。
UAV控制部110获取表示无人飞行器100的位置的位置信息。UAV控制部110可以从GPS接收器240获取表示无人飞行器100所在的纬度、经度和高度的位置信息。UAV控制部110可以分别从GPS接收器240获取表示无人飞行器100所在的纬度以及经度的纬度经度信息、并从气压高度计270获取表示无人飞行器100所在的高度的高度信息,作为位置信息。
UAV控制部110从磁罗盘260获取表示无人飞行器100的朝向的朝向信息。朝向信息表示例如与无人飞行器100的机头的朝向对应 的方位。
UAV控制部110获取表示摄像部220以及摄像部230各自的摄像范围的摄像范围信息。UAV控制部110从摄像部220以及摄像部230获取表示摄像部220以及摄像部230的视角的视角信息,作为用于确定摄像范围的参数。UAV控制部110获取表示摄像部220以及摄像部230的摄像方向的信息,作为用于确定摄像范围的参数。UAV控制部110例如从万向支架200获取表示摄像部220的姿势状态的姿势信息,作为表示摄像部220的摄像方向的信息。UAV控制部110获取表示无人飞行器100的朝向的信息。表示摄像部220的姿势状态的信息表示万向支架200从俯仰轴和偏航轴的基准旋转角度旋转的角度。UAV控制部110获取表示无人飞行器100所在的位置的位置信息,作为用于确定摄像范围的参数。UAV控制部110可以基于摄像部220和摄像部230的视角和摄像方向、以及无人飞行器100所在的位置,通过划定表示摄像部220摄像的地理范围的摄像范围并生成摄像范围信息,来获取摄像范围信息。
UAV控制部110控制万向支架200、旋翼机构210、摄像部220以及摄像部230。UAV控制部110通过变更摄像部220的摄像方向或视角,来控制摄像部220的摄像范围。UAV控制部110通过控制万向支架200的旋转机构,来控制被万向支架200支持的摄像部220的摄像范围。
摄像范围是指由摄像部220或摄像部230摄像的地理范围。摄像范围由纬度、经度和高度定义。摄像范围可以是由纬度、经度和高度定义的三维空间数据的范围。摄像范围基于摄像部220或摄像部230的视角和摄像方向、以及无人飞行器100所在的位置而确定。摄像部220和摄像部230的摄像方向由摄像部220和摄像部230的设置有摄像镜头的正面所朝的方位和俯角来定义。摄像部220的摄像方向是由无人飞行器100的机头的方位和摄像部220相对于万向支架200的姿势状态来确定的方向。摄像部230的摄像方向是由无人飞行器100的 机头的方位和设置摄像部230的位置来确定的方向。
UAV控制部110可以对由摄像部220或摄像部230所摄像的摄像图像(航拍图像),附加与此航拍图像相关的信息作为附加信息(元数据的一个示例)。附加信息包括与航拍时的无人飞行器100的飞行相关的信息(飞行信息)和与航拍时的摄像部220或摄像部230的摄像相关的信息(摄像信息)。飞行信息可以包括航拍位置信息、航拍路径信息以及航拍时间信息中的至少一个。摄像信息可以包括航拍视角信息、航拍方向信息、航拍姿势信息、摄像范围信息以及被摄体距离信息中的至少一个。
航拍位置信息表示航拍航拍图像的位置(航拍位置)。航拍位置信息可以基于GPS接收器240所获取的位置信息。航拍路径信息表示航拍航拍图像的路径(航拍路径)。航拍路径信息可以由连续排列航拍位置的航拍位置的集合构成。航拍时间信息表示航拍航拍图像的时间(航拍时间)。航拍时间信息可以基于UAV控制部110所参照的计时器的时间信息。
航拍视角信息表示航拍航拍图像时的摄像部220或摄像部230的视角信息。航拍方向信息表示航拍航拍图像时的摄像部220或摄像部230的摄像方向(航拍方向)。航拍姿势信息表示航拍航拍图像时的摄像部220或摄像部230的姿势信息。摄像范围信息表示航拍航拍图像时的摄像部220或摄像部230的摄像范围。被摄体距离信息表示从摄像部220或摄像部230到被摄体的距离的信息。被摄体距离信息可以基于超声波传感器280或激光测量仪290测得的检测信息。关于被摄体距离信息,也可以通过摄像包括同一被摄体的多张图像,利用这些图像作为立体图像,计算出到被摄体的距离。此外,摄像信息也可以包括航拍时的无人飞行器100的朝向的信息。
通信接口150与发送器50和便携式终端80进行通信。通信接口150可以将航拍图像发送到便携式终端80。通信接口150可以将航拍 图像以及其附加信息的至少一部分发送到便携式终端80。
通信接口150可以接收用于确定摄像部220或摄像部230要摄像的航拍图像的构图的信息。用于确定要摄像的航拍图像的构图的信息可以包括例如用于选择航拍图像(实时取景图像)中的主被摄体的选择信息、用于选择构图的选择信息。通信接口150可以将由摄像部220或摄像部230摄像的航拍图像(例如实时取景图像等航拍动态图像、航拍静止图像)发送到便携式终端80。通信接口150可以直接与便携式终端80之间进行通信,也可以通过发送器50与便携式终端80之间进行通信。
存储器160存储UAV控制部110对万向支架200、旋翼机构210、摄像部220、摄像部230、GPS接收器240、惯性测量装置250、磁罗盘260、气压高度计270、超声波传感器280以及激光测量仪290进行控制所需的程序等。存储器160可以为计算机可读记录介质,可以包括SRAM(Static Random Access Memory:静态随机存取存储器)、DRAM(Dynamic Random Access Memory:动态随机存取存储器)、EPROM(Erasable Programmable Read Only Memory:可擦除可编程只读存储器)、EEPROM(Electrically Erasable Programmable Read-Only Memory:电可擦除可编程只读存储器)以及USB(Universal Serial Bus:通用串行总线)存储器等闪存中的至少一个。
存储器160保存通过通信接口150获取的各种信息、各种数据。存储器160可以存储用于摄像图像的各种构图的样本信息。构图的样本信息可以以表格形式存储。存储器160可以保存由UAV控制部110确定的构图的信息。存储器160可以保存与用于实现确定的构图下的摄像的无人飞行器100的动作相关的动作信息。无人飞行器100的动作信息可以在航拍时从存储器160读出,无人飞行器100可以根据此动作信息进行动作。
万向支架200可以以偏航轴、俯仰轴以及横滚轴为中心可旋转地 支持摄像部220。万向支架200可以通过使摄像部220以偏航轴、俯仰轴以及横滚轴中的至少一个为中心旋转,从而变更摄像部220的摄像方向。
偏航轴、俯仰轴以及横滚轴可以如下确定。例如,将横滚轴定义为水平方向(与地面平行的方向)。此时,将俯仰轴确定为与地面相平行、并与横滚轴垂直的方向,将偏航轴(参照z轴)确定为与地面垂直、并与横滚轴以及俯仰轴垂直的方向。
摄像部220摄像期望的摄像范围的被摄体并生成摄像图像的数据。通过摄像部220的摄像而得到的图像数据存储于摄像部220所具有的存储器、或存储器160中。
摄像部230摄像无人飞行器100的周边并生成摄像图像的数据。摄像部230的图像数据存储于存储器160中。
GPS接收器240接收表示从多个导航卫星(即GPS卫星)发送的时间以及各GPS卫星的位置(坐标)的多个信号。GPS接收器240根据接收到的多个信号,计算出GPS接收器240的位置(即无人飞行器100的位置)。GPS接收器240将无人飞行器100的位置信息输出到UAV控制部110。另外,可以用UAV控制部110代替GPS接收器240来进行GPS接收器240的位置信息的计算。在此情况下,在UAV控制部110中输入GPS接收器240所接收到的多个信号中包含的表示时间以及各GPS卫星的位置的信息。
惯性测量装置250检测无人飞行器100的姿势,并将检测结果输出到UAV控制部110。惯性测量装置IMU250检测无人飞行器100的前后、左右以及上下的3轴方向的加速度和俯仰轴、横滚轴以及偏航轴的3轴方向的角速度,作为无人飞行器100的姿势。
磁罗盘260检测无人飞行器100的机头的方位,并将检测结果输出到UAV控制部110。
气压高度计270检测无人飞行器100飞行的高度,并将检测结果输出到UAV控制部110。另外,也可以通过气压高度计270以外的传感器检测无人飞行器100飞行的高度。
超声波传感器280发射超声波,检测地面、物体反射的超声波,并将检测结果输出到UAV控制部110。检测结果可以表示从无人飞行器100到地面的距离即高度。检测结果可以表示从无人飞行器100到物体(被摄体)的距离。
激光测量仪290对物体照射激光,接收物体反射的反射光,并通过反射光测量无人飞行器100与物体(被摄体)之间的距离。作为基于激光的距离测量方法的一个示例,可以为飞行时间法。
图3是示出UAV控制部110的功能构成的一个示例的框图。UAV控制部110包含图像获取部111、主被摄体确定部112、构图确定部113、动作信息生成部114和动作控制部115。主被摄体确定部112和构图确定部113是信息获取部的一个示例。构图确定部113是识别部的一个示例。动作信息生成部114是生成部的一个示例。动作控制部115是控制部的一个示例。
图像获取部111可以获取保存于存储器160的图像(例如由摄像部220或摄像部230航拍的航拍图像)。图像获取部111可以获取摄像部220或摄像部230航拍中的航拍图像。航拍图像可以是动态图像,也可以是静止图像。航拍中的航拍动态图像也被称为实时取景图像(第一图像的一个示例)。由图像获取部111获取的航拍图像主要以实时取景图像为例。
主被摄体确定部112在由图像获取部111获取的实时取景图像中包含的一个以上的被摄体中确定(决定)主被摄体(第一被摄体的一个示例)。主被摄体的确定是主被摄体的信息获取的一个示例。主被摄体的确定例如可以由便携式终端80的用户手动进行,也可以由无人飞行器100自动进行。主被摄体也可以是实时取景图像外的被摄体 (例如与用户期望的航拍范围相对应的地图信息中包含的任意的被摄体)。此时,可以超出无人飞行器100的摄像范围,来确定适于期望的被摄体的构图。
构图确定部113确定用于摄像所确定的主被摄体的构图。用于摄像主被摄体的构图的确定是用于摄像主被摄体的构图的信息获取的一个示例。此构图由于是尚未摄像的要摄像的构图,因此也称为预构图(第一构图的一个示例)。构图可以是规定图像中一个以上的被摄体的位置关系的信息。构图确定部113可以参考保存在存储器160中的构图的样本信息,并根据主被摄体的信息来确定预构图。预构图的确定例如可以由便携式终端80的用户手动进行,也可以由无人飞行器100自动进行。
构图的样本信息可以包括例如三分法构图(Rule of Thirds)、二分法构图、三角形构图、对角线构图、字母构图、中心构图、边缘构图、三明治构图、隧道构图中的至少一个构图的信息,作为样本信息。此外,构图的样本信息也可以包括将主被摄体配置在各构图的预定交叉点、分割点(例如黄金分割点)上的信息。
动作信息生成部114生成用于实现按照所确定的预构图进行的航拍的无人飞行器100的动作信息。无人飞行器100的动作信息可以包括例如与无人飞行器100的移动相关的移动信息(例如无人飞行器100的移动量、移动方向)、与无人飞行器100的旋转相关的旋转信息(例如无人飞行器100的旋转量、旋转方向)、与万向支架200的旋转相关的旋转信息(例如万向支架200的旋转量、旋转方向)、以及其他的无人飞行器100的动作信息中的至少一部分。
动作控制部115可以按照所生成的动作信息(例如无人飞行器100的移动量、移动方向)控制无人飞行器100的飞行。动作控制部115可以按照所生成的动作信息(例如无人飞行器100的旋转量、旋转方向)控制无人飞行器100的朝向。这样,动作控制部115可以通 过使无人飞行器100移动来变更摄像部220的摄像范围。
动作控制部115可以按照所生成的动作信息(例如万向支架200的旋转量、旋转方向)控制万向支架200的旋转。动作控制部115也可以按照动作信息控制无人飞行器100的姿势,并且通过万向支架200的旋转控制万向支架200的姿势。这样,动作控制部115可以通过使万向支架200旋转来变更摄像部220的摄像范围。
图4是示出便携式终端80的硬件构成的一个示例的框图。便携式终端80可以具有终端控制部81、接口部82、操作部83、无线通信部85、存储器87以及显示部88。操作部83是信息获取部的一个示例。无线通信部85是通信部的一个示例。
终端控制部81例如可以使用CPU、MPU或DSP构成。终端控制部81执行用于总体控制便携式终端80的各部分的动作的信号处理、与其他各部分之间的数据的输入输出处理、数据的运算处理和数据的存储处理。
终端控制部81可以通过无线通信部85获取来自无人飞行器100的数据、信息。终端控制部81可以通过接口部82获取来自发送器50的数据、信息。终端控制部81可以获取通过操作部83输入的数据、信息。终端控制部81可以获取保存在存储器87中的数据、信息。终端控制部81可以将数据、信息发送到显示部88,将基于此数据、信息的显示信息显示于显示部88。
终端控制部81可以执行摄像辅助应用程序。摄像辅助应用程序可以是进行用于通过无人飞行器100在期望的构图下进行航拍的辅助的应用程序。终端控制部81可以生成应用程序中使用的各种数据。
接口部82进行发送器50与便携式终端80之间的信息、数据的输入输出。接口部82例如可以通过USB电缆进行输入输出。接口部82还可以是USB以外的接口。
操作部83接受并获取由便携式终端80的用户输入的数据、信息。操作部83可以包括按钮、按键、触控显示屏、话筒等。这里主要例示了操作部83和显示部88由触控显示屏构成。在此情况下,操作部83可以接受触控操作、点击操作、拖动操作等。操作部83可以通过接受用于选择主被摄体的选择操作,获取主被摄体的选择信息。操作部83可以通过接受用于选择构图的选择操作,获取构图的选择信息。
无线通信部85通过各种无线通信方式与无人飞行器100之间进行无线通信。此无线通信的无线通信方式例如可以包括通过无线LAN、Bluetooth(注册商标)、或公共无线网络进行的通信。无线通信部55可以将主被摄体的选择信息、构图的选择信息发送到无人飞行器100。
存储器87例如可以具有存储有对便携式终端80的动作进行规定的程序、设定值的数据的ROM、以及暂时保存终端控制部81进行处理时使用的各种信息、数据的RAM。存储器87可以包括ROM和RAM以外的存储器。存储器87可以设置在便携式终端80的内部。存储器87可以设置成可从便携式终端80拆卸下来。程序可以包括应用程序。
显示部88例如使用LCD(Liquid Crystal Display:液晶显示器)构成,显示从终端控制部81输出的各种信息、数据。显示部88可以显示与摄像辅助应用程序的执行相关的各种数据、信息。显示部88可以显示用于选择主被摄体的选择画面、用于选择构图的选择画面。
另外,便携式终端80可以通过支架安装在发送器50上。便携式终端80和发送器50可以通过有线电缆(如USB电缆)连接。也可以不将便携式终端80安装在发送器50上,而是将便携式终端80和发送器50分别独立设置。摄像辅助系统10也可以不包含发送器50。
接着,对摄像辅助系统10的动作概要进行说明。
图5是用于说明摄像辅助系统10的动作概要的图。
在图5中,在山M1中有道路R1,在道路R1上有人H1。无人飞行器100边在山M1上空飞行边进行航拍。无人飞行器100摄像山M1的实时取景图像,并发送到便携式终端80。便携式终端80接收来自无人飞行器100的实时取景图像G1,并将实时取景图像G1显示于显示部88。由此,便携式终端80的用户可以确认实时取景图像G1。
假设用户希望调整构图以便更有吸引力地摄像实时取景图像G1中拍到的被摄体。在这种情况下,便携式终端80通过操作部83接受指示构图调整的操作,并将该构图调整命令发送给无人飞行器100。无人飞行器100在接收到构图调整命令时,确定实时取景图像G1中的主被摄体(例如人H1),确定预构图,生成无人飞行器100的动作信息。
无人飞行器100根据动作信息进行移动等,并通知便携式终端80已经移动到期望位置(移动完成)。当接收到移动完成通知时,便携式终端80例如基于通过操作部83的用户指示,向无人飞行器100发送摄像命令。另外,在接收到移动完成通知时,便携式终端80可以通过无线通信部85获取移动后的无人飞行器100的位置处的实时取景图像。在这种情况下,用户可以通过显示确认移动后的实时取景图像,并且可以判断是否应在该位置进行航拍。当接收到摄像命令时,无人飞行器100根据摄像命令通过摄像部220或230进行航拍,得到航拍图像(第二图像的一个示例)。
由此,无人飞行器100可以获取期望的预构图的航拍图像。在图5中,在实时取景图像中,作为主被摄体的人H1被配置在任意位置上,但是在调整构图所航拍的航拍图像中,作为主被摄体的人H1位于三分法构图中三分线的交点上。这样,无人飞行器100可以进行动作(在这里为移动)来航拍,以便实现期望的构图。
另外,摄像命令也可以不由便携式终端80发送,而是由发送器50发送。在这种情况下,发送器50可以使用通信等与便携式终端80协作,并将发送器50具有的摄像按钮(未图示)的按下的信息发送到无人飞行器100。
接着,对主被摄体的确定示例进行说明。
图6A是示出由无人飞行器100摄像的实时取景图像G1的一个示例的图。图6B是示出通过颜色划分实时取景图像G1的颜色划分图像G2的一个示例的图。图6C是示出使用了颜色划分图像G2的主被摄体的选择示例的图。
实时取景图像G1中包含具有多个颜色成分(例如蓝色、浅蓝色)的海洋和存在具有绿色成分的森林的岛屿。在无人飞行器100中,通信接口150将实时取景图像G1发送到便携式终端80。在便携式终端80中,无线通信部85可以从无人飞行器100接收实时取景图像G1,并且显示部88显示实时取景图像G1。
主被摄体确定部112可以将实时取景图像G1分割为多个图像块(例如16×16块)。主被摄体确定部112可以根据各图像块的颜色成分将实时取景图像G1划分成一个以上的区域,生成颜色划分图像G2。在图6B中,海洋的蓝色部分可以被划分为区域A,海洋的浅蓝色部分可以被划分为区域B,岛屿的绿色部分可以被划分为区域C。在便携式终端80中,无线通信部85可以从无人飞行器100接收颜色划分图像G2,并且显示部88显示颜色划分图像G2。
主被摄体确定部112可以将颜色划分图像G2中的任意一个颜色成分的区域确定为主被摄体。如图6C所示,显示部88可以根据摄像辅助应用程序显示颜色划分图像G2,并且进行用于确定选择哪个区域(这里指表示区域A的ZA、表示区域B的ZB、表示区域C的ZC)作为主被摄体的引导显示。在图6C中,例示了通过操作部83选择区域C作为主被摄体。在这种情况下,在便携式终端80中,无线通信 部85将通过操作部83获得的主被摄体的选择信息发送到无人飞行器100。在无人飞行器100中,通信接口150接收主被摄体的选择信息。主被摄体确定部112基于主被摄体的选择信息确定主被摄体。在这种情况下,主被摄体成为与摄像图像中用户想要的摄像对象对应的像素的集合。
在实时取景图像G1的端部处丢失了被摄体的一部分的信息的情况下,换言之,在被摄体在实时取景图像G1的图像内部与图像外部之间被分割的情况下,主被摄体确定部112也可以插值该被摄体的像素信息。例如,主被摄体确定部112可以基于实时取景图像G1的图像中的被摄体周围的像素信息(例如像素值)来插值被摄体的像素信息。主被摄体确定部112可以将实时取景图像G1的图像中的被摄体周围的像素信息(例如像素值)原样插值,作为被摄体的像素信息。主被摄体确定部112可以收集被摄体周围的多个像素信息,并且基于多个像素信息通过加权或平均来生成新颜色,进行插值以作为被摄体的像素信息。作为插值技术,主被摄体确定部112可以使用最近邻插值法(Nearest neighbor)、双线性插值法(Bilinear)、双三次插值法(Bicubic)等。
这样,主被摄体确定部112可以通过通信接口150获取主被摄体的选择信息,并且可以基于该选择信息确定主被摄体。由此,无人飞行器100可以从基于实时取景图像G1的颜色划分图像G2中包含的被摄体中确定用户所期望的主被摄体。因此,无人飞行器100可以进行以用户所期望的主被摄体为基准的构图调整。
主被摄体确定部112也可以不使用被摄体的选择信息来确定主被摄体。此外,也可以确定多个主被摄体。
例如,主被摄体确定部112可以将颜色划分图像G2中的颜色成分的区域中的预定的区域确定为主被摄体。即,主被摄体确定部112可以通过颜色将实时取景图像G1分组,并将预定的颜色组识别为主 被摄体。在这种情况下,主被摄体确定部112例如可以将位于由各颜色成分包围的中心的区域(例如图6B中的岛屿区域、即区域C)确定为主被摄体。由此,无人飞行器100可以将通过被周围区域包围而突出的被摄体作为主被摄体。
此外,主被摄体确定部112可以将由颜色成分划分的区域中具有预定大小以下(例如最小)的区域确定为主被摄体。由此,无人飞行器100可以以实时取景图像G1中比其他区域难辨别的小尺寸区域为基准来调整构图。例如,可以将在山中迷路的人确定为主被摄体。
此外,主被摄体确定部112可以将从操作部83、存储器87获得的预定颜色区域确定为主被摄体。由此,无人飞行器100可以将用户所期望的颜色的被摄体、预先作为主要摄像对象确定的颜色的被摄体确定为主被摄体。
这样,无人飞行器100可以按照例如山、海、人的衣服等的各种颜色粗略地辨别被摄体。因此,无人飞行器100可以将例如特定的颜色成分登记为要注意的被摄体,并且可以根据颜色自动辨别主被摄体。
主被摄体确定部112可以基于空间频率将实时取景图像G1划分成一个以上的区域,并且将预定区域确定为主被摄体。即,主被摄体确定部112可以通过空间频率将实时取景图像G1分组,并将预定的空间频率范围的组识别为主被摄体。
空间频率越高,图像中边缘越多,图像越清晰。另一方面,空间频率越低,图像中边缘越少,图像越模糊。因此,主被摄体确定部112可以将由空间频率划分的区域中空间频率在预定频率以上(例如最高)的区域确定为主被摄体。由此,无人飞行器100可以以比较清晰的区域为基准来调整构图。
此外,主被摄体确定部112可以将从操作部83、存储器87获得 的具有预定空间频率的区域确定为主被摄体。由此,无人飞行器100可以将用户所期望的空间频率的被摄体、预先作为主要摄像对象确定的空间频率的被摄体确定为主被摄体。
主被摄体确定部112可以基于实时取景图像G1被航拍的航拍位置信息,将实时取景图像G1中包含的一个以上的被摄体中的预定的被摄体确定为主被摄体。在这种情况下,主被摄体确定部112可以通过通信接口150获取由外部服务器等存储的地图数据库的地图信息。地图信息可以包括被摄体类别(例如山、河流、海、建筑物)的信息。主被摄体确定部112可以基于被摄体的类别、被摄体的大小来确定主被摄体。
由此,无人飞行器100可以基于无人飞行器100的地理信息、地形信息来调整构图。
此外,主被摄体确定部112可以通过通信接口150从外部服务器等获取与实时取景图像G1中包含的被摄体的航拍图像相关的评价信息。主被摄体确定部112可以将该评价信息在预定基准以上(例如最高评价)的被摄体确定为主被摄体。
由此,无人飞行器100可以以被其他人高度评价的被摄体为基准来调整构图。另外,航拍位置信息可以根据GPS接收器240所获取的信息获得。
此外,主被摄体确定部112可以基于确定的主被摄体被航拍时设置的摄像模式来确定主被摄体。此外,在设置日落模式的情况下,可以将太阳、太阳下沉的水平线附近、地平线附近确定为主被摄体。
由此,无人飞行器100可以添加考虑被摄体的摄像模式来确定主被摄体。因此,无人飞行器100可以确定适合于根据摄像模式所设置的摄像信息(相机参数)的主被摄体,并且有望可以获得清晰的航拍图像。
主被摄体确定部112可以将例如基于上述任意一种方法的主被摄体的确定信息与实时取景图像G1等的摄像范围的信息一起存储到存储器160中。主被摄体确定部112可以基于存储在存储器160中的主被摄体的确定信息(即,过去具有实绩的主被摄体的确定信息)来确定实时取景图像G1中包含的被摄体中的主被摄体。例如,主被摄体确定部112可以在相同的(例如同一个)摄像范围中,将过去被确定为主被摄体预定次数以上(例如最多)的被摄体、过去被确定为主被摄体预定频度以上(例如最高频度)的被摄体确定为主被摄体。
由此,无人飞行器100可以根据过去的实绩,换言之,根据用户的选择倾向、无人飞行器100的确定倾向,以机器学习的方式来确定主被摄体。无人飞行器100可以将以机器学习的方式确定的主被摄体作为基准来调整构图。
接着,对构图的确定示例进行说明。
图7是示出构图选择的示例的图。
当主被摄体被确定时,构图确定部113基于主被摄体从存储器160获取要航拍的候选构图的信息。例如,构图确定部113可以将实时取景图像的构图与保存在存储器160中的构图的样本信息进行比较,将与实时取景图像的构图具有预定基准以上的一致度的构图的样本信息确定为一个以上的候选构图。构图确定部113可以根据两个构图中的主被摄体的形状、位置、大小、两个构图中的多个被摄体的形状、位置、大小、位置关系等中的至少一个来判定实时取景图像的构图与构图的样本信息之间的一致度。
作为一个示例,如图7所示,假设主被摄体确定部112选择具有细长形状的河流作为主被摄体。在这种情况下,构图确定部113可以参考由存储器160保存的构图的样本信息,获取例如适于航拍细长形状区域的对角线构图、三分法构图或其他构图作为候选构图。在对角线构图中,作为主被摄体的河流可以沿着构图中的对角线配置。在三 分法构图中,作为主被摄体,可以将河流沿着分成三部分的分割线配置、或与分割线重叠配置。构图确定部113可以通过通信接口150将所获取的候选构图发送到便携式终端80。在便携式终端80中,无线通信部85可以获取候选构图的信息。如图7所示,显示部88可以显示候选构图的信息。
显示部88可以根据摄像辅助应用程序来显示候选构图,并且进行用于决定选择哪个候选构图(这里是对角线构图、三分法构图)作为预构图的引导显示。在图7中,例示了通过操作部83选择对角线构图作为预构图。在这种情况下,在便携式终端80中,无线通信部85将通过操作部83获得的构图的选择信息发送到无人飞行器100。在无人飞行器100中,通信接口150接收构图的选择信息。构图确定部113基于构图的选择信息确定预构图。
显示在显示部88上的候选构图的图像可以是保存在存储器160中并被发送到便携式终端80的图像(例如作为构图的样本信息的构图的图像)。显示在显示部88上的候选构图的图像也可以是由构图确定部113生成并发送到便携式终端80的图像。在这种情况下,构图确定部113可以基于与主被摄体的形状相对应的构图的信息和实时取景图像G1来生成要显示的候选构图的图像。例如,在实时取景图像G1中作为主被摄体具有河流,并且在其两侧作为被摄体存在山体时,构图确定部113可以例如简化这些被摄体的形状,生成按照候选构图的位置关系配置有各被摄体的图像。此外,也可以代替构图确定部113,由便携式终端80的终端控制部81基于与从无人飞行器100获取的主被摄体的形状相对应的构图的信息和实时取景图像G1来生成候选构图的图像。
这样,构图确定部113可以添加例如主被摄体的形状来确定候选构图。便携式终端80的显示部88可以显示所确定的候选构图,并促使用户进行选择。显示部88可以以静止图像显示候选构图,也可以以动态图像的预览来显示候选构图。构图确定部113可以通过通信接 口150获取构图的选择信息,并基于此选择信息来确定预构图。
由此,无人飞行器100可以确定主被摄体在要航拍的航拍图像中被配置在期望位置上的构图,可以有吸引力地对主被摄体进行航拍。此外,构图确定部113可以自动地确定候选构图,并且可以限定性地从各种构图的样本信息中提示出候选构图。由于构图确定部113基于选择信息来确定预构图,所以可以在预构图的选择中反映出用户的意愿。
另外,构图确定部113也可以不执行候选构图的提示,而是根据主被摄体的形状来确定构图。由此,无人飞行器100可以添加主被摄体的形状,从而使用具有良好平衡性的预构图来进行航拍,可以有吸引力地对主被摄体进行航拍。
构图确定部113可以不使用构图的选择信息而确定构图。
例如,构图确定部113可以通过场景识别算法来识别实时取景图像G1的场景。构图确定部113可以根据场景识别结果确定适于该场景的构图或提示候选构图。例如,在认识到实时取景图像G1是日出(太阳升起)的场景时,构图确定部113可以确定适于对此场景进行摄像的中心构图、二分法构图等构图,或者提示这些候选构图。在场景识别中,例如可以使用深度学习,也可以使用卷积神经网络。
由此,无人飞行器100可以结合实时取景图像G1被航拍的场景,确定能有吸引力地对此场景进行航拍的构图。
构图确定部113可以将例如基于上述任意一种方法的构图的确定信息与实时取景图像G1等的摄像范围的信息一起存储到存储器160中。构图确定部113可以基于存储在存储器160中的构图的确定信息(即,过去具有实绩的构图的确定信息)来确定考虑了主被摄体的构图。例如,构图确定部113可以在相同的(例如同一个)摄像范围中,将过去被使用预定次数以上(例如最多)的构图、过去被使用 预定频度以上(例如最高频度)的构图,确定为要航拍的构图。
由此,无人飞行器100可以根据过去的实绩,换言之,根据用户的选择倾向和无人飞行器100的确定倾向,以机器学习的方式来确定构图。无人飞行器100可以通过以机器学习的方式确定的构图来有吸引力地对主被摄体进行航拍。
此外,构图确定部113可以通过通信接口150从外部服务器等获取与使用了构图的航拍图像相关的评价信息。主被摄体确定部112可以将该评价信息在预定基准以上(例如最高评价)的构图确定为预构图。
由此,无人飞行器100可以将被其他人高度评价的构图确定为预构图。因此,无人飞行器100能够以客观上优选的构图获取配置有被摄体的航拍图像。
接着,对动作信息的生成示例进行说明。
动作信息生成部114确定摄像范围,以便按照确定的构图进行航拍。摄像范围可以由无人飞行器100的位置、无人飞行器100的朝向、摄像部220的朝向、摄像部220或摄像部230的视角等确定。因此,动作信息生成部114可以生成用于从实时取景图像G1被航拍的无人飞行器100的动作状态变更为用于实现所确定的构图的无人飞行器100的动作状态的信息,作为动作信息。
例如,动作信息生成部114可以包括用于从移动前(实时取景图像G1被航拍时)的无人飞行器100的位置向移动后的(用于实现所确定的构图的)无人飞行器100的位置移动的无人飞行器100的移动信息,作为动作信息。
动作信息生成部114可以包括用于从变更前(实时取景图像G1被航拍时)的无人飞行器100的朝向变更为变更后的(用于实现所确定的构图的)无人飞行器100的朝向的无人飞行器100的旋转信息, 作为动作信息。
动作信息生成部114可以包括用于从变更前的万向支架200的旋转状态(例如旋转角度)(相当于摄像部220的朝向)变更为变更后的万向支架200的旋转状态的万向支架200的旋转信息,作为动作信息。
动作信息生成部114可以包括用于从变更前的摄像部220或摄像部230的视角变更为变更后的摄像部220或摄像部230的朝向的摄像部220或摄像部230的视角变更信息,作为动作信息。摄像部220或摄像部230的视角可以与摄像部220或摄像部230的变焦倍率对应。即,动作信息生成部114可以包括用于从变更前的摄像部220或摄像部230的变焦倍率变更为变更后的摄像部220或摄像部230的变焦倍率的变焦倍率变更信息,作为动作信息。
图8A是示出用于以所确定的构图进行航拍的摄像范围的旋转示例的图。图8B是示出用于以所确定的构图进行航拍的无人飞行器100的移动示例的图。图8C是用于对从水平方向观察到的无人飞行器100的移动进行说明的图。
在图8A中,示出了作为简化表现了实时取景图像的构图的当前构图C1和预构图C2。在图8A中,与图7相同,在河流RV11的两侧存在山M11、山M12。使预构图C2为对角线构图。
动作信息生成部114将当前构图C1中的被摄体的大小与预构图C2中的被摄体的大小进行比较。动作信息生成部114可以基于当前构图C1中的被摄体的大小与预构图C2中的被摄体的大小来计算无人飞行器100的高度的变化量(即重力方向上的移动量)。例如,在预构图C2中的被摄体的大小是当前构图C1中的被摄体的大小的两倍时,动作信息生成部114可以计算出移动方向和移动量,使无人飞行器100的高度变为1/2。例如,在预构图C2中的被摄体的大小是当前构图C1中的被摄体的大小的1/2时,动作信息生成部114可以 计算出移动方向和移动量,使无人飞行器100的高度变为两倍。另外,当前构图C1中的无人飞行器100的高度信息可以是实时取景图像G1被航拍时的航拍高度,可以由气压高度计270等获取。
在图8A中,河流RV11的大小、山M11、M12的大小在当前构图C1和预构图C2中是相同的。即,示出了到作为被摄体的河流RV11、山M11、M12的距离(被摄体距离)未被变更。在这种情况下,动作信息生成部114可以判断为无人飞行器100的高度未被改变,并且重力方向上的移动量为值0。
由此,通过当前构图C1与预构图C2的比较,无人飞行器100可以容易地计算出高度的变化量(即重力方向上的移动量)。因此,无人飞行器100不仅可以在二维空间(水平方向)移动,还可以在三维空间移动。
另外,还可以变更摄像部220或摄像部230的变焦倍率,来代替变更无人飞行器100的高度。
动作信息生成部114可以将当前构图C1中的各被摄体的位置关系与预构图C2中的各被摄体的位置关系进行比较。动作信息生成部114可以基于当前构图C1中的各被摄体的位置关系与预构图C2中的各被摄体的位置关系,计算构图的旋转量和旋转方向,即无人飞行器100的旋转量和旋转方向、或者摄像部220或摄像部230的旋转量和旋转方向。这里的旋转方向可以是例如沿着水平方向的方向。另外,可以通过基于数学坐标变换的映射来计算并获取各被摄体的位置关系的信息。
在图8A中,成为预构图C2中的各被摄体相对于当前构图C1中的各被摄体逆时针旋转30度的位置关系。在这种情况下,动作信息生成部114计算出旋转方向为逆时针旋转、并且是以摄像部220或摄像部230的光轴为中心的30度的旋转量。
由此,通过当前构图C1与预构图C2的比较,无人飞行器100可以容易地计算出构图的旋转量。
动作信息生成部114可以生成万向支架200的旋转信息。例如动作信息生成部114可以基于摄像部220的视角信息、实时取景图像G1中的便携式终端80的显示部88上的被摄体的画面位置以及预构图中的显示部88上的相同被摄体的画面位置,来计算万向支架200的旋转信息。画面上相同被摄体的移动距离与万向支架200的旋转角度的变化量(旋转量)成正比。
在图8B中,在当前构图C1和预构图C2中,假设同一被摄体M21的画面位置的距离w1(相当于画面上的移动距离)为显示部88的画面的一边w的1/6的长度。此外,假设由摄像部220的视角信息所示的摄像视角为90度。在这种情况下,用于实现预构图C2的万向支架200的旋转角度(角度θ2)为15度。此外,动作信息生成部114可以基于真实空间中的移动方向与显示部88的画面上的移动方向之间的对应关系,导出(例如计算出)万向支架200的旋转方向。另外,真实空间中的移动方向与显示部88的画面上的移动方向可以相反。例如,在对图6A进行航拍期间,当沿重力方向(向下)旋转万向支架200时,航拍图像中的岛屿的位置向图6A中的上方移动。
动作信息生成部114可以生成无人飞行器100的移动信息。例如动作信息生成部114指示动作控制部115使得无人飞行器100飞行预定距离(例如预定的短距离)。无人飞行器100在动作控制部115的控制下飞行预定距离。动作信息生成部114可以与便携式终端80的终端控制部81协作来判定真实空间中的飞行带来的移动距离与显示部88的画面上的移动距离之间的对应关系。
具体地,动作信息生成部114可以通过通信接口150向便携式终端80通知与该飞行有关的预定距离的信息。便携式终端80的终端控制部81可以在该飞行期间的航拍图像中,检测出在显示部88的画面 上的相同被摄体伴随着无人飞行器100移动预定距离而移动的移动距离的信息。终端控制部81可以通过无线通信部85将移动距离信息发送到无人飞行器100。动作信息生成部114可以通过通信接口150接收移动距离的信息。这样,动作信息生成部114可以判定真实空间中的飞行带来的移动距离与画面上的移动距离之间的对应关系,并且将此对应关系的信息预先保存在存储器87等中。例如,存储器160可以预先保存有真实空间中的移动距离为画面上的移动距离的α倍这一信息。
此外,终端控制部81可以将真实空间中的移动方向与画面上的移动方向之间的对应关系的信息也保存于存储器160。另外,例如可以通过GPS接收器240获取真实空间中的移动前后的位置。
在图8B中,由摄像部220摄像的相同被摄体M21从位置p1移动到位置p2。可以基于位置p1与位置p2之间的距离d1(相当于移动距离)和保存于存储器120d的对应关系的信息(例如α倍),计算用于实现预构图的无人飞行器100的移动距离。
由此,即使在不包含大量的传感器的情况下,无人飞行器100也可以判定其为实现预构图C2而应该位于的位置,并获取用于实现预构图C2的移动量和移动方向的信息。
此外,动作信息生成部114可以通过其他方法生成移动信息。图8C是示出移动前后的摄像部220的视角的变化与水平方向上的移动距离之间的关系的一个示例的图。表示图8C中各位置的点示出了从侧方观察的情况。
例如,动作信息生成部114可以获取在实时取景图像G1的航拍时万向支架200相对于重力方向的角度θ1。角度θ1可以基于实时取景图像G1的航拍时的无人飞行器100相对于重力方向的倾斜度和万向支架200相对于无人飞行器100的倾斜度来计算。动作信息生成部114可以基于无人飞行器100的高度和万向支架200的角度θ1,计算 无人飞行器100的水平方向上的位置p11与无人飞行器100的摄像范围的中心部所处的水平方向上的位置p12之间的距离d11。
然后,动作信息生成部114可以基于摄像部220的视角信息、当前构图C1中的被摄体的画面位置以及预构图C2中的显示部88上的相同被摄体的画面位置,来计算万向支架200的旋转信息(例如相当于角度θ2的旋转角度)。此时,动作信息生成部114可以参考保存于存储器160中的真实空间中的飞行带来的移动距离与画面上的移动距离之间的对应关系的信息。动作信息生成部114可以基于无人飞行器100的高度h和相对于重力方向的万向支架200的旋转后的角度(θ1+θ2),来计算无人飞行器100的水平方向上的位置p11与旋转后(预构图C2实现时)的无人飞行器100的摄像范围的中心部所处的水平方向上的位置p13之间的距离d1+d2。因此,动作信息生成部114可以计算出旋转前(实时取景图像G1航拍时)的无人飞行器100的摄像范围的中心部所处的水平方向上的位置p12与旋转后(预构图C2实现时)的无人飞行器100的摄像范围的中心部所处的水平方向上的位置p13之间的差、即与角度θ2对应的移动距离d12。
由此,即使在没有事先获得真实空间中的移动距离与画面上的移动距离之间的对应关系的信息、此对应关系未知的情况下,无人飞行器100也可以计算出用于实现预构图的移动量和移动方向。
动作信息生成部114通过上述方法计算出在水平方向上的一个轴方向(例如图8C所示的x方向)上的移动距离d12。同样,动作信息生成部114可以计算出在与水平方向上的一个轴正交的另一方向(例如y方向)上的移动距离d22(未图示)。动作信息生成部114可以通过合成移动距离d12和移动距离d22来计算水平方向(xy方向)上的移动距离。此外,动作信息生成部114可以通过合成x方向分量和y方向分量的移动距离来计算移动方向。
这样,无人飞行器100可以通过生成动作信息来掌握由当前构图 C1变为预构图C2所需的无人飞行器100的动作。此外,无人飞行器100可以通过生成万向支架200的旋转信息作为动作信息,而通过调整万向支架200来变更摄像范围,提供用于成为期望的预构图C2的动作信息。在这种情况下,无人飞行器100可以通过万向支架200的旋转来变更摄像范围,因此不需要移动无人飞行器100,可以节省无人飞行器100飞行所需的电力消耗,实现低成本。
此外,无人飞行器100可以基于当前构图C1和预构图C2中的被摄体的位置,通过比较简单的运算,计算出万向支架200的旋转角度。因此,无人飞行器100可以将运算处理负荷抑制得较低。
此外,无人飞行器100可以通过生成无人飞行器100的移动信息作为动作信息,而通过无人飞行器100的地理位置调整来变更摄像范围,提供用于成为期望的预构图C2的动作信息。即,可以通过无人飞行器100在空间上的移动来变更摄像范围,实现适当的预构图C2。
接着,对摄像辅助系统10的动作示例进行说明。
图9是示出摄像辅助系统10的动作示例的流程图。
首先,在无人飞行器100中,摄像部220摄像航拍图像(S101)。通信接口150将所摄像的航拍图像(例如实时取景图像)发送到便携式终端80(S102)。这里,作为一个示例,将S101的航拍图像作为实时取景图像进行说明。
在便携式终端80中,无线通信部85接收实时取景图像(S151)。显示部88可以显示例如实时取景图像。用户可以在确认实时取景图像的显示并希望调整构图时,通过操作部83进行调整构图的操作。当通过操作部83接受构图的调整操作时,终端控制部81启动摄像辅助应用程序,通过无线通信部85向无人飞行器100发送构图调整命令(S152)。
在无人飞行器100中,当通过通信接口150获取构图调整命令时 (S103),UAV控制部110启动摄像辅助应用程序。主被摄体确定部112确定实时取景图像中的主被摄体(S104)。在S104中,主被摄体确定部112可以通过通信接口150向便携式终端80提供用于选择主被摄体的画面信息。在便携式终端80中,无线通信部85可以从无人飞行器100接收用于选择主被摄体的画面信息,操作部83接受主被摄体的选择操作,并且无线通信部85将基于主被摄体的选择操作的选择信息发送到无人飞行器100(S153)。主被摄体确定部112可以通过通信接口150获取主被摄体的选择信息,并且基于主被摄体的选择信息确定主被摄体。
构图确定部113基于所确定的主被摄体来确定构图(S105)。在S105中,构图确定部113可以通过通信接口150向便携式终端80提供用于选择主构图的画面信息。在便携式终端80中,无线通信部85从无人飞行器100接收用于选择构图的画面信息,操作部83接受构图的选择操作,并且无线通信部85将基于构图的选择操作的选择信息发送到无人飞行器100(S154)。构图确定部113可以通过通信接口150获取构图的选择信息,并且基于构图的选择信息来确定构图。
动作信息生成部114基于所确定的构图,生成(例如计算出)无人飞行器100的动作信息(例如无人飞行器100的移动方向和移动量的信息)(S106)。这里的无人飞行器100的移动方向和移动量例如可以是S101中的实时取景图像从航拍位置向用于实现预构图的航拍位置的移动方向和移动量。
动作控制部115进行向基于计算出的移动方向和移动量的目的地的飞行控制,并移动(S107)。此目的地是从移动前的实时取景图像的位置开始移动了移动方向和移动量的量的位置。动作控制部115判定向目的地的移动是否完成(S108)。在向目的地的移动尚未完成的情况下,转至S107。
在向目的地的移动已完成的情况下,通信接口150向便携式终端 80发送移动完成通知(S109)。在便携式终端80中,无线通信部85接收来自无人飞行器100的移动完成通知(S155)。操作部83可以接受用于由移动后的无人飞行器100所包含的摄像部220或摄像部230进行摄像的操作。当通过操作部83接受摄像操作时,终端控制部81通过无线通信部85向无人飞行器100发送摄像命令(S156)。
在无人飞行器100中,通信接口150接收来自便携式终端80的摄像命令(S110)。摄像部220或摄像部230按照摄像命令摄像航拍图像(S111)。此航拍图像是在无人飞行器100为实现预构图而移动到的位置上航拍的图像,是遵照预构图的图像。
另外,也可以省略摄像命令的发送和接收。在这种情况下,无人飞行器100可以在基于动作信息的移动完成时或完成后,在移动后的位置上进行航拍。
另外,S101至S106的处理可以在无人飞行器100不移动的期间(不移动并且位于预定位置时)进行,也可以在无人飞行器100的移动期间进行。
另外,动作信息也可以是万向支架200的旋转信息,而不是无人飞行器100的移动信息。在这种情况下,在S106中,动作信息生成部114基于所确定的构图,生成(例如计算出)无人飞行器100的动作信息(例如万向支架200的旋转方向和旋转量的信息)。这里的万向支架200的旋转方向和旋转量例如可以是S101中的实时取景图像用于从航拍位置实现预构图的万向支架200的旋转方向和旋转量。在S107中,动作控制部115进行向基于所计算出的旋转方向和旋转量的目标旋转位置的旋转控制,并使万向支架200旋转。此目标旋转位置是从移动前的实时取景图像的摄像时的万向支架200的角度开始移动了旋转方向和旋转量的量的位置。在S108中,动作控制部115判定向目标旋转位置的旋转是否完成。在向目标旋转位置的旋转尚未完成的情况下,转至S107,动作控制部115继续旋转动作。
另外,操作控制部115可以进行万向支架200的旋转控制和无人飞行器100的飞行控制中的任一个,也可以进行两者。另外,由于由万向支架200的旋转引起的摄像范围的变更不伴随无人飞行器100的移动,所以摄像范围的变更程度较小。另一方面,由于由无人飞行器100的移动引起的摄像范围的变更伴随无人飞行器100的移动,所以摄像范围的变更程度较大。因此,当在万向支架200的旋转控制之后进行无人飞行器100的飞行控制时,即使在通过万向支架200的旋转不能实现期望的构图的情况下,也可以通过无人飞行器100的飞行控制来辅助实现期望的构图。即,无人飞行器100能够通过万向支架20的旋转控制来节省能量,同时通过无人飞行器100的飞行控制可靠地实现期望的构图。
这样,根据无人飞行器100和摄像辅助系统10,可以添加期望的被摄体来确定用于有吸引力地对该期望的被摄体进行航拍的构图。即,无人飞行器100和摄像辅助系统10不仅可以将期望的被摄体收入到摄像图像内,还可以考虑改善摄像图像的构图来辅助图像的摄像。因此,即使在用户对照片摄像没有足够的专门技术的情况下,也可以通过无人飞行器100辅助构图的确定,并且可以辅助期望的被摄体的航拍。此外,无人飞行器100可以进行与构图相匹配的动作(例如无人飞行器100的移动、万向支架200的旋转角度的调整),因此可以将预构图用于将来的航拍。
此外,无人飞行器100可以通过进行与摄像辅助相关的主被摄体的确定、构图的确定以及动作信息的生成,来快速地实施基于该动作信息的无人飞行器100的动作。此外,无人飞行器100可以通过进行与摄像辅助相关的主被摄体的确定、构图的确定以及动作信息的生成,来降低便携式终端80的处理负荷,还可以实现与便携式终端80之间的通信负荷的降低。因此,便携式终端80可以在降低便携式终端80自身的处理负荷的同时,与无人飞行器100协作对与摄像辅助相关的处理作出贡献。
此外,无人飞行器100可以通过基于S101和S102的实时取景图像进行主被摄体的确定和构图的确定等,来生成以实时取景图像中包含的期望的被摄体为基准的预构图。因此,无人飞行器100可以在进行一系列摄像的流程中,以期望的构图对期望的被摄体进行航拍。此外,无人飞行器100可以以适于在正式摄像前的临时摄像中映入的被摄体中包含的主被摄体的构图进行正式摄像。
另外,显示部88也可以显示动作信息。与动作信息相关的显示信息可以是“请向东移动10米”、“请在重力方向上将万向支架200旋转20度”等。也可以由其他的提示部提示动作信息,来代替由显示部88进行的动作信息的显示。例如,也可以由声音输出部(未图示)输出与动作信息相关的声音信息,还可以由振动部(未图示)进行表示动作信息的振动。
通过无人飞行器100提示动作信息,用户可以确认动作信息的内容。因此,通过确认了动作信息的用户操作发送器50,可以从发送器50向无人飞行器100发送动作指示。无人飞行器100也可以通过通信接口150获取动作指示,使无人飞行器100移动到用于实现预构图的航拍位置上。在这种情况下,无人飞行器100即使不具有动作控制部115的功能,也可以实施用于实现预构图的无人飞行器100的动作。
(第二实施方式)
在第一实施方式中,例示了无人飞行器进行主被摄体的确定、构图的确定和动作信息的生成。在第二实施方式中,例示便携式终端进行主被摄体的确定、构图的确定和动作信息的生成。另外,在第二实施方式中,对于与第一实施方式相同的构成、动作,省略或简化其说明。
图10是示出第二实施方式中的摄像辅助系统10A的构成示例的示意图。摄像辅助系统10A包含无人飞行器100A、发送器50和便 携式终端80A。无人飞行器100A、发送器50和便携式终端80A互相之间可以通过有线通信或无线通信(例如无线LAN(Local Area Network,局域网))进行通信。
便携式终端80A确定无人飞行器100A进行航拍的构图,并生成无人飞行器100A的动作信息,以成为所确定的构图。无人飞行器100A按照动作信息控制无人飞行器100A的动作。便携式终端80A可以与发送器50一起,被预定了使用无人飞行器100A进行航拍的用户所携带。便携式终端80A对由无人飞行器100A进行的航拍进行辅助。
图11是示出无人飞行器100A的硬件构成的一个示例的框图。无人飞行器100A与第一实施方式中的无人飞行器100相比,代替UAV控制部110而具有UAV控制部110A。另外,在图11的无人飞行器100A中,对与图2所示的无人飞行器100的构成相同的构成赋予相同的符号,并省略或简化其说明。另外,存储器160也可以不保存与摄像辅助相关的信息(例如构图的样本信息、真实空间中的距离与画面上的距离之间的对应关系的信息)。
图12是示出UAV控制部110A的功能构成的一个示例的框图。UAV控制部110A包含动作控制部115和动作信息获取部116。另外,在图12的UAV控制部110A中,对与图3所示的UAV控制部110的构成相同的构成赋予相同的符号,并省略或简化其说明。
动作信息获取部116例如通过通信接口150从便携式终端80A获取无人飞行器100A的动作信息。动作控制部115按照所获取的动作信息控制无人飞行器100A的动作。无人飞行器100A的动作控制的内容可以与第一实施方式中的相同。
图13是示出便携式终端80A的硬件构成的一个示例的框图。便携式终端80A与第一实施方式中的便携式终端80相比,代替终端控制部81而具有终端控制部81A。另外,在图13的便携式终端80A 中,对与图4所示的便携式终端80的构成相同的构成赋予相同的符号,并省略或简化其说明。另外,与第一实施方式中的无人飞行器100所包含的存储器160相同,存储器87可以保存有与摄像辅助相关的信息(例如构图的样本信息、真实空间中的距离与画面上的距离之间的对应关系的信息、与机器学习相关的信息)。
图14是示出终端控制部81A的功能构成的一个示例的框图。终端控制部81A包含图像获取部811、主被摄体确定部812、构图确定部813和动作信息生成部814。主被摄体确定部812和构图确定部813是信息获取部的一个示例。动作信息生成部114是生成部的一个示例。
图像获取部811可以获取保存于存储器87的图像(例如由无人飞行器100A的摄像部220或摄像部230航拍的航拍图像)。图像获取部811例如可以通过通信接口150获取摄像部220或摄像部230航拍中的航拍图像。航拍图像可以是动态图像,也可以是静止图像。航拍中的航拍动态图像也被称为实时取景图像。由图像获取部811获取的航拍图像主要以实时取景图像为例。
主被摄体确定部812在由图像获取部811获取的实时取景图像中包含的一个以上的被摄体中确定(决定)主被摄体。主被摄体的确定是主被摄体的信息获取的一个示例。主被摄体确定部812的主被摄体确定方法可以与第一实施方式中的无人飞行器100所包含的主被摄体确定部112的主被摄体确定方法相同。
构图确定部813确定用于摄像所确定的主被摄体的构图。用于摄像主被摄体的构图的确定是用于摄像主被摄体的构图的信息获取的一个示例。构图确定部813的构图确定方法可以与第一实施方式中的无人飞行器100所包含的构图确定部113的构图确定方法相同。
动作信息生成部814生成用于实现按照所确定的构图进行的航拍的无人飞行器100A的动作信息。动作信息生成部814的动作信息 生成方法可以与第一实施方式中的无人飞行器100所包含的动作信息生成部814的动作信息生成方法相同。生成的动作信息例如可以由无线通信部85发送到无人飞行器100A。
接着,对摄像辅助系统10A的动作示例进行说明。
图15是示出摄像辅助系统10A的动作示例的流程图。
首先,无人飞行器100A执行S101和S102的处理。便携式终端80A执行S151的处理。
在便携式终端80A中,显示部88可以显示例如实时取景图像。用户可以在确认实时取景图像的显示并希望调整构图时,通过操作部83进行用于调整构图的操作。此操作是构图调整开始指示的一个示例。当通过操作部83接受构图的调整操作时,终端控制部81启动摄像辅助应用程序(S161)。
主被摄体确定部812确定实时取景图像中的主被摄体(S162)。在S162中,主被摄体确定部812可以使显示部88显示用于选择主被摄体的选择画面。操作部83可以通过接受主被摄体的选择操作,获取主被摄体的选择信息。主被摄体确定部812可以基于此主被摄体的选择信息确定主被摄体。
构图确定部813基于所确定的主被摄体来确定构图(S163)。在S163中,构图确定部813可以使显示部88显示用于选择预构图的选择画面。操作部83可以通过接受构图的选择操作,获取预构图的选择信息。构图确定部813可以基于此构图的选择信息确定构图。
动作信息生成部814基于所确定的构图生成无人飞行器100A的动作信息(S164)。例如无线通信部85将所生成的无人飞行器100A的动作信息发送到无人飞行器100A(S165)。
在无人飞行器100A中,通信接口150接收来自便携式终端80A 的无人飞行器100A的动作信息(S121)。
然后,无人飞行器100A实施S107至S111的处理,便携式终端80A实施S155和S156的处理。
这样,根据便携式终端80A和摄像辅助系统10A,可以添加期望的被摄体来确定用于有吸引力地对该期望的被摄体进行航拍的构图。即,便携式终端80A和摄像辅助系统10A不仅可以将期望的被摄体收入到摄像图像内,还可以考虑改善摄像图像的构图来辅助图像的摄像。因此,即使在用户对照片摄像没有足够的专门技术的情况下,也可以通过便携式终端80A辅助构图的确定,并且可以辅助期望的被摄体的航拍。此外,无人飞行器100A可以进行与构图相匹配的动作(例如无人飞行器100A的移动、万向支架200的旋转角度的调整),因此可以将预构图用于将来的航拍。
此外,便携式终端80A可以通过进行与摄像辅助相关的主被摄体的确定、构图的确定以及动作信息的生成,来降低无人飞行器100A的处理负荷,可以将无人飞行器100A集中到航拍图像的处理、飞行控制等处理中。此外,由于无人飞行器100A可以按照由作为其他装置的便携式终端80等生成的动作信息来动作,所以即使降低无人飞行器100的处理负荷,也可以实施用于实现期望的构图的期望的动作。
此外,便携式终端80A之外的信息处理装置(例如发送器50、PC、其他的信息处理装置)也可以具有便携式终端80A所具有的摄像辅助功能(例如主被摄体确定功能、构图确定功能、动作信息生成功能)。
(第三实施方式)
在第一、第二实施方式中,例示了对无人飞行器的航拍进行辅助。在第三实施方式中,例示对安装于万向支架装置的摄像装置的摄像进 行辅助。另外,在第三实施方式中,对于与第一、第二实施方式相同的构成、动作,省略或简化其说明。
图16是示出第三实施方式中的摄像辅助系统10B的构成示例的立体图。摄像辅助系统10B包含万向支架装置300和便携式终端80B。万向支架装置300和便携式终端80B互相之间可以通过有线通信(例如USB通信)或无线通信(例如无线LAN、Bluetooth(注册商标)、短距离通信、公共无线线路)进行通信。万向支架装置300是支持装置的一个示例。
便携式终端80B可以确定用于由安装于万向支架装置300的便携式终端80B所包含的摄像部820进行摄像的构图,并且可以生成万向支架装置300的动作信息,以成为所确定的构图。或者,万向支架装置300可以确定用于由安装于万向支架装置300的便携式终端80B所包含的摄像部820进行摄像的构图,并且可以生成万向支架装置300的动作信息,以成为所确定的构图。万向支架装置300按照动作信息控制万向支架装置300的动作。万向支架装置300可以被预定了使用便携式终端80B进行摄像的用户所携带。便携式终端80B或万向支架装置300对安装于万向支架装置300的便携式终端80B的摄像进行辅助。另外,摄像部820是摄像装置的一个示例。
如图16所示,万向支架装置300包含万向支架310、安装部315和握持部330。安装部315将便携式终端80B安装到万向支架装置300上,并固定便携式终端80B相对于万向支架装置300的位置、朝向。
万向支架310可以以偏航轴、俯仰轴以及横滚轴为中心可旋转地支持便携式终端80B。万向支架310可以通过使便携式终端80B以偏航轴、俯仰轴以及横滚轴中的至少一个为中心旋转,从而变更便携式终端80B所包含的摄像部820的摄像方向。由于便携式终端80B中的摄像部820的位置是固定的,因此可以说便携式终端80B的旋转对应于摄像部820的旋转。
在使用时(摄像时),握持部330可以由用户握持。图16所示的握持部330是一个示例,也可以是与图16不同的握持部330的形状、握持部330相对于万向支架装置300的位置以及握持部330相对于万向支架装置300的大小。另外,在图16中,表示万向支架310的虚线在便携式终端80B附近断开,这表示万向支架310位于比便携式终端80B更靠后侧。
接着,对万向支架装置300和便携式终端80B的构成示例进行说明。
尽管未图示,但是万向支架装置300具有无人飞行器100或无人飞行器100A的硬件构成的至少一部分。尽管未图示,但是万向支架装置300具有无人飞行器100或无人飞行器100A的功能构成的至少一部分。
尽管未图示,但是便携式终端80B可以与便携式终端80或便携式终端80A的硬件构成相同。尽管未图示,但是便携式终端80B可以与便携式终端80或便携式终端80A的功能构成相同。
在摄像辅助系统10B中,在便携式终端80B具有第一实施方式的便携式终端80的功能时,万向支架装置300可以具有第一实施方式的无人飞行器100的功能。即,万向支架装置300可以具有由摄像部820摄像的图像(例如实时取景图像)中的主被摄体的确定功能、构图的确定功能和万向支架装置300的动作信息的生成功能。此外,万向支架装置300可以具有基于动作信息来控制万向支架装置300的动作的功能。
在摄像辅助系统10B中,在便携式终端80B具有第二实施方式的便携式终端80A的功能时,万向支架装置300可以具有第二实施方式的无人飞行器100A的功能。即,便携式终端80B可以具有由摄像部820摄像的图像(例如实时取景图像)中的主被摄体的确定功能、构图的确定功能和万向支架装置300的动作信息的生成功能。万向支 架装置300可以具有万向支架装置300的动作信息的获取功能。此外,万向支架装置300可以具有基于动作信息来控制万向支架装置300的动作的功能。
另外,与无人飞行器100、100A不同,万向支架装置300不考虑飞行。因此,动作信息可以是万向支架310的旋转信息。因此,万向支架装置300可以基于动作信息来控制万向支架310的旋转。
这样,根据万向支架装置300、便携式终端80B和摄像辅助系统10B,可以添加期望的被摄体来确定用于有吸引力地对该期望的被摄体进行航拍的构图。即,万向支架装置300、便携式终端80B和摄像辅助系统10B不仅可以将期望的被摄体收入到摄像图像内,还可以考虑改善摄像图像的构图来辅助图像的摄像。因此,即使在用户对照片摄像没有足够的专门技术的情况下,也可以通过万向支架装置300或便携式终端80B辅助构图的确定,并且可以辅助期望的被摄体的摄像。此外,万向支架装置300可以进行与构图相匹配的动作(例如万向支架310的旋转角度的调整),因此可以将预构图用于将来的摄像。
此外,万向支架装置300可以通过进行与摄像辅助相关的主被摄体的确定、构图的确定以及动作信息的生成,来快速地实施基于该动作信息的万向支架装置300的动作。此外,万向支架装置300可以通过进行与摄像辅助相关的主被摄体的确定、构图的确定以及动作信息的生成,来降低便携式终端80B的处理负荷,还可以实现与便携式终端80B之间的通信负荷的降低。因此,便携式终端80B可以在降低便携式终端80B自身的处理负荷的同时,与万向支架装置300协作对与摄像辅助相关的处理作出贡献。此外,万向支架装置300可以将由摄像辅助生成的动作信息用于万向支架310的旋转控制。即,不仅为了第一、第二实施方式那样的无人飞行器100的航拍,还可以为了使用万向支架装置300的摄像,而实施基于摄像辅助的摄像支援。
此外,便携式终端80B可以通过进行与摄像辅助相关的主被摄体 的确定、构图的确定以及动作信息的生成,来降低万向支架装置300的处理负荷,可以将万向支架装置300集中到摄像图像的处理等中。此外,由于万向支架装置300可以按照由作为其他装置的便携式终端80B等生成的动作信息来动作,所以即使降低万向支架装置300的处理负荷,也可以实施用于实现期望的构图的期望的动作。
(第四实施方式)
在第三实施方式中,例示了对安装于万向支架装置的摄像装置的摄像进行辅助。在第四实施方式中,例示对万向支架装置所具有的摄像部的摄像进行辅助。另外,在第四实施方式中,对于与第一至第三实施方式相同的构成、动作,省略或简化其说明。
图17A是示出第四实施方式中的万向支架装置300C的构成示例的正面立体图。图17B是示出第四实施方式中的摄像辅助系统10C的构成示例的背面立体图。摄像辅助系统10C包含万向支架装置300C和便携式终端80C。万向支架装置300C和便携式终端80C互相之间可以通过有线通信(例如USB通信)或无线通信(例如无线LAN、Bluetooth(注册商标)、短距离通信、公共无线线路)进行通信。万向支架装置300C是支持装置的一个示例。
便携式终端80C可以确定用于由内设于万向支架装置300C的摄像部320进行摄像的构图,并且可以生成万向支架装置300C的动作信息,以成为所确定的构图。或者,万向支架装置300C可以确定用于由摄像部320进行摄像的构图,并且可以生成万向支架装置300C的动作信息,以成为所确定的构图。万向支架装置300C按照动作信息控制万向支架装置300C的动作。万向支架装置300C可以被预定了使用万向支架装置300C进行摄像的用户所携带。便携式终端80C或万向支架装置300C对万向支架装置300C的摄像进行辅助。
如图17A及图17B所示,万向支架装置300C包含万向支架310C、摄像部320和握持部330。摄像部320是摄像装置的一个示例。在图 17A及图17B的万向支架装置300C和摄像辅助系统10C中,对与图16所示的万向支架装置300和摄像辅助系统10B相同的构成赋予相同的符号,并省略或简化其说明。
万向支架310C可以以偏航轴、俯仰轴以及横滚轴为中心可旋转地支持摄像部320。万向支架310C可以通过使摄像部320以偏航轴、俯仰轴以及横滚轴中的至少一个为中心旋转,从而变更摄像部320的摄像方向。例如摄像部320可以在纸张表面上对进深方向进行摄像。摄像部320可以变更摄像方向。握持部330可以由例如用户的手HD1握持。
接着,对万向支架装置300和便携式终端80B的构成示例进行说明。
尽管未图示,但是万向支架装置300C具有无人飞行器100或无人飞行器100A的硬件构成的至少一部分。尽管未图示,但是万向支架装置300C具有无人飞行器100或无人飞行器100A的功能构成的至少一部分。
尽管未图示,但是便携式终端80C可以与便携式终端80或便携式终端80A的硬件构成相同。尽管未图示,但是便携式终端80C可以与便携式终端80或便携式终端80A的功能构成相同。
在摄像辅助系统10C中,在便携式终端80C具有第一实施方式的便携式终端80的功能时,万向支架装置300C可以具有第一实施方式的无人飞行器100的功能。即,万向支架装置300C可以具有由摄像部320摄像的图像(例如实时取景图像)中的主被摄体的确定功能、构图的确定功能和万向支架装置300C的动作信息的生成功能。此外,万向支架装置300C可以具有基于动作信息来控制万向支架装置300C的动作的功能。
在摄像辅助系统10C中,在便携式终端80C具有第二实施方式 的便携式终端80A的功能时,万向支架装置300C可以具有第二实施方式的无人飞行器100A的功能。即,便携式终端80C可以具有由摄像部320摄像的图像(例如实时取景图像)中的主被摄体的确定功能、构图的确定功能和万向支架装置300C的动作信息的生成功能。万向支架装置300C可以具有万向支架装置300C的动作信息的获取功能。此外,万向支架装置300C可以具有基于动作信息来控制万向支架装置300C的动作的功能。
另外,与无人飞行器100、100A不同,万向支架装置300C不考虑飞行。因此,动作信息可以是万向支架310C的旋转信息。因此,万向支架装置300C可以基于动作信息来控制万向支架310C的旋转。
这样,根据万向支架装置300C、便携式终端80C和摄像辅助系统10C,可以添加期望的被摄体来确定用于有吸引力地对该期望的被摄体进行航拍的构图。即,万向支架装置300C、便携式终端80C和摄像辅助系统10C不仅可以将期望的被摄体收入到摄像图像内,还可以考虑改善摄像图像的构图来辅助图像的摄像。因此,即使在用户对照片摄像没有足够的专门技术的情况下,也可以通过万向支架装置300C或便携式终端80C辅助构图的确定,并且可以辅助期望的被摄体的摄像。此外,万向支架装置300C可以进行与构图相匹配的动作(例如万向支架310C的旋转角度的调整),因此可以将预构图用于将来的摄像。
此外,万向支架装置300C可以通过进行与摄像辅助相关的主被摄体的确定、构图的确定以及动作信息的生成,来快速地实施基于该动作信息的万向支架装置300C的动作。此外,万向支架装置300C可以通过进行与摄像辅助相关的主被摄体的确定、构图的确定以及动作信息的生成,来降低便携式终端80C的处理负荷,还可以实现与便携式终端80C之间的通信负荷的降低。因此,便携式终端80C可以在降低便携式终端80C自身的处理负荷的同时,与万向支架装置300C协作对与摄像辅助相关的处理作出贡献。此外,万向支架装置 300C可以将由摄像辅助生成的动作信息用于万向支架310C的旋转控制。即,不仅为了第一、第二实施方式那样的无人飞行器100的航拍,还可以为了使用万向支架装置300C的摄像,而实施基于摄像辅助的摄像支援。
此外,便携式终端80C可以通过进行与摄像辅助相关的主被摄体的确定、构图的确定以及动作信息的生成,来降低万向支架装置300C的处理负荷,可以将万向支架装置300C集中到摄像图像的处理等中。此外,由于万向支架装置300C可以按照由作为其他装置的便携式终端80C等生成的动作信息来动作,所以即使降低万向支架装置300C的处理负荷,也可以实施用于实现期望的构图的期望的动作。
以上通过实施方式对本公开进行了说明,但是本公开的技术范围并不限于上述实施方式所记载的范围。对本领域普通技术人员来说,显然可以对上述实施方式加以各种变更或改良。从权利要求书的记载即可明白,加以了这样的变更或改良的方式都可包含在本公开的技术范围之内。
权利要求书、说明书、以及说明书附图中所示的装置、系统、程序以及方法中的动作、过程、步骤、以及阶段等各项处理的执行顺序,只要没有特别明示“在……之前”、“事先”等,只要前面处理的输出并不用在后面的处理中,则可以以任意顺序实现。关于权利要求书、说明书以及说明书附图中的动作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。
符号说明
10、10A 摄像辅助系统
50 发送器
80、80A、80B、80C 便携式终端
81、81A 终端控制部
82 接口部
83 操作部
85 无线通信部
87 存储器
88 显示部
100、100A 无人飞行器
110、110A UAV 控制部
111 图像获取部
112 主被摄体确定部
113 构图确定部
114 动作信息生成部
115 动作控制部
116 动作信息获取部
150 通信接口
160 存储器
200 万向支架
210 旋翼机构
220、230 摄像部
240 GPS接收器
250 惯性测量装置
260 磁罗盘
270 气压高度计
280 超声波传感器
290 激光测量仪
300、300C 万向支架装置
310、310C 万向支架
315 安装部
320 摄像部
330 握持部
811 图像获取部
812 主被摄体确定部
813 构图确定部
814 动作信息生成部
820 摄像部

Claims (42)

  1. 一种移动平台,其是对由摄像装置进行的第二图像的摄像进行辅助的移动平台,其包含:
    图像获取部,其获取第一图像;
    信息获取部,其在所述第一图像所包含的一个以上的被摄体中获取第一被摄体的信息,并在对所述第二图像中的包括所述第一被摄体的一个以上的被摄体的位置进行规定的一个以上的构图中获取第一构图的信息;以及
    生成部,其根据所述第一构图,生成与用于摄像所述第二图像的所述摄像装置的动作相关的动作信息。
  2. 如权利要求1所述的移动平台,其中,
    所述信息获取部从所述第一图像所包含的多个被摄体中选择并获取所述第一被摄体。
  3. 如权利要求1或2所述的移动平台,其中,
    所述信息获取部根据所述第一图像所包含的被摄体的颜色成分来获取所述第一被摄体的信息。
  4. 如权利要求1或2所述的移动平台,其中,
    所述信息获取部根据所述第一图像所包含的被摄体的空间频率来获取所述第一被摄体的信息。
  5. 如权利要求1所述的移动平台,其中,
    所述信息获取部获取所述摄像装置的位置信息,并且根据所述摄 像装置的位置信息来获取所述第一被摄体的信息。
  6. 如权利要求1所述的移动平台,其中,
    所述信息获取部根据由所述摄像装置进行的所述第二图像的摄像时的摄像模式来获取所述第一被摄体的信息。
  7. 如权利要求1至6中任一项所述的移动平台,其中,
    所述信息获取部从多个所述构图中选择并获取所述第一构图。
  8. 如权利要求1至7中任一项所述的移动平台,其还包含:
    识别部,其识别所述第一被摄体的形状,
    所述信息获取部根据所述第一被摄体的形状来获取所述第一构图的信息。
  9. 如权利要求1至7中任一项所述的移动平台,其还包含:
    识别部,其识别所述第二图像被摄像时的场景,
    所述信息获取部根据所述场景来获取所述第一构图的信息。
  10. 如权利要求1至9中任一项所述的移动平台,其中,
    所述生成部生成与可旋转地支持所述摄像装置的支持部件的旋转相关的旋转信息作为所述动作信息。
  11. 如权利要求10所述的移动平台,其中,
    所述生成部根据所述第一图像中的所述第一被摄体的位置和所述第一构图中的所述第一被摄体的位置来确定所述支持部件的旋转量和旋转方向。
  12. 如权利要求1至11中任一项所述的移动平台,其中,
    所述生成部生成与所述摄像装置的移动相关的移动信息作为所述动作信息。
  13. 如权利要求12所述的移动平台,其中,
    所述生成部根据所述第一图像中的所述第一被摄体的大小和所述第一构图中的所述第一被摄体的大小来确定沿重力方向的所述摄像装置的移动量。
  14. 如权利要求12所述的移动平台,其中,
    所述生成部根据所述第一图像中的所述第一被摄体的位置、所述第一构图中的所述第一被摄体的位置、以及所述第一图像中的移动距离与真实空间中的移动距离的对应关系来确定所述摄像装置的移动量和移动方向。
  15. 如权利要求1至14中任一项所述的移动平台,其还包含:
    提示部,其提示所述动作信息。
  16. 如权利要求1至15中任一项所述的移动平台,其中,
    所述第一图像是由所述摄像装置摄像的图像。
  17. 如权利要求1、3至6、8至16中任一项所述的移动平台,其中,
    所述移动平台是包含所述摄像装置以及可旋转地支持所述摄像装置的支持部件的飞行体,并且还包含:
    控制部,其根据所述动作信息来控制所述飞行体的飞行或所述支持部件的旋转。
  18. 如权利要求1、3至6、8至16中任一项所述的移动平台,其中,
    所述移动平台是在使用时由用户握持的、包含可旋转地支持所述摄像装置的支持部件的支持装置,并且还包含:
    控制部,其根据所述动作信息来控制所述支持部件的旋转。
  19. 如权利要求1至4、7至16中任一项所述的移动平台,其中,
    所述移动平台是便携式终端,并且还包含:
    通信部,其将所述动作信息发送到飞行体或支持装置。
  20. 一种飞行体,其包含:
    摄像装置;
    支持部件,其可旋转地支持所述摄像装置;
    动作信息获取部,其获取由如权利要求1至4、7至16中任一项所述的移动平台生成的所述动作信息;以及
    控制部,其根据所述动作信息来控制所述飞行体的飞行或所述支持部件的旋转。
  21. 一种支持装置,其包含:
    支持部件,其可旋转地支持摄像装置;
    动作信息获取部,其获取由如权利要求1至4、7至16中任一项所述的移动平台生成的所述动作信息;以及
    控制部,其根据所述动作信息来控制所述支持部件的旋转。
  22. 一种摄像辅助方法,其是对由摄像装置进行的第二图像的摄像进行辅助的移动平台中的摄像辅助方法,其具有以下步骤:
    获取第一图像的步骤;
    在所述第一图像所包含的一个以上的被摄体中获取第一被摄体的信息的步骤;
    在对所述第二图像中的包括所述第一被摄体的一个以上的被摄体的位置进行规定的一个以上的构图中获取第一构图的信息的步骤;以及
    根据所述第一构图,生成与用于摄像所述第二图像的所述摄像装置的动作相关的动作信息的步骤。
  23. 如权利要求22所述的摄像辅助方法,其中,
    所述获取第一被摄体的信息的步骤包括从所述第一图像所包含的多个被摄体中选择并获取所述第一被摄体的步骤。
  24. 如权利要求22或23所述的摄像辅助方法,其中,
    所述获取第一被摄体的信息的步骤包括根据所述第一图像所包含的被摄体的颜色成分来获取所述第一被摄体的信息的步骤。
  25. 如权利要求22或23所述的摄像辅助方法,其中,
    所述获取第一被摄体的信息的步骤包括根据所述第一图像所包含的被摄体的空间频率来获取所述第一被摄体的信息的步骤。
  26. 如权利要求22所述的摄像辅助方法,其还包括获取所述摄像装置的位置信息的步骤,所述获取第一被摄体的信息的步骤包括根据所述摄像装置的位置信息来获取所述第一被摄体的信息的步骤。
  27. 如权利要求22所述的摄像辅助方法,其中,
    所述获取第一被摄体的信息的步骤包括根据由所述摄像装置进行的所述第二图像的摄像时的摄像模式来获取所述第一被摄体的信息的步骤。
  28. 如权利要求22至27中任一项所述的摄像辅助方法,其中,
    所述获取第一构图的信息的步骤包括从所述多个构图中选择并获取所述第一构图的步骤。
  29. 如权利要求28所述的摄像辅助方法,其还包括识别所述第一被摄体的形状的步骤,所述获取第一构图的信息的步骤包括根据所述第一被摄体的形状来获取所述第一构图的信息的步骤。
  30. 如权利要求28所述的摄像辅助方法,其还包括识别所述第二图像被摄像时的场景的步骤,所述获取第一构图的信息的步骤包括根据所述场景来获取所述第一构图的信息的步骤。
  31. 如权利要求22至30中任一项所述的摄像辅助方法,其中,
    所述生成动作信息的步骤包括生成与可旋转地支持所述摄像装置的支持部件的旋转相关的旋转信息作为所述动作信息的步骤。
  32. 如权利要求31所述的摄像辅助方法,其中,
    所述生成动作信息的步骤包括根据所述第一图像中的所述第一被摄体的位置和所述第一构图中的所述第一被摄体的位置来确定所述支持部件的旋转量和旋转方向的步骤。
  33. 如权利要求22至32中任一项所述的摄像辅助方法,其中,
    所述生成动作信息的步骤包括生成与所述摄像装置的移动相关的移动信息作为所述动作信息的步骤。
  34. 如权利要求33所述的摄像辅助方法,其中,
    所述生成动作信息的步骤包括根据所述第一图像中的所述第一被摄体的大小和所述第一构图中的所述第一被摄体的大小来确定沿重力方向的所述摄像装置的移动量的步骤。
  35. 如权利要求33所述的摄像辅助方法,其中,
    所述生成动作信息的步骤包括根据所述第一图像中的所述第一被摄体的位置、所述第一构图中的所述第一被摄体的位置、以及所述第一图像中的移动距离与真实空间中的移动距离的对应关系来确定所述摄像装置的移动量和移动方向的步骤。
  36. 如权利要求22至35中任一项所述的摄像辅助方法,其还包括:
    在提示部提示所述动作信息的步骤。
  37. 如权利要求22至36中任一项所述的摄像辅助方法,其中,
    所述第一图像是由所述摄像装置摄像的图像。
  38. 如权利要求22、24至27、29至37中任一项所述的摄像辅助方法,其中,
    所述移动平台是包含所述摄像装置以及可旋转地支持所述摄像装置的支持部件的飞行体,
    还包括:根据所述动作信息来控制所述飞行体的飞行或所述支持部件的旋转的步骤。
  39. 如权利要求22、24至27、29至37中任一项所述的摄像辅助方法,其中,
    所述移动平台是在使用时由用户握持的、包含可旋转地支持所述摄像装置的支持部件的支持装置,
    还包括:根据所述动作信息来控制所述支持部件的旋转的步骤。
  40. 如权利要求22至37中任一项所述的摄像辅助方法,其中,
    所述移动平台是便携式终端,
    还包括:将所述动作信息发送到飞行体或支持装置的步骤。
  41. 一种程序,其是用于使对由摄像装置进行的第二图像的摄像进行辅助的移动平台执行以下步骤的程序:
    获取第一图像的步骤;
    在所述第一图像所包含的一个以上的被摄体中获取第一被摄体的信息的步骤;
    在对所述第二图像中的包括所述第一被摄体的一个以上的被摄体的位置进行规定的一个以上的构图中获取第一构图的信息的步骤;以及
    根据所述第一构图,生成与用于摄像所述第二图像的所述摄像装置的动作相关的动作信息的步骤。
  42. 一种记录介质,其是记录有用于使对由摄像装置进行的第二图像的摄像进行辅助的移动平台执行以下步骤的程序的计算机可读记录介质:
    获取第一图像的步骤;
    在所述第一图像所包含的一个以上的被摄体中获取第一被摄体的信息的步骤;
    在对所述第二图像中的包括所述第一被摄体的一个以上的被摄体的位置进行规定的一个以上的构图中获取第一构图的信息的步骤;以及
    根据所述第一构图,生成与用于摄像所述第二图像的所述摄像装置的动作相关的动作信息的步骤。
PCT/CN2017/108413 2017-05-26 2017-10-30 移动平台、飞行体、支持装置、便携式终端、摄像辅助方法、程序以及记录介质 WO2018214401A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201780064135.6A CN109863745A (zh) 2017-05-26 2017-10-30 移动平台、飞行体、支持装置、便携式终端、摄像辅助方法、程序以及记录介质

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-104737 2017-05-26
JP2017104737A JP6875196B2 (ja) 2017-05-26 2017-05-26 モバイルプラットフォーム、飛行体、支持装置、携帯端末、撮像補助方法、プログラム、及び記録媒体

Publications (1)

Publication Number Publication Date
WO2018214401A1 true WO2018214401A1 (zh) 2018-11-29

Family

ID=64396168

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/108413 WO2018214401A1 (zh) 2017-05-26 2017-10-30 移动平台、飞行体、支持装置、便携式终端、摄像辅助方法、程序以及记录介质

Country Status (3)

Country Link
JP (1) JP6875196B2 (zh)
CN (1) CN109863745A (zh)
WO (1) WO2018214401A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022188151A1 (zh) * 2021-03-12 2022-09-15 深圳市大疆创新科技有限公司 影像拍摄方法、控制装置、可移动平台和计算机存储介质
CN115835013A (zh) * 2021-09-16 2023-03-21 腾讯科技(深圳)有限公司 多媒体互动方法、系统、装置、设备、介质及计算机程序

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020209167A1 (ja) * 2019-04-08 2020-10-15 ソニー株式会社 情報処理装置、情報処理方法、及びプログラム
US20220283584A1 (en) 2019-07-19 2022-09-08 Sony Group Corporation Information processing device, information processing method, and information processing program

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006027448A (ja) * 2004-07-16 2006-02-02 Chugoku Electric Power Co Inc:The 無人飛行体を利用した空撮方法及びその装置
US7773116B1 (en) * 2006-02-08 2010-08-10 Lockheed Martin Corporation Digital imaging stabilization
CN103426282A (zh) * 2013-07-31 2013-12-04 深圳市大疆创新科技有限公司 遥控方法及终端
WO2016029169A1 (en) * 2014-08-22 2016-02-25 Cape Productions Inc. Methods and apparatus for unmanned aerial vehicle autonomous aviation
CN106131411A (zh) * 2016-07-14 2016-11-16 纳恩博(北京)科技有限公司 一种拍摄图像的方法和装置
CN106331508A (zh) * 2016-10-19 2017-01-11 深圳市道通智能航空技术有限公司 拍摄构图的方法及装置
US20170108877A1 (en) * 2014-07-30 2017-04-20 SZ DJI Technology Co., Ltd. Systems and methods for target tracking
CN106586011A (zh) * 2016-12-12 2017-04-26 高域(北京)智能科技研究院有限公司 航拍无人飞行器的对准方法及其航拍无人飞行器
CN106708089A (zh) * 2016-12-20 2017-05-24 北京小米移动软件有限公司 跟随式的飞行控制方法及装置、无人机

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000098456A (ja) * 1998-09-28 2000-04-07 Minolta Co Ltd オート構図機能を有するカメラ
JP3833486B2 (ja) * 2000-04-19 2006-10-11 富士写真フイルム株式会社 撮像装置
JP2008061209A (ja) * 2006-09-04 2008-03-13 Canon Inc 画像処理方法
JP4894712B2 (ja) * 2007-10-17 2012-03-14 ソニー株式会社 構図判定装置、構図判定方法、プログラム
JP5200780B2 (ja) * 2008-09-08 2013-06-05 ソニー株式会社 撮影装置および方法、並びにプログラム
JP5310076B2 (ja) * 2009-02-23 2013-10-09 株式会社ニコン 画像処理装置、および画像処理プログラム
JP5287465B2 (ja) * 2009-04-21 2013-09-11 ソニー株式会社 撮像装置、撮影設定方法及びそのプログラム
JP5858754B2 (ja) * 2011-11-29 2016-02-10 キヤノン株式会社 撮影装置、表示方法及びプログラム
JP2013207357A (ja) * 2012-03-27 2013-10-07 Sony Corp サーバ、クライアント端末、システムおよびプログラム
JP5880263B2 (ja) * 2012-05-02 2016-03-08 ソニー株式会社 表示制御装置、表示制御方法、プログラムおよび記録媒体
JP6000780B2 (ja) * 2012-09-21 2016-10-05 オリンパス株式会社 撮像装置
CN103870138B (zh) * 2012-12-11 2017-04-19 联想(北京)有限公司 一种信息处理方法及电子设备
KR102045957B1 (ko) * 2013-01-18 2019-11-18 삼성전자 주식회사 휴대단말의 촬영 방법 및 장치
JP2014236334A (ja) * 2013-05-31 2014-12-15 株式会社ニコン 撮像装置
CN103533245B (zh) * 2013-10-21 2018-01-09 努比亚技术有限公司 拍摄装置及辅助拍摄方法
US9667860B2 (en) * 2014-02-13 2017-05-30 Google Inc. Photo composition and position guidance in a camera or augmented reality system
US10235587B2 (en) * 2014-03-04 2019-03-19 Samsung Electronics Co., Ltd. Method and system for optimizing an image capturing boundary in a proposed image
CN104301613B (zh) * 2014-10-16 2016-03-02 深圳市中兴移动通信有限公司 移动终端及其拍摄方法
US9407815B2 (en) * 2014-11-17 2016-08-02 International Business Machines Corporation Location aware photograph recommendation notification
CN104935810A (zh) * 2015-05-29 2015-09-23 努比亚技术有限公司 引导拍摄方法及装置
EP3101889A3 (en) * 2015-06-02 2017-03-08 LG Electronics Inc. Mobile terminal and controlling method thereof
US9876951B2 (en) * 2015-09-25 2018-01-23 International Business Machines Corporation Image subject and composition demand
CN105578043A (zh) * 2015-12-18 2016-05-11 Tcl集团股份有限公司 一种相机拍照的构图方法和装置
CN105611164A (zh) * 2015-12-29 2016-05-25 太仓美宅姬娱乐传媒有限公司 一种摄像机的辅助拍摄方法

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006027448A (ja) * 2004-07-16 2006-02-02 Chugoku Electric Power Co Inc:The 無人飛行体を利用した空撮方法及びその装置
US7773116B1 (en) * 2006-02-08 2010-08-10 Lockheed Martin Corporation Digital imaging stabilization
CN103426282A (zh) * 2013-07-31 2013-12-04 深圳市大疆创新科技有限公司 遥控方法及终端
US20170108877A1 (en) * 2014-07-30 2017-04-20 SZ DJI Technology Co., Ltd. Systems and methods for target tracking
WO2016029169A1 (en) * 2014-08-22 2016-02-25 Cape Productions Inc. Methods and apparatus for unmanned aerial vehicle autonomous aviation
CN106131411A (zh) * 2016-07-14 2016-11-16 纳恩博(北京)科技有限公司 一种拍摄图像的方法和装置
CN106331508A (zh) * 2016-10-19 2017-01-11 深圳市道通智能航空技术有限公司 拍摄构图的方法及装置
CN106586011A (zh) * 2016-12-12 2017-04-26 高域(北京)智能科技研究院有限公司 航拍无人飞行器的对准方法及其航拍无人飞行器
CN106708089A (zh) * 2016-12-20 2017-05-24 北京小米移动软件有限公司 跟随式的飞行控制方法及装置、无人机

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022188151A1 (zh) * 2021-03-12 2022-09-15 深圳市大疆创新科技有限公司 影像拍摄方法、控制装置、可移动平台和计算机存储介质
CN115835013A (zh) * 2021-09-16 2023-03-21 腾讯科技(深圳)有限公司 多媒体互动方法、系统、装置、设备、介质及计算机程序
CN115835013B (zh) * 2021-09-16 2024-05-17 腾讯科技(深圳)有限公司 多媒体互动方法、系统、装置、设备、介质及计算机程序

Also Published As

Publication number Publication date
CN109863745A (zh) 2019-06-07
JP2018201119A (ja) 2018-12-20
JP6875196B2 (ja) 2021-05-19

Similar Documents

Publication Publication Date Title
JP6803919B2 (ja) 飛行経路生成方法、飛行経路生成システム、飛行体、プログラム、及び記録媒体
JP6878567B2 (ja) 3次元形状推定方法、飛行体、モバイルプラットフォーム、プログラム及び記録媒体
JP6765512B2 (ja) 飛行経路生成方法、情報処理装置、飛行経路生成システム、プログラム及び記録媒体
WO2018214401A1 (zh) 移动平台、飞行体、支持装置、便携式终端、摄像辅助方法、程序以及记录介质
JP7251474B2 (ja) 情報処理装置、情報処理方法、情報処理プログラム、画像処理装置および画像処理システム
WO2017208424A1 (ja) 姿勢推定装置、姿勢推定方法及び観測システム
US11122209B2 (en) Three-dimensional shape estimation method, three-dimensional shape estimation system, flying object, program and recording medium
WO2019230604A1 (ja) 検査システム
KR20170094030A (ko) 실내 내비게이션 및 파노라마 사진 맵핑 제공 시스템 및 그 방법
WO2019080768A1 (zh) 信息处理装置、空中摄像路径生成方法、程序、及记录介质
US20230032219A1 (en) Display control method, display control apparatus, program, and recording medium
JP2023100642A (ja) 検査システム
JP3860945B2 (ja) 撮影指示装置、撮影指示方法及び記録媒体
US20120026324A1 (en) Image capturing terminal, data processing terminal, image capturing method, and data processing method
JP2019028560A (ja) モバイルプラットフォーム、画像合成方法、プログラム、及び記録媒体
CN109891188B (zh) 移动平台、摄像路径生成方法、程序、以及记录介质
JP6681101B2 (ja) 検査システム
CN111213107B (zh) 信息处理装置、拍摄控制方法、程序以及记录介质
WO2021115192A1 (zh) 图像处理装置、图像处理方法、程序及记录介质
JP6329219B2 (ja) 操作端末、及び移動体
WO2020119572A1 (zh) 形状推断装置、形状推断方法、程序以及记录介质
WO2019228337A1 (zh) 移动体、图像生成方法、程序以及记录介质
WO2018179312A1 (ja) 画像生成装置及び画像生成方法
JP6803960B1 (ja) 画像処理装置、画像処理方法、プログラム、及び記録媒体
JP6681102B2 (ja) 検査システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17910838

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 17910838

Country of ref document: EP

Kind code of ref document: A1