CN107547793B - Flying device, method, and storage medium storing program - Google Patents

Flying device, method, and storage medium storing program Download PDF

Info

Publication number
CN107547793B
CN107547793B CN201710342364.1A CN201710342364A CN107547793B CN 107547793 B CN107547793 B CN 107547793B CN 201710342364 A CN201710342364 A CN 201710342364A CN 107547793 B CN107547793 B CN 107547793B
Authority
CN
China
Prior art keywords
imaging
unit
time point
mode
holder
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710342364.1A
Other languages
Chinese (zh)
Other versions
CN107547793A (en
Inventor
山田俊介
水品隆广
松田英明
高桥智洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Casio Computer Co Ltd
Original Assignee
Casio Computer Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Casio Computer Co Ltd filed Critical Casio Computer Co Ltd
Publication of CN107547793A publication Critical patent/CN107547793A/en
Application granted granted Critical
Publication of CN107547793B publication Critical patent/CN107547793B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The present invention relates to a flying device which is thrown from a hand of a throwing person and flies unmanned to perform aerial photography, and enables the throwing person to easily determine a photography mode as shown in the drawing when throwing. Each angular velocity value obtained from an angular velocity sensor based on the time point when the projectile is thrown by the projectile or each value of each initial velocity value calculated from each output value of an acceleration sensor is compared with a predetermined threshold value and determined (S403, S405, S407, S409). Based on the comparison result, a photographing mode of any one of the circle photographing, the rotation photographing, the self-timer photographing, the auto-tracking photographing, and the normal photographing is determined (S404, S406, S408, S410, and S411), and photographing is performed in the determined photographing mode (S412).

Description

Flying device, method, and storage medium storing program
Technical Field
The present invention relates to a flying apparatus for flying an unmanned aerial vehicle away from a hand of a holder or the like to perform aerial photography.
Background
Flying devices have begun to become widespread as follows (see, for example, patent documents 1 to 4): a digital camera is mounted on a small unmanned aerial vehicle called "drone" in general, which is equipped with, for example, 4 driving propulsion devices based on a rotating blade driven by a motor, and the digital camera and the flight device are subjected to timer shooting or remote operation by radio or the like, whereby shooting can be performed from a higher position where the camera cannot reach.
Documents of the prior art
Patent document
Patent document 1: JP Kokai publication 2004-118087
Patent document 2: JP 2005-269413A
Patent document 3: JP Kokai publication No. 2012-156683
Patent document 4: JP 2008-120294
However, in the conventional small unmanned aerial vehicle, it is necessary to perform control by remote control or to set a photographing position and a flight path in advance by a smartphone or the like, and there is a problem that it is difficult to perform photographing operation.
Disclosure of Invention
Therefore, the object of the present invention is to enable a user to easily determine a photographing mode as shown in the figure.
In one embodiment, the sharing transfer device includes an imaging unit, and the flying device includes: an acquisition unit that acquires a state at a time point when the holder leaves; a determination unit that determines a photographing mode of the image pickup unit after the time point of departure based on the state acquired by the acquisition unit; and an imaging control unit for controlling the imaging unit in the imaging mode determined by the determination unit.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, the holder can easily determine the photographing mode as shown in the figure.
Drawings
Fig. 1 is a diagram showing a configuration example of a motor frame of an aircraft according to the present embodiment.
Fig. 2 is a diagram showing an example of a system configuration of the flight device according to the present embodiment.
Fig. 3 is an explanatory view of the throwing direction.
Fig. 4 is a flowchart showing an example of the imaging mode control processing of the flight device according to the present embodiment.
Fig. 5 is a flowchart showing an example of threshold setting processing for each shooting mode in the present embodiment.
Fig. 6 is a flowchart showing an example of the imaging condition control processing of the flight device according to the present embodiment.
Fig. 7 is a flowchart showing a process of creating an initial velocity-shutter velocity correspondence table in another embodiment of the imaging condition control process of the flight device.
Fig. 8 is a diagram showing an example of a relationship table of exposure, shutter speed, and aperture.
Fig. 9 is a diagram showing an example of the initial velocity-shutter velocity correspondence table generated in the other embodiment.
Description of the reference numerals
100 flying device
101 main frame
102 motor frame
103 rotating blade
104 electric motor
105 circuit box
106 vidicon
201 controller
202 video camera system
203 flight sensor
204 touch sensor
205 motor driver
206 electric power sensor
207 accumulator
Detailed Description
The embodiments for carrying out the present invention will be described in detail below with reference to the drawings. In the present embodiment, the imaging mode of the imaging unit after the time point is determined based on the state of the flying apparatus at the time point when the flying apparatus departs from the holder, for example, the state of the flying apparatus at the time point when the projectile is thrown by the projectile, and the imaging unit is controlled in the determined imaging mode. More specifically, for example, after the time point when the throw is performed, the drive propulsion unit of the flight device is driven and propelled to perform the flight, and the imaging mode of the imaging unit in the flight device can be controlled. More specifically, the state at the time of a throw is calculated and acquired from sensor data, and each parameter is compared with a threshold value to estimate what throw mode the user has performed, and the user is shifted to an imaging mode corresponding to the estimated throw mode, and imaging corresponding to each imaging mode is performed.
Fig. 1 is a diagram showing an example of an external appearance of a flight device 100 according to the present embodiment.
The main frame 101 is provided with 4 circular motor frames 102 (support portions). The motor frame 102 can support a motor 104, and a rotary blade 103 is fixed to a motor shaft of the motor 104. The 4 sets of the motor 104 and the rotary blade 103 constitute a drive propelling part.
A circuit box 105 in the main frame 101 houses a motor driver, a controller, sensors, and the like for driving the motor 104. A camera 106 as an imaging unit is mounted on a lower portion of the main frame 101.
Fig. 2 is a diagram showing an example of a system configuration of the flight device 100 according to the embodiment having the configuration shown in fig. 1. The controller 201 is connected with: a camera system 202 including the camera 106 (refer to fig. 1); a flight sensor 203 constituted by, for example, an acceleration sensor, a gyroscope, a GPS (global positioning system) sensor, or the like; a touch sensor 204 (contact detection sensor section); motor drivers 205 of #1 to #4 that drive the respective motors 104 of #1 to #4 (refer to fig. 1), respectively; and a power sensor 206 that supplies electric power to each motor driver 205 while monitoring the voltage of the battery 207. Here, the touch sensor 204 may be a push button or the like as long as it can detect contact. Although not shown, the electric power of the battery 207 is also supplied to the control units 201 to 206. The controller 201 acquires information on the attitude of the body of the flying apparatus 100 from the flight sensor 203 in real time. The controller 201 also monitors the voltage of the battery 207 via the power sensor 206, and transmits a power instruction signal configured by a duty ratio based on pulse width modulation to each of the motor drivers 205 #1 to # 4. Thus, the motor drivers 205 #1 to #4 control the rotation speeds of the motors 104 #1 to #4, respectively. The controller 201 controls the camera system 202 to control the shooting operation of the camera 106 (fig. 1). In the present embodiment, the controller 201 functions as: an acquisition means for acquiring a state at a time point when the player leaves the device, that is, a state at a time point when the player throws the device; a determination unit that determines the imaging mode of the camera system 202 and the camera 106 as the imaging unit after the time point when the flying apparatus 100 departs from the holder (is thrown by the holder) based on the state acquired by the acquisition unit; and an imaging control unit for controlling the camera 106 via the camera system 202 in the imaging mode determined by the determination unit.
The controller 201, the camera system 202, the flight sensor 203, the motor driver 205, the electric power sensor 206, and the storage battery 207 of fig. 2 are accommodated in the circuit box 105 in the main frame 101 of fig. 1. Although not explicitly shown in fig. 1, the touch sensor 204 is attached to the main frame 101 and/or the motor frame 102 in fig. 1, and detects a difference in electrical physical quantity between when a finger of a thrower or the like touches the main frame 101 or the motor frame 102 and when the finger does not touch the main frame 101 or the motor frame 102.
The operation of the flight device 100 having the above-described configuration will be described below. First, examples of a shooting mode in the shooting mode of the present embodiment and a corresponding throwing method will be described below. The photographing mode is a circling photographing, a spinning photographing, a self timer photographing, an auto tracking photographing, or a normal photographing. Further, the shooting mode may include shooting prohibition.
● example of determining the spiral photography mode:
this mode is a mode in which shooting is performed while circling around the user who performs throwing. The throwing method is a method in which a three-dimensional space is formed by an x-axis, a y-axis, and a z-axis as shown in fig. 3, the x-axis and the y-axis are axes in a plane parallel to the ground, and the z-axis is an axis perpendicular to the direction from the ground to the sky, and the throwing is performed while rotating around the x-axis or around the y-axis or around two axes xy as shown in fig. 3 as 301.
● example of determination of rotation photography mode:
this mode is a mode in which the flight device 100 itself performs imaging while rotating around the z axis. The throwing method throws the ball while rotating around the z-axis as shown in fig. 3 as 302.
● self timer shooting mode decision:
the mode is a mode in which photographing is performed with a self-timer after the start of flight. The throwing method is to leave the hand without throwing or to throw slightly upward. The rotation is not made to exceed the threshold. This mode is basically to photograph a thrower, perform face detection, autofocus, and the like. In this mode, the object starts to fall due to gravity, and is suspended to counter this and held at a fixed position.
● determination example of auto-tracking photography mode:
is a mode in which a thrown user is automatically tracked and photographed. The throwing method is to throw the flying device off the hand upside down or, as shown in fig. 3 as 304, to throw it forcefully upward parallel to the z-axis. The rotation is not made to exceed the threshold.
● determination example of normal imaging mode:
the shooting mode is a mode in which shooting is performed at a position where the object is stationary at a throw. The throwing method is to throw other than the above-described photographing modes. For example, although not particularly shown, the projectile may be thrown in the horizontal direction without rotating the projectile. In addition, the continuous shooting interval may be changed corresponding to the throwing speed.
The determination of various imaging conditions such as shutter speed, aperture, imaging interval, and imaging timing of still images or moving images may be appropriately set for each mode, or may be performed fully automatically. The shooting condition may be set to one of the shooting modes and determined according to the state at the time point when the player is thrown and leaves the holder.
Fig. 4 is a flowchart showing an example of the shooting mode control processing of the flight device 100 according to the present embodiment for instructing any one of the above-described 5 types of shooting modes according to the throw method. This processing can be realized as processing in which a CPU (central processing unit) incorporated in the controller 201 of fig. 2 executes a control program stored in a memory (not particularly shown) incorporated in the controller.
The controller 201 first monitors whether the flying apparatus 100 is separated from the hand of the user (thrown) by monitoring a voltage change of the touch sensor 204 or the like (repetition of determination "no" in step S401).
When the determination in step S401 is yes, the controller 201 acquires and calculates the state at the time of throwing based on the outputs of the flight sensors 203 (step S402). Specifically, the controller 201 first obtains angular velocities ω x, ω y, and ω z [ rad/s: radian/second ], as each output value in each coordinate axis direction of the gyro sensor constituting the flight sensor 203. Then, the controller 201 calculates the angular velocity ω ini _ hor in the direction 301 of fig. 3 around the x-axis, around the y-axis, or around the xy-axes, and the angular velocity ω ini _ vert in the direction 302 of fig. 3 around the z-axis, by the arithmetic processing equivalent to the following expressions 1 and 2.
[ equation 1 ]
Figure BDA0001294616690000051
[ equation 2 ]
ωini_vert=ωz
Next, the controller 201 calculates the velocities Vx, Vy, and Vz [ m/s: meter/second]. Here, the controller 201 calculates the above-described velocities Vx, Vy, and Vz based on acceleration values in the coordinate axis directions output from acceleration sensors constituting the flight sensor 203 of fig. 2 at the time point when the throw is performed. Now, the accelerations in the x-axis, y-axis, and z-axis directions in the xyz-axis absolute coordinate system output by the acceleration sensor are taken as ax, ay, and az [ m/s ]2]Then, the controller 201 calculates velocities Vx, Vy, and Vz in each coordinate axis direction at the time point of the throwing motion from a time point ts at which one of these acceleration values exceeds a predetermined threshold value to a time point tr at which the release of the flying apparatus 100 from the body of the throwing person is detected based on the output of the touch sensor 204 in fig. 2, by performing integration operation processing equivalent to the following expressions 3, 4, and 5 on each of the accelerations ax, ay, and az.
[ equation 3 ]
Figure BDA0001294616690000061
[ equation 4 ]
Figure BDA0001294616690000062
[ equation 5 ]
Figure BDA0001294616690000063
Next, the controller 201 calculates an initial velocity Vini _ hor in the horizontal direction around the x-axis, around the y-axis, i.e., 303 in fig. 3, and an initial velocity Vini _ vert in the vertical direction around the z-axis, i.e., 304 in fig. 3, by arithmetic processing equivalent to the following expressions 6 and 7.
[ equation 6 ]
Figure BDA0001294616690000064
[ equation 7 ]
Vini_vert=Vz
After the above processing of step S402, the controller 201 determines whether the angular velocity ω ini _ hor in the direction of 301 of fig. 3 calculated at step S402 is larger than a threshold ω THini _ hor set in advance by threshold setting processing of the flowchart of fig. 5 described later (step S403).
When the determination in step S403 is yes, the controller 201 sets the shooting mode to the aforementioned spiral shooting mode (step S404), and then proceeds to the shooting process in step S412.
When the determination in step S403 is no, the controller 201 next determines whether the angular velocity ω ini _ vert in the direction of 302 in fig. 3 calculated in step S402 is greater than a threshold value ω THini _ vert set in advance by the threshold value setting process in the flowchart in fig. 5 described later (step S405).
If the determination in step S405 is yes, the controller 201 sets the shooting mode to the above-described rotation shooting mode (step S406), and then proceeds to the shooting process in step S412.
If the determination in step S405 is "no", the controller 201 next determines whether the initial velocity Vini _ hor in the horizontal direction of 303 in fig. 3 calculated in step S402 is greater than a threshold value VTHini _ hor set in advance by threshold value setting processing in the flowchart in fig. 5 described later (step S407).
If the determination in step S407 is yes, the controller 201 sets the shooting mode to the normal shooting mode described above (step S408), and then proceeds to the shooting process in step S412. In the normal shooting mode, still images, continuous shooting, or moving images are shot.
If the determination in step S407 is "no", the controller 201 next determines whether the initial velocity Vini _ vert in the vertical direction of 304 in fig. 3 calculated in step S402 is greater than 0 (or a threshold value slightly greater than 0) (step S409).
If the determination in step S409 is yes, the controller 201 sets the shooting mode to the self timer shooting mode described above (step S410), and then proceeds to the shooting process in step S412.
If the determination in step S409 is "no", the controller 201 sets the shooting mode to the above-described auto-tracking shooting mode (step S411), and then proceeds to the shooting process in step S412.
In the shooting process of step S412, the controller 201 controls the motor drivers 204 of #1 to #4 to perform the flight operation in each set shooting mode, and thereafter controls the camera system 202 to perform shooting.
After that, although not particularly shown, when the photographing is ended due to the elapse of a predetermined time or a predetermined number of times or an instruction from the user, the controller 201 searches for the position of the user (owner) who performed the upward polishing. The exploring method can employ existing techniques. When the position of the owner is found, the controller 201 controls the motor drivers 205 of #1 to #4 to fly in the direction of the owner until it is determined from GPS data or the like that the distance from the owner is equal to or less than a predetermined distance. Then, the controller 201 controls the motor drivers 205 of #1 to #4 to execute the hovering action at the home or the landing action to the hand of the thrower, and stops the motors of #1 to #4 at the time of the landing action to end the control action.
Fig. 5 is a flowchart showing an example of threshold setting processing for each shooting mode in the present embodiment. The controller 201 first shifts to the threshold setting mode (step S502) by accepting a given switch operation or the like by the user (step S501).
Next, the controller 201 sets a mode in which the threshold value is not set among the aforementioned photographing modes (step S503).
Next, the controller 201 lets the user throw the piece by the throw method corresponding to the shooting mode set in step S503 (step S504).
Based on the result of the throw in step S504, the controller 201 calculates the angular velocity ω ini _ hor in the direction 301 in fig. 3, the angular velocity ω ini _ vert in the direction 302 in fig. 3, the initial velocity Vini _ hor in the horizontal direction in fig. 3, and the initial velocity Vini _ vert in the vertical direction in fig. 3 at 304, based on the same processing (arithmetic processing equivalent to expressions 1 to 7) as in step S402 in fig. 4 described above. Then, the controller 201 automatically sets the respective values obtained by changing the respective values by predetermined amounts to the respective threshold values ω THini _ hor, ω THini _ vert, VTHini _ hor, and VTHini _ vert (step S505).
The controller 201 then determines whether or not the series of processing of steps S503 to S505 described above has ended for all the shooting modes (step S506).
If the determination in step S506 is "no", the controller 201 returns to the processing in step S503 and shifts to the processing for the next unprocessed shooting mode.
If the determination in step S506 is yes, the controller 201 ends the threshold setting process for each shooting mode shown in the flowchart of fig. 5.
With the above-described embodiments, the user can easily determine the shooting mode as shown in the drawing when throwing.
Next, an embodiment showing an example of the shooting condition in the shooting mode and the corresponding throwing method will be described. Here, the shooting conditions include shutter speed, aperture, shooting interval, and shooting timing of a still image or a moving image. In the above-described embodiment, all the imaging conditions are described on the premise that they are automatically determined, but in the present embodiment, the state at the time point when the flying apparatus 100 departs from the holder is acquired from various sensors in the flying sensor 203 of fig. 2, and based on the acquired state, the imaging conditions after the time point when the flying apparatus 100 departs from the hand of the holder or after the time point when the thrower throws the flying apparatus 100 are determined.
In the present embodiment, similarly to the above-described embodiments, when a three-dimensional space including x, y, and z axes is defined as shown in fig. 3, the x and y axes are defined as axes in a plane parallel to the ground, and the z axis is defined as an axis perpendicular to the direction from the ground toward the sky, the controller 201 calculates the velocities Vx, Vy, and Vz [ m/s: m/s ]. Here, the controller 201 calculates the speeds Vx, Vy, and Vz by performing integration operation equivalent to the expressions 3, 4, and 5 described above on the accelerations ax, ay, and az, respectively, from a time point ts at which a throwing motion starts when any one of the acceleration values ax, ay, and az in the coordinate axis directions output by the acceleration sensor constituting the flight sensor 203 of fig. 2 exceeds a predetermined threshold value, until a release time point tr at which the flying device 100 is separated from the body of the throwing person is detected based on the output of the touch sensor 204 of fig. 2. In the present embodiment, the imaging conditions are determined as follows based on these respective speeds.
● determination example of shooting conditions relating to shutter speed:
for example, when it is desired to photograph an image that is as in focus as much as possible after throwing, it is desired to increase the shutter speed. Conversely, when it is desired to photograph a moving image, it is desired to slow down the shutter speed. As a pitching method for controlling this, the shutter speed is set faster as the sum of the speeds Vx, Vy, and Vz in each direction is larger. That is, the more rapid (powerful) the throw is in either direction, and the faster the shutter speed is. The shutter speed may be controlled in conjunction with a diaphragm described later.
● determination example of imaging conditions relating to aperture:
for example, when it is desired to photograph as sharp an image as possible after throwing, it is desired to reduce the aperture. Conversely, when a soft image is desired to be captured, it is desired to enlarge the aperture. As a pitch method for controlling this, the diaphragm is narrowed as the average of the velocities Vx, Vy, and Vz in each direction is larger. That is, the more rapidly (strongly) the throw is performed regardless of the direction, the more the diaphragm is reduced. The control of the aperture may be linked to the shutter speed described above.
● determination example of imaging conditions relating to imaging interval:
it is desirable to determine the imaging interval at which imaging is to be performed at a given time or a given distance. As a throwing method for controlling this, as shown as 302 in fig. 3, as in the spin imaging mode described above, the throwing is performed while rotating around the z-axis, and the imaging interval is set longer as the product of the speeds Vx and Vy in each direction is larger. That is, the more images are captured as the throw is slower, and the less images are captured as the throw is faster. The velocity in the z-axis direction is not considered. Of course, the relationship between the speed and the interval of the shot may be reversed.
● determination example of imaging conditions relating to imaging timing:
when it is desired to perform imaging at the highest position, the image is gradually thrown upward in the z-axis direction, i.e., directly above.
When the user wants to photograph when he or she enters the angle of view, the user throws the ball while rotating around the x axis, the y axis, or the xy axis as shown as 301 in fig. 3, similarly to the spiral photographing mode described above.
When it is desired to photograph a desired object when the object enters the angle of view, the object is thrown so as to draw a parabola. In this case, it is not clear which direction of the x-axis, the y-axis, and the z-axis is to be advanced, but when it is detected that the trajectory of flight is at least a parabola, the controller 201 determines a main subject in the view angle in the trajectory direction of the parabola, and performs 1 or more shots with the main subject focused on.
The controller 201 calculates the trajectory of the parabola by an operation equivalent to the following expression, for example, and determines a main object in the angle of view in the direction of the calculated trajectory of the parabola.
First, an initial velocity is set to V0[m/s]Adding gravity toThe speed is set to g [ m/s ]2]In addition, in the case of an oblique parabola, the elevation angle of the initial velocity is represented by θ [ radian measure ]]The time from the throw start time point is set to t. In this case, the velocity Vxy and the displacement xy in the horizontal xy direction are calculated by the following equations 8 and 9.
[ equation 8 ]
Vxy=V0cosθ
[ equation 9 ]
xy=V0cosθ·t
The velocity and displacement in the vertical direction are calculated by the following expressions 10 and 11.
[ equation 10 ]
Vz=V0sinθ-gt
[ equation 11 ]
Figure BDA0001294616690000111
In the present embodiment, the controller 201 determines the initial velocity V by determining0The elevation angle θ of the downward throw is within a predetermined range, and it is determined that the throw is performed so as to draw a parabola, and the trajectory of the parabola is calculated by the above expressions 8 to 11, whereby the main subject in the angle of view in the direction of the calculated trajectory of the parabola is determined.
Fig. 6 is a flowchart showing an example of imaging condition control processing of the flight device 100 according to the present embodiment for instructing any one of the above-described 4 types of imaging conditions by the throw method. This process can be realized in the controller 201 of fig. 2 as the following process: the CPU built in the controller 201 executes a control program stored in a memory not particularly shown, which is also built in.
The controller 201 first monitors whether the flying apparatus 100 is separated from the hand of the user (thrown) by monitoring a voltage change of the touch sensor 204 or the like (repetition of determination "no" in step S601).
When the determination in step S601 is yes, the controller 201 acquires and calculates the state at the time of throwing based on the outputs of the flight sensors 203 (step S602). Specifically, the controller 201 first obtains angular velocities ω x, ω y, and ω z [ rad/s: radian/second ], as each output value in each coordinate axis direction of the gyro sensor constituting the flight sensor 203. Then, the controller 201 calculates the angular velocity ω ini _ hor around the x-axis, around the y-axis or around the xy-axis, i.e., the direction 301 in fig. 3, and the angular velocity ω ini _ vert around the z-axis, i.e., the direction 302 in fig. 3, based on the arithmetic processing equivalent to the above equation 1 and equation 2.
Next, the controller 201 calculates the velocities Vx, Vy, Vz [ m/s ] in the x-axis, y-axis, and z-axis directions in the xyz-axis absolute coordinate system at the time point of the throw by executing the integration operation equivalent to the above equations 3, 4, and 5: m/sec ], and these velocity sum values Vx + Vy + Vz are calculated.
Then, the controller 201 calculates the initial velocity Vini _ vert in the vertical direction around the z-axis, i.e., 304 of fig. 3, based on the arithmetic processing equivalent to the above expression 7.
After the above processing in step S602, the controller 201 sets both the shutter speed and the aperture or one of the shutter speeds and the aperture that are determined in advance, corresponding to the value Vx + Vy + Vz obtained by summing the speeds calculated in step S602.
Next, it is determined whether or not the angular velocity ω ini _ vert in the direction of 302 in fig. 3 calculated in step S602 is larger than the threshold value ω THini _ vert set in advance by the threshold value setting processing in the flowchart in fig. 5 (step S604).
If the determination in step S604 is yes, the controller 201 sets an imaging interval having the same length as the product of the velocities Vx and Vy calculated in step S602 (step S605). If the determination of step S604 is no, the controller 201 skips the process of step S605.
After that, the controller 201 determines whether the angular velocity ω ini _ hor in the direction of 301 in fig. 3 calculated in step S602 is larger than the threshold ω THini _ hor set in advance by the threshold setting processing in the flowchart in fig. 5 described above (step S606).
If the determination in step S606 is yes, the controller 201 sets a shooting timing at which shooting is performed when the user enters the angle of view (step S607). Whether the user has entered the angle of view is determined based on the recognition result of the face recognition processing using the image information obtained from the camera system 202 of fig. 2, for example. Alternatively, the user may be equipped with a remote controller having a beacon transmission function, and capture the beacon to determine whether the user has entered the angle of view. The controller 201 then ends the photographing condition control process shown in the flowchart of fig. 6.
If the determination in step S606 is no, the controller 201 determines the initial velocity V0Whether the elevation angle θ of the pitch at the time of flight is within a given range is determined to determine whether the flight trajectory is parabolic (step S608).
If the determination in step S608 is yes, the controller 201 sets a shooting timing at which shooting is performed when the desired object enters the angle of view (step S609). In this case, the controller 201 determines a main object in the view angle in the trajectory direction of the calculated parabola by calculating the trajectory of the parabola by the aforementioned expression 8 to expression 11, for example. The determination of the main subject is performed by the image recognition processing in the above-described angle of view of the image information obtained from the camera system 202 of fig. 2, for example. The controller 201 then ends the photographing condition control process shown in the flowchart of fig. 6.
If the determination in step S608 is no, the controller 201 determines whether the initial velocity Vini _ vert in the vertical direction of 304 in fig. 3 calculated in step S602 is greater than 0 (or a threshold value slightly greater than 0) (step S610).
If the determination in step S610 is yes, the controller 201 sets the shooting timing at which shooting is performed at the highest point. The controller 201 then ends the photographing condition control process shown in the flowchart of fig. 6.
If the determination in step S610 is "no", the controller 201 ends the imaging condition control process shown in the flowchart of fig. 6.
After the imaging condition control processing shown in the flowchart of fig. 6 is finished, the controller 201 executes the imaging mode control processing shown in the flowchart of fig. 4, and can then shift to the imaging processing in step S412 of fig. 4.
Another embodiment of the photographing condition control process will be described below. In another embodiment, the user tries to throw at various initial speeds (intensities) in advance to draw a parabola, and stores the relationship between each initial speed (intensity) and the shutter speed and the aperture automatically set to be suitable for exposure in accordance with the shutter speed as the initial speed-shutter speed correspondence table. Then, the user throws at a preferred initial speed to draw a parabola, thereby performing photographing at a preferred shutter speed and an aperture automatically set in accordance with the shutter speed.
Fig. 7 is a flowchart showing a process of creating an initial velocity-shutter velocity correspondence table in another embodiment of the imaging condition control process of the flight apparatus 100. This processing can be realized as the following processing in the controller 201 of fig. 2, as in the case of fig. 6: the CPU built in the controller 201 executes a control program stored in a memory not particularly shown, which is also built in.
First, the controller 201 accepts a user operation (step S701), and shifts to a threshold setting mode (step S702).
Next, the controller 201 causes the user to throw to draw a parabola (step S703).
The controller 201 obtains the initial velocity V at the time of throwing0(step S704).
The controller 201 causes the camera system 202 to take an image at all changeable shutter speeds while adjusting the aperture so that the EV (exposure) value becomes the same during the flight of the flight device 100 realized by the current throw, and records the image in the memory in the controller 201 (step S705). Fig. 8 is a diagram showing an example of a relationship table of EV values, shutter speeds, and apertures of a ROM (Read only memory) stored in advance in the controller 201. For example, if the EV value is 13, the aperture is changed from 32 to 2.0 between the shutter speed change from 1/8 seconds to 1/2000 seconds. In step S705, for example, after the EV value is automatically set to 13, the controller 201 determines a stop corresponding to the shutter speed while changing the shutter speed from 1/8 seconds to 1/2000 seconds in a stepwise manner with reference to the above-described relationship table stored in the ROM, causes the camera system 202 to perform shooting at each combination of the determined shutter speed and stop, and records the obtained image data in a RAM (random access Memory) in the controller 201.
The controller 201 then determines whether or not the predetermined number of throws is ended (step S706).
If the determination in step S706 is "no", the user is thrown at an initial speed (intensity) different from the previous time to draw a parabola (step S707). The controller 201 then executes the processing of steps S704 and S705 again.
When the determination in step S706 becomes yes after the above operations are repeated, the controller 201 shifts to a user selection state (step S708).
The controller 201 transfers all the photos recorded in the RAM in the controller 201 in step S705 to a display of a smartphone or a remote controller (not shown), and displays the photos (step S709).
Each time the throw is performed, the controller 201 causes the user to select a favorite photo (step S710).
The controller 201 calculates the initial velocity V of each throw0And a RAM in which the relationship with the shutter speed of the photograph selected by the user for the throw is stored (step S711), and an initial speed-shutter speed correspondence table as shown in fig. 9, for example, is created based on the stored relationship and stored in the RAM. The controller 201 then ends the threshold setting processing shown in the flowchart of fig. 7.
After the threshold setting process is completed, the user can throw at a preferred initial speed to draw a parabola and photograph at a preferred shutter speed and an aperture automatically set in accordance with the preferred shutter speed.
In the above-described embodiment, the shooting mode is determined based on the angular velocity and the velocity, but the shooting mode may be determined based on the acceleration.
In the above-described embodiment, the number of still images captured by the flight device 100 is arbitrary. The flying apparatus 100 may photograph not only a still image but also a moving image. In this case, the shooting time of the moving image is arbitrary.
The flight device 100 may communicate with a terminal held by a projectile, for example, and transmit a photographed video to photograph the video while viewing the video.
The timing of the photographing performed by the flying apparatus 100 and the like can be wirelessly operated from, for example, a terminal held by a projectile.
When the folding mechanism of the motor frame 102 for conveyance is employed for the flying device 100, the process of deforming the motor frame 102 into the flying-enabled state may be executed immediately after the start of the throwing.
In the above description of the embodiment, the driving propulsion unit is described as including the motor 104 and the rotary blade 103, but the driving propulsion unit may be realized by a mechanism that propels by air pressure or engine output. Further, the slide member may fall naturally without the thrust driving portion. Depending on the state, the imaging may not be performed. And may also be left only from the hand rather than being thrown.

Claims (12)

1. An aircraft device provided with an imaging unit, the aircraft device comprising:
an acquisition unit that acquires a state at a time point when the holder leaves;
a determination unit configured to determine a photographing mode of the image pickup unit after the time point of the departure based on the state acquired by the acquisition unit;
an imaging control unit that controls the imaging unit in the imaging mode determined by the determination unit; and
a sensor unit that detects an angular velocity, acceleration, or velocity of the flying apparatus at a time point when the flying apparatus leaves the holder,
the determination means compares each value of each output value of the sensor unit based on the time point of departure from the holder acquired by the acquisition means with a predetermined threshold value, and thereby determines the imaging mode of the imaging unit based on the comparison result.
2. The flying device of claim 1,
the determination means determines the imaging mode of the imaging unit after the time point when the holding person performs the throw based on the state of the time point when the holding person performs the throw.
3. The flying device of claim 1,
the imaging mode includes an imaging mode or an imaging condition.
4. A flying device according to claim 3,
the photography mode is any 1 or more of hover photography, spinning photography, self timer photography, auto-tracking photography, normal photography, and photography prohibition.
5. A flying device according to claim 3,
the shooting condition is any 1 or more of shutter speed, aperture, shooting interval, and shooting timing of a still image or a moving image.
6. The flying device of claim 1,
the determination unit determines a photographing mode of photographing from a self-timer when a state at a time point of departure from the holder is such that only acceleration in a gravity direction is detected from an output of the sensor unit.
7. The flying device of claim 1,
the sensor calculates angular velocity, acceleration or velocity in each coordinate axis direction in a predetermined absolute coordinate system at a point of time when the holder leaves,
the determination means calculates the angular velocity, acceleration, or velocity in the horizontal coordinate axis direction or the vertical coordinate axis direction with respect to the ground surface based on the respective output values of the sensor at the time point when the holder moves away from the holder or by calculation processing based on the respective output values, compares the calculated values with a predetermined threshold value, and determines the imaging mode of the imaging unit based on the comparison result.
8. A flying device according to any one of claims 2 to 7, characterised in that,
the holding person is caused to throw the shot in advance for each of the shooting modes, and values obtained by changing respective values of respective output values of the sensor unit by predetermined amounts based on a time point at which the shot is thrown are automatically set as the predetermined threshold values.
9. A flying device according to any one of claims 1 to 7, characterised in that,
the flying device is provided with:
a drive propulsion section for flying in the air after a point in time of departure from the holder.
10. The flying device of claim 9,
when the state acquired by the acquisition unit is a state of staying at the home position, the drive propulsion unit hovers at the home position.
11. An imaging method for an aircraft provided with an imaging unit, the imaging method comprising:
acquiring a state at a time point when the holder leaves;
determining a photographing mode of the image pickup unit after the time point of departure based on the acquired state;
controlling the image pickup unit in the determined image pickup mode; and
a step of detecting an angular velocity, acceleration or speed of the flying device at a point in time of departure from the holder,
the imaging mode of the imaging unit is determined based on a comparison result by comparing a value detected based on the acquired detection value at the time point when the holder moves away from the holder with a predetermined threshold value.
12. A storage medium storing a program for causing a computer that controls an aircraft including an imaging unit to execute:
acquiring a state at a time point when the holder leaves;
determining a photographing mode of the image pickup unit after the time point of departure based on the acquired state;
controlling the image pickup unit in the determined image pickup mode; and
a step of detecting an angular velocity, acceleration or speed of the flying device at a point in time of departure from the holder,
the imaging mode of the imaging unit is determined based on a comparison result by comparing a value detected based on the acquired detection value at the time point when the holder moves away from the holder with a predetermined threshold value.
CN201710342364.1A 2016-06-23 2017-05-15 Flying device, method, and storage medium storing program Active CN107547793B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2016124929 2016-06-23
JP2016-124929 2016-06-23
JP2017032100A JP6347299B2 (en) 2016-06-23 2017-02-23 Flight apparatus, method, and program
JP2017-032100 2017-02-23

Publications (2)

Publication Number Publication Date
CN107547793A CN107547793A (en) 2018-01-05
CN107547793B true CN107547793B (en) 2020-07-03

Family

ID=60947479

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710342364.1A Active CN107547793B (en) 2016-06-23 2017-05-15 Flying device, method, and storage medium storing program

Country Status (2)

Country Link
JP (1) JP6347299B2 (en)
CN (1) CN107547793B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109218664B (en) * 2018-08-03 2021-09-21 北京润科通用技术有限公司 Video shooting method and system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN204287973U (en) * 2014-12-30 2015-04-22 览意科技(上海)有限公司 flight camera
CN104685436A (en) * 2013-12-13 2015-06-03 深圳市大疆创新科技有限公司 Methods for launching and landing an unmanned aerial vehicle
CN105283816A (en) * 2013-07-31 2016-01-27 深圳市大疆创新科技有限公司 Remote control method and terminal
CN105388901A (en) * 2014-08-26 2016-03-09 鹦鹉股份有限公司 Method of dynamic control of a rotary- wing drone in throw start
CN105517666A (en) * 2014-09-05 2016-04-20 深圳市大疆创新科技有限公司 Context-based flight mode selection
CN105527972A (en) * 2016-01-13 2016-04-27 深圳一电航空技术有限公司 Unmanned aerial vehicle (UAV) flight control method and device
CN105589466A (en) * 2016-02-24 2016-05-18 谭圆圆 Flight control device of unmanned aircraft and flight control method thereof

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015200209A1 (en) * 2014-06-23 2015-12-30 Nixie Labs, Inc. Wearable unmanned aerial vehicles, launch- controlled unmanned aerial vehicles, and associated systems and methods

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105283816A (en) * 2013-07-31 2016-01-27 深圳市大疆创新科技有限公司 Remote control method and terminal
CN104685436A (en) * 2013-12-13 2015-06-03 深圳市大疆创新科技有限公司 Methods for launching and landing an unmanned aerial vehicle
CN105388901A (en) * 2014-08-26 2016-03-09 鹦鹉股份有限公司 Method of dynamic control of a rotary- wing drone in throw start
CN105517666A (en) * 2014-09-05 2016-04-20 深圳市大疆创新科技有限公司 Context-based flight mode selection
CN204287973U (en) * 2014-12-30 2015-04-22 览意科技(上海)有限公司 flight camera
CN105527972A (en) * 2016-01-13 2016-04-27 深圳一电航空技术有限公司 Unmanned aerial vehicle (UAV) flight control method and device
CN105589466A (en) * 2016-02-24 2016-05-18 谭圆圆 Flight control device of unmanned aircraft and flight control method thereof

Also Published As

Publication number Publication date
JP2018002131A (en) 2018-01-11
CN107547793A (en) 2018-01-05
JP6347299B2 (en) 2018-06-27

Similar Documents

Publication Publication Date Title
US11188101B2 (en) Method for controlling aircraft, device, and aircraft
US11649052B2 (en) System and method for providing autonomous photography and videography
CN110692027B (en) System and method for providing easy-to-use release and automatic positioning of drone applications
CN109074168B (en) Unmanned aerial vehicle control method and device and unmanned aerial vehicle
JP2017065467A (en) Drone and control method thereof
CN111356954B (en) Control device, mobile body, control method, and program
CN111567032B (en) Specifying device, moving body, specifying method, and computer-readable recording medium
CN109814588A (en) Aircraft and object tracing system and method applied to aircraft
US20210014427A1 (en) Control device, imaging device, mobile object, control method and program
JP6642502B2 (en) Flight device, method, and program
CN109076101B (en) Holder control method, device and computer readable storage medium
US10308359B2 (en) Moving device, method of controlling moving device and storage medium
JP6910785B2 (en) Mobile imager and its control method, as well as imager and its control method, unmanned aerial vehicle, program, storage medium
CN107547793B (en) Flying device, method, and storage medium storing program
CN111630838B (en) Specifying device, imaging system, moving object, specifying method, and program
WO2021014752A1 (en) Information processing device, information processing method, and information processing program
CN111357271B (en) Control device, mobile body, and control method
CN111602385B (en) Specifying device, moving body, specifying method, and computer-readable recording medium
JP6471272B1 (en) Long image generation system, method and program
CN111226170A (en) Control device, mobile body, control method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant