WO2018133388A1 - 智能飞行设备的拍摄方法及智能飞行设备 - Google Patents

智能飞行设备的拍摄方法及智能飞行设备 Download PDF

Info

Publication number
WO2018133388A1
WO2018133388A1 PCT/CN2017/096530 CN2017096530W WO2018133388A1 WO 2018133388 A1 WO2018133388 A1 WO 2018133388A1 CN 2017096530 W CN2017096530 W CN 2017096530W WO 2018133388 A1 WO2018133388 A1 WO 2018133388A1
Authority
WO
WIPO (PCT)
Prior art keywords
angle
projection
light source
orientation
determining
Prior art date
Application number
PCT/CN2017/096530
Other languages
English (en)
French (fr)
Inventor
陈涛
吴珂
韩晋
刘华一君
Original Assignee
北京小米移动软件有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京小米移动软件有限公司 filed Critical 北京小米移动软件有限公司
Priority to RU2017134749A priority Critical patent/RU2668609C1/ru
Priority to JP2017552164A priority patent/JP6532958B2/ja
Publication of WO2018133388A1 publication Critical patent/WO2018133388A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C1/00Measuring angles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/006Apparatus mounted on flying objects
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]

Definitions

  • the present disclosure relates to the field of electronic device technologies, and in particular, to a method for photographing an intelligent flying device and an intelligent flying device.
  • unmanned cameras With the rapid development of electronic device technology, various intelligent flying devices have emerged, such as unmanned cameras.
  • the unmanned camera can fly to the ground to shoot the ground scene under the control of equipment such as a remote control.
  • an unmanned camera when shooting in an environment with a light source, the light emitted by the light source is easily projected after passing through the above-mentioned type of intelligent flying device.
  • an unmanned camera when shooting with an unmanned camera in fine weather, an unmanned camera may produce a projection on the ground under the illumination of sunlight, in which case it is easy to take the resulting projection together to a photo or In the video.
  • the present disclosure provides a shooting method of an intelligent flying device and an intelligent flying device.
  • a method for photographing an intelligent flying device comprising:
  • the light source angle is an angle between a current orientation of the target light source and a vertical direction
  • the target light source is a light source capable of generating a projection on the smart flight device, the vertical direction being a direction perpendicular to the horizontal plane ;
  • the shooting is performed based on the current shooting angle and the orientation of the projection.
  • the determining the light source angle comprises any one of the following implementations:
  • determining, according to the angle of the light source, the light emitted by the target light source to pass through the intelligent flight setting The orientation of the projection produced on the horizontal surface after preparation, including:
  • the flying height is the height of the current flying device from the horizontal plane.
  • the photographing is performed based on the current shooting angle and the projected orientation, including:
  • the method further includes:
  • the photographing is performed based on the current shooting angle and the projected orientation, including:
  • the photographing is performed based on the target rotation direction and a preset rotation angle corresponding to the preset projection range at which the projection distance is located.
  • an intelligent flight device comprising:
  • a first determining module configured to determine a light source angle, where the light source angle is an angle between a current orientation and a vertical direction of the target light source, where the target light source is a light source capable of generating a projection on the smart flight device, the vertical
  • the direction is a direction perpendicular to the horizontal plane;
  • a second determining module configured to determine, according to the angle of the light source determined by the first determining module, an orientation of a projection generated by the target light source on a horizontal plane after passing through the smart flight device;
  • a photographing module configured to perform photographing based on a current photographing angle and an orientation of the projection determined by the second determining module.
  • the first determining module includes:
  • a first determining submodule configured to determine, according to the plurality of first preset angles, a plurality of light intensities by using the configured light sensor, and determine a first preset angle corresponding to the maximum light intensity as the light source angle, where the plurality The first preset angles are in one-to-one correspondence with the plurality of light intensities;
  • a second determining submodule configured to determine a plurality of exposures based on the plurality of second preset angles, and determine a second preset angle corresponding to the maximum exposure as the light source angle, the plurality of second presets The angle corresponds to the plurality of exposures one by one.
  • the second determining module includes:
  • a third determining submodule configured to determine, when the light source angle is zero, that a projection generated by the target light source on a horizontal surface after passing through the smart flight device is located directly below a current location of the smart flight device ;
  • a fourth determining submodule configured to determine a flying height when the angle of the light source is not zero, and determine, according to the light source angle and the flying height, that the light emitted by the target light source passes through the smart flying device The orientation of the projection produced on the horizontal plane, the flying height being the height of the current flying device from the horizontal plane.
  • the shooting module includes:
  • a determining submodule configured to determine, according to a current shooting angle and an orientation of the projection, whether the projected orientation is within a shooting range
  • a first shooting sub-module configured to: when the orientation of the projection is within the shooting range and directly below the current location of the smart flying device, vertically shooting in a vertical direction to obtain a captured picture;
  • a fifth determining submodule configured to determine a size of the smart flying device in the captured picture by performing preset image processing on the captured picture
  • a sixth determining submodule configured to determine a rotation angle based on a size of the smart flight device in the captured picture and the flying height, the rotation angle being an angle that avoids rotation of the projection;
  • a second shooting sub-module for taking a picture based on the rotation angle.
  • the shooting module further includes:
  • a seventh determining submodule configured to determine a target direction according to the projected orientation when the orientation of the projection is within the shooting range, and the angle of the light source is not zero, the target direction being Any direction other than the direction in which the projection is located;
  • An eighth determining submodule configured to determine, from a plurality of preset projection ranges, a preset projection range in which the projection distance is located, the projection distance being a horizontal distance between the projected orientation and the smart flight device;
  • a ninth determining sub-module configured to determine, from a plurality of preset rotation angles, a preset rotation angle corresponding to a preset projection range in which the projection distance is located, the plurality of preset rotation angles and the plurality of presets One-to-one correspondence of projection ranges;
  • the second shooting submodule is used for:
  • the photographing is performed based on the target rotation direction and a preset rotation angle corresponding to the preset projection range at which the projection distance is located.
  • an intelligent flying device comprising:
  • a memory for storing processor executable instructions
  • processor is configured to:
  • the light source angle is an angle between a current orientation of the target light source and a vertical direction
  • the target light source is a light source capable of generating a projection on the smart flight device, the vertical direction being a direction perpendicular to the horizontal plane ;
  • the shooting is performed based on the current shooting angle and the orientation of the projection.
  • the shooting according to the current shooting angle of the intelligent flying device and the orientation of the projection can avoid capturing the projection together into a photo or video, thereby improving the shooting quality.
  • FIG. 1 is a flow chart showing a method of photographing an intelligent flying device according to an exemplary embodiment.
  • FIG. 2A is a flowchart of a method of photographing an intelligent flying device, according to another exemplary embodiment.
  • FIG. 2B is a schematic diagram of an implementation environment of a method for photographing an intelligent flying device according to the embodiment of FIG. 2A.
  • FIG. 2C is a schematic diagram of an implementation environment of a shooting method of another intelligent flying device according to the embodiment of FIG. 2A.
  • FIG. 3 is a block diagram of an intelligent flight device, according to an exemplary embodiment.
  • FIG. 4 is a block diagram of a smart flight device 400, according to an exemplary embodiment.
  • the application scenarios of the embodiments of the present disclosure will be described.
  • the light emitted by the light source passes through the intelligent flying device, and a projection is generated.
  • a shooting method of the intelligent flying device is improved, and the shooting can be prevented from being taken together into a picture or a video, thereby achieving an effect of improving the shooting quality.
  • Intelligent fly provided by an embodiment of the present disclosure
  • the shooting method of the line device is performed by an intelligent flying device, which may be a device such as an unmanned camera.
  • FIG. 1 is a flowchart of a method for photographing an intelligent flying device according to an exemplary embodiment. As shown in FIG. 1 , a shooting method of the smart flying device may include the following steps.
  • a light source angle is determined, the light source angle is an angle between a current orientation of the target light source and a vertical direction, and the target light source is a light source capable of generating a projection on the smart flight device, the vertical direction being perpendicular to the horizontal plane. The direction.
  • step 102 based on the angle of the light source, the orientation of the projection generated on the horizontal plane after the light emitted by the target light source passes through the smart flight device is determined.
  • step 103 shooting is performed based on the current shooting angle and the projected orientation.
  • the orientation of the projection generated on the horizontal surface after the device, since the orientation of the projection has been determined, the shooting according to the current shooting angle of the intelligent flying device and the orientation of the projection can avoid the projection of the projection together into the photo or video. Improve the quality of the shot.
  • determining the angle of the light source comprises any of the following implementations:
  • determining, according to the angle of the light source, an orientation of a projection generated by the target light source on a horizontal plane after passing through the smart flight device including:
  • Determining a flying height when the angle of the light source is not zero, and determining, according to the light source angle and the flying height, an orientation of a projection generated on a horizontal surface of the light emitted by the target light source, the flying height is The height of the current flight device from the horizontal plane.
  • shooting based on the current shooting angle and the orientation of the projection including:
  • the image is vertically photographed vertically in the vertical direction to obtain a captured picture
  • the rotation angle Degree means avoiding the angle at which the projection needs to be rotated
  • the method further includes:
  • shooting based on the current shooting angle and the orientation of the projection includes:
  • the shooting is performed based on the target direction and a preset rotation angle corresponding to the preset projection range at which the projection distance is located.
  • FIG. 2A is a flowchart of a method for photographing an intelligent flying device according to another exemplary embodiment. As shown in FIG. 2A, the smart flying device method is used in an intelligent control device, and the smart flying device method includes the following steps. :
  • a light source angle is determined, the light source angle is an angle between a current orientation of the target light source and a vertical direction, and the target light source is a light source capable of generating a projection on the intelligent flight device, the vertical direction being perpendicular to the horizontal plane. The direction.
  • the above-mentioned target light source may include a sun, a light, and the like, which is not limited by the embodiment of the present disclosure.
  • the target light source in FIG. 2B is the sun 21, and the angle between the current orientation and the vertical direction of the target light source 21 is, that is, the angle of the light source is.
  • the implementation process of determining the angle of the light source may include any one of the following implementation manners:
  • the first method is: determining, according to the plurality of first preset angles, a plurality of light intensities by using the configured light sensor, and determining a first preset angle corresponding to the maximum light intensity as the light source angle, the plurality of first presets The angle corresponds to the plurality of light intensities.
  • the smart flight device can be configured with a light sensor, which is not difficult to understand, because the target light source can only project the smart flight device when the height of the target light source from the horizontal plane is higher than the height of the smart flight device from the horizontal plane. Therefore, in a possible implementation, the light sensor may be disposed at a top end of the smart flight device, and the light sensor may be rotated to collect light emitted by the light source based on the plurality of first predetermined angles.
  • the first preset angle of the plurality of first preset angles may be set by the user according to actual requirements, or may be set by default by the smart flight device, which is not limited by the embodiment of the disclosure.
  • the plurality of first preset angles may be 30 degrees, 60 degrees, 90 degrees, -30 between the vertical directions and the vertical direction, respectively.
  • the preset angle of the angle and the angle of -60 degrees That is, the smart flight device performs light collection every 30 degrees through the light sensor to obtain a plurality of light intensity corresponding to the plurality of first preset angles.
  • the stronger the intensity of the light the closer the first predetermined angle corresponding to the light intensity is to the target light source, that is, the closer the first preset angle corresponding to the light intensity is to the orientation and vertical direction of the target light source.
  • the angle between the two when the smart flight device obtains the plurality of light intensities, determining a maximum light intensity from the plurality of light intensities, and determining a first predetermined angle corresponding to the maximum light intensity as the light source angle .
  • the second manner determining a plurality of exposures based on the plurality of second preset angles, and determining a second preset angle corresponding to the maximum exposure as the light source angle, the plurality of second preset angles and the plurality of The exposure is one-to-one correspondence.
  • the smart flight device may collect light based on the plurality of second preset angles through its own camera to determine a plurality of exposures.
  • the greater the exposure degree the closer the corresponding second preset angle is to the angle between the orientation of the target light source and the vertical direction. Therefore, the second preset angle corresponding to the maximum exposure degree is determined as the light source angle.
  • the second preset angle of each of the plurality of second preset angles may be set by the user according to actual requirements, or may be set by default by the smart flight device, which is not limited by the embodiment of the disclosure.
  • the angle of the light source may also be determined by other methods.
  • the method may also include any of the following implementation manners.
  • the smart flight device determines the location information of the current location of the smart flight device through the positioning function, acquires the system time point, and selects the server based on the location information and the system time point. Obtaining the position information and the time point of the system corresponding to the rising angle of the sun, the rising angle of the sun is an angle between the sun and the horizontal plane, and determining a difference between the angle of 90 degrees and the rising angle of the sun as the angle of the light source
  • the designated server is configured to store a correspondence between system time points and location information and a sun rise angle.
  • the smart flight device can determine the location information of the current location and the system time point by using a positioning technology such as a GPS (Global Positioning System) or the like.
  • the intelligent flight device may send a light source angle acquisition request to the designated server, where the light source angle acquisition request carries the location information and the system time point, and the designated server receives the light source angle acquisition request, and the location may be extracted from the light source angle acquisition request.
  • the rising angle is sent to the smart flight device, and since the sun rise angle can be approximated as an angle between the smart flight device and the horizontal ground, the smart flight device will be between 90 degrees and the sun rise angle The difference is determined as the angle of the above light source.
  • the fourth way determine the angle of the light source by capturing the projection of objects of the same dimension.
  • the smart flight device may also capture an object in the same space as itself and at the same height as itself and a projection of the object, according to the projection of the projected object and the current flight device.
  • the flying height is determined by the angle of the light source, which can be determined according to the Pythagorean theorem of the triangle, and will not be described in detail here.
  • the angle of the light source may be obtained by the smart terminal associated with the smart flight device, and then the light source angle is sent to the smart flight device, which is not limited in the embodiment of the disclosure.
  • step 202 based on the angle of the light source, the orientation of the projection generated on the horizontal plane after the light emitted by the target light source passes through the smart flight device is determined.
  • determining the orientation of the projection produced by the target light source on the horizontal surface after passing through the smart flight device may include the following implementations:
  • the first way when the angle of the light source is zero, it is determined that the light generated by the target light source passes through the intelligent flying device and the projection generated on the horizontal surface is directly below the current location of the intelligent flying device.
  • the light generated by the target light source 21 passes through the smart flight device 23 and the projection 22 generated on the horizontal surface is directly below the current position of the smart flight device 23.
  • the second way when the angle of the light source is not zero, determining the flying height, and determining the orientation of the projection generated by the target light source on the horizontal plane after passing through the intelligent flying device according to the light source angle and the flying height.
  • the flying height is the height of the current flying device from the horizontal plane.
  • the flying height is H
  • the smart flying device may determine the flying height by using an infrared ray sensor or the like that can be used for measuring the distance, which is not limited by the embodiment of the present disclosure.
  • step 203 a photograph is taken based on the current photographing angle and the orientation of the projection.
  • the smart flight device can determine the current shooting angle by using an angle sensor configured by itself, that is, when the camera device of the camera is rotated, the angle sensor can be used to acquire the camera device. The angle of rotation to get the current shooting angle.
  • the smart flight device determines whether the projected orientation is within the shooting range based on the current shooting angle and the orientation of the projection.
  • determining whether the orientation of the projection is within the shooting range may include: determining whether a shooting direction corresponding to the current shooting angle is in the same direction as a direction of the projected orientation, If the shooting direction corresponding to the current shooting angle is different from the direction in which the projection is located, it is determined that the orientation of the projection is not within the shooting range, and if the current shooting angle corresponds to the shooting direction and the direction of the projected orientation Or determining whether the angle between the shooting angle and the vertical direction is smaller than the light source angle.
  • the angle between the shooting angle and the vertical direction is smaller than the light source angle, determining that the projected orientation is within the shooting range, If the angle between the shooting angle and the vertical direction is greater than or equal to the light source angle, it is determined that the orientation of the projection is not within the shooting range.
  • the current shooting angle is to the right, since the orientation of the projection is opposite to the shooting direction corresponding to the shooting angle, that is, the orientation of the projection is in the horizontal left direction, therefore, It is difficult to understand that the orientation of the projection is not within the shooting range.
  • the current shooting angle corresponds to the same shooting direction as the projection, and the angle between the current shooting angle and the vertical direction is smaller than the light source angle, for example, if the current shooting angle is leftward and The angle between the vertical directions is 20 degrees, and the angle of the light source is 60 degrees, then the intelligent flight device determines The orientation of the projection is within the shooting range.
  • the smart flight device determines whether the orientation of the projection is within the shooting range based on the current shooting angle and the orientation of the projection.
  • the The smart flight device transmits the current shooting angle and the orientation of the projection to a smart device such as a mobile phone, a smart remote controller, etc. associated with the smart flight device, and the smart device determines the current shooting angle based on the current shooting angle and the projected orientation Whether the orientation of the projection is within the shooting range is not limited in the embodiment of the present disclosure.
  • the specific implementation may include any of the following implementation manners:
  • the first way when the orientation of the projection is within the shooting range and directly below the current position of the intelligent flying device, the image is taken vertically downward in the vertical direction to obtain a picture, and the picture is preset by Image processing, determining a size of the smart flight device in the captured picture, determining a rotation angle based on a size of the smart flight device in the captured image and the flight height, the rotation angle being an angle required to avoid the projection , shooting based on the angle of rotation.
  • the camera device needs to be rotated by a certain angle to avoid the projection for shooting.
  • the imaging device is rotated by a ⁇ angle in either direction, it is possible to avoid capturing the projection into a picture or video. Therefore, it is necessary to determine the magnitude of the angle ⁇ .
  • the embodiment of the present disclosure captures the projection to obtain a captured picture.
  • the smart flight device generally includes a plurality of arms for assisting flying, based on the feature, the circle in which the projection is located can be obtained based on the projection.
  • the smart flight device can determine the size of the smart flight device in the captured image by performing preset image processing on the captured image, thereby obtaining the smart flight device in the captured image.
  • the size of the radius r of the circular area in which it is located, after which the intelligent flying device determines the angle ⁇ required to be rotated by tan ⁇ r / H based on the magnitude of the radius r and the above-mentioned flying height H.
  • the process of the preset image processing may include an image normalization process, a pixel point scan process, and the like, which are not limited in the embodiment of the present disclosure.
  • the second mode when the orientation of the projection is within the shooting range, and the angle of the light source is not zero, determining a target direction according to the orientation of the projection, the target direction being other than the direction in which the projection is located Determining, in any direction, a preset projection range in which the projection distance is located from a plurality of preset projection ranges, the projection distance being a horizontal distance between the orientation of the projection and the intelligent flying device, and from a plurality of preset rotation angles Determining a preset rotation angle corresponding to the preset projection range in which the projection distance is located, the plurality of preset rotation angles are in one-to-one correspondence with the plurality of preset projection ranges, based on the target direction and the preset of the projection distance The preset rotation angle corresponding to the projection range is used for shooting.
  • the preset projection range of the plurality of preset projection ranges may be customized by the user according to actual needs, or may be set by default by the smart flight device, which is not limited by the embodiment of the disclosure.
  • each of the plurality of preset rotation angles can be customized by the user according to actual needs.
  • the setting may also be set by default by the smart flight device, which is not limited by the embodiment of the present disclosure.
  • the orientation of the projection is within the shooting range and the angle of the light source is not zero, if the shooting is directly performed, the projection is taken together into the picture or video, and therefore, the shooting angle needs to be adjusted.
  • FIG. 2B if the orientation of the projection is as shown in FIG. 22, it is necessary to rotate the camera to a preset rotation angle in any direction other than the direction in which the projection is located.
  • the preset rotation angle and the orientation of the projection are related to the horizontal distance between the smart flight device, and the target light source is greater when the horizontal distance between the projected orientation and the smart flight device is larger.
  • the smaller the projection formed on the horizontal ground the smaller the preset rotation angle can be set, that is, only a small angle needs to be rotated, thereby avoiding shooting the projection into a picture or video.
  • the horizontal distance between the orientation of the projection and the smart flight device is smaller, the projection of the target light source on the horizontal ground after the target light source passes through the smart flight device is larger, so if the projection is to be avoided, the projection is taken. , you need to turn a larger preset rotation angle.
  • the user may determine a one-to-one correspondence between the plurality of preset projection ranges and the plurality of preset rotation angles according to a plurality of data tests, and then the plurality of preset projections A one-to-one correspondence between the range and the plurality of preset rotation angles is stored in the smart flight device.
  • the target direction is any direction other than the direction in which the projection is located, that is, the user can set the target direction according to his own preference, and set the target direction and the user account.
  • Corresponding storage is performed, and then, based on the correspondence between the target direction and the user account, the smart flight device can select a target direction corresponding to the user account, thereby improving the user experience.
  • the shooting path can be planned according to the orientation of the projection and the target direction. For example, referring to FIG. 2B, if the smart flight device rotates the preset rotation angle in a horizontal right direction, the smart flight device can fly horizontally to the left direction. At this time, since the projection of the smart flight device will follow The intelligent flight device moves together, that is, the orientation of the projection also moves horizontally to the left, so that the smart flight device can shoot the scene in the projected orientation.
  • the shooting path may also be sent to a smart device such as a mobile phone, a remote control device, or the like, and the smart device performs a path demonstration after receiving the shooting path. In this way, the user can be informed of the next shooting path of the smart flying device, thereby improving the user experience.
  • a smart device such as a mobile phone, a remote control device, or the like
  • the orientation of the projection generated on the horizontal surface after the device, since the orientation of the projection has been determined, the shooting according to the current shooting angle of the intelligent flying device and the orientation of the projection can avoid the projection of the projection together into the photo or video. Improve the quality of the shot.
  • FIG. 3 is a block diagram of an intelligent flight device, according to an exemplary embodiment.
  • the smart flight device includes a first determining module 310, a second determining module 320, and a shooting module 330.
  • the first determining module 310 is configured to determine a light source angle, where the light source angle is an angle between a current orientation of the target light source and a vertical direction, and the target light source is a light source capable of generating a projection on the smart flight device, where the vertical direction is a direction perpendicular to the horizontal plane;
  • a second determining module 320 configured to determine, according to the light source angle determined by the first determining module 310, an orientation of a projection generated by the target light source on a horizontal plane after passing through the smart flying device;
  • the shooting module 330 is configured to perform shooting based on the current shooting angle and the orientation of the projection determined by the second determining module 320.
  • the first determining module 310 includes:
  • a first determining submodule configured to determine a plurality of light intensities by using the configured light sensor based on the plurality of first preset angles, and determine a first preset angle corresponding to the maximum light intensity as the light source angle, the plurality of a predetermined angle corresponds to the plurality of light intensities;
  • a second determining submodule configured to determine a plurality of exposures based on the plurality of second preset angles, and determine a second preset angle corresponding to the maximum exposure as the light source angle, and the plurality of second preset angles
  • the plurality of exposures correspond one-to-one.
  • the second determining module 320 includes:
  • a third determining sub-module configured to: when the light source angle is zero, determine that a light generated by the target light source passes through the smart flying device and a projection generated on a horizontal plane is directly below the current location of the smart flying device;
  • a fourth determining submodule configured to determine a flying height when the angle of the light source is not zero, and determine, according to the light source angle and the flying height, a projection generated by the target light source on a horizontal plane after passing through the intelligent flying device
  • the altitude of the flight is the height of the current flight device from the horizontal plane.
  • the shooting module 330 includes:
  • a determining sub-module configured to determine, according to the current shooting angle and the orientation of the projection, whether the orientation of the projection is within a shooting range
  • a first shooting sub-module configured to: when the orientation of the projection is within the shooting range and directly below the current location of the smart flying device, vertically shooting vertically in the vertical direction to obtain a captured picture;
  • a fifth determining submodule configured to determine a size of the smart flying device in the captured image by performing preset image processing on the captured image
  • a sixth determining submodule configured to determine a rotation angle based on a size of the smart flying device in the captured picture and the flying height, the rotation angle being an angle that avoids rotation of the projection;
  • the second shooting sub-module is configured to perform shooting based on the rotation angle.
  • the shooting module 330 further includes:
  • a seventh determining submodule configured to determine a target direction according to an orientation of the projection when the orientation of the projection is within the shooting range and the angle of the light source is not zero, the target direction being a direction other than the orientation of the projection Any direction other than
  • An eighth determining submodule configured to determine, from a plurality of preset projection ranges, a preset projection range in which the projection distance is located, the projection distance being a horizontal distance between the projected orientation and the smart flight device;
  • a ninth determining sub-module configured to determine, from a plurality of preset rotation angles, a preset rotation angle corresponding to a preset projection range in which the projection distance is located, the plurality of preset rotation angles and the plurality of preset projection ranges One correspondence
  • the second shooting sub-module is used for:
  • the shooting is performed based on the target direction and a preset rotation angle corresponding to the preset projection range at which the projection distance is located.
  • the orientation of the projection generated on the horizontal surface after the device, since the orientation of the projection has been determined, the shooting according to the current shooting angle of the intelligent flying device and the orientation of the projection can avoid the projection of the projection together into the photo or video. Improve the quality of the shot.
  • FIG. 4 is a block diagram of a smart flight device 400, according to an exemplary embodiment.
  • the smart flight device 400 can be an unmanned camera or the like.
  • the smart flight device 400 can include one or more of the following components: a processing component 402, a memory 404, a power component 406, a multimedia component 408, an audio component 410, an input/output (I/O) interface 412, and a sensor component. 414, and a communication component 416.
  • Processing component 402 typically controls the overall operation of smart flying device 400, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations.
  • Processing component 402 can include one or more processors 420 to execute instructions to perform all or part of the steps of the methods described above.
  • processing component 402 can include one or more modules to facilitate interaction between component 402 and other components.
  • processing component 402 can include a multimedia module to facilitate interaction between multimedia component 408 and processing component 402.
  • the memory 404 is configured to store various types of data to support operation at the smart flight device 400. Examples of such data include instructions for any application or method operating on smart flight device 400, contact data, phone book data, messages, pictures, videos, and the like.
  • Memory 404 can be implemented by any type of volatile or non-volatile storage device, or a combination thereof, such as static random access memory (SRAM), electrically erasable programmable read only memory (EEPROM), erasable.
  • SRAM static random access memory
  • EEPROM electrically erasable programmable read only memory
  • EPROM Programmable Read Only Memory
  • PROM Programmable Read Only Memory
  • ROM Read Only Memory
  • Magnetic Memory Flash Memory
  • Disk Disk or Optical Disk.
  • Power component 406 provides power to various components of smart flight device 400.
  • Power component 406 can include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for smart flight device 400.
  • the multimedia component 408 includes a screen between the smart flight device 400 and the user that provides an output interface.
  • the screen can include a liquid crystal display (LCD) and a touch panel (TP). If the screen package Including a touch panel, the screen can be implemented as a touch screen to receive input signals from a user.
  • the touch panel includes one or more touch sensors to sense touches, slides, and gestures on the touch panel. The touch sensor may sense not only the boundary of the touch or sliding action, but also the duration and pressure associated with the touch or slide operation.
  • the multimedia component 408 includes a front camera and/or a rear camera. When the smart flight device 400 is in an operation mode, such as a shooting mode or a video mode, the front camera and/or the rear camera can receive external multimedia data. Each front and rear camera can be a fixed optical lens system or have focal length and optical zoom capabilities.
  • the audio component 410 is configured to output and/or input an audio signal.
  • the audio component 410 includes a microphone (MIC) that is configured to receive an external audio signal when the smart flight device 400 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode.
  • the received audio signal may be further stored in memory 404 or transmitted via communication component 416.
  • audio component 410 also includes a speaker for outputting an audio signal.
  • the I/O interface 412 provides an interface between the processing component 402 and the peripheral interface module, which may be a keyboard, a click wheel, a button, or the like. These buttons may include, but are not limited to, a home button, a volume button, a start button, and a lock button.
  • Sensor assembly 414 includes one or more sensors for providing status assessment of various aspects to smart flight device 400.
  • sensor component 414 can detect an open/closed state of smart flying device 400, a relative positioning of components, such as the display and keypad of smart flying device 400, and sensor component 414 can also detect smart flying device 400 or smart
  • the position of one component of the flying device 400 changes, the presence or absence of contact of the user with the intelligent flying device 400, the orientation or acceleration/deceleration of the intelligent flying device 400, and the temperature change of the intelligent flying device 400.
  • Sensor assembly 414 can include a proximity sensor configured to detect the presence of nearby objects without any physical contact.
  • Sensor assembly 414 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications.
  • the sensor assembly 414 can also include an acceleration sensor, a gyro sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
  • Communication component 416 is configured to facilitate wired or wireless communication between smart flight device 400 and other devices.
  • the intelligent flight device 400 can access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof.
  • communication component 416 receives broadcast signals or broadcast associated information from an external broadcast management system via a broadcast channel.
  • the communication component 416 also includes a near field communication (NFC) module to facilitate short range communication.
  • NFC near field communication
  • the NFC module can be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra-wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
  • RFID radio frequency identification
  • IrDA infrared data association
  • UWB ultra-wideband
  • Bluetooth Bluetooth
  • smart flight device 400 may be implemented by one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), A programmable gate array (FPGA), controller, microcontroller, microprocessor, or other electronic component implementation for performing the above methods.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGA programmable gate array
  • controller microcontroller, microprocessor, or other electronic component implementation for performing the above methods.
  • non-transitory computer readable storage medium comprising instructions, such as a package
  • the memory 404 of instructions is executable by the processor 420 of the smart flight device 400 to perform the above method.
  • the non-transitory computer readable storage medium may be a ROM, a random access memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, and an optical data storage device.
  • a non-transitory computer readable storage medium when instructions in the storage medium are executed by a processor of the smart flight device 400, enabling the smart flight device 400 to perform the smart flight device of the embodiment of FIG. 1 or FIG. 2A The method of shooting.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Studio Devices (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

本公开是关于一种智能飞行设备的拍摄方法及智能飞行设备,属于电子设备技术领域,该方法包括确定光源角度,该光源角度为目标光源当前所在方位与竖直方向之间的夹角,该目标光源为能够对智能飞行设备产生投影的光源,该竖直方向为与水平面垂直的方向,根据该光源角度,确定该目标光源发出的光线经过该智能飞行设备后在水平面上产生的投影的方位,基于该智能飞行设备当前拍摄角度和该投影的方位进行拍摄,避免将该投影一起拍摄至照片或视频中,提高了拍摄质量。

Description

智能飞行设备的拍摄方法及智能飞行设备
本申请基于申请号为201710049939.0、申请日为2017年01月23日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本公开涉及电子设备技术领域,尤其涉及一种智能飞行设备的拍摄方法及智能飞行设备。
背景技术
随着电子设备技术的快速发展,出现了各种各样的智能飞行设备,例如,无人摄像机等。其中,无人摄像机可以在诸如遥控等之类设备的控制下,飞行至高空中对地面景物进行拍摄。
然而,在实际应用过程中,当在有光源的环境下拍摄时,光源发出的光线经过上述类型的智能飞行设备后容易产生投影。例如,在晴朗的天气里使用无人摄像机进行拍摄时,在太阳光的照射下,无人摄像机可能会在地面上产生投影,在该种情况下,容易将所产生的投影一起拍摄至照片或视频中。
发明内容
为克服相关技术中存在的问题,本公开提供一种智能飞行设备的拍摄方法及智能飞行设备。
第一方面,提供一种智能飞行设备的拍摄方法,所述方法包括:
确定光源角度,所述光源角度为目标光源当前所在方位与竖直方向之间的夹角,所述目标光源为能够对智能飞行设备产生投影的光源,所述竖直方向为与水平面垂直的方向;
根据所述光源角度,确定所述目标光源发出的光线经过所述智能飞行设备后在水平面上产生的投影的方位;
基于当前拍摄角度和所述投影的方位进行拍摄。
可选地,所述确定光源角度包括如下实现方式中的任一种:
基于多个第一预设角度,通过配置的光线传感器确定多个光线强度,并将最大光线强度对应的第一预设角度确定为所述光源角度,所述多个第一预设角度与所述多个光线强度一一对应;
基于多个第二预设角度,确定多个曝光度,并将最大曝光度对应的第二预设角度确定为所述光源角度,所述多个第二预设角度与所述多个曝光度一一对应。
可选地,所述根据所述光源角度,确定所述目标光源发出的光线经过所述智能飞行设 备后在水平面上产生的投影的方位,包括:
当所述光源角度为零时,确定所述目标光源发出的光线经过所述智能飞行设备后在水平面上产生的投影位于所述智能飞行设备当前所在位置的正下方;
当所述光源角度不为零时,确定飞行高度,并根据所述光源角度和所述飞行高度,确定所述目标光源发出的光线经过所述智能飞行设备后在水平面上产生的投影的方位,所述飞行高度为所述智能飞行设备当前距离水平面的高度。
可选地,所述基于当前拍摄角度和所述投影的方位进行拍摄,包括:
基于当前拍摄角度和所述投影的方位,判断所述投影的方位是否在拍摄范围内;
当所述投影的方位在所述拍摄范围内且位于所述智能飞行设备当前所在位置的正下方时,在竖直方向上向下垂直拍摄,得到拍摄图片;
通过对所述拍摄图片进行预设图像处理,确定所述智能飞行设备在所述拍摄图片中的大小;
基于所述智能飞行设备在所述拍摄图片中的大小和所述飞行高度,确定转动角度,所述转动角度是指避开所述投影需要转动的角度;
基于所述转动角度进行拍摄。
可选地,所述基于当前拍摄角度和所述投影的方位,判断所述投影的方位是否在拍摄范围内之后,还包括:
当所述投影的方位在所述拍摄范围内,且所述光源角度不为零时,根据所述投影的方位,确定目标方向,所述目标方向为除了所述投影的方位所在的方向之外的任一方向;
从多个预设投影范围中确定投影距离所处的预设投影范围,所述投影距离为所述投影的方位与所述智能飞行设备之间的水平距离;
从多个预设转动角度中确定所述投影距离所处的预设投影范围对应的预设转动角度,所述多个预设转动角度与所述多个预设投影范围一一对应;
相应地,所述基于当前拍摄角度和所述投影的方位进行拍摄,包括:
基于所述目标方向和所述投影距离所处的预设投影范围对应的预设转动角度进行拍摄。
另一方面,提供一种智能飞行设备,所述智能飞行设备包括:
第一确定模块,用于确定光源角度,所述光源角度为目标光源当前所在方位与竖直方向之间的夹角,所述目标光源为能够对智能飞行设备产生投影的光源,所述竖直方向为与水平面垂直的方向;
第二确定模块,用于根据所述第一确定模块确定的所述光源角度,确定所述目标光源发出的光线经过所述智能飞行设备后在水平面上产生的投影的方位;
拍摄模块,用于基于当前拍摄角度和所述第二确定模块确定的所述投影的方位进行拍摄。
可选地,所述第一确定模块包括:
第一确定子模块,用于基于多个第一预设角度,通过配置的光线传感器确定多个光线强度,并将最大光线强度对应的第一预设角度确定为所述光源角度,所述多个第一预设角度与所述多个光线强度一一对应;
第二确定子模块,用于基于多个第二预设角度,确定多个曝光度,并将最大曝光度对应的第二预设角度确定为所述光源角度,所述多个第二预设角度与所述多个曝光度一一对应。
可选地,所述第二确定模块包括:
第三确定子模块,用于当所述光源角度为零时,确定所述目标光源发出的光线经过所述智能飞行设备后在水平面上产生的投影位于所述智能飞行设备当前所在位置的正下方;
第四确定子模块,用于当所述光源角度不为零时,确定飞行高度,并根据所述光源角度和所述飞行高度,确定所述目标光源发出的光线经过所述智能飞行设备后在水平面上产生的投影的方位,所述飞行高度为所述智能飞行设备当前距离水平面的高度。
可选地,所述拍摄模块包括:
判断子模块,用于基于当前拍摄角度和所述投影的方位,判断所述投影的方位是否在拍摄范围内;
第一拍摄子模块,用于当所述投影的方位在所述拍摄范围内且位于所述智能飞行设备当前所在位置的正下方时,在竖直方向上向下垂直拍摄,得到拍摄图片;
第五确定子模块,用于通过对所述拍摄图片进行预设图像处理,确定所述智能飞行设备在所述拍摄图片中的大小;
第六确定子模块,用于基于所述智能飞行设备在所述拍摄图片中的大小和所述飞行高度,确定转动角度,所述转动角度是指避开所述投影需要转动的角度;
第二拍摄子模块,用于基于所述转动角度进行拍摄。
可选地,所述拍摄模块还包括:
第七确定子模块,用于当所述投影的方位在所述拍摄范围内,且所述光源角度不为零时,根据所述投影的方位,确定目标方向,所述目标方向为除了所述投影的方位所在的方向之外的任一方向;
第八确定子模块,用于从多个预设投影范围中确定投影距离所处的预设投影范围,所述投影距离为所述投影的方位与所述智能飞行设备之间的水平距离;
第九确定子模块,用于从多个预设转动角度中确定所述投影距离所处的预设投影范围对应的预设转动角度,所述多个预设转动角度与所述多个预设投影范围一一对应;
相应地,所述第二拍摄子模块用于:
基于所述目标方向和所述投影距离所处的预设投影范围对应的预设转动角度进行拍摄。
第三方面,提供一种智能飞行设备,所述智能飞行设备包括:
处理器;
用于存储处理器可执行指令的存储器;
其中,所述处理器被配置为:
确定光源角度,所述光源角度为目标光源当前所在方位与竖直方向之间的夹角,所述目标光源为能够对智能飞行设备产生投影的光源,所述竖直方向为与水平面垂直的方向;
根据所述光源角度,确定所述目标光源发出的光线经过所述智能飞行设备后在水平面上产生的投影的方位;
基于当前拍摄角度和所述投影的方位进行拍摄。
本公开的实施例提供的技术方案可以包括以下有益效果:
确定能够对智能飞行设备产生投影的目标光源当前所在方位与竖直方向之间的夹角,根据所确定的夹角,可以确定该目标光源发出的光线经过该智能飞行设备后在水平面上产生的投影的方位,由于已经确定了该投影的方位,因此,根据智能飞行设备当前拍摄角度和该投影的方位进行拍摄,可以避免将该投影一起拍摄至照片或视频中,提高了拍摄质量。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本公开。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施例,并与说明书一起用于解释本公开的原理。
图1是根据一示例性实施例示出的一种智能飞行设备的拍摄方法的流程图。
图2A是根据另一示例性实施例示出的一种智能飞行设备的拍摄方法的流程图。
图2B是图2A实施例涉及的一种智能飞行设备的拍摄方法的实施环境示意图。
图2C是图2A实施例涉及的另一种智能飞行设备的拍摄方法的实施环境示意图。
图3是根据一示例性实施例示出的一种智能飞行设备的框图。
图4是根据一示例性实施例示出的一种智能飞行设备400的框图。
具体实施方式
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本公开相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本公开的一些方面相一致的设备和方法的例子。
在对本公开实施例进行详细地解释说明之前,先对本公开实施例的应用场景予以说明。由于相关技术中,当在有光源的环境下进行拍摄时,光源发出的光线经过智能飞行设备后会产生投影,在该种情况下,容易将所产生的投影一起拍摄至照片或视频中,影响拍摄质量,为此,在本公开实施例,提高了一种智能飞行设备的拍摄方法,可以避免将投影一起拍摄至图片或视频中,从而达到了提高拍摄质量的效果。本公开实施例提供的智能飞 行设备的拍摄方法由智能飞行设备作为执行主体,该智能飞行设备可以为无人摄像机等设备。
图1是根据一示例性实施例示出的一种智能飞行设备的拍摄方法的流程图,如图1所示,该智能飞行设备的拍摄方法可以包括以下步骤。
在步骤101中,确定光源角度,该光源角度为目标光源当前所在方位与竖直方向之间的夹角,该目标光源为能够对智能飞行设备产生投影的光源,该竖直方向为与水平面垂直的方向。
在步骤102中,根据该光源角度,确定该目标光源发出的光线经过该智能飞行设备后在水平面上产生的投影的方位。
在步骤103中,基于当前拍摄角度和该投影的方位进行拍摄。
在本公开实施例中,确定能够对智能飞行设备产生投影的目标光源当前所在方位与竖直方向之间的夹角,根据所确定的夹角,可以确定该目标光源发出的光线经过该智能飞行设备后在水平面上产生的投影的方位,由于已经确定了该投影的方位,因此,根据智能飞行设备当前拍摄角度和该投影的方位进行拍摄,可以避免将该投影一起拍摄至照片或视频中,提高了拍摄质量。
可选地,确定光源角度包括如下实现方式中的任一种:
基于多个第一预设角度,通过配置的光线传感器确定多个光线强度,并将最大光线强度对应的第一预设角度确定为该光源角度,该多个第一预设角度与该多个光线强度一一对应;
基于多个第二预设角度,确定多个曝光度,并将最大曝光度对应的第二预设角度确定为该光源角度,该多个第二预设角度与该多个曝光度一一对应。
可选地,根据该光源角度,确定该目标光源发出的光线经过该智能飞行设备后在水平面上产生的投影的方位,包括:
当该光源角度为零时,确定该目标光源发出的光线经过该智能飞行设备后在水平面上产生的投影位于该智能飞行设备当前所在位置的正下方;
当该光源角度不为零时,确定飞行高度,并根据该光源角度和该飞行高度,确定该目标光源发出的光线经过该智能飞行设备后在水平面上产生的投影的方位,该飞行高度为该智能飞行设备当前距离水平面的高度。
可选地,基于当前拍摄角度和该投影的方位进行拍摄,包括:
基于当前拍摄角度和该投影的方位,判断该投影的方位是否在拍摄范围内;
当该投影的方位在该拍摄范围内且位于该智能飞行设备当前所在位置的正下方时,在竖直方向上向下垂直拍摄,得到拍摄图片;
通过对该拍摄图片进行预设图像处理,确定该智能飞行设备在该拍摄图片中的大小;
基于该智能飞行设备在该拍摄图片中的大小和该飞行高度,确定转动角度,该转动角 度是指避开该投影需要转动的角度;
基于该转动角度进行拍摄。
可选地,基于当前拍摄角度和该投影的方位,判断该投影的方位是否在拍摄范围内之后,还包括:
当该投影的方位在该拍摄范围内,且该光源角度不为零时,根据该投影的方位,确定目标方向,该目标方向为除了该投影的方位所在的方向之外的任一方向;
从多个预设投影范围中确定投影距离所处的预设投影范围,该投影距离为该投影的方位与该智能飞行设备之间的水平距离;
从多个预设转动角度中确定该投影距离所处的预设投影范围对应的预设转动角度,该多个预设转动角度与该多个预设投影范围一一对应;
相应地,基于当前拍摄角度和该投影的方位进行拍摄,包括:
基于该目标方向和该投影距离所处的预设投影范围对应的预设转动角度进行拍摄。
上述所有可选技术方案,均可按照任意结合形成本公开的可选实施例,本公开实施例对此不再一一赘述。
图2A是根据另一示例性实施例示出的一种智能飞行设备的拍摄方法的流程图,如图2A所示,该智能飞行设备方法用于智能控制设备中,该智能飞行设备方法包括以下步骤:
在步骤201中,确定光源角度,该光源角度为目标光源当前所在方位与竖直方向之间的夹角,该目标光源为能够对智能飞行设备产生投影的光源,该竖直方向为与水平面垂直的方向。
其中,上述目标光源可以包括太阳、灯光等,本公开实施例对此不做限定。例如,请参考图2B,该图2B中的目标光源为太阳21,且该目标光源21当前所在方位与竖直方向之间的夹角为,即该光源角度为。其中,确定光源角度的实现过程可以包括如下实现方式中的任一种:
第一种方式:基于多个第一预设角度,通过配置的光线传感器确定多个光线强度,并将最大光线强度对应的第一预设角度确定为该光源角度,该多个第一预设角度与该多个光线强度一一对应。
也即是,该智能飞行设备可以配置有光线传感器,不难理解,由于只有当目标光源距离水平面的高度高于智能飞行设备距离水平面的高度时,该目标光源才可能对该智能飞行设备产生投影,因此,在一种可能的实现方式中,该光线传感器可以配置在该智能飞行设备的顶端,并且,该光线传感器可以转动,以基于多个第一预设角度,采集光源发出的光线。
其中,该多个第一预设角度中的每个第一预设角度均可以由于用户根据实际需求自定义设置,也可以由该智能飞行设备默认设置,本公开实施例对此不做限定。
例如,该多个第一预设角度可以分别为与竖直方向之间成30度、60度、90度、-30 度和-60度夹角的预设角度。即该智能飞行设备通过该光线传感器,每隔30度进行一次光线采集,得到该多个第一预设角度对应的多个光线强度。
由于光线强度越强,说明该光线强度对应的第一预设角度越接近于正对着该目标光源,即该光线强度对应的该第一预设角度越接近于目标光源的方位与竖直方向之间的夹角,因此,该智能飞行设备得到该多个光线强度后,从该多个光线强度中确定最大光线强度,并将该最大光线强度对应的第一预设角度确定为该光源角度。
第二种方式:基于多个第二预设角度,确定多个曝光度,并将最大曝光度对应的第二预设角度确定为该光源角度,该多个第二预设角度与该多个曝光度一一对应。
在实际实现时,该智能飞行设备可以通过自身的摄像装置,基于该多个第二预设角度采集光线,以确定多个曝光度。曝光度越大,对应的第二预设角度越接近于目标光源的方位与竖直方向之间的夹角,因此,将最大曝光度对应的第二预设角度确定为该光源角度。
其中,该多个第二预设角度中的每个第二预设角度均可以由于用户根据实际需求自定义设置,也可以由该智能飞行设备默认设置,本公开实施例对此不做限定。
需要说明的是,上述提供了两种确定光源角度的方法仅是示例性的,在另一实施例中,还可以通过其它方法来确定光源角度,例如,还可以包括如下实现方式中的任一种:
第三种方式:当该目标光源为太阳时,该智能飞行设备通过定位功能,确定该智能飞行设备当前所在位置的位置信息,获取系统时间点,基于该位置信息和系统时间点,从指定服务器中获取该位置信息和该系统时间点对应太阳升起角度,该太阳升起角度为太阳与水平面之间的夹角,将90度与该太阳升起角度之间的差值确定为上述光源角度,其中,该指定服务器用于存储系统时间点和位置信息与太阳升起角度之间的对应关系。
也即是,在该种实现方式中,该智能飞行设备可以通过诸如GPS(Global Positioning System,全球定位系统)等之类定位技术,确定自身当前所在位置的位置信息以及系统时间点,之后,该智能飞行设备可以向指定服务器发送光源角度获取请求,该光源角度获取请求中携带该位置信息和该系统时间点,该指定服务器接收到光源角度获取请求,可以从该光源角度获取请求中提取该位置信息和该系统时间点,并从预先存储的系统时间点和位置信息与太阳升起角度之间的对应关系,获取该位置信息和该系统时间点对应的太阳升起角度,并将确定该太阳升起角度发送给智能飞行设备,由于该太阳升起角度可以近似地认为的该智能飞行设备与水平地面之间的夹角,因此,该智能飞行设备将90度与该太阳升起角度之间的差值确定为上述光源角度。
第四种方式:通过捕捉同维度物体的投影确定光源角度。
在该种实现方式中,该智能飞行设备还可以对与自身处于同一空间内,且与自身处于同一高度的物体以及该物体的投影进行拍摄,以根据该投影物体的投影与该智能飞行设备当前的飞行高度,确定该光源角度,具体可以根据三角形的勾股定理进行确定,这里不再详细描述。
需要说明的是,上述仅是以通过智能飞行设备确定光源角度为例进行说明,在另一实 施例中,还可以通过与该智能飞行设备关联的智能终端来获取光源角度后,将该光源角度发送给该智能飞行设备,本公开实施例对此不做限定。
在步骤202中,根据该光源角度,确定该目标光源发出的光线经过该智能飞行设备后在水平面上产生的投影的方位。
根据该光源角度不同,确定该目标光源发出的光线经过该智能飞行设备后在水平面上产生的投影的方位可以包括如下实现方式:
第一种方式:当该光源角度为零时,确定该目标光源发出的光线经过该智能飞行设备后在水平面上产生的投影位于该智能飞行设备当前所在位置的正下方。
请参考图2C,当该光源角度为零时,该目标光源21发出的光线经过该智能飞行设备23后在水平面上产生的投影22位于该智能飞行设备23当前所在位置的正下方。
第二种方式:当该光源角度不为零时,确定飞行高度,并根据该光源角度和该飞行高度,确定该目标光源发出的光线经过该智能飞行设备后在水平面上产生的投影的方位,该飞行高度为该智能飞行设备当前距离水平面的高度。
请参考图2B,若该飞行高度为H,由于光源角度α已知,因此,根据公式tanα=d/H,可以确定该目标光源发出的光线经过该智能飞行设备后在水平面上产生的投影的方位d。
其中,在一种可能的实现方式中,该智能飞行设备可以通过自身配置的红外线传感器等之类可以用于测量距离的设备来确定飞行高度,本公开实施例对此不作限定。
在步骤203中,基于当前拍摄角度和该投影的方位进行拍摄。
在一种可能的实现方式中,该智能飞行设备可以通过自身配置的角度传感器来确定当前的拍摄角度,也即是,当自身的摄像装置转动时,可以通过该角度传感器来获取该摄像装置所转动的角度,从而得到当前拍摄角度。
之后,该智能飞行设备基于当前拍摄角度和该投影的方位,判断该投影的方位是否在拍摄范围内。
其中,在一种可能的实现方式中,判断该投影的方位是否在该拍摄范围内的具体实现过程可以包括:判断当前拍摄角度所对应的拍摄方向与该投影的方位所在的方向是否同向,若当前拍摄角度所对应的拍摄方向与该投影的方位所在的方向不同向,则确定该投影的方位不在该拍摄范围内,若当前拍摄角度所对应的拍摄方向与该投影的方位所在的方向同向,则判断该拍摄角度与竖直方向之间的夹角是否小于光源角度,若该拍摄角度与竖直方向之间的夹角小于光源角度,则确定该投影的方位在该拍摄范围内,若该拍摄角度与竖直方向之间的夹角大于或等于光源角度,则确定该投影的方位不在该拍摄范围内。
例如,请参考图2B,若当前的拍摄角度为向右,则由于该投影的方位与该拍摄角度所对应的拍摄方向反向,即该投影的方位在水平向左的方向上,因此,不难理解,该投影的方位不在该拍摄范围内。反之,如果该当前的拍摄角度所对应的拍摄方向与该投影的方位相同,且当前拍摄角度与竖直方向之间的夹角小于光源角度时,例如,若当前的拍摄角度为向左且与竖直方向之间的夹角为20度,且光源角度为60度,则该智能飞行设备确定 该投影的方位在拍摄范围内。
需要说明的是,上述仅是以由该智能飞行设备基于当前拍摄角度和该投影的方位,判断该投影的方位是否在拍摄范围内为例进行说明,在另一实施例中,还可以由该智能飞行设备将当前拍摄角度和该投影的方位发送给与该智能飞行设备关联的诸如手机、智能遥控器等之类的智能设备,由该智能设备基于当前拍摄角度和该投影的方位,判断该投影的方位是否在拍摄范围内,本公开实施例对此不做限定。
当该投影的方位在拍摄范围内时,需要避开该投影进行拍摄,具体实现方式可以包括如下实现方式中的任一种:
第一种方式:当该投影的方位在拍摄范围内且位于该智能飞行设备当前所在位置的正下方时,在竖直方向上向下垂直拍摄,得到拍摄图片,通过对该拍摄图片进行预设图像处理,确定该智能飞行设备在该拍摄图片中的大小,基于该智能飞行设备在该拍摄图片中的大小和该飞行高度,确定转动角度,该转动角度是指避开该投影需要转动的角度,基于该转动角度进行拍摄。
若该投影的方位位于该智能飞行设备当前所在位置的正下方,则需要将该摄像装置转动一定的角度,以避开该投影进行拍摄。请参考图2C,若将摄像装置向任一方向转动β角度,则可以避免将该投影拍摄至图片或视频中。因此,需要确定角度β的大小。
为此,本公开实施例对该投影进行拍摄,得到拍摄图片,由于该智能飞行设备通常包括多个用于助飞的机臂,基于该特点,实际上基于该投影可以得到该投影所在的圆形区域,如图2C所示,该智能飞行设备可以通过对该拍摄图片进行预设图像处理,来确定该智能飞行设备在该拍摄图片中的大小,从而得到该智能飞行设备在该拍摄图片中所在的圆形区域的半径r的大小,之后,该智能飞行设备基于该半径r的大小和上述飞行高度H,通过tanβ=r/H,可以确定上述需要转动的角度β。
需要说明的是,由于上述半径r的大小和该飞行高度的单位可能不相同,因此,在实际实现过程中,需要进行单位换算处理,这里不做详细描述。
其中,上述预设图像处理的过程可以包括图像归一化处理、像素点扫描处理等执行过程,本公开实施例对此不作限定。
第二种方式:当该投影的方位在该拍摄范围内,且该光源角度不为零时,根据该投影的方位,确定目标方向,该目标方向为除了该投影的方位所在的方向之外的任一方向,从多个预设投影范围中确定投影距离所处的预设投影范围,该投影距离为该投影的方位与该智能飞行设备之间的水平距离,从多个预设转动角度中确定该投影距离所处的预设投影范围对应的预设转动角度,该多个预设转动角度与该多个预设投影范围一一对应,基于该目标方向和该投影距离所处的预设投影范围对应的预设转动角度进行拍摄。
其中,该多个预设投影范围中的每个预设投影范围均可以由用户根据实际需求自定义设置,也可以由该智能飞行设备默认设置,本公开实施例对此不作限定。
其中,该多个预设转动角度中的每个预设转动角度均可以由用户根据实际需求自定义 设置,也可以由该智能飞行设备默认设置,本公开实施例对此不作限定。
当该投影的方位在拍摄范围内,且该光源角度不为零时,若直接进行拍摄,导致该投影一起拍摄至该图片或视频中,因此,需要调整拍摄角度。请参考图2B,若投影的方位如图22所示,则需要将摄像装置向除了该投影的方位所在的方向之外的其它任一方向转动预设转动角度。
在该种情况下,该预设转动角度和该投影的方位与该智能飞行设备之间的水平距离有关,当该投影的方位与该智能飞行设备之间的水平距离越大时,该目标光源经过该智能飞行设备后,在水平地面上形成的投影越小,因此,该预设转动角度可以设置的越小,即只需要转动一个小角度,即可避免将该投影拍摄至图片或视频中。反之,如果该投影的方位与该智能飞行设备之间的水平距离越小,该目标光源经过该智能飞行设备后,在水平地面上形成的投影越大,因此,若要避开该投影进行拍摄,则需要转动一个较大的预设转动角度。
在一种可能的实现方式中,可以由用户根据大量的数据测试,来确定该多个预设投影范围和多个预设转动角度之间的一一对应关系,之后将该多个预设投影范围和多个预设转动角度之间的一一对应关系存储至该智能飞行设备中。
需要说明的是,由于上述目标方向为除了该投影的方位所在的方向之外的任一方向,也即是,用户可以根据自身偏好来设置该目标方向,并将所设置的目标方向与用户账号进行对应存储,之后,基于该目标方向与用户账号之间的对应关系,该智能飞行设备可以选择与该用户账号对应的目标方向,从而提高了用户体验。
另外,在上述执行过程中,当智能飞行设备向目标方向转动预设转动角度后,将导致无法拍摄到该投影的方位上的景物,为了能够可以继续拍摄到该投影的方位上的景物,在向目标方向转动预设转动角度后,可以根据该投影的方位和该目标方向,规划拍摄路径。例如,请参考图2B,若该智能飞行设备向水平向右方向转动预设转动角度,则该智能飞行设备可以向水平向左方向飞行,此时,由于该智能飞行设备的投影将随着该智能飞行设备一起移动,即该投影的方位也随之向水平向左方向移动,因此,该智能飞行设备即可对该投影的方位上的景物进行拍摄。
需要说明的是,该智能飞行设备规划拍摄路径后,还可以将该拍摄路径发送至诸如手机、遥控设备等之类的智能设备中,由该智能设备在接收到该拍摄路径后进行路径演示,如此,可以使得用户获知该智能飞行设备接下来的拍摄路径,提高了用户体验。
在本公开实施例中,确定能够对智能飞行设备产生投影的目标光源当前所在方位与竖直方向之间的夹角,根据所确定的夹角,可以确定该目标光源发出的光线经过该智能飞行设备后在水平面上产生的投影的方位,由于已经确定了该投影的方位,因此,根据智能飞行设备当前拍摄角度和该投影的方位进行拍摄,可以避免将该投影一起拍摄至照片或视频中,提高了拍摄质量。
图3是根据一示例性实施例示出的一种智能飞行设备的框图。参照图3,该智能飞行设备包括第一确定模块310,第二确定模块320和拍摄模块330。
第一确定模块310,用于确定光源角度,该光源角度为目标光源当前所在方位与竖直方向之间的夹角,该目标光源为能够对智能飞行设备产生投影的光源,该竖直方向为与水平面垂直的方向;
第二确定模块320,用于根据该第一确定模块310确定的该光源角度,确定该目标光源发出的光线经过该智能飞行设备后在水平面上产生的投影的方位;
拍摄模块330,用于基于当前拍摄角度和该第二确定模块320确定的该投影的方位进行拍摄。
可选地,该第一确定模块310包括:
第一确定子模块,用于基于多个第一预设角度,通过配置的光线传感器确定多个光线强度,并将最大光线强度对应的第一预设角度确定为该光源角度,该多个第一预设角度与该多个光线强度一一对应;
第二确定子模块,用于基于多个第二预设角度,确定多个曝光度,并将最大曝光度对应的第二预设角度确定为该光源角度,该多个第二预设角度与该多个曝光度一一对应。
可选地,该第二确定模块320包括:
第三确定子模块,用于当该光源角度为零时,确定该目标光源发出的光线经过该智能飞行设备后在水平面上产生的投影位于该智能飞行设备当前所在位置的正下方;
第四确定子模块,用于当该光源角度不为零时,确定飞行高度,并根据该光源角度和该飞行高度,确定该目标光源发出的光线经过该智能飞行设备后在水平面上产生的投影的方位,该飞行高度为该智能飞行设备当前距离水平面的高度。
可选地,该拍摄模块330包括:
判断子模块,用于基于当前拍摄角度和该投影的方位,判断该投影的方位是否在拍摄范围内;
第一拍摄子模块,用于当该投影的方位在该拍摄范围内且位于该智能飞行设备当前所在位置的正下方时,在竖直方向上向下垂直拍摄,得到拍摄图片;
第五确定子模块,用于通过对该拍摄图片进行预设图像处理,确定该智能飞行设备在该拍摄图片中的大小;
第六确定子模块,用于基于该智能飞行设备在该拍摄图片中的大小和该飞行高度,确定转动角度,该转动角度是指避开该投影需要转动的角度;
第二拍摄子模块,用于基于该转动角度进行拍摄。
可选地,该拍摄模块330还包括:
第七确定子模块,用于当该投影的方位在该拍摄范围内,且该光源角度不为零时,根据该投影的方位,确定目标方向,该目标方向为除了该投影的方位所在的方向之外的任一方向;
第八确定子模块,用于从多个预设投影范围中确定投影距离所处的预设投影范围,该投影距离为该投影的方位与该智能飞行设备之间的水平距离;
第九确定子模块,用于从多个预设转动角度中确定该投影距离所处的预设投影范围对应的预设转动角度,该多个预设转动角度与该多个预设投影范围一一对应;
相应地,该第二拍摄子模块用于:
基于该目标方向和该投影距离所处的预设投影范围对应的预设转动角度进行拍摄。
在本公开实施例中,确定能够对智能飞行设备产生投影的目标光源当前所在方位与竖直方向之间的夹角,根据所确定的夹角,可以确定该目标光源发出的光线经过该智能飞行设备后在水平面上产生的投影的方位,由于已经确定了该投影的方位,因此,根据智能飞行设备当前拍摄角度和该投影的方位进行拍摄,可以避免将该投影一起拍摄至照片或视频中,提高了拍摄质量。
关于上述实施例中的设备,其中各个模块执行操作的具体方式已经在有关该方法的实施例中进行了详细描述,此处将不做详细阐述说明。
图4是根据一示例性实施例示出的一种智能飞行设备400的框图。例如,智能飞行设备400可以是无人拍摄机等。
参照图4,智能飞行设备400可以包括以下一个或多个组件:处理组件402,存储器404,电源组件406,多媒体组件408,音频组件410,输入/输出(I/O)的接口412,传感器组件414,以及通信组件416。
处理组件402通常控制智能飞行设备400的整体操作,诸如与显示,电话呼叫,数据通信,相机操作和记录操作相关联的操作。处理组件402可以包括一个或多个处理器420来执行指令,以完成上述的方法的全部或部分步骤。此外,处理组件402可以包括一个或多个模块,便于处理组件402和其他组件之间的交互。例如,处理组件402可以包括多媒体模块,以方便多媒体组件408和处理组件402之间的交互。
存储器404被配置为存储各种类型的数据以支持在智能飞行设备400的操作。这些数据的示例包括用于在智能飞行设备400上操作的任何应用程序或方法的指令,联系人数据,电话簿数据,消息,图片,视频等。存储器404可以由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(SRAM),电可擦除可编程只读存储器(EEPROM),可擦除可编程只读存储器(EPROM),可编程只读存储器(PROM),只读存储器(ROM),磁存储器,快闪存储器,磁盘或光盘。
电源组件406为智能飞行设备400的各种组件提供电源。电源组件406可以包括电源管理系统,一个或多个电源,及其他与为智能飞行设备400生成、管理和分配电源相关联的组件。
多媒体组件408包括在所述智能飞行设备400和用户之间的提供一个输出接口的屏幕。在一些实施例中,屏幕可以包括液晶显示器(LCD)和触摸面板(TP)。如果屏幕包 括触摸面板,屏幕可以被实现为触摸屏,以接收来自用户的输入信号。触摸面板包括一个或多个触摸传感器以感测触摸、滑动和触摸面板上的手势。所述触摸传感器可以不仅感测触摸或滑动动作的边界,而且还检测与所述触摸或滑动操作相关的持续时间和压力。在一些实施例中,多媒体组件408包括一个前置摄像头和/或后置摄像头。当智能飞行设备400处于操作模式,如拍摄模式或视频模式时,前置摄像头和/或后置摄像头可以接收外部的多媒体数据。每个前置摄像头和后置摄像头可以是一个固定的光学透镜系统或具有焦距和光学变焦能力。
音频组件410被配置为输出和/或输入音频信号。例如,音频组件410包括一个麦克风(MIC),当智能飞行设备400处于操作模式,如呼叫模式、记录模式和语音识别模式时,麦克风被配置为接收外部音频信号。所接收的音频信号可以被进一步存储在存储器404或经由通信组件416发送。在一些实施例中,音频组件410还包括一个扬声器,用于输出音频信号。
I/O接口412为处理组件402和外围接口模块之间提供接口,上述外围接口模块可以是键盘,点击轮,按钮等。这些按钮可包括但不限于:主页按钮、音量按钮、启动按钮和锁定按钮。
传感器组件414包括一个或多个传感器,用于为智能飞行设备400提供各个方面的状态评估。例如,传感器组件414可以检测到智能飞行设备400的打开/关闭状态,组件的相对定位,例如所述组件为智能飞行设备400的显示器和小键盘,传感器组件414还可以检测智能飞行设备400或智能飞行设备400一个组件的位置改变,用户与智能飞行设备400接触的存在或不存在,智能飞行设备400方位或加速/减速和智能飞行设备400的温度变化。传感器组件414可以包括接近传感器,被配置用来在没有任何的物理接触时检测附近物体的存在。传感器组件414还可以包括光传感器,如CMOS或CCD图像传感器,用于在成像应用中使用。在一些实施例中,该传感器组件414还可以包括加速度传感器,陀螺仪传感器,磁传感器,压力传感器或温度传感器。
通信组件416被配置为便于智能飞行设备400和其他设备之间有线或无线方式的通信。智能飞行设备400可以接入基于通信标准的无线网络,如WiFi,2G或3G,或它们的组合。在一个示例性实施例中,通信组件416经由广播信道接收来自外部广播管理系统的广播信号或广播相关信息。在一个示例性实施例中,所述通信组件416还包括近场通信(NFC)模块,以促进短程通信。例如,在NFC模块可基于射频识别(RFID)技术,红外数据协会(IrDA)技术,超宽带(UWB)技术,蓝牙(BT)技术和其他技术来实现。
在示例性实施例中,智能飞行设备400可以被一个或多个应用专用集成电路(ASIC)、数字信号处理器(DSP)、数字信号处理设备(DSPD)、可编程逻辑器件(PLD)、现场可编程门阵列(FPGA)、控制器、微控制器、微处理器或其他电子元件实现,用于执行上述方法。
在示例性实施例中,还提供了一种包括指令的非临时性计算机可读存储介质,例如包 括指令的存储器404,上述指令可由智能飞行设备400的处理器420执行以完成上述方法。例如,所述非临时性计算机可读存储介质可以是ROM、随机存取存储器(RAM)、CD-ROM、磁带、软盘和光数据存储设备等。
一种非临时性计算机可读存储介质,当所述存储介质中的指令由智能飞行设备400的处理器执行时,使得智能飞行设备400能够执行图1或者图2A实施例所涉及的智能飞行设备的拍摄方法。
本领域技术人员在考虑说明书及实践这里公开的发明后,将容易想到本公开的其它实施方案。本申请旨在涵盖本公开的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本公开的一般性原理并包括本公开未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本公开的真正范围和精神由下面的权利要求指出。
应当理解的是,本公开并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本公开的范围仅由所附的权利要求来限制。

Claims (11)

  1. 一种智能飞行设备的拍摄方法,其特征在于,所述方法包括:
    确定光源角度,所述光源角度为目标光源当前所在方位与竖直方向之间的夹角,所述目标光源为能够对智能飞行设备产生投影的光源,所述竖直方向为与水平面垂直的方向;
    根据所述光源角度,确定所述目标光源发出的光线经过所述智能飞行设备后在水平面上产生的投影的方位;
    基于所述智能飞行设备当前拍摄角度和所述投影的方位进行拍摄。
  2. 如权利要求1所述的方法,其特征在于,所述确定光源角度包括如下实现方式中的任一种:
    基于多个第一预设角度,通过配置的光线传感器确定多个光线强度,并将最大光线强度对应的第一预设角度确定为所述光源角度,所述多个第一预设角度与所述多个光线强度一一对应;
    基于多个第二预设角度,确定多个曝光度,并将最大曝光度对应的第二预设角度确定为所述光源角度,所述多个第二预设角度与所述多个曝光度一一对应。
  3. 如权利要求1所述的方法,其特征在于,所述根据所述光源角度,确定所述目标光源发出的光线经过所述智能飞行设备后在水平面上产生的投影的方位,包括:
    当所述光源角度为零时,确定所述目标光源发出的光线经过所述智能飞行设备后在水平面上产生的投影位于所述智能飞行设备当前所在位置的正下方;
    当所述光源角度不为零时,确定飞行高度,并根据所述光源角度和所述飞行高度,确定所述目标光源发出的光线经过所述智能飞行设备后在水平面上产生的投影的方位,所述飞行高度为所述智能飞行设备当前距离水平面的高度。
  4. 如权利要求3所述的方法,其特征在于,所述基于当前拍摄角度和所述投影的方位进行拍摄,包括:
    基于当前拍摄角度和所述投影的方位,判断所述投影的方位是否在拍摄范围内;
    当所述投影的方位在所述拍摄范围内且位于所述智能飞行设备当前所在位置的正下方时,在竖直方向上向下垂直拍摄,得到拍摄图片;
    通过对所述拍摄图片进行预设图像处理,确定所述智能飞行设备在所述拍摄图片中的大小;
    基于所述智能飞行设备在所述拍摄图片中的大小和所述飞行高度,确定转动角度,所述转动角度是指避开所述投影需要转动的角度;
    基于所述转动角度进行拍摄。
  5. 如权利要求4所述的方法,其特征在于,所述基于当前拍摄角度和所述投影的方位,判断所述投影的方位是否在拍摄范围内之后,还包括:
    当所述投影的方位在所述拍摄范围内,且所述光源角度不为零时,根据所述投影的方位,确定目标方向,所述目标方向为除了所述投影的方位所在的方向之外的任一方向;
    从多个预设投影范围中确定投影距离所处的预设投影范围,所述投影距离为所述投影的方位与所述智能飞行设备之间的水平距离;
    从多个预设转动角度中确定所述投影距离所处的预设投影范围对应的预设转动角度,所述多个预设转动角度与所述多个预设投影范围一一对应;
    相应地,所述基于当前拍摄角度和所述投影的方位进行拍摄,包括:
    基于所述目标方向和所述投影距离所处的预设投影范围对应的预设转动角度进行拍摄。
  6. 一种智能飞行设备,其特征在于,所述智能飞行设备包括:
    第一确定模块,用于确定光源角度,所述光源角度为目标光源当前所在方位与竖直方向之间的夹角,所述目标光源为能够对智能飞行设备产生投影的光源,所述竖直方向为与水平面垂直的方向;
    第二确定模块,用于根据所述第一确定模块确定的所述光源角度,确定所述目标光源发出的光线经过所述智能飞行设备后在水平面上产生的投影的方位;
    拍摄模块,用于基于当前拍摄角度和所述第二确定模块确定的所述投影的方位进行拍摄。
  7. 如权利要求6所述的智能飞行设备,其特征在于,所述第一确定模块包括:
    第一确定子模块,用于基于多个第一预设角度,通过配置的光线传感器确定多个光线强度,并将最大光线强度对应的第一预设角度确定为所述光源角度,所述多个第一预设角度与所述多个光线强度一一对应;
    第二确定子模块,用于基于多个第二预设角度,确定多个曝光度,并将最大曝光度对应的第二预设角度确定为所述光源角度,所述多个第二预设角度与所述多个曝光度一一对应。
  8. 如权利要求6所述的智能飞行设备,其特征在于,所述第二确定模块包括:
    第三确定子模块,用于当所述光源角度为零时,确定所述目标光源发出的光线经过所述智能飞行设备后在水平面上产生的投影位于所述智能飞行设备当前所在位置的正下方;
    第四确定子模块,用于当所述光源角度不为零时,确定飞行高度,并根据所述光源角度和所述飞行高度,确定所述目标光源发出的光线经过所述智能飞行设备后在水平面上产生的投影的方位,所述飞行高度为所述智能飞行设备当前距离水平面的高度。
  9. 如权利要求8所述的智能飞行设备,其特征在于,所述拍摄模块包括:
    判断子模块,用于基于当前拍摄角度和所述投影的方位,判断所述投影的方位是否在拍摄范围内;
    第一拍摄子模块,用于当所述投影的方位在所述拍摄范围内且位于所述智能飞行设备当前所在位置的正下方时,在竖直方向上向下垂直拍摄,得到拍摄图片;
    第五确定子模块,用于通过对所述拍摄图片进行预设图像处理,确定所述智能飞行设备在所述拍摄图片中的大小;
    第六确定子模块,用于基于所述智能飞行设备在所述拍摄图片中的大小和所述飞行高度,确定转动角度,所述转动角度是指避开所述投影需要转动的角度;
    第二拍摄子模块,用于基于所述转动角度进行拍摄。
  10. 如权利要求9所述的智能飞行设备,其特征在于,所述拍摄模块还包括:
    第七确定子模块,用于当所述投影的方位在所述拍摄范围内,且所述光源角度不为零时,根据所述投影的方位,确定目标方向,所述目标方向为除了所述投影的方位所在的方向之外的任一方向;
    第八确定子模块,用于从多个预设投影范围中确定投影距离所处的预设投影范围,所述投影距离为所述投影的方位与所述智能飞行设备之间的水平距离;
    第九确定子模块,用于从多个预设转动角度中确定所述投影距离所处的预设投影范围对应的预设转动角度,所述多个预设转动角度与所述多个预设投影范围一一对应;
    相应地,所述第二拍摄子模块用于:
    基于所述目标方向和所述投影距离所处的预设投影范围对应的预设转动角度进行拍摄。
  11. 一种智能飞行设备,其特征在于,所述智能飞行设备包括:
    处理器;
    用于存储处理器可执行指令的存储器;
    其中,所述处理器被配置为:
    确定光源角度,所述光源角度为目标光源当前所在方位与竖直方向之间的夹角,所述目标光源为能够对智能飞行设备产生投影的光源,所述竖直方向为与水平面垂直的方向;
    根据所述光源角度,确定所述目标光源发出的光线经过所述智能飞行设备后在水平面上产生的投影的方位;
    基于当前拍摄角度和所述投影的方位进行拍摄。
PCT/CN2017/096530 2017-01-23 2017-08-09 智能飞行设备的拍摄方法及智能飞行设备 WO2018133388A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
RU2017134749A RU2668609C1 (ru) 2017-01-23 2017-08-09 Способ фотографирования для интеллектуального летательного устройства и интеллектуальное летательное устройство
JP2017552164A JP6532958B2 (ja) 2017-01-23 2017-08-09 スマート飛行機器の撮影方法、スマート飛行機器、プログラム及び記録媒体

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710049939.0 2017-01-23
CN201710049939.0A CN106973218B (zh) 2017-01-23 2017-01-23 智能飞行设备的拍摄方法及智能飞行设备

Publications (1)

Publication Number Publication Date
WO2018133388A1 true WO2018133388A1 (zh) 2018-07-26

Family

ID=59334914

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/096530 WO2018133388A1 (zh) 2017-01-23 2017-08-09 智能飞行设备的拍摄方法及智能飞行设备

Country Status (6)

Country Link
US (1) US10419662B2 (zh)
EP (1) EP3352453B1 (zh)
JP (1) JP6532958B2 (zh)
CN (1) CN106973218B (zh)
RU (1) RU2668609C1 (zh)
WO (1) WO2018133388A1 (zh)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106973218B (zh) 2017-01-23 2019-09-27 北京小米移动软件有限公司 智能飞行设备的拍摄方法及智能飞行设备
CN109933083B (zh) * 2017-12-15 2022-04-05 翔升(上海)电子技术有限公司 基于无人机的放牧方法、装置和系统
EP3742248A1 (en) * 2019-05-20 2020-11-25 Sony Corporation Controlling a group of drones for image capture
CN111724440B (zh) * 2020-05-27 2024-02-02 杭州数梦工场科技有限公司 监控设备的方位信息确定方法、装置及电子设备
WO2022141231A1 (en) * 2020-12-30 2022-07-07 SZ DJI Technology Co., Ltd. Systems and methods for determining the position of an object using an unmanned aerial vehicle
CN112843678B (zh) * 2020-12-31 2023-05-23 上海米哈游天命科技有限公司 拍摄图像的方法、装置、电子设备及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001016559A (ja) * 1999-06-29 2001-01-19 Canon Inc 画像撮影装置、画像撮影システム、画像撮影方法、及びコンピュータ読み取り可能な記録媒体
JP2008199525A (ja) * 2007-02-15 2008-08-28 Toyota Motor Corp 車両用撮影装置
CN102694963A (zh) * 2012-04-27 2012-09-26 南京航空航天大学 一种获取无阴影目标图像的方法
CN205186521U (zh) * 2015-04-10 2016-04-27 萧文昌 可自主拦阻光线的飞行器
US20160280397A1 (en) * 2015-03-27 2016-09-29 Konica Minolta Laboratory U.S.A., Inc. Method and system to avoid plant shadows for vegetation and soil imaging
CN106125767A (zh) * 2016-08-31 2016-11-16 北京小米移动软件有限公司 飞行器的控制方法、装置及飞行器
CN106973218A (zh) * 2017-01-23 2017-07-21 北京小米移动软件有限公司 智能飞行设备的拍摄方法及智能飞行设备

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5211172A (en) * 1991-03-29 1993-05-18 Mcguane Joseph B Solar controlled sun tracker for a sunbather
RU2098797C1 (ru) * 1994-11-30 1997-12-10 Алексей Владиславович Курбатов Способ получения проекции объекта с помощью проникающего излучения и устройство для его осуществления
US6996527B2 (en) * 2001-07-26 2006-02-07 Matsushita Electric Industrial Co., Ltd. Linear discriminant based sound class similarities with unit value normalization
US6861633B2 (en) * 2002-06-20 2005-03-01 The Aerospace Corporation Microelectromechanical system optical sensor providing bit image data of a viewed image
IL180223A0 (en) * 2006-12-20 2007-05-15 Elbit Sys Electro Optics Elop Airborne photogrammetric imaging system and method
CN104363845B (zh) * 2012-04-24 2017-09-01 于罗吉尼有限责任公司 用于治疗女性小便失禁的填充剂施加器
RU2498378C1 (ru) * 2012-06-21 2013-11-10 Александр Николаевич Барышников Способ получения изображения земной поверхности с движущегося носителя и устройство для его осуществления
RU2584368C1 (ru) * 2015-02-13 2016-05-20 Открытое акционерное общество "Лётно-исследовательский институт имени М.М. Громова" Способ определения контрольных значений параметров пространственно-угловой ориентации самолёта на трассах и приаэродромных зонах при лётных испытаниях пилотажно-навигационного оборудования и система для его осуществления
WO2017203646A1 (ja) * 2016-05-26 2017-11-30 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド 撮像制御装置、影位置特定装置、撮像システム、移動体、撮像制御方法、影位置特定方法、及びプログラム
US9639960B1 (en) * 2016-11-04 2017-05-02 Loveland Innovations, LLC Systems and methods for UAV property assessment, data capture and reporting
US10429857B2 (en) * 2017-01-20 2019-10-01 The Boeing Company Aircraft refueling with sun glare prevention

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001016559A (ja) * 1999-06-29 2001-01-19 Canon Inc 画像撮影装置、画像撮影システム、画像撮影方法、及びコンピュータ読み取り可能な記録媒体
JP2008199525A (ja) * 2007-02-15 2008-08-28 Toyota Motor Corp 車両用撮影装置
CN102694963A (zh) * 2012-04-27 2012-09-26 南京航空航天大学 一种获取无阴影目标图像的方法
US20160280397A1 (en) * 2015-03-27 2016-09-29 Konica Minolta Laboratory U.S.A., Inc. Method and system to avoid plant shadows for vegetation and soil imaging
CN205186521U (zh) * 2015-04-10 2016-04-27 萧文昌 可自主拦阻光线的飞行器
CN106125767A (zh) * 2016-08-31 2016-11-16 北京小米移动软件有限公司 飞行器的控制方法、装置及飞行器
CN106973218A (zh) * 2017-01-23 2017-07-21 北京小米移动软件有限公司 智能飞行设备的拍摄方法及智能飞行设备

Also Published As

Publication number Publication date
JP6532958B2 (ja) 2019-06-19
EP3352453B1 (en) 2019-12-18
CN106973218A (zh) 2017-07-21
JP2019506012A (ja) 2019-02-28
EP3352453A1 (en) 2018-07-25
RU2668609C1 (ru) 2018-10-02
US20180213146A1 (en) 2018-07-26
US10419662B2 (en) 2019-09-17
CN106973218B (zh) 2019-09-27

Similar Documents

Publication Publication Date Title
WO2018133388A1 (zh) 智能飞行设备的拍摄方法及智能飞行设备
US10110800B2 (en) Method and apparatus for setting image capturing parameters
KR101712301B1 (ko) 화면을 촬영하기 위한 방법 및 디바이스
CN104065878B (zh) 拍摄控制方法、装置及终端
EP2991336B1 (en) Image capturing method and apparatus
WO2017032126A1 (zh) 无人机的拍摄控制方法及装置、电子设备
JP6348611B2 (ja) 自動ピント合わせ方法、装置、プログラム及び記録媒体
WO2019033411A1 (zh) 一种全景拍摄方法及装置
WO2019006769A1 (zh) 无人机跟拍方法及装置
EP3145170A1 (en) Method and apparatus for controlling positioning of camera device, camera device and terminal device
US10191708B2 (en) Method, apparatrus and computer-readable medium for displaying image data
US20170054906A1 (en) Method and device for generating a panorama
WO2018053722A1 (zh) 全景照片拍摄方法及装置
CN112188089A (zh) 距离获取方法及装置、焦距调节方法及装置、测距组件
WO2018082164A1 (zh) 视频剪辑、视频拍摄方法及装置
CN106973275A (zh) 投影设备的控制方法和装置
CN107241535B (zh) 闪光灯调节装置及终端设备
CN104284093A (zh) 全景拍摄方法及装置
KR102512787B1 (ko) 촬영 프리뷰 이미지를 표시하는 방법, 장치 및 매체
US9584725B2 (en) Method and terminal device for shooting control
CA2804594A1 (en) Methods and devices for capturing images
CN110191276A (zh) 拍摄模式的切换方法及装置、电子设备
CN112927641B (zh) 屏幕亮度的调整方法及装置、终端设备、存储介质
CN109862252B (zh) 图像拍摄方法及装置

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2017552164

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2017134749

Country of ref document: RU

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17893486

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17893486

Country of ref document: EP

Kind code of ref document: A1