CN112154654A - Match shooting method, electronic equipment, unmanned aerial vehicle and storage medium - Google Patents

Match shooting method, electronic equipment, unmanned aerial vehicle and storage medium Download PDF

Info

Publication number
CN112154654A
CN112154654A CN201980033571.6A CN201980033571A CN112154654A CN 112154654 A CN112154654 A CN 112154654A CN 201980033571 A CN201980033571 A CN 201980033571A CN 112154654 A CN112154654 A CN 112154654A
Authority
CN
China
Prior art keywords
area
aerial vehicle
unmanned aerial
camera
target object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980033571.6A
Other languages
Chinese (zh)
Inventor
刘畅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN112154654A publication Critical patent/CN112154654A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects

Abstract

A match shooting method, electronic equipment and unmanned aerial vehicle are provided, wherein the method comprises the steps of obtaining a preset match area and a target object; controlling the camera to generate fixed focus frames which are adaptive to the number of the target objects according to the number of the target objects; when the target is detected to be located in the game area, adjusting parameters of at least one of the unmanned aerial vehicle (110), the camera and the pan-tilt-zoom (120) so that the target is located in a focusing frame of the camera. According to the method, when the target object in the competition area is tracked and shot, a camera does not need to be erected in the competition area or on the periphery of the competition area, the target object is located in the fixed focus frame of the camera by adjusting at least one parameter of the unmanned aerial vehicle, the camera and the holder, the tracking and shooting cost of the target object is low, the flexibility is high, the competition focus can be captured in time, and the efficiency of tracking and shooting the target object in the competition area is improved.

Description

Match shooting method, electronic equipment, unmanned aerial vehicle and storage medium
Technical Field
The embodiment of the invention relates to the technical field of unmanned aerial vehicle shooting, in particular to a match shooting method, electronic equipment, an unmanned aerial vehicle and a storage medium.
Background
With the popularity of sports, more and more communities, units, and companies have begun organizing many sports events involving multiple people, such as football, basketball, and the like. In small-scale competition of small groups, special cameras and photographers are not required to be arranged, and shooting is carried out in a mode of erecting small cameras at the field side.
However, the cameras installed at the sides of the field are low, so that the field of vision is not complete, and the focus of the event cannot be captured.
Disclosure of Invention
The embodiment of the invention provides a match shooting method, electronic equipment, an unmanned aerial vehicle and a storage medium.
In a first aspect, an embodiment of the present application provides a match shooting method, which is applied to an unmanned aerial vehicle carrying a camera and a pan-tilt, and includes:
acquiring a preset competition area and a target object;
controlling the camera to generate fixed focus frames which are adaptive to the number of the target objects according to the number of the target objects;
when the target object is detected to be located in the competition area, adjusting parameters of at least one of the unmanned aerial vehicle, the camera and the holder so that the target object is located in a fixed focus frame of the camera.
In a second aspect, an embodiment of the present application provides an electronic device, is applied to the unmanned aerial vehicle who carries on camera and cloud platform, includes:
a memory for storing a computer program;
a processor for executing the computer program, in particular for:
acquiring a preset competition area and a target object;
controlling the camera to generate fixed focus frames which are adaptive to the number of the target objects according to the number of the target objects;
when the target object is detected to be located in the competition area, adjusting parameters of at least one of the unmanned aerial vehicle, the camera and the holder so that the target object is located in a fixed focus frame of the camera.
In a third aspect, an embodiment of the application provides an unmanned aerial vehicle, which includes a body, a power supply battery arranged on the body, a power system, a camera, a holder and a processor; the power supply battery can supply power for the power system, and the power system supplies power for the unmanned aerial vehicle;
the processor is used for acquiring a preset competition area and a target object; controlling the camera to generate fixed focus frames which are adaptive to the number of the target objects according to the number of the target objects; and when the target object is detected to be located in the competition area, adjusting at least one parameter of the power system, the camera and the holder so that the target object is located in a fixed focus frame of the camera.
In a fourth aspect, an embodiment of the present application provides a computer storage medium, in which a computer program is stored, and the computer program, when executed, implements the race shooting method according to the first aspect.
According to the match shooting method, the electronic device, the unmanned aerial vehicle and the storage medium, the preset match area and the preset target objects are obtained, the camera is controlled to generate the fixed focus frames corresponding to the number of the target objects according to the number of the target objects, and when the target objects are detected to be located in the match area, at least one parameter of the unmanned aerial vehicle, the camera and the pan-tilt-zoom is adjusted, so that the target objects are located in the fixed focus frames of the camera. According to the embodiment of the application, when the target objects in the competition area are tracked and shot, cameras do not need to be erected in the competition area or around the competition area, the fixed focus frames which are adaptive to the number of the target objects are generated according to the number of the target objects, and then parameters of at least one of the unmanned aerial vehicle, the camera and the pan-tilt head are adjusted to enable the target objects to be located in the fixed focus frames, so that real-time tracking and shooting of the target objects in the competition area are achieved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
FIG. 1 is a schematic diagram of a movable stage according to an embodiment of the present application;
fig. 2 is a schematic configuration diagram of an unmanned aerial vehicle system according to an embodiment of the present application;
FIG. 3 is a flowchart of a race shooting method provided in an embodiment of the present application;
FIG. 4a is a schematic view of an object according to an embodiment of the present disclosure;
FIG. 4b is another schematic view of an object according to an embodiment of the present disclosure;
FIG. 5 is a flowchart of a race shooting method provided in an embodiment of the present application;
FIG. 6a is a top view of a playing area according to an embodiment of the present application;
FIG. 6b is a side view of the playing area shown in FIG. 6 a;
FIG. 7 is a flowchart of a race shooting method provided in an embodiment of the present application;
FIG. 8 is a flowchart of a race shooting method provided in an embodiment of the present application;
FIG. 9 is a flowchart of a race shooting method provided in an embodiment of the present application;
fig. 10 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 11 is a schematic structural diagram of an unmanned aerial vehicle provided in an embodiment of the present application;
fig. 12 is a schematic structural diagram of an unmanned aerial vehicle provided in the embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is a schematic structural diagram of a movable platform according to an embodiment of the present application, where the movable platform 100 includes: the mobile terminal comprises a mobile body 101, a cradle head 120 arranged on the mobile body, and a camera 123 arranged on the cradle head 120.
The moving body 101 can perform a movement of an arbitrary posture in space, such as a movement in a horizontal direction and a movement, a movement in a vertical direction, and a rotational movement. The pan/tilt head can rotate in the horizontal direction (i.e., within the XY plane), as well as in the vertical direction.
In an example, as shown in fig. 2, the movable platform 100 may be an unmanned aerial vehicle system, and the corresponding movable body 101 may be an unmanned aerial vehicle 110. Optionally, the drone may be of various types such as multi-rotor, fixed-wing, etc., wherein the multi-rotor drone may include drones that include four rotors, six rotors, eight rotors, etc. that include other numbers of rotors. The present embodiment is described by taking a rotor unmanned aerial vehicle as an example.
The unmanned flight system 100 can include a drone 110, a display device 130, and a control terminal 140. The drone 110 may include, among other things, a power system 150, a flight control system 160, a frame, and a pan-tilt 120 carried on the frame. The drone 110 may be in wireless communication with the control terminal 140 and the display device 130.
The drone 110 includes a frame, which may include a fuselage and a foot rest (also referred to as a landing gear). The fuselage may include a central frame and one or more arms connected to the central frame, the one or more arms extending radially from the central frame. The foot rest is connected with the fuselage for play the supporting role when unmanned aerial vehicle 110 lands.
The power system 150 may include one or more electronic governors (abbreviated as electric governors) 151, one or more propellers 153, and one or more motors 152 corresponding to the one or more propellers 153, wherein the motors 152 are connected between the electronic governors 151 and the propellers 153, the motors 152 and the propellers 153 are disposed on the horn of the drone 110; the electronic governor 151 is configured to receive a drive signal generated by the flight control system 160 and provide a drive current to the motor 152 based on the drive signal to control the rotational speed of the motor 152. The motor 152 is used to drive the propeller in rotation, thereby providing power for the flight of the drone 110, which power enables the drone 110 to achieve one or more degrees of freedom of motion. In certain embodiments, the drone 110 may rotate about one or more axes of rotation. For example, the above-mentioned rotation axes may include a Roll axis (Roll), a Yaw axis (Yaw) and a pitch axis (pitch). It should be understood that the motor 152 may be a dc motor or an ac motor. The motor 152 may be a brushless motor or a brush motor.
Flight control system 160 may include a flight controller 161 and a sensing system 162. The sensing system 162 is used to measure attitude information of the drone, i.e., position information and status information of the drone 110 in space, such as three-dimensional position, three-dimensional angle, three-dimensional velocity, three-dimensional acceleration, three-dimensional angular velocity, and the like. The sensing system 162 may include, for example, at least one of a gyroscope, an ultrasonic sensor, an electronic compass, an Inertial Measurement Unit (IMU), a vision sensor, a global navigation satellite system, and a barometer. For example, the Global navigation satellite System may be a Global Positioning System (GPS). The flight controller 161 is used to control the flight of the drone 110, for example, the flight of the drone 110 may be controlled according to attitude information measured by the sensing system 162. It should be understood that the flight controller 161 may control the drone 110 according to preprogrammed instructions, or may control the drone 110 in response to one or more control instructions from the control terminal 140.
The pan/tilt head 120 may include a motor 122. The cradle head is used for carrying the camera 123. Flight controller 161 may control the movement of pan/tilt head 120 via motor 122. Optionally, as another embodiment, the pan/tilt head 120 may further include a controller for controlling the movement of the pan/tilt head 120 by controlling the motor 122. It should be understood that the pan/tilt head 120 may be separate from the drone 110, or may be part of the drone 110. It should be understood that the motor 122 may be a dc motor or an ac motor. The motor 122 may be a brushless motor or a brush motor. It should also be understood that the pan/tilt head may be located at the top of the drone, as well as at the bottom of the drone.
The camera 123 may be, for example, a device for capturing an image such as a camera or a video camera, and the camera 123 may communicate with the flight controller 161 and perform shooting under the control of the flight controller 161. The camera 123 of the present embodiment at least includes a photosensitive element, such as a Complementary Metal Oxide Semiconductor (CMOS) sensor or a Charge-coupled Device (CCD) sensor. It can be understood that the camera 123 may also be directly fixed to the drone 110, such that the pan/tilt head 120 may be omitted.
The display device 130 is located at the ground end of the unmanned aerial vehicle system 100, can communicate with the unmanned aerial vehicle 110 in a wireless manner, and can be used for displaying attitude information of the unmanned aerial vehicle 110. In addition, an image taken by the imaging device may also be displayed on the display apparatus 130. It should be understood that the display device 130 may be a stand-alone device or may be integrated into the control terminal 140.
The control terminal 140 is located at the ground end of the unmanned aerial vehicle system 100, and can communicate with the unmanned aerial vehicle 110 in a wireless manner, so as to remotely control the unmanned aerial vehicle 110.
The technical solution of the present invention will be described in detail below with specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
Fig. 3 is a flowchart of a race shooting method provided in an embodiment of the present application, and as shown in fig. 3, the control method in the embodiment of the present application may include:
s101, acquiring a preset competition area and a target object.
The execution main body of the method is the unmanned aerial vehicle, the exemplary execution main body can be the flight controller in the unmanned aerial vehicle shown in the figure 2, and the flight controller can control the unmanned aerial vehicle, the holder and the camera.
The competition area is the range tracked by the unmanned aerial vehicle, namely the unmanned aerial vehicle tracks and shoots the target object in the competition area, and the target object is positioned in the fixed focus frame of the camera.
Illustratively, the playing area may be a particular area within the playing field, such as a perimeter extension area including the playing field area and the playing field area. The size of the periodic extension area is set according to actual needs, for example, the peripheral extension area extends 10 meters towards the period of the playing field area, so that the playing field area and the area extending 10 meters outwards from the playing field area form a playing area.
The embodiment of the application does not limit the mode of obtaining the competition area by the unmanned aerial vehicle.
In one example, the drone may obtain the playing area by image processing, such as the drone controlling a camera to take a picture of the playing field, and then the user specifies the size of the playing area on the picture of the playing field, for example, the user performs a box selection on the picture of the playing field, and sets the area in the selection box as the playing area.
In another example, the drone may obtain the playing area through information input by the user, for example, the user inputs information of the size, the boundary, etc. of the playing area to the drone, and the drone obtains the playing area according to the information of the size, the boundary, etc. of the playing area input by the user.
The playing area may include one or more of a ball playing area, a shooting playing area, a track and field playing area, a water playing area, and an ice and snow playing area.
The target object that this implementation relates to is the object that unmanned aerial vehicle tracked.
Optionally, the target object of the embodiment of the present application includes one or more target objects. For example, fig. 4a shows a ball as a target, or, as shown in fig. 4b, 6 players and a ball as targets.
Optionally, the target object is a basketball, a soccer ball, a football, or the like.
Optionally, the target is one or more athletes.
Optionally, the target is one or more referees.
In the embodiment of the present application, the number and the type of the target objects are determined according to actual needs, and the embodiment of the present application does not limit this.
And S102, controlling the camera to generate a fixed focus frame according to the number of the target objects.
The fixed focus frame is a visual field frame with a constant focal length, and the visual field range of the fixed focus frame is correspondingly kept constant.
As shown in fig. 4a and 4b, the camera is controlled to generate a fixed focus frame according to the number of the target objects, and the size of the fixed focus frame is adapted to the number of the target objects, even if the target objects are located in the fixed focus frame, so that the camera can perform tracking shooting on the target objects in the fixed focus frame.
In the first example of S102, as shown in fig. 4a, when the number of the target objects is one, the focal length of the camera is adjusted to generate a first focus frame so that the one target object is located in the first focus frame.
In the second example of S102, as shown in fig. 4b, when the number of the targets is multiple, the focal length of the camera is adjusted to generate a second fixed focus frame. The size of the second fixed focus frame is adapted to the size and the number of the target objects.
The range of the second fixed focus frame imaging is larger than the range of the first fixed focus frame imaging.
In an implementation manner of the second example of the S102, when the number of the target objects is multiple, adjusting the focal length of the camera to generate a second fixed-focus frame may specifically include:
and step A, when the number of the target objects is multiple, adjusting the focal length of the camera to generate the second fixed focus frame according to the preset mapping relation between the number of the target objects and the focal length value.
In the embodiment of the present application, focus values corresponding to different numbers of target objects are preset, for example, as shown in table 1:
TABLE 1
Number of target Focal length value
2 A1
3 A2
...... ......
n An
As shown in table 1, the focal length value is a1 when the number of objects is 2, a2 when the number of objects is 3, and An when the number of objects is n. Therefore, the target focal length value corresponding to the number of the target objects can be obtained according to the mapping relation between the number of the target objects and the focal length value shown in the table 1, and then the focal length value of the camera is adjusted to be the target focal length value, so that a second fixed focus frame is formed.
In some embodiments, as shown in fig. 5, the controlling the camera to generate the fixed focus frame according to the number of the target objects in the above S102 may include:
and S1021, if the target object switching instruction is received, determining the number of the switched target objects.
And S1022, when the difference between the number of the switched target objects and the number of the target objects before switching is greater than a preset threshold value, adjusting the focal length of the camera to generate a fixed focus frame which is suitable for the number of the switched target objects.
For example, assuming that the number of the objects to be photographed at the previous time (i.e., before switching) is 3 and the number of the objects to be photographed at the current time (i.e., after switching) is 5, when it is determined that the number of the objects to be photographed at the current time is different from the number of the objects to be photographed at the previous time, the user inputs an object switching instruction to the unmanned aerial vehicle. After receiving the target object switching instruction, the unmanned aerial vehicle determines the number of the switched target objects, for example, the number of the target object switching instruction is increased by 2, so that the unmanned aerial vehicle can determine that the number of the switched target objects is 5 according to the number of the target objects before switching and the target object switching instruction. Next, a difference 2 between the number of the targets after switching (for example, 5) and the number of the targets before switching (for example, 3) is determined, and when the difference 2 is determined to be greater than a preset threshold (for example, 0), the camera is controlled to adjust the focal length so as to generate a fixed focus frame corresponding to the number of the targets after switching (for example, 5).
S103, when the target object is detected to be located in the competition area, adjusting parameters of at least one of the unmanned aerial vehicle, the camera and the holder so that the target object is located in a fixed focus frame of the camera.
As shown in fig. 4a and 4b, when it is detected that the target object is located in the playing area, the unmanned aerial vehicle adjusts parameters of at least one of the unmanned aerial vehicle, the camera and the pan/tilt head, so that the target object is located in the fixed focus frame of the camera, and tracking shooting of the target object in the playing area is achieved.
It should be noted that, when the unmanned aerial vehicle tracks and shoots a target object in a competition area, the fixed focus frame of the camera is also always located in the competition area, so that the problem of resource waste caused by the fact that the unmanned aerial vehicle tracks and shoots a non-target object outside the competition area can be avoided.
In some examples, when the number of the targets is multiple, adjusting parameters of at least one of the drone, the camera, and the pan-tilt so that the maximum number of targets are located within the fixed focus frame. For example, as shown in fig. 4b, when the target objects include 6 players and one ball, if all the target objects cannot be located in the focus fixing frame, parameters of at least one of the unmanned aerial vehicle, the camera and the pan-tilt head are adjusted to enable as many target objects as possible to be located in the focus fixing frame, so as to achieve tracking shooting of as many target objects as possible.
In a possible implementation manner, adjusting the parameters of the unmanned aerial vehicle in S103 to enable the target object to be located in the fixed-focus frame of the camera may specifically include step B:
and B, adjusting parameters of the unmanned aerial vehicle to enable the unmanned aerial vehicle and the target object to be in a preset angle range.
Specifically, referring to fig. 6b, parameters of the unmanned aerial vehicle are adjusted so that an angle α between the unmanned aerial vehicle and the target object is within a preset angle range, and the target object is ensured to be located in a fixed focus frame of the camera.
Optionally, the above-mentioned predetermined angular range can be unmanned aerial vehicle's pitch angle scope, and the angle between unmanned aerial vehicle and the target is located unmanned aerial vehicle's pitch angle scope promptly, can guarantee like this that the target is located the fixed focus frame of camera for unmanned aerial vehicle tracks the shooting to the target.
Optionally, in the embodiment of the present application, when there are a plurality of targets, the patterning may be performed according to the distribution of the targets.
Optionally, the unmanned aerial vehicle of the embodiment of the application can also live the shot picture through the communication interface.
According to the match shooting method provided by the embodiment of the application, the preset match area and the preset target objects are obtained, the camera is controlled to generate the fixed focus frames corresponding to the number of the target objects according to the number of the target objects, and when the target objects are detected to be located in the match area, at least one parameter of the unmanned aerial vehicle, the camera and the cloud deck is adjusted, so that the target objects are located in the fixed focus frames of the camera. That is, in the embodiment of the present application, when tracking shooting is performed on a target object in a match area, a camera does not need to be erected in or around the match area, but a fixed focus frame adapted to the number of the target object is generated according to the number of the target object, and then parameters of at least one of the unmanned aerial vehicle, the camera and the pan-tilt head are adjusted to enable the target object to be located in the fixed focus frame, so that real-time tracking shooting of the target object in the match area is realized.
On the basis of the above embodiment, the following describes in detail a specific process of adjusting parameters of at least one of the unmanned aerial vehicle, the camera, and the pan/tilt head, with reference to a specific example.
As shown in fig. 2, the unmanned aerial vehicle is equipped with a pan-tilt head, and the pan-tilt head is equipped with a camera. At this moment, adjust in above-mentioned S103 in unmanned aerial vehicle, camera and the cloud platform parameter of at least one, include:
and C, adjusting at least one of flight parameters of the unmanned aerial vehicle, moving parameters of the holder and shooting parameters of the camera.
Wherein, unmanned aerial vehicle's flight parameter includes following at least one: the direction of motion of the drone and/or the pose of the drone. Wherein, unmanned aerial vehicle's direction of motion includes horizontal direction and vertical direction.
As shown in fig. 2, the flight controller 161 adjusts the flight parameters of the drone by controlling the power system 150, specifically, the power system 150 may include one or more electronic governors (abbreviated as electric governors) 151, one or more propellers 153, and one or more motors 152 corresponding to the one or more propellers 153, where the electronic governors 151 are configured to receive a driving signal generated by the flight controller 161 and provide a driving current to the motors 152 according to the driving signal to control the rotation speed of the motors 152. The motor 152 is used to drive the propeller in rotation to provide power for the flight of the drone, which enables the drone to achieve motion in one or more degrees of freedom.
The moving parameters of the pan-tilt head comprise at least one of the following parameters: the horizontal direction movement parameter of the holder, the vertical direction pitching parameter and the rotation angular speed of the holder. As shown in fig. 2, the pan/tilt head 120 may include a motor 122. Flight controller 161 may control the movement of pan/tilt head 120 via motor 122. For example, the pan/tilt head 120 may rotate about one or more axes of rotation, which may include a Roll axis (Roll), a Yaw axis (Yaw), and a pitch axis (pitch).
The shooting parameters of the camera comprise at least one of the following parameters: the field of view of the camera, aperture, exposure time, sensitivity, illumination, and the like. In the field of view (i.e., the field of view), the field of view is generally reflected in the field of view of the image capture area of the image, and the smaller the field of view, the smaller the field of view formed on the photosensitive element. The larger the aperture is opened, the larger the light flux passing through the lens, and the higher the sharpness of the image, and the smaller the aperture is opened, the smaller the light flux passing through the lens, and the lower the sharpness of the image. The exposure time is the time during which the shutter is opened in order to project light onto the photosensitive surface of the photosensitive material. The long exposure time allows more light to enter, and is suitable for the condition of poor light conditions. Short exposure times are suitable for better lighting. Sensitivity, a measure of the sensitivity of a film to light, is determined by sensitivity metrology and several measurements, and for less sensitive films, it takes longer to achieve the same image as a more sensitive film, and is therefore commonly referred to as a slow film. Highly sensitive negatives are thus referred to as fast negatives. The illuminance is a unit for measuring the sensitivity of the camera, and is expressed by lux, and the camera can shoot images under dark illumination conditions. The lower the lux value, the clearer the camera can capture in lower lighting conditions.
According to the embodiment of the application, at least one of flight parameters of the unmanned aerial vehicle, moving parameters of the holder and shooting parameters of the camera can be adjusted, so that the target is located in the fixed focus frame of the camera, and the unmanned aerial vehicle can track and shoot a preset target object in a competition area.
Optionally, in an adjustment mode of the embodiment of the application, a movement parameter of the pan/tilt head may be preferentially adjusted, if the movement parameter of the pan/tilt head is adjusted, the camera parameter of the camera is adjusted again when the target cannot be located in the fixed focus frame of the camera, and if the camera parameter of the camera is adjusted, the flight parameter of the unmanned aerial vehicle is adjusted when the target cannot be located in the fixed focus frame of the camera.
Optionally, in another adjustment mode of the embodiment of the application, two or three of a flight parameter of the unmanned aerial vehicle, a movement parameter of the pan-tilt and a shooting parameter of the camera may be adjusted at the same time, so that the target object is located in the fixed focus frame of the camera.
In an example, the adjusting the flight parameters of the drone in step C above may include:
and C1, adjusting the movement parameters of the unmanned aerial vehicle in the first horizontal direction or the second horizontal direction.
Fig. 6a is a top view of a playing area according to an embodiment of the present application, and fig. 6b is a side view of the playing area shown in fig. 6 a. As shown in fig. 6a and 6b, when the playing area is square, it is assumed that the playing area is rectangular, the first horizontal direction is a direction in which the long side of the playing area is located, i.e., the x-axis direction in fig. 6a, and the second horizontal direction is a direction in which the short side of the playing area is located, i.e., the y-axis direction in fig. 6a and 6 b. At this time, the first horizontal direction and the second horizontal direction correspond to adjacent sides of the playing area, respectively.
As shown in fig. 6a, tracking and shooting the target object in the first horizontal direction may be achieved by adjusting the movement parameter of the drone in the first horizontal direction, so that the drone moves left and right along the first horizontal direction, and then the target object moving along the first horizontal direction or the movement component thereof moves along the first horizontal direction is located in the focusing frame of the camera. As shown in fig. 6a, when the unmanned aerial vehicle moves along the first horizontal direction, the horizontal viewing angle range of the unmanned aerial vehicle changes along the first horizontal direction, so that the target object moving along the first horizontal direction is located in the fixed focus frame of the camera, and the unmanned aerial vehicle tracks and shoots the target object in the first horizontal direction.
As shown in fig. 6b, tracking the shooting target object in the second horizontal direction may also be achieved by adjusting the movement parameters of the drone in the second horizontal direction, so that the drone moves along the second horizontal direction, and then the target object moving along the second horizontal direction or the motion component thereof moves along the second horizontal direction is located in the focusing frame of the camera. As shown in fig. 6b, when the unmanned aerial vehicle moves along the second horizontal direction, the pitch angle range of the unmanned aerial vehicle changes along the second horizontal direction, so that the target object moving along the second horizontal direction is located in the focusing frame of the camera, and the unmanned aerial vehicle can track and shoot the target object in the second horizontal direction.
In a preferred embodiment, the target object has a motion component in a first horizontal direction and a motion component in a second horizontal direction, and the drone may automatically decide the corresponding motion direction of the drone according to a proportional relationship between the motion component in the first horizontal direction and the motion component in the second horizontal direction.
It should be noted that the moving speed of the drone along the first horizontal direction or the second horizontal direction may be determined according to the moving speed of the target object and the flying speed of the drone. For example, when the target object moves slowly, the unmanned aerial vehicle can be controlled to reduce the flying speed, and when the target object moves rapidly, the unmanned aerial vehicle is controlled to increase the flying speed, so that the target object is located in the fixed focus frame of the camera, and the target object can be tracked in time.
In another preferred embodiment, in a sports event, in order to capture the event in real time, the drone may not be limited to the first horizontal direction and the second horizontal direction, and the moving direction of the drone in the horizontal direction may be consistent with the moving direction of the target object in the horizontal direction.
The match shooting method provided by the embodiment of the application is characterized in that the flight parameters of the unmanned aerial vehicle, the moving parameters of the holder and at least one of the shooting parameters of the camera are adjusted, so that the target object is located in the fixed focus frame of the camera, the real-time tracking shooting of the target object in the match area is realized, the tracking shooting cost of the target object is low, the flexibility is high, the target object can be captured in time, and the efficiency of tracking shooting of the target object in the match area is improved.
On the basis of the above embodiments, in some embodiments, the drone is limited to move within a preset flight range, as shown in fig. 7, the method of the embodiment of the present application further includes:
s201, acquiring the flight range of the unmanned aerial vehicle according to the position information of the competition area and the attribute information of the unmanned aerial vehicle.
Optionally, the attribute of the unmanned aerial vehicle related to in the embodiment of the present application includes at least one of: the offset angle of the unmanned aerial vehicle, the rotation range of each shaft of the holder and the zooming range of the camera.
In the embodiment of the application, before the unmanned aerial vehicle acquires the flight range of the unmanned aerial vehicle according to the position information of the competition area and the attribute information of the unmanned aerial vehicle, the position information of the competition area is firstly acquired.
The unmanned aerial vehicle can obtain the position information of the competition area through two modes, namely automatic generation and user input, wherein the automatic generation mode comprises a first mode and a second mode, and the user input mode comprises a third mode.
The first method comprises the following steps D1 to D3:
and D1, controlling the unmanned aerial vehicle to acquire the image information of the area to be divided.
The area to be divided includes the above-mentioned playing area, for example, the area to be divided is a playing place (e.g., a stadium), and the playing area is a playing field area (e.g., a court) in the playing place.
Optionally, the parameter of adjustment unmanned aerial vehicle, for example, adjust unmanned aerial vehicle's flying height, increase unmanned aerial vehicle's field of vision scope to make unmanned aerial vehicle gather the whole image information who waits to divide the region.
Step D2, acquiring the position information of the area to be divided according to the image information of the area to be divided; wherein the position information of the region to be divided includes boundary position information of the region to be divided.
The position information of the to-be-divided area comprises boundary position information of the to-be-divided area, wherein the boundary of the to-be-divided area comprises an angular point and/or a side line of the to-be-divided area.
And D3, dividing the areas to be divided to generate the competition areas, and acquiring the position information of the competition areas according to the position information of the areas to be divided.
The playing area is a part of the area to be divided, so that a part of the content can be divided from the area to be divided as the playing area, for example, if the playing area is set to include the playing area and the area of the playing area expanded outward by 10 meters, the unmanned aerial vehicle can divide the playing area from the area to be divided and the area of the playing area expanded outward by 10 meters as the playing area according to the setting. Optionally, the boundary lines of the court can be automatically divided by the system, for example, the boundary lines of the court are directly identified for division; optionally, the setting may be a setting parameter input by a user, or may be a result of frame selection performed by the user in the image information of the region to be divided.
Thus, after the playing area is divided from the areas to be divided, the position information of the playing area can be obtained according to the position information of the areas to be divided. For example, the position information of the playing area is obtained based on the position information of the area to be divided and the position of the playing area in the area to be divided.
The second method comprises the following steps E1 to E3:
and E1, controlling the unmanned aerial vehicle to acquire the image information of a plurality of subareas in the area to be divided.
In this mode, unmanned aerial vehicle can only gather the image information of a subregion of treating the subregion once, can control unmanned aerial vehicle like this and carry out a lot of collection, for example, control unmanned aerial vehicle around treating the subregion and fly a week to make unmanned aerial vehicle gather the image information of a plurality of subregions in treating the subregion.
And step E2, respectively acquiring the position information of the plurality of subareas according to the image information of the plurality of subareas.
Referring to the above manner, the position information of each partition in the region to be partitioned can be obtained.
And E3, acquiring the position information of the areas to be divided according to the position information of the plurality of subareas.
Specifically, the plurality of partitions are pieced together according to the position information of the plurality of partitions, and then the pieced position information of the to-be-divided areas is obtained.
And E4, dividing the areas to be divided to generate the competition areas, and acquiring the position information of the competition areas according to the position information of the areas to be divided.
The specific process of this step can refer to step D3 described above, and will not be described herein again.
In both of the above two modes, the unmanned aerial vehicle automatically generates the position information of the competition area based on an image processing mode, and a mode of obtaining the position information of the competition area by user input is introduced below.
The third mode comprises the steps F1 and F2:
and step F1, receiving boundary information input by a user.
Illustratively, the boundary information includes: corner information of the playing area and/or edge information of the playing area. For example, the boundary information input by the user includes two corner points and any one edge on a diagonal line of the playing area, or includes 4 corner points of the playing area, or includes two adjacent edges of the playing area.
Step F2, determining the position information of the playing area according to the boundary information.
The position information of the playing area can be determined according to the boundary information input by the user, for example, the coordinates of 4 corner points of the playing area input by the user, so that the position information of the playing area can be determined according to the coordinates of the 4 corner points.
According to the embodiment of the application, the position information of the competition area is obtained through the 3 modes, wherein the first mode and the second mode are high in intelligent degree, the participation degree of a user is reduced, and the position information of the competition area can be automatically generated. And in the third mode, the position information of the competition area can be determined according to the requirements of the user, and the whole process is simple.
According to the steps, after the position information of the competition area is obtained, the unmanned aerial vehicle can obtain the flight range of the unmanned aerial vehicle according to the position information of the competition area and the attribute information of the unmanned aerial vehicle. Preferably, the flight range of the unmanned aerial vehicle comprises a flight range of the unmanned aerial vehicle in a horizontal direction and a flight range of the unmanned aerial vehicle in a vertical direction.
Assuming that the playing area is rectangular, the flight range of the drone in the horizontal direction includes the movement range of the drone in the x-axis direction as shown in fig. 6a and the movement range of the drone in the y-axis direction as shown in fig. 6b, wherein the movement range of the drone in the x-axis direction can be determined according to the horizontal viewing angle of the drone and the length of the long side of the playing area, and the movement range of the drone in the y-axis direction can be determined according to the pitch angle of the drone and the length of the short side of the playing area.
The flight range of the drone in the vertical direction should be greater than or equal to the minimum safe flight height of the drone and less than or equal to the maximum safe flight height of the drone, as shown in fig. 6 b.
Optionally, when the flight range of the unmanned aerial vehicle is obtained according to the position information of the competition area and the attribute information of the unmanned aerial vehicle, the limitation of the area to be divided on the flight range of the unmanned aerial vehicle needs to be considered. For example, when one side of the area to be divided has a fence, the flight range of the drone should avoid the fence.
S202, controlling the unmanned aerial vehicle to move in the flight range of the unmanned aerial vehicle according to the flight range of the unmanned aerial vehicle, so that the target object is located in a fixed focus frame of the camera.
This application embodiment, after obtaining unmanned aerial vehicle's flight range according to above-mentioned step, then control unmanned aerial vehicle and remove in this flight range, can realize the accuracy assurance to unmanned aerial vehicle's flight range like this, improved unmanned aerial vehicle's flight security to when realizing that unmanned aerial vehicle removes in this flight range, the target level is in the fixed focus frame of camera, follows the shooting with the target object of contrast match area intra-area.
According to the match shooting method provided by the embodiment of the application, the flight range of the unmanned aerial vehicle is obtained according to the position information of the match area and the attribute information of the unmanned aerial vehicle; according to the flight range of the unmanned aerial vehicle, the unmanned aerial vehicle is controlled to move in the flight range of the unmanned aerial vehicle, so that the target object is located in a fixed focus frame of the camera. This application embodiment promptly, the accuracy has obtained unmanned aerial vehicle's flight range to control unmanned aerial vehicle and remove in this flight range, realize the accurate control to unmanned aerial vehicle, and guarantee that the target level is in the fixed focus frame of camera, realize the accurate tracking of unmanned aerial vehicle contrast match area intra-area target object.
On the basis of the embodiment shown in fig. 7, in some embodiments, as shown in fig. 8, adjusting the parameter of at least one of the drone, the camera, and the pan/tilt head in S103 may include:
s301, control unmanned aerial vehicle flies at first preset height, first preset height belongs to unmanned aerial vehicle' S flight within range.
S302, when the unmanned aerial vehicle flies at the first preset height, the moving parameters of the unmanned aerial vehicle in the horizontal direction, the moving parameters of the holder and/or the shooting parameters of the camera are adjusted.
This application embodiment, unmanned aerial vehicle tracks to shoot at the target object of contrast match area under first preset height, like this when the parameter of at least one in adjustment unmanned aerial vehicle, camera and cloud platform, at first controls unmanned aerial vehicle flight under first preset height, then, adjusts unmanned aerial vehicle at the removal parameter of horizontal direction. After the flight parameters of the unmanned aerial vehicle are adjusted, the moving parameters of the holder and/or the shooting parameters of the camera are adjusted, so that the target object is located in the fixed focus frame of the camera.
The first preset height is specifically set according to actual needs, for example, the first preset height corresponding to a basketball game is smaller than the first preset height corresponding to a football game.
In one aspect of the present disclosure, the method of the present disclosure further includes:
s303, controlling the flying height of the unmanned aerial vehicle to rise from the first preset height to the second preset height so as to enable the target object to be located in the fixed focus frame of the camera, wherein the second preset height belongs to the flying range of the unmanned aerial vehicle.
Specifically, in the above S301 and S302, the unmanned aerial vehicle is controlled to fly at the first preset height, and after the moving parameter of the unmanned aerial vehicle in the horizontal direction and the moving parameter of the cradle head and/or the shooting parameter of the camera are adjusted, when the target is still in the fixed focus frame that cannot be located in the camera, at this time, the flying height of the unmanned aerial vehicle can be increased from the first preset height to the second preset height, so as to increase the visual field range of the fixed focus frame of the camera, and thus the target in the competition area is located in the fixed focus frame of the camera.
Optionally, the second preset height is specifically set according to actual needs, optionally, the second preset height is smaller than or equal to the maximum safe flying height of the unmanned aerial vehicle and larger than the first preset height, and the second preset height and the first preset height are both located in the flying range of the unmanned aerial vehicle.
According to the method, when at least one parameter of the unmanned aerial vehicle, the camera and the cloud deck is adjusted, the unmanned aerial vehicle is firstly controlled to fly at a first preset height, when the unmanned aerial vehicle flies at the first preset height, the moving parameter of the unmanned aerial vehicle in the horizontal direction, the moving parameter of the cloud deck and/or the shooting parameter of the camera are/is adjusted, and then the parameters are quickly and accurately adjusted, so that the target object is quickly and accurately tracked.
On the basis of the foregoing embodiment, in some embodiments, as shown in fig. 9, the adjusting the parameters of the drone in S103 may include:
s401, obtaining the position information of the target object in the competition area.
Wherein, the obtaining of the position information of the target object in the competition area at least comprises the following two modes:
in the first mode, a preset target object in the game area is identified, and the position information of the target object in the game area is obtained.
Specifically, the unmanned aerial vehicle identifies a target object in the competition area, for example, the unmanned aerial vehicle photographs the competition area to obtain image information of the competition area, identifies the target object from the image information of the competition area based on an image identification method, and determines whether the identified target object is matched with a preset target object, so as to obtain an identification result of the target object.
Optionally, when the recognition result of the target object is a match, the position information of the target object in the game area may be obtained.
In a possible implementation manner of the application, the unmanned aerial vehicle can adopt an image processing manner to obtain the position information of the target object. The method comprises the following steps: acquiring current image information through a camera of the unmanned aerial vehicle; and when the current image information comprises the image of the target object, acquiring the position information of the target object in the competition area according to the image information of the competition area and the current image information.
In the first mode, a preset target object in the competition area is identified in an image identification mode to generate an identification result; and if the identification result is matching, acquiring the position information of the target object in the competition area based on an image processing method.
In a second mode, a sensing signal sent by sensing equipment on the target object is obtained; and acquiring the position information of the target object according to the sensing signal.
In this implementation, a sensing device, such as an rfid (radio Frequency identification) sensor or an infrared sensor, is provided on the target. These sensing devices can send sensing signals to the surroundings in real time. Therefore, after the unmanned aerial vehicle receives the sensing information sent by the sensing equipment, the position information of the target object can be obtained according to the sensing signal.
In the mode, the sensing signal sent by the sensing equipment on the target object is obtained, and the position information of the target object is obtained according to the sensing signal, so that the mode of obtaining the position information of the target object is simple.
S402, adjusting parameters of the unmanned aerial vehicle according to the position information of the target object.
Specifically, according to the above steps, after the position information of the target object is obtained, the parameter of the unmanned aerial vehicle is adjusted according to the position information of the target object, for example, at least one of the flight parameter of the unmanned aerial vehicle, the movement parameter of the pan-tilt and the shooting parameter of the camera is adjusted according to the position information of the target object in the playing area, so that the target object is located in the fixed focus frame of the camera, and the unmanned aerial vehicle tracks and shoots the target object in the playing area.
According to the embodiment of the application, the position information of the target object in the competition area is obtained, and the accurate adjustment of the parameters of the unmanned aerial vehicle is realized based on the obtained position information of the target object, so that the target object is located in the fixed focus frame of the camera, and the accurate tracking shooting of the unmanned aerial vehicle for the target object in the competition area is realized.
Fig. 10 is a schematic structural diagram of an electronic device provided in an embodiment of the present application, and as shown in fig. 10, an electronic device 200 in an embodiment of the present application is applied to an unmanned aerial vehicle equipped with a camera and a pan-tilt, and includes: a memory 220 and a processor 230.
A memory 220 for storing a computer program;
a processor 230 for executing said computer program, in particular for,
acquiring a preset competition area and a target object;
controlling the camera to generate fixed focus frames which are adaptive to the number of the target objects according to the number of the target objects;
when the target object is detected to be located in the competition area, adjusting parameters of at least one of the unmanned aerial vehicle, the camera and the holder so that the target object is located in a fixed focus frame of the camera.
The electronic device of the embodiment of the present invention may be configured to execute the technical solutions in the method embodiments, and the implementation principles and technical effects are similar, which are not described herein again.
In one possible implementation, the processor 230 is specifically configured to:
when the number of the target objects is one, adjusting the focal length of the camera to generate a first fixed focal frame;
when the number of the target objects is multiple, adjusting the focal length of the camera to generate a second fixed-focus frame;
the shooting range of the second fixed focus frame is larger than that of the first fixed focus frame.
In one possible implementation, the processor 230 is specifically configured to:
and when the number of the target objects is multiple, adjusting the focal length of the camera to generate the second fixed-focus frame according to a preset mapping relation between the number of the target objects and the focal length value.
Optionally, the playing area comprises one or more of a ball playing area, a shooting playing area, a track and field playing area, a water playing area, and an ice and snow playing area.
Optionally, the target object is a basketball, a football or a soccer ball.
Optionally, the target object is one or more players.
In one possible implementation, the processor 230 is specifically configured to:
adjusting the parameters of the unmanned aerial vehicle, so that the unmanned aerial vehicle and the target object are in a preset angle range.
In one possible implementation, the processor 230 is further configured to:
and when the target objects are multiple, performing composition according to the distribution of the target objects.
In one possible implementation, the processor 230 is further configured to
When the target object is a plurality of, adjust the parameter of at least one in unmanned aerial vehicle, camera and the cloud platform to make the target object of the maximum number be located in the frame of focusing.
In one possible implementation, the processor 230 is further configured to
And carrying out live broadcast on the shot picture through a communication interface.
In one possible implementation, the processor 230 is specifically configured to:
if a target object switching instruction is received, determining the number of the switched target objects;
and when the difference between the number of the switched target objects and the number of the target objects before switching is larger than a preset threshold value, adjusting the focal length of the camera to generate a fixed focus frame which is adaptive to the number of the switched target objects.
In one possible implementation, the processor 230 is specifically configured to:
adjusting at least one of flight parameters of the unmanned aerial vehicle, moving parameters of the cradle head and shooting parameters of the camera.
Optionally, the flight parameters of the drone include: the direction of motion of the drone and/or the pose of the drone.
In one possible implementation, the processor 230 is specifically configured to:
and adjusting the movement parameters of the unmanned aerial vehicle in a first horizontal direction or in a second horizontal direction, wherein when the competition area is square, the first horizontal direction and the second horizontal direction respectively correspond to two adjacent sides of the competition area.
Optionally, the moving direction of the unmanned aerial vehicle in the horizontal direction is consistent with the moving direction of the target object in the horizontal direction.
In one possible implementation, the processor 230 is further configured to:
acquiring the flight range of the unmanned aerial vehicle according to the position information of the competition area and the attribute information of the unmanned aerial vehicle;
according to the flight range of the unmanned aerial vehicle, controlling the unmanned aerial vehicle to move in the flight range of the unmanned aerial vehicle so that the target object is located in the fixed focus frame of the camera.
In a possible implementation manner, the processor 230 is further configured to obtain the position information of the competition area before obtaining the flight range of the unmanned aerial vehicle according to the position information of the competition area and the attribute information of the unmanned aerial vehicle.
In one possible implementation, the processor 230 is specifically configured to:
controlling the unmanned aerial vehicle to acquire image information of an area to be divided;
acquiring the position information of the area to be divided according to the image information of the area to be divided; the position information of the region to be divided comprises boundary position information of the region to be divided;
and dividing the area to be divided to generate the competition area, and acquiring the position information of the competition area according to the position information of the area to be divided.
In one possible implementation, the processor 230 is specifically configured to:
controlling the unmanned aerial vehicle to acquire image information of a plurality of partitions in an area to be partitioned;
respectively acquiring the position information of the plurality of subareas according to the image information of the plurality of subareas;
acquiring the position information of the area to be divided according to the position information of the plurality of partitions;
and dividing the area to be divided to generate the competition area, and acquiring the position information of the competition area according to the position information of the area to be divided.
In one possible implementation, the processor 230 is specifically configured to:
receiving boundary information input by a user;
and determining the position information of the competition area according to the boundary information.
Optionally, the boundary information includes: corner information of the playing area and/or edge information of the playing area.
Optionally, the attribute of the drone includes at least one of: the offset angle of the unmanned aerial vehicle, the rotation range of each shaft of the holder and the zooming range of the camera.
In one possible implementation, the processor 230 is specifically configured to:
controlling the unmanned aerial vehicle to fly at a first preset height, wherein the first preset height belongs to the flight range of the unmanned aerial vehicle;
the unmanned aerial vehicle flies to be in when first predetermined height, adjust unmanned aerial vehicle is at the moving parameter of horizontal direction, and the moving parameter of cloud platform and/or the shooting parameter of camera.
In one possible implementation, the processor 230 is further configured to:
and controlling the flying height of the unmanned aerial vehicle to rise from the first preset height to the second preset height so as to enable the target object to be located in the fixed focus frame of the camera, wherein the second preset height belongs to the flying range of the unmanned aerial vehicle.
In one possible implementation, the processor 230 is specifically configured to:
acquiring the position information of the target object in the competition area;
and adjusting the flight parameters of the unmanned aerial vehicle according to the position information of the target object.
In one possible implementation, the processor 230 is specifically configured to:
and identifying the target object in the game area to obtain the position information of the target object in the game area.
In one possible implementation, the processor 230 is specifically configured to:
acquiring current image information through a camera of the unmanned aerial vehicle;
and when the current image information comprises the image of the target object, acquiring the position information of the target object in the competition area according to the image information of the competition area and the current image information.
In one possible implementation, the processor 230 is specifically configured to:
obtaining a sensing signal sent by sensing equipment on the target object;
and acquiring the position information of the target object in the competition area according to the sensing signal.
Optionally, the playing area includes a playing area and a peripheral extension area of the playing area.
The electronic device of the embodiment of the present invention may be configured to execute the technical solutions in the method embodiments, and the implementation principles and technical effects are similar, which are not described herein again.
Fig. 11 is a schematic structural diagram of the unmanned aerial vehicle provided in the embodiment of the present application, and on the basis of the above embodiment, as shown in fig. 11, the unmanned aerial vehicle 300 includes: the system comprises a machine body 310, a power supply battery 320 arranged on the machine body 310, a power system 330, a camera 340, a holder 360 and a processor 350; the power supply battery 320 can supply power to the power system 330, and the power system 330 provides power for the unmanned aerial vehicle;
the processor 350 is configured to obtain a preset competition area and a target object; controlling the camera 340 to generate a fixed focus frame corresponding to the number of the target objects according to the number of the target objects; and when the target is detected to be located in the game area, adjusting parameters of at least one of the power system 330, the camera 340 and the pan-tilt 360 so that the target is located in a focusing frame of the camera 340.
The unmanned aerial vehicle of the embodiment of the invention can be used for executing the technical scheme in the embodiment of the method, the realization principle and the technical effect are similar, and the details are not repeated here.
In one possible implementation, the processor 350 is specifically configured to:
when the number of the target objects is one, adjusting the focal length of the camera 340 to generate a first fixed focal frame;
when the number of the target objects is multiple, adjusting the focal length of the camera 340 to generate a second fixed-focus frame;
the shooting range of the second fixed focus frame is larger than that of the first fixed focus frame.
In one possible implementation, the processor 350 is specifically configured to:
when the number of the target objects is multiple, the focal length of the camera 340 is adjusted to generate the second fixed-focus frame according to a preset mapping relationship between the number of the target objects and the focal length value.
Optionally, the playing area comprises one or more of a ball playing area, a shooting playing area, a track and field playing area, a water playing area, and an ice and snow playing area.
Optionally, the target object is a basketball, a football or a soccer ball.
Optionally, the target object is one or more players.
In one possible implementation form of the method,
the processor 350 is specifically configured to adjust parameters of the power system 330, so that the unmanned aerial vehicle and the target object are in a preset angle range.
In one possible implementation, the processor 350 is further configured to:
and when the target objects are multiple, performing composition according to the distribution of the target objects.
In one possible implementation, the processor 350 is further configured to
When the target object is multiple, adjusting parameters of at least one of the power system 330, the pan/tilt head 360 and the camera 340, so that the target object is located in a fixed focus frame of the camera 340.
In a possible implementation manner, the processor 350 is further configured to live the shot picture through a communication interface.
In one possible implementation, the processor 350 is specifically configured to:
if a target object switching instruction is received, determining the number of the switched target objects;
and when the difference between the number of the switched target objects and the number of the target objects before switching is greater than a preset threshold value, adjusting the focal length of the camera 340 to generate a fixed focus frame which is suitable for the number of the switched target objects.
Fig. 12 is a schematic structural diagram of the unmanned aerial vehicle according to an embodiment of the present application, and based on the foregoing embodiment, as shown in fig. 12, the power system 330 includes a fuselage power device 331 and a pan-tilt power device 332;
the processor 350 is specifically configured to control the fuselage power device 331 to adjust flight parameters of the fuselage 310, and/or control the pan/tilt power device 332 to adjust movement parameters of the pan/tilt head 360, and/or control the camera 340 to adjust shooting parameters.
Optionally, the flight parameters of the fuselage 310 include: a direction of movement of the body 310 and/or a pose of the body 310.
In a possible implementation manner, the processor 350 is specifically configured to control the body power device 331 to adjust a movement parameter of the body 310 in a first horizontal direction, or control the body power device 331 to adjust a movement parameter of the body 310 in a second horizontal direction, where the first horizontal direction and the second horizontal direction respectively correspond to two adjacent sides of the playing area when the playing area is square.
Optionally, the moving direction of the body 310 in the horizontal direction is the same as the moving direction of the target object in the horizontal direction.
In a possible implementation manner, the processor 350 is further configured to obtain a flight range of the unmanned aerial vehicle according to the position information of the competition area and the attribute information of the unmanned aerial vehicle;
according to the flight range of the unmanned aerial vehicle, controlling the unmanned aerial vehicle to move in the flight range of the unmanned aerial vehicle, so that the visual field range of the unmanned aerial vehicle is in the competition area.
In a possible implementation manner, the processor 350 is configured to obtain the position information of the competition area before obtaining the flight range of the unmanned aerial vehicle according to the position information of the competition area and the attribute information of the unmanned aerial vehicle.
In a possible implementation manner, the processor 350 is provided with a function of controlling the drone to acquire image information of an area to be divided;
acquiring the position information of the area to be divided according to the image information of the area to be divided; the position information of the region to be divided comprises boundary position information of the region to be divided;
and dividing the area to be divided to generate the competition area, and acquiring the position information of the competition area according to the position information of the area to be divided.
In a possible implementation manner, the processor 350 is specifically configured to control the unmanned aerial vehicle to acquire image information of a plurality of partitions in an area to be partitioned;
respectively acquiring the position information of the plurality of subareas according to the image information of the plurality of subareas;
acquiring the position information of the area to be divided according to the position information of the plurality of partitions;
and dividing the area to be divided to generate the competition area, and acquiring the position information of the competition area according to the position information of the area to be divided.
In a possible implementation manner, the processor 350 is specifically configured to receive boundary information input by a user;
and determining the position information of the competition area according to the boundary information.
Optionally, the boundary information includes: corner information of the playing area and/or edge information of the playing area.
Optionally, the attribute of the drone includes at least one of: the offset angle of the unmanned aerial vehicle, the rotation range of each axis of the pan/tilt head 360 and the zoom range of the camera 340.
In a possible implementation manner, the processor 350 is specifically configured to control the fuselage 310 to fly at a first preset height by controlling the fuselage power device 331, where the first preset height is within a flight range of the unmanned aerial vehicle;
when the fuselage 310 flies at the first preset height, the parameters of the fuselage 310 in the horizontal direction are adjusted by controlling the fuselage power device 331, and/or the movement parameters of the pan-tilt 360 are adjusted by controlling the pan-tilt power device 332 and/or the shooting parameters are adjusted by controlling the camera 340.
In a possible implementation manner, the processor 350 is further configured to control the flight height of the fuselage 310 to rise from the first preset height to the second preset height by controlling the fuselage power device 331, so that the target is located in the fixed focus frame of the camera 340, and the second preset height belongs to the flight range of the unmanned aerial vehicle.
In one possible implementation, the processor 350 is specifically configured to:
acquiring the position information of the target object in the competition area;
and adjusting the flight parameters of the power system 330 according to the position information of the target object.
In one possible implementation, the processor 350 is specifically configured to:
and identifying the target object in the game area to obtain the position information of the target object in the game area.
In one possible implementation, the processor 350 is specifically configured to:
controlling a camera 340 of the unmanned aerial vehicle to acquire current image information;
and when the current image information comprises the image of the target object, acquiring the position information of the target object in the competition area according to the image information of the competition area and the current image information.
In a possible implementation manner, the processor 350 is further configured to obtain a sensing signal sent by a sensing device on the object;
and acquiring the position information of the target object in the competition area according to the sensing signal.
Optionally, the playing area includes a playing area and a peripheral extension area of the playing area.
The unmanned aerial vehicle of the embodiment of the invention can be used for executing the technical scheme in the embodiment of the method, the realization principle and the technical effect are similar, and the details are not repeated here.
The embodiment of the invention also provides a computer storage medium, wherein the computer storage medium stores program instructions, and when the program is executed, the program can comprise part or all of the steps of the match shooting method in the embodiments.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned storage medium includes: various media capable of storing program codes, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, and an optical disk.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (88)

1. The utility model provides a match shooting method which characterized in that, is applied to the unmanned aerial vehicle who carries on camera and cloud platform, includes:
acquiring a preset competition area and a target object;
controlling the camera to generate fixed focus frames which are adaptive to the number of the target objects according to the number of the target objects;
when the target object is detected to be located in the competition area, adjusting parameters of at least one of the unmanned aerial vehicle, the camera and the holder so that the target object is located in a fixed focus frame of the camera.
2. The method according to claim 1, wherein the generating a fixed focus frame corresponding to the number of the target objects according to the number of the target objects comprises:
when the number of the target objects is one, adjusting the focal length of the camera to generate a first fixed focal frame;
when the number of the target objects is multiple, adjusting the focal length of the camera to generate a second fixed-focus frame;
the shooting range of the second fixed focus frame is larger than that of the first fixed focus frame.
3. The method of claim 2, wherein when the number of the targets is multiple, adjusting the focal length of the camera generates a second fixed focus frame, comprising:
and when the number of the target objects is multiple, adjusting the focal length of the camera to generate the second fixed-focus frame according to a preset mapping relation between the number of the target objects and the focal length value.
4. The method of any one of claims 1-3, wherein the playing area comprises one or more of a ball playing area, a shoot playing area, a track and field playing area, a water playing area, an ice and snow playing area.
5. The method of any one of claims 1-3, wherein the target is a basketball, a football, or a soccer ball.
6. A method as claimed in any one of claims 1 to 3, wherein the target is one or more players.
7. The method according to any one of claims 1 to 3, wherein the adjusting the parameters of the drone so that the target is located within the fixed focus frame of the camera is specifically:
adjusting the parameters of the unmanned aerial vehicle, so that the unmanned aerial vehicle and the target object are in a preset angle range.
8. The method of claim 7, wherein the method comprises:
and when the target objects are multiple, performing composition according to the distribution of the target objects.
9. The method of claim 7, wherein the method comprises:
when the target object is a plurality of, adjust the parameter of at least one in unmanned aerial vehicle, camera and the cloud platform to make the most number of target object be located in the frame of focusing.
10. The method according to any one of claims 1-3, further comprising:
and carrying out live broadcast on the shot picture through a communication interface.
11. The method according to claim 1, wherein the generating of the fixed focus frame corresponding to the number of the target objects according to the number of the target objects comprises:
if a target object switching instruction is received, determining the number of the switched target objects;
and when the difference between the number of the switched target objects and the number of the target objects before switching is larger than a preset threshold value, adjusting the focal length of the camera to generate a fixed focus frame which is adaptive to the number of the switched target objects.
12. The method of claim 1, wherein said adjusting parameters of at least one of said drone, said camera, and said pan-tilt comprises:
adjusting at least one of flight parameters of the unmanned aerial vehicle, moving parameters of the cradle head and shooting parameters of the camera.
13. The method of claim 12, wherein the flight parameters of the drone include: the direction of motion of the drone and/or the pose of the drone.
14. The method of claim 13, wherein said adjusting flight parameters of said drone comprises:
and adjusting the movement parameters of the unmanned aerial vehicle in a first horizontal direction or in a second horizontal direction, wherein when the competition area is square, the first horizontal direction and the second horizontal direction respectively correspond to two adjacent sides of the competition area.
15. The method of claim 14, wherein the direction of movement of the drone in the horizontal direction is coincident with the direction of movement of the target object in the horizontal direction.
16. The method according to any one of claims 12-15, further comprising:
acquiring the flight range of the unmanned aerial vehicle according to the position information of the competition area and the attribute information of the unmanned aerial vehicle;
according to the flight range of the unmanned aerial vehicle, controlling the unmanned aerial vehicle to move in the flight range of the unmanned aerial vehicle so that the target object is located in the fixed focus frame of the camera.
17. The method of claim 16, wherein obtaining the flight range of the drone based on the location information of the playing area and the attribute information of the drone further comprises obtaining the location information of the playing area.
18. The method of claim 17, wherein obtaining the location information of the playing area comprises:
controlling the unmanned aerial vehicle to acquire image information of an area to be divided;
acquiring the position information of the area to be divided according to the image information of the area to be divided; the position information of the region to be divided comprises boundary position information of the region to be divided;
and dividing the area to be divided to generate the competition area, and acquiring the position information of the competition area according to the position information of the area to be divided.
19. The method of claim 17, wherein obtaining the location information of the playing area comprises:
controlling the unmanned aerial vehicle to acquire image information of a plurality of partitions in an area to be partitioned;
respectively acquiring the position information of the plurality of subareas according to the image information of the plurality of subareas;
acquiring the position information of the area to be divided according to the position information of the plurality of partitions;
and dividing the area to be divided to generate the competition area, and acquiring the position information of the competition area according to the position information of the area to be divided.
20. The method of claim 17, wherein obtaining the location information of the playing area comprises:
receiving boundary information input by a user;
and determining the position information of the competition area according to the boundary information.
21. The method of claim 20, wherein the boundary information comprises: corner information of the playing area and/or edge information of the playing area.
22. The method of claim 17, wherein the attributes of the drone include at least one of: the offset angle of the unmanned aerial vehicle, the rotation range of each shaft of the holder and the zooming range of the camera.
23. The method of claim 16, wherein the adjusting the parameters of the at least one of the drone, the camera, and the pan-tilt comprises:
controlling the unmanned aerial vehicle to fly at a first preset height, wherein the first preset height belongs to the flight range of the unmanned aerial vehicle;
the unmanned aerial vehicle flies to be in when first predetermined height, adjust unmanned aerial vehicle is at the moving parameter of horizontal direction, and the moving parameter of cloud platform and/or the shooting parameter of camera.
24. The method of claim 23, further comprising:
and controlling the flying height of the unmanned aerial vehicle to rise from the first preset height to the second preset height so as to enable the target object to be located in the fixed focus frame of the camera, wherein the second preset height belongs to the flying range of the unmanned aerial vehicle.
25. The method of claim 1, wherein said adjusting flight parameters of said drone comprises:
acquiring the position information of the target object in the competition area;
and adjusting the flight parameters of the unmanned aerial vehicle according to the position information of the target object.
26. The method according to claim 25, wherein the obtaining the location information of the object within the playing area comprises:
and identifying the target object in the game area to obtain the position information of the target object in the game area.
27. The method according to claim 26, wherein the obtaining the location information of the object within the playing area comprises:
acquiring current image information through a camera of the unmanned aerial vehicle;
and when the current image information comprises the image of the target object, acquiring the position information of the target object in the competition area according to the image information of the competition area and the current image information.
28. The method according to claim 25, wherein the obtaining the location information of the object within the playing area comprises:
obtaining a sensing signal sent by sensing equipment on the target object;
and acquiring the position information of the target object in the competition area according to the sensing signal.
29. The method of claim 1, wherein the playing area comprises a playing area and a peripheral extension of the playing area.
30. The utility model provides an electronic equipment, its characterized in that is applied to the unmanned aerial vehicle who carries on camera and cloud platform, includes:
a memory for storing a computer program;
a processor for executing the computer program, in particular for:
acquiring a preset competition area and a target object;
controlling the camera to generate fixed focus frames which are adaptive to the number of the target objects according to the number of the target objects;
when the target object is detected to be located in the competition area, adjusting parameters of at least one of the unmanned aerial vehicle, the camera and the holder so that the target object is located in a fixed focus frame of the camera.
31. The electronic device of claim 30, wherein the processor is specifically configured to:
when the number of the target objects is one, adjusting the focal length of the camera to generate a first fixed focal frame;
when the number of the target objects is multiple, adjusting the focal length of the camera to generate a second fixed-focus frame;
the shooting range of the second fixed focus frame is larger than that of the first fixed focus frame.
32. The electronic device of claim 31, wherein the processor is specifically configured to:
and when the number of the target objects is multiple, adjusting the focal length of the camera to generate the second fixed-focus frame according to a preset mapping relation between the number of the target objects and the focal length value.
33. The electronic device of any of claims 30-32, wherein the playing area comprises one or more of a ball playing area, a shoot playing area, a track and field playing area, a water playing area, and an ice and snow playing area.
34. The electronic device of any of claims 30-32, wherein the object is a basketball, a football, or a soccer ball.
35. The electronic device of any one of claims 30-32, wherein the target is one or more players.
36. The electronic device of any one of claims 30-32,
the processor is specifically configured to:
adjusting the parameters of the unmanned aerial vehicle, so that the unmanned aerial vehicle and the target object are in a preset angle range.
37. The electronic device of claim 36, wherein the processor is further configured to:
and when the target objects are multiple, performing composition according to the distribution of the target objects.
38. The electronic device of claim 36, wherein the processor is further configured to
When the target object is a plurality of, adjust the parameter of at least one in unmanned aerial vehicle, camera and the cloud platform to make the target object of the maximum number be located in the frame of focusing.
39. The electronic device of any of claims 30-32, wherein the processor is further configured to
And carrying out live broadcast on the shot picture through a communication interface.
40. The electronic device of claim 30, wherein the processor is specifically configured to:
if a target object switching instruction is received, determining the number of the switched target objects;
and when the difference between the number of the switched target objects and the number of the target objects before switching is larger than a preset threshold value, adjusting the focal length of the camera to generate a fixed focus frame which is adaptive to the number of the switched target objects.
41. The electronic device of claim 30, wherein the processor is specifically configured to:
adjusting at least one of flight parameters of the unmanned aerial vehicle, moving parameters of the cradle head and shooting parameters of the camera.
42. The electronic device of claim 41, wherein the flight parameters of the drone include: the direction of motion of the drone and/or the pose of the drone.
43. The electronic device of claim 42, wherein the processor is specifically configured to:
and adjusting the movement parameters of the unmanned aerial vehicle in a first horizontal direction or in a second horizontal direction, wherein when the competition area is square, the first horizontal direction and the second horizontal direction respectively correspond to two adjacent sides of the competition area.
44. The electronic device of claim 43, wherein a direction of movement of the drone in the horizontal direction is coincident with a direction of movement of the target object in the horizontal direction.
45. The electronic device of any of claims 41-44, wherein the processor is further configured to:
acquiring the flight range of the unmanned aerial vehicle according to the position information of the competition area and the attribute information of the unmanned aerial vehicle;
according to the flight range of the unmanned aerial vehicle, controlling the unmanned aerial vehicle to move in the flight range of the unmanned aerial vehicle so that the target object is located in the fixed focus frame of the camera.
46. The electronic device of claim 45, wherein the processor is further configured to obtain the position information of the competition area before obtaining the flight range of the unmanned aerial vehicle according to the position information of the competition area and the attribute information of the unmanned aerial vehicle.
47. The electronic device of claim 46, wherein the processor is specifically configured to:
controlling the unmanned aerial vehicle to acquire image information of an area to be divided;
acquiring the position information of the area to be divided according to the image information of the area to be divided; the position information of the region to be divided comprises boundary position information of the region to be divided;
and dividing the area to be divided to generate the competition area, and acquiring the position information of the competition area according to the position information of the area to be divided.
48. The electronic device of claim 46, wherein the processor is specifically configured to:
controlling the unmanned aerial vehicle to acquire image information of a plurality of partitions in an area to be partitioned;
respectively acquiring the position information of the plurality of subareas according to the image information of the plurality of subareas;
acquiring the position information of the area to be divided according to the position information of the plurality of partitions;
and dividing the area to be divided to generate the competition area, and acquiring the position information of the competition area according to the position information of the area to be divided.
49. The electronic device of claim 46, wherein the processor is specifically configured to:
receiving boundary information input by a user;
and determining the position information of the competition area according to the boundary information.
50. The electronic device of claim 49, wherein the boundary information comprises: corner information of the playing area and/or edge information of the playing area.
51. The electronic device of claim 46, wherein the attributes of the drone include at least one of: the offset angle of the unmanned aerial vehicle, the rotation range of each shaft of the holder and the zooming range of the camera.
52. The electronic device of claim 45, wherein the processor is specifically configured to:
controlling the unmanned aerial vehicle to fly at a first preset height, wherein the first preset height belongs to the flight range of the unmanned aerial vehicle;
the unmanned aerial vehicle flies to be in when first predetermined height, adjust unmanned aerial vehicle is at the moving parameter of horizontal direction, and the moving parameter of cloud platform and/or the shooting parameter of camera.
53. The electronic device of claim 52, wherein the processor is further configured to:
and controlling the flying height of the unmanned aerial vehicle to rise from the first preset height to the second preset height so as to enable the target object to be located in the fixed focus frame of the camera, wherein the second preset height belongs to the flying range of the unmanned aerial vehicle.
54. The electronic device of claim 30, wherein the processor is specifically configured to:
acquiring the position information of the target object in the competition area;
and adjusting the flight parameters of the unmanned aerial vehicle according to the position information of the target object.
55. The electronic device of claim 54, wherein the processor is further configured to:
and identifying the target object in the game area to obtain the position information of the target object in the game area.
56. The electronic device of claim 55, wherein the processor is specifically configured to:
acquiring current image information through a camera of the unmanned aerial vehicle;
and when the current image information comprises the image of the target object, acquiring the position information of the target object in the competition area according to the image information of the competition area and the current image information.
57. The electronic device of claim 54, wherein the processor is further configured to:
obtaining a sensing signal sent by sensing equipment on the target object;
and acquiring the position information of the target object in the competition area according to the sensing signal.
58. The electronic device of claim 30, wherein the playing area comprises a playing area and a peripheral extension of the playing area.
59. An unmanned aerial vehicle is characterized by comprising a body, a power supply battery arranged on the body, a power system, a camera, a holder and a processor; the power supply battery can supply power for the power system, and the power system supplies power for the unmanned aerial vehicle;
the processor is used for acquiring a preset competition area and a target object; controlling the camera to generate fixed focus frames which are adaptive to the number of the target objects according to the number of the target objects; and when the target object is detected to be located in the competition area, adjusting at least one parameter of the power system, the camera and the holder so that the target object is located in a fixed focus frame of the camera.
60. A drone as claimed in claim 59, wherein the processor is specifically configured to:
when the number of the target objects is one, adjusting the focal length of the camera to generate a first fixed focal frame;
when the number of the target objects is multiple, adjusting the focal length of the camera to generate a second fixed-focus frame;
the shooting range of the second fixed focus frame is larger than that of the first fixed focus frame.
61. A drone as claimed in claim 60, wherein the processor is specifically configured to:
and when the number of the target objects is multiple, adjusting the focal length of the camera to generate the second fixed-focus frame according to a preset mapping relation between the number of the target objects and the focal length value.
62. A drone as claimed in any one of claims 59 to 61, wherein the playing areas include one or more of ball playing areas, shooting playing areas, track and field playing areas, water playing areas, snow and ice playing areas.
63. A drone according to any of claims 59 to 61, wherein the target is a basketball, a football or a soccer ball.
64. A drone as claimed in any of claims 59 to 61, wherein the target is one or more players.
65. A drone according to any of claims 59 to 61,
the processor is specifically used for adjusting parameters of the power system, so that the unmanned aerial vehicle and the target object are in a preset angle range.
66. A drone according to claim 65, wherein the processor is further to:
and when the target objects are multiple, performing composition according to the distribution of the target objects.
67. The drone of claim 65, wherein the processor is further to use
When the target objects are multiple, adjusting parameters of at least one of the power system, the camera and the holder so that the target objects are located in a fixed focus frame of the camera.
68. A drone as in any of claims 59-61, wherein the processor is further configured to live the captured picture through the communication interface.
69. A drone as claimed in claim 59, wherein the processor is specifically configured to:
if a target object switching instruction is received, determining the number of the switched target objects;
and when the difference between the number of the switched target objects and the number of the target objects before switching is larger than a preset threshold value, adjusting the focal length of the camera to generate a fixed focus frame which is adaptive to the number of the switched target objects.
70. A drone according to claim 59, wherein the power system includes a fuselage power plant and a pan-tilt power plant;
the processor is specifically configured to control the fuselage power device to adjust flight parameters of the fuselage, and/or control the pan-tilt power device to adjust movement parameters of the pan-tilt and/or control the camera to adjust shooting parameters.
71. A drone according to claim 70, wherein the flight parameters of the fuselage include: a direction of movement of the fuselage and/or a pose of the fuselage.
72. The drone of claim 71, wherein the processor is specifically configured to control the body power device to adjust a parameter of movement of the body in a first horizontal direction, or to control the body power device to adjust a parameter of movement of the body in a second horizontal direction, wherein the first horizontal direction and the second horizontal direction correspond to two adjacent sides of the playing area when the playing area is square.
73. A drone according to claim 72, wherein the direction of movement of the fuselage in the horizontal direction coincides with the direction of movement of the target object in the horizontal direction.
74. A drone according to any of claims 70-73,
the processor is further configured to acquire a flight range of the unmanned aerial vehicle according to the position information of the competition area and the attribute information of the unmanned aerial vehicle;
according to the flight range of the unmanned aerial vehicle, controlling the unmanned aerial vehicle to move in the flight range of the unmanned aerial vehicle, so that the visual field range of the unmanned aerial vehicle is in the competition area.
75. A drone according to claim 74, wherein the processor is configured to obtain the location information of the playing area before obtaining the flight range of the drone according to the location information of the playing area and the attribute information of the drone.
76. A drone according to claim 75,
the processor is used for controlling the unmanned aerial vehicle to acquire image information of an area to be divided;
acquiring the position information of the area to be divided according to the image information of the area to be divided; the position information of the region to be divided comprises boundary position information of the region to be divided;
and dividing the area to be divided to generate the competition area, and acquiring the position information of the competition area according to the position information of the area to be divided.
77. A drone according to claim 75,
the processor is specifically used for controlling the unmanned aerial vehicle to acquire image information of a plurality of subareas in an area to be divided;
respectively acquiring the position information of the plurality of subareas according to the image information of the plurality of subareas;
acquiring the position information of the area to be divided according to the position information of the plurality of partitions;
and dividing the area to be divided to generate the competition area, and acquiring the position information of the competition area according to the position information of the area to be divided.
78. A drone according to claim 75,
the processor is specifically configured to receive boundary information input by a user;
and determining the position information of the competition area according to the boundary information.
79. A drone as claimed in claim 78, wherein the boundary information includes: corner information of the playing area and/or edge information of the playing area.
80. A drone as claimed in claim 75, wherein the attributes of the drone include at least one of: the offset angle of the unmanned aerial vehicle, the rotation range of each shaft of the holder and the zooming range of the camera.
81. A drone according to claim 74,
the processor is specifically configured to control the airframe to fly at a first preset height by controlling the airframe power device, where the first preset height belongs to a flight range of the unmanned aerial vehicle;
when the aircraft flies at the first preset height, the parameters of the aircraft body in the horizontal direction are adjusted by controlling the aircraft body power device, and/or the moving parameters of the holder are adjusted by controlling the holder power device, and/or the shooting parameters are adjusted by controlling the camera.
82. A drone according to claim 81,
the processor is further used for controlling the flying height of the fuselage to rise to the second preset height from the first preset height through controlling the fuselage power device, so that the target object is located in the fixed focus frame of the camera, and the second preset height belongs to the flying range of the unmanned aerial vehicle.
83. A drone as claimed in claim 59, wherein the processor is specifically configured to:
acquiring the position information of the target object in the competition area;
and adjusting the flight parameters of the power system according to the position information of the target object.
84. A drone as claimed in claim 83, wherein the processor is specifically configured to:
and identifying the target object in the game area to obtain the position information of the target object in the game area.
85. A drone as claimed in claim 84, wherein the processor is specifically configured to:
controlling a camera of the unmanned aerial vehicle to acquire current image information;
and when the current image information comprises the image of the target object, acquiring the position information of the target object in the competition area according to the image information of the competition area and the current image information.
86. A drone according to claim 83,
the processor is further used for obtaining a sensing signal sent by sensing equipment on the target object;
and acquiring the position information of the target object in the competition area according to the sensing signal.
87. A drone as in claim 59, wherein the playing area includes a playing area and a peripheral extension of the playing area.
88. A computer storage medium, characterized in that the storage medium has stored therein a computer program which, when executed, implements a race shooting method according to any one of claims 1-29.
CN201980033571.6A 2019-08-21 2019-08-21 Match shooting method, electronic equipment, unmanned aerial vehicle and storage medium Pending CN112154654A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/101818 WO2021031159A1 (en) 2019-08-21 2019-08-21 Match photographing method, electronic device, unmanned aerial vehicle and storage medium

Publications (1)

Publication Number Publication Date
CN112154654A true CN112154654A (en) 2020-12-29

Family

ID=73891517

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980033571.6A Pending CN112154654A (en) 2019-08-21 2019-08-21 Match shooting method, electronic equipment, unmanned aerial vehicle and storage medium

Country Status (2)

Country Link
CN (1) CN112154654A (en)
WO (1) WO2021031159A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112799422A (en) * 2021-04-06 2021-05-14 众芯汉创(北京)科技有限公司 Unmanned aerial vehicle flight control method and device for power inspection
CN113163254A (en) * 2021-04-06 2021-07-23 广州津虹网络传媒有限公司 Live image processing method and device and electronic equipment
CN113784046A (en) * 2021-08-31 2021-12-10 北京安博盛赢教育科技有限责任公司 Follow-up shooting method, device, medium and electronic equipment
CN113810625A (en) * 2021-10-15 2021-12-17 江苏泰扬金属制品有限公司 Cloud service system for resource allocation

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101014097A (en) * 2006-10-17 2007-08-08 马涛 Active infrared tracking system
CN102014248A (en) * 2009-09-04 2011-04-13 华晶科技股份有限公司 Auto-focusing method and module and image pick-up device employing the method
CN102158650A (en) * 2007-10-17 2011-08-17 索尼株式会社 Image composition determining apparatus, image composition determining method
CN103261939A (en) * 2010-12-09 2013-08-21 富士胶片株式会社 Image capture device and primary photographic subject recognition method
CN109069903A (en) * 2016-02-19 2018-12-21 沛勒尔维珍公司 System and method for monitoring the object in sport event
CN109479088A (en) * 2017-06-02 2019-03-15 深圳市大疆创新科技有限公司 The system and method for carrying out multiple target tracking based on depth machine learning and laser radar and focusing automatically
CN208874651U (en) * 2018-11-07 2019-05-17 杭州晨安科技股份有限公司 Double holder intelligent cameras
CN109792478A (en) * 2016-09-01 2019-05-21 迪尤莱特公司 System and method based on focus target information adjustment focus

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108473201B (en) * 2015-12-29 2021-11-05 乐天集团股份有限公司 Unmanned aerial vehicle retraction system, unmanned aerial vehicle retraction method, and recording medium
CN106584516A (en) * 2016-11-01 2017-04-26 河池学院 Intelligent photographing robot for tracing specified object
US20180139374A1 (en) * 2016-11-14 2018-05-17 Hai Yu Smart and connected object view presentation system and apparatus
CN110109469A (en) * 2019-03-19 2019-08-09 南京理工大学泰州科技学院 It is a kind of with color, identification, positioning, following function quadrotor drone control system
CN110141845A (en) * 2019-06-10 2019-08-20 湖南大狗科技有限公司 A kind of cycle racing rail safety management monitoring system based on unmanned plane

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101014097A (en) * 2006-10-17 2007-08-08 马涛 Active infrared tracking system
CN102158650A (en) * 2007-10-17 2011-08-17 索尼株式会社 Image composition determining apparatus, image composition determining method
CN102014248A (en) * 2009-09-04 2011-04-13 华晶科技股份有限公司 Auto-focusing method and module and image pick-up device employing the method
CN103261939A (en) * 2010-12-09 2013-08-21 富士胶片株式会社 Image capture device and primary photographic subject recognition method
CN109069903A (en) * 2016-02-19 2018-12-21 沛勒尔维珍公司 System and method for monitoring the object in sport event
CN109792478A (en) * 2016-09-01 2019-05-21 迪尤莱特公司 System and method based on focus target information adjustment focus
CN109479088A (en) * 2017-06-02 2019-03-15 深圳市大疆创新科技有限公司 The system and method for carrying out multiple target tracking based on depth machine learning and laser radar and focusing automatically
CN208874651U (en) * 2018-11-07 2019-05-17 杭州晨安科技股份有限公司 Double holder intelligent cameras

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112799422A (en) * 2021-04-06 2021-05-14 众芯汉创(北京)科技有限公司 Unmanned aerial vehicle flight control method and device for power inspection
CN112799422B (en) * 2021-04-06 2021-07-13 国网江苏省电力有限公司泰州供电分公司 Unmanned aerial vehicle flight control method and device for power inspection
CN113163254A (en) * 2021-04-06 2021-07-23 广州津虹网络传媒有限公司 Live image processing method and device and electronic equipment
CN113784046A (en) * 2021-08-31 2021-12-10 北京安博盛赢教育科技有限责任公司 Follow-up shooting method, device, medium and electronic equipment
CN113810625A (en) * 2021-10-15 2021-12-17 江苏泰扬金属制品有限公司 Cloud service system for resource allocation

Also Published As

Publication number Publication date
WO2021031159A1 (en) 2021-02-25

Similar Documents

Publication Publication Date Title
CN112154654A (en) Match shooting method, electronic equipment, unmanned aerial vehicle and storage medium
CN105242685B (en) A kind of accompanying flying unmanned plane system and method
CN110799921A (en) Shooting method and device and unmanned aerial vehicle
CN105242684A (en) Unmanned plane aerial photographing system and method of photographing accompanying aircraft
CN104125372B (en) Target photoelectric search and detection method
CN205353774U (en) Accompany unmanned aerial vehicle system of taking photo by plane of shooing aircraft
CN108521864B (en) Imaging control method, imaging device and unmanned aerial vehicle
JP2017072986A (en) Autonomous flying device, control method and program of autonomous flying device
CN110651466A (en) Shooting control method and device for movable platform
US11798172B2 (en) Maximum temperature point tracking method, device and unmanned aerial vehicle
WO2020233682A1 (en) Autonomous circling photographing method and apparatus and unmanned aerial vehicle
WO2020172800A1 (en) Patrol control method for movable platform, and movable platform
WO2019227333A1 (en) Group photograph photographing method and apparatus
WO2020227998A1 (en) Image stability augmentation control method, photography device and movable platform
WO2020136632A1 (en) A compact interval sweeping imaging system and method
JP2019216343A (en) Determination device, moving body, determination method, and program
CN110351483A (en) A kind of adaptive zoom monitoring unmanned platform of more camera lenses and control method
CN113271409B (en) Combined camera, image acquisition method and aircraft
CN111630838B (en) Specifying device, imaging system, moving object, specifying method, and program
CN113597754A (en) Method and device for acquiring match picture and method and device for controlling shooting device
CN112334853A (en) Course adjustment method, ground end equipment, unmanned aerial vehicle, system and storage medium
CN110392891A (en) Mobile's detection device, control device, moving body, movable body detecting method and program
CN214776631U (en) Aircraft combined camera and aircraft
CN111433819A (en) Target scene three-dimensional reconstruction method and system and unmanned aerial vehicle
CN112166597A (en) Image processing method, device and movable platform

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20201229

WD01 Invention patent application deemed withdrawn after publication