CN112492215B - Shooting control method and device and electronic equipment - Google Patents

Shooting control method and device and electronic equipment Download PDF

Info

Publication number
CN112492215B
CN112492215B CN202011449662.9A CN202011449662A CN112492215B CN 112492215 B CN112492215 B CN 112492215B CN 202011449662 A CN202011449662 A CN 202011449662A CN 112492215 B CN112492215 B CN 112492215B
Authority
CN
China
Prior art keywords
input
image
track
micro
shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011449662.9A
Other languages
Chinese (zh)
Other versions
CN112492215A (en
Inventor
周煜泽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202011449662.9A priority Critical patent/CN112492215B/en
Publication of CN112492215A publication Critical patent/CN112492215A/en
Application granted granted Critical
Publication of CN112492215B publication Critical patent/CN112492215B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters

Abstract

The application discloses a shooting control method and device and electronic equipment, and belongs to the technical field of communication. The method can solve the problem that the operation process of the panoramic image or video with the composition meeting the user requirement in the prior art is complicated, and comprises the following steps: receiving a first input; responding to the first input, controlling a micro cloud platform in the electronic equipment to move along a first movement path, and controlling a camera arranged on the micro cloud platform to acquire images in the process of controlling the micro cloud platform to move along the first movement path; the first motion path is a motion path preset by a user, the first motion path is composed of at least two shooting points, different shooting points correspond to different position information, and the position information corresponding to one shooting point is used for indicating a moving position of the micro-holder in the electronic equipment. The method and the device are suitable for shooting scenes of panoramic images or videos.

Description

Shooting control method and device and electronic equipment
Technical Field
The application belongs to the technical field of communication, and particularly relates to a shooting control method and device and electronic equipment.
Background
With the development of electronic devices, users use the photographing function of the electronic devices more and more frequently. Currently, in the process of using the shooting function of the electronic device, a user can move the electronic device so that the electronic device can shoot a panoramic image or video with a composition meeting the user requirement.
However, according to the above method, since a large operation error exists in the process of moving the electronic device by the user, for example, when the moving track of the electronic device required by the user is circular, the actual moving track of the electronic device may be elliptical due to the operation error of the user, and therefore, after the electronic device is triggered to perform shooting for multiple times, a panoramic image or a video with a composition meeting the user requirement may be obtained, so that the operation process of shooting the panoramic image or the video with the composition meeting the user requirement is complicated.
Disclosure of Invention
The embodiment of the application aims to provide a shooting control method, a shooting control device and electronic equipment, and the problem that the operation process of shooting a panoramic image or video with a composition meeting the requirements of a user is complex at present can be solved.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides a shooting control method, including: receiving a first input; responding to the first input, controlling a micro cloud platform in the electronic equipment to move along a first movement path, and controlling a camera arranged on the micro cloud platform to acquire images in the process of controlling the micro cloud platform to move along the first movement path; the first motion path is a motion path which is triggered and set by a user in advance, the first motion path is composed of at least two shooting points, different shooting points correspond to different position information, and the position information corresponding to one shooting point is used for indicating a moving position of the micro-holder in the electronic equipment.
In a second aspect, an embodiment of the present application provides a shooting control apparatus, including: the device comprises a receiving module and a control module; a receiving module for receiving a first input; the control module is used for responding to the first input received by the receiving module, controlling a micro cloud platform in the electronic equipment to move along a first movement path, and controlling a camera arranged on the micro cloud platform to acquire images in the process of controlling the micro cloud platform to move along the first movement path; the first motion path is a motion path which is triggered and set by a user in advance, the first motion path is composed of at least two shooting points, different shooting points correspond to different position information, and the position information corresponding to one shooting point is used for indicating a moving position of the micro-holder in the electronic equipment.
In a third aspect, embodiments of the present application provide an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, where the program or instructions, when executed by the processor, implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium on which a program or instructions are stored, which when executed by a processor, implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, and the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In an embodiment of the present application, a first input may be received; responding to the first input, controlling a micro cloud platform in the electronic equipment to move along a first movement path, and controlling a camera arranged on the micro cloud platform to acquire images in the process of controlling the micro cloud platform to move along the first movement path; the first motion path is a motion path which is triggered and set by a user in advance, the first motion path is composed of at least two shooting points, different shooting points correspond to different position information, and the position information corresponding to one shooting point is used for indicating a moving position of the micro-holder in the electronic equipment. According to the scheme, the micro cloud platform in the electronic equipment can be controlled to move along the first motion path preset by the user after the first input of the user is received, and in the process of controlling the micro cloud platform to move along the first motion path, the camera arranged on the micro cloud platform is controlled to collect images, so that the camera can collect images on the path meeting the user requirements, and therefore the camera can be ensured to shoot panoramic images or videos with composition meeting the user requirements. Therefore, the shooting control method provided by the embodiment of the application can obtain the panoramic image and the video meeting the user requirements without triggering multiple times of shooting, so that the operation process of shooting the panoramic image and the video meeting the user requirements can be simplified.
Drawings
Fig. 1 is a schematic diagram of a shooting control method provided in an embodiment of the present application;
fig. 2 is one of schematic interfaces applied to a shooting control method according to an embodiment of the present application;
fig. 3 is a second schematic interface diagram of an application of the photographing control method according to the embodiment of the present application;
fig. 4 is a third schematic interface diagram of an application of the shooting control method according to the embodiment of the present application;
fig. 5 is a fourth schematic interface diagram of an application of the shooting control method according to the embodiment of the present application;
fig. 6 is a fifth schematic interface diagram of an application of the shooting control method according to the embodiment of the present application;
fig. 7 is a sixth schematic interface diagram of an application of the shooting control method according to the embodiment of the present application;
fig. 8 is a seventh schematic interface diagram of an application of the shooting control method according to the embodiment of the present application;
fig. 9 is a schematic diagram of a photographing control apparatus in an embodiment of the present application;
FIG. 10 is a schematic diagram of an electronic device in an embodiment of the application;
fig. 11 is a hardware schematic diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or described herein. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The embodiment of the application provides an image display method, an image display device and electronic equipment, which can receive a first input; responding to the first input, controlling a micro cloud platform in the electronic equipment to move along a first movement path, and controlling a camera arranged on the micro cloud platform to acquire images in the process of controlling the micro cloud platform to move along the first movement path; the first motion path is a motion path which is triggered and set by a user in advance, the first motion path is composed of at least two shooting points, different shooting points correspond to different position information, and the position information corresponding to one shooting point is used for indicating a moving position of the micro-holder in the electronic equipment. According to the scheme, the micro cloud platform in the electronic equipment can be controlled to move along the first motion path preset by the user after the first input of the user is received, and in the process of controlling the micro cloud platform to move along the first motion path, the camera arranged on the micro cloud platform is controlled to collect images, so that the camera can collect images on the path meeting the user requirements, and therefore the camera can be ensured to shoot panoramic images or videos with composition meeting the user requirements. Therefore, the shooting control method provided by the embodiment of the application can obtain the panoramic image or video meeting the user requirements without triggering multiple times of shooting, so that the operation process of shooting the panoramic image or video meeting the user requirements can be simplified.
Some of the nouns or terms referred to in the claims and the specification of the present application will be explained first.
A micro pan-tilt head: the component is used for mounting and supporting the camera in the electronic equipment. The micro-pan-tilt can drive the camera (comprising the lens group and the photosensitive element) arranged on the micro-pan-tilt to move within a certain travel range.
Slow motion video: for example, if the capturing frame rate when capturing a slow motion video is 60 frames/second, the slow motion video may have a playing frame rate of 20 frames/second.
And (3) time-delay video: for example, if the capture frame rate when capturing a delayed video is 60 frames/second, the play frame rate of the delayed video may be 120 frames/second.
The shooting control method, the shooting control device, and the electronic device provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
The shooting control method provided by the embodiment of the application can be applied to scenes of shooting panoramic images or videos through the electronic equipment with the micro-holder.
In the embodiment of the application, it is assumed that a camera in the electronic device is installed in the micro-pan-tilt, if a user needs to shoot a panoramic image or a video with a certain composition through the camera, the user can trigger a motion path of the micro-pan-tilt in advance, the motion path comprises at least two shooting points, different shooting points correspond to different position information, and the position information corresponding to one shooting point is used for indicating a moving position of the micro-pan-tilt in the electronic device. Then, the user can trigger the shooting control device to control the micro cloud platform to move along the movement path, and in the process of controlling the micro cloud platform to move, the camera is controlled to collect images, and at least two images are obtained. Since the at least two images are taken at different points of taking, the subject may be located at different positions in different ones of the two images. Therefore, after the shooting control device synthesizes the at least two images, a panoramic image and a video with a certain composition can be obtained. Therefore, the panoramic image and the video with a certain composition can be accurately shot without manually moving the electronic equipment by a user, so that the panoramic image and the video with the specific composition can be accurately shot, and the operation process of shooting the panoramic image and the video meeting the requirements of the user can be simplified.
In the embodiment of the present application, the composition of the video (or the panoramic image) may be a layout of a subject of the video (or the panoramic image) in the video (or the panoramic image).
It should be noted that, in an actual implementation of the embodiment of the present application, the shooting control method provided in the embodiment of the present application may be executed before triggering to acquire an image, or the shooting control method provided in the embodiment of the present application may also be executed in the process of acquiring an image, which may specifically be determined according to an actual use requirement, and the embodiment of the present application is not limited.
As shown in fig. 1, an embodiment of the present application provides a shooting control method, which may include steps 101 and 102 described below.
Step 101, the shooting control device receives a first input.
Optionally, in this embodiment of the present application, it is assumed that a micro-pan-tilt is disposed in the electronic device, and a camera is disposed on the micro-pan-tilt; the photographing control apparatus may receive a first input (scene 1) before controlling the camera provided on the micro cloud stage to capture an image; or, the shooting control device may receive the first input (scene 2) in a process of controlling the camera disposed on the micro cloud platform to acquire the image, and may specifically be determined according to the actual use requirement, which is not limited in the embodiment of the present application.
For example, in scenario 1 above, the first input may be an input to a panoramic image control or a video capture control in a capture preview interface. In the above scenario 2, the first input is an input of a shortcut control displayed on the shooting interface, where the shortcut control is used to trigger to continue shooting the panoramic image or the video according to a preset path.
It should be noted that, in an actual implementation, the first input may also be a voice input or a preset gesture input, which may be specifically determined according to an actual use requirement, and the embodiment of the present application is not limited.
And 102, responding to the first input by the shooting control device, controlling the micro cloud platform in the electronic equipment to move along the first movement path, and controlling a camera arranged on the micro cloud platform to acquire images in the process of controlling the micro cloud platform to move along the first movement path.
The first motion path is a motion path which is triggered and set by a user in advance, the first motion path is composed of at least two shooting points, different shooting points correspond to different position information, and the position information corresponding to one shooting point is used for indicating a moving position of the micro-holder in the electronic equipment.
It is understood that, when the electronic device and the photographed object are kept relatively still, for each of the at least two photographing points, the position information corresponding to one photographing point is also used to indicate the position of the photographing subject in the image captured when the camera is at the one photographing point.
For example, assuming that the electronic device and the monkey (i.e., the subject to be photographed) remain relatively stationary, then: if the first movement path includes 3 shooting points, which are the shooting point 20, the shooting point 21, and the shooting point 22, respectively, and the shooting subject is an image of the monkey 23, as shown in fig. 2 (a), and the angle of view of the camera provided on the micro-pan head is shown as a filled area 24. Then, as shown in (b) of fig. 2, the images acquired by the camera provided on the micro-pan head at the shooting points 20, 21 and 22 are the image 25, the image 26 and the image 27, respectively, and it can be seen that the image of the monkey 23 is located at different positions in the image 25, the image 26 and the image 27.
In the embodiment of the application, under the condition that the electronic equipment is kept static, the operation process of the micro cloud platform is very smooth and stable, so that the exposure time and the definition of an image shot by a camera arranged on the micro cloud platform in the process of moving along the first motion path of the micro cloud platform can be ensured.
In the embodiment of the present application, the first motion path may be any one of motion paths (a motion path formed by the shooting point 20 → the shooting point 21 → the shooting point 22, shown in (b) of fig. 2) within a stroke range (a stroke range 28, shown in (b) of fig. 2) of the micro-pan/tilt.
It should be noted that, in this embodiment of the application, when the first input is received, if the current position of the micro cloud platform is different from the moving position indicated by the first shooting point in the first motion path, the micro cloud platform may be controlled to move to the moving position first, and then the micro cloud platform is controlled to move along the first motion path.
For convenience of description, in the following embodiments, the camera provided on the micro-pan-tilt is referred to as a target camera, and the two cameras have the same meaning and may be interchanged.
Optionally, in this embodiment of the application, the shooting control device may control the target camera to acquire images at all or part of the shooting points of the first motion path.
Optionally, in this embodiment of the application, after the shooting control device controls the micro-cloud platform to move along the first movement path, the target camera may be controlled to stop collecting images, or the target camera may be controlled to continue collecting images, which may specifically be determined according to actual use requirements, and this embodiment of the application is not limited.
Optionally, in this embodiment of the application, the shooting control device may synthesize an image, which is acquired by the target camera in the process of moving along the first movement path by the micro-cloud platform, into a panoramic image or a video.
According to the shooting control method provided by the embodiment of the application, after the first input of the user is received, the micro cloud platform in the electronic equipment is controlled to move along the first motion path preset by the user, and in the process of controlling the micro cloud platform to move along the first motion path, the camera arranged on the micro cloud platform is controlled to collect images, so that the camera can collect images on the path meeting the user requirements, and therefore a panoramic image or a video with a composition meeting the user requirements can be shot. Therefore, the shooting control method provided by the embodiment of the application can obtain the panoramic image and the video meeting the user requirements without triggering multiple times of shooting, so that the operation process of shooting the panoramic image and the video meeting the user requirements can be simplified.
Optionally, in this embodiment of the application, the user may preset one or more motion paths. In a case where a plurality of movement paths are set in advance, the photographing control means may take a movement path used last time as a first movement path after receiving the first input; alternatively, the photographing control apparatus may display a selection frame for the user to trigger selection of the first movement path from the plurality of movement paths. The method can be determined according to actual use requirements, and the embodiment of the application is not limited.
For example, in the embodiment of the present application, before the step 101, the shooting control method provided in the embodiment of the present application may further include the following steps 103 to 105.
Step 103, the shooting control device receives a second input.
The second input may be a sliding input (one possible implementation manner), or an input to at least two preset images of M preset images in the shooting preview interface, where each preset image indicates a position within the stroke range of the micro-pan-tilt, and M is an integer greater than 1 (another possible implementation manner).
Optionally, in this embodiment of the application, in one possible implementation manner described above, the second input may be a sliding input of a user in the space, or a sliding input of the user on the shooting preview interface. The method can be determined according to actual use requirements, and the embodiment of the application is not limited.
Optionally, in this embodiment of the application, when the second input is a sliding input of the user on the shooting preview interface, the sliding input may specifically be a dragging input of the user to a path setting control in the shooting preview interface, where the path setting control is used to indicate the micro pan-tilt.
For example, the shooting control device may specifically display a path setting control in a range identifier in the shooting preview interface, where the range identifier is used to indicate a stroke range of the micro-pan/tilt head. The user may identify a drag path setup control at the scope.
It should be noted that, the marks in the embodiments of the present application are used to indicate words, symbols, images, and the like of information, and a control or other container may be used as a carrier for displaying information, including but not limited to a word mark, a symbol mark, and an image mark.
In the embodiment of the application, the shape of the range mark is the same as the shape of the stroke range of the micro cloud platform.
In the embodiment of the application, the mapping relation between the range identification and the stroke range of the micro-holder can be preset. Assuming that the area a of the stroke range of the micro-tripod head and the area of the range mark are b, then: the mapping proportionality coefficient between the area a of the stroke range of the micro-tripod head and the area b of the range identifier can be k. Wherein a, b and k are all greater than 0.
For example, assuming that the range of travel of the micro-tripod head is a square with an area of 4 square centimeters and k is 2, the range identifier may be a square with an area of 2 square centimeters.
It can be understood that, in the embodiment of the present application, the position in the range identifier corresponds to the moving position in the stroke range of the micro-pan-tilt one-to-one, and different positions in the range identifier correspond to different moving positions in the stroke range of the micro-pan-tilt.
For example, as shown in fig. 3, assuming that the range of travel 30 of the micro-tripod head includes 4 moving positions, namely, a moving position a, a moving position B, a moving position C and a moving position D, the range identifier 31 also includes 4 positions, namely, a position a1 corresponding to the moving position a, a position B1 corresponding to the moving position B, a position C1 corresponding to the moving position C, and a position D1 corresponding to the moving position D. In this way, when determining one position in the range identifier, the shooting control device may determine a moving position corresponding to the identifier within the range of travel of the micro-pan/tilt head based on the position.
Optionally, in this embodiment of the application, the stroke range of the micro-pan-tilt refers to a stroke range of the micro-pan-tilt in a plane perpendicular to the optical axis direction of the target camera.
Optionally, in this embodiment of the application, the M preset images may be images pre-stored in the electronic device; the captured images of the current environment can also be acquired, for example, the images captured by the control target camera at M positions within the travel range of the micro cloud platform in the case of receiving an input of a user triggering the display of the shooting preview interface. The method can be determined according to actual use requirements, and the embodiment of the application is not limited.
Optionally, in this embodiment of the application, if the path setting control and/or the M preset images are not displayed in the shooting preview interface, before the user performs the second input, the user may trigger the display of the path setting control and/or the M preset images through the first target input.
For example, the first target input may be an input that a user starts a path setting function in a setting interface of the camera application, and after the photographing control apparatus receives the first target input, the path setting control and/or the M preset images may be displayed in a photographing preview interface. Of course, after receiving the first target input, the shooting control device may first display an entry identifier for triggering display of M preview images in the shooting preview interface, and after the user inputs the entry identifier, the shooting control device may further capture M preset images on the shooting preview interface.
Optionally, in this embodiment of the application, after the shooting control device displays the path setting control, the user may press the path setting control for a long time to trigger the shooting control device to display the range identifier. And then triggering and setting the motion path of the micro-holder based on the shooting path setting control and the range identifier.
For example, as shown in (a) of fig. 4, the path setting control 40 is displayed on the photographing preview interface, the user can long-press on the path setting control 40, and then as shown in (b) of fig. 4, the photographing controlling apparatus can display the range identification 41 on the photographing preview interface.
And 104, responding to the second input by the shooting control device, and acquiring target track information.
The target track information may be second input track information, and the input track information may include at least one of second input track direction information (1), track size information (2), and track position information (3).
And 105, determining a first motion path corresponding to the target track information by the shooting control device according to the target track information.
Step 105 is described in detail below.
Optionally, in this embodiment of the application, in the above one possible implementation manner, the target track information is different, and the first motion path corresponding to the target track information may also be different.
In one example, when the target trajectory information includes trajectory direction information of the second input, a direction of the first motion path is the same as a trajectory direction of the second input.
In one example, when the target trajectory information includes trajectory size information of the second input, the shape of the first motion path is the same as the trajectory shape of the second input, and the area of the first motion path is L times the trajectory area of the second input, where L is greater than 0. For example, L may be the same as the proportionality coefficient k between the area of the range of travel of the micro-cloud stage and the area of the range identification.
In one example, when the target track information includes second input track position information, if the target track information includes second input track start position information, the start position of the first motion path is a position corresponding to the second input track start position within the stroke range of the micro-pan/tilt head. And if the target track information comprises second input track end position information, the end position of the first motion path is a position corresponding to the second input track end position within the stroke range of the micro-holder.
In the embodiment of the present application, the example that the target track information includes one of the above (1), (2), (3), or (4) is taken as an example to illustrate. In practical implementation, the target track information may include two or more of the above (1), (2), (3), and (4), which may be determined according to practical use requirements, and the embodiment of the present application is not limited.
Exemplarily, taking the target track information including (1), (2), (3) and (4) as an example, fig. 5 (a) is a schematic diagram of the range identifier 50, and the input track of the second input is an input track shown by an arrow 51. Fig. 5 (b) is a schematic view of the range of travel 52 of the micro cloud stage. If the mapping scale factor k is 0.5, the first motion path corresponding to the target trajectory information is the motion path indicated by the arrow 53 as shown in (b) of fig. 5. It is to be understood that each cell in fig. 5 represents a position (or a mobile position). It can be seen that the direction of the first motion path is the same as the direction of the trajectory of the second input; the shape of the first motion path is the same as the input shape of the second input; the length of the first motion path is 0.5 times the length of the second input; the starting position of the first motion path is a position corresponding to the starting position of the second input track within the stroke range 52; the end position of the first movement path is a position corresponding to the second input trajectory end position within the stroke range 52.
Optionally, in this embodiment of the application, in another possible implementation manner described above, the second input may include at least two sub-inputs, and one sub-input is used to trigger selection of one preset image of the M preset images, so as to select at least two preset images. The target track information may specifically include track position information of the at least two sub-inputs, where the track position information of each sub-input indicates a preset image that the sub-input triggers selection.
In the embodiment of the application, since one preset image indicates one moving position of the micro-pan-tilt, the user selects at least two preset images through the second input trigger, that is, at least two moving positions in the stroke range of the selected micro-pan-tilt are equivalent. Furthermore, after the moving positions corresponding to the at least two preset images are connected according to the sequence of the at least two selected preset images, the formed motion path is the first motion path.
It should be noted that, in this embodiment of the application, assuming that the moving positions indicated by the preset image 1, the preset image 2, and the preset image 3 in the M preset images are on the same straight line, if the user triggers and selects the preset image 1 and the preset image 3, the first motion path includes the moving positions indicated by the preset image 1, the preset image 2, and the preset image 3.
In the embodiment of the application, after the shooting control device determines the first motion path, the first motion path may be automatically recorded or may be recorded under the trigger of a user. Therefore, when a user can trigger the shooting control device through input (for example, the first input), the micro cloud platform can be controlled to move along the first movement path, and in the process of controlling the micro cloud platform to move along the first movement path, a camera (namely, a target camera) arranged on the micro cloud platform is controlled to acquire images.
In the embodiment of the application, the input track input by the user can represent the requirement of the user, so that the shooting control device can ensure that the first motion path corresponding to the input track information determined by the shooting control device according to the input track information input by the user is the motion path meeting the actual use requirement of the user; therefore, the composition of the panoramic image or the video shot by the camera arranged on the micro cloud platform in the process of moving along the first motion path meets the actual use requirement of a user, and the satisfaction degree of the user can be improved.
Alternatively, in this embodiment of the application, after receiving the second input of the user, the shooting control device may display a track image (for example, a first track image described below) indicating an input track of the second input, and display a reference image corresponding to the track image, so that the user can confirm whether a motion path corresponding to the input track information of the second input meets an actual use requirement of the user through the track image and the reference image. When the user confirms that the motion path corresponding to the second input track information does not meet the use requirement of the user, the user can trigger the shooting control device to adjust the track image, and then the motion path corresponding to the input track indicated by the adjusted track image is determined as the first motion path.
For example, in the embodiment of the present application, in the case of the above-mentioned one possible implementation manner, that is, in the case that the second input is a pair sliding input, after the step 103, the shooting control method provided in the embodiment of the present application may further include the following steps 106 to 108; the step 103 can be specifically realized by the step 103a described below.
And 106, responding to the second input by the shooting control device, displaying the first track image, and displaying N reference images corresponding to the first track image.
Wherein the first track image is used to indicate an input track of the second input, and N may be an integer greater than 1.
It is understood that, in the embodiment of the present application, the shooting control apparatus may specifically display the first trajectory image on the input trajectory of the second input.
Exemplarily, assuming that the second input is a drag input to the path setting control, the photographing preview interface includes the path setting control 40 and the range identifier 41 as shown in (b) of fig. 4, and when the user drags the path setting control 40 to move within the range identifier 41, the photographing controlling apparatus may display the first trajectory image 42 on a drag trajectory of the drag input and display the path setting control 40 at an end position of the first trajectory image 42 and display the N reference images 43 as shown in (a) of fig. 6.
In the embodiment of the present application, the N reference images corresponding to the first track image are substantially N reference images corresponding to the motion path corresponding to the second input track information.
Specifically, after receiving the second input, the shooting control device may respond to the second input to obtain input trajectory information of the second input; determining a second motion path corresponding to the input track information according to the input track information; and then acquiring the N reference images according to a second motion path, and finally displaying the N reference images by the shooting control device. The N reference images are used for indicating N shooting points on the second motion path, and different reference images indicate different shooting points in the second motion path.
Optionally, in this embodiment of the application, the N reference images may be images stored in the electronic device in advance, or may also be images acquired by the shooting control device controlling the control target camera at N shooting points on the second motion path after determining the second motion path, which may be specifically determined according to actual use requirements, and this embodiment of the application is not limited.
In step 107, the photographing control apparatus receives a third input for the first reference image of the N reference images.
Optionally, in this embodiment of the application, the first reference image may be one or more reference images.
Optionally, in this embodiment of the application, the third input may be any possible form of input such as a click input, a long-press input, or a drag input to the first reference image, which may be determined specifically according to actual use requirements, and this embodiment of the application is not limited.
And step 108, the shooting control device responds to a third input, deletes at least one reference image and updates the first track image into a second track image.
Wherein the third input may be used to trigger deletion of at least one of the N reference pictures; the at least one reference picture may include a first reference picture and/or a second reference picture, the second reference picture being a reference picture arranged after the first reference picture.
It is to be understood that, in the embodiment of the present application, the second trajectory image indicates a part of the input trajectory of the second input.
Illustratively, as shown in fig. 6 (a), the user may click on the first reference image 44 of the N reference images (i.e., the third input), and then as shown in fig. 6 (b), the photographing controlling means may delete the first reference image 44 of the N reference images 43 and the reference image (i.e., at least one reference image) located after the first reference image 44, and update the track image shown at 42 (i.e., the first track image) to the track image shown at 45 (i.e., the second track image). Alternatively, as shown in (c) in fig. 6, the photographing control means may delete the first reference image 44 of the N reference images 43 and update the trajectory image shown at 42 to the trajectory image shown at 46 (i.e., the second trajectory image); the track image segment between the point a and the point B is a track image segment that does not meet the user requirement, and the user can redraw the track image segment between the point a and the point B (e.g., a dashed semicircular track image segment shown in fig. 6 (c)) to obtain a track image meeting the user requirement.
In this embodiment of the application, the shooting control device may further delete the shooting point corresponding to the at least one reference image from the second motion path, that is, delete a part of the second motion path. It can be understood that, in the embodiment of the present application, the second motion path after the user triggers and deletes is a motion path that meets the user requirement.
It should be noted that, in this embodiment of the application, when the N reference images are images pre-stored in the electronic device, deleting at least one reference image specifically deletes at least one reference image displayed on the shooting preview interface, and keeps the at least one reference image stored in the electronic device unchanged; that is, deleting at least one picture is essentially canceling the display of at least one reference picture.
In the embodiment of the application, the user can trigger the shooting control device to delete the reference image and update the track image to adjust the motion path corresponding to the input track information input by the user (for example, the second input), and the user can trigger the setting of the motion path of the micro-pan-tilt in a self-defined manner and can trigger the local adjustment of the motion path in the process of setting the motion path, so that the finally set motion path can meet the actual use requirement.
Further, in the process of triggering and setting the motion path, when a user is unsatisfied with a part of the motion path corresponding to the input track information, the user can directly trigger and delete the part of the motion path through another input without re-triggering and setting by the user, so that the flexibility and the operation convenience for setting the motion path can be improved.
Optionally, in this embodiment of the application, when the second input is a sliding input, and the sliding input is a dragging input to a path setting control in a shooting preview interface; after the shooting control device updates the first track image into the second track image, the display position of the path setting control can be automatically updated, so that the user can continue to drag the path setting control to trigger the update of the track image.
For example, in the embodiment of the present application, after the step 108, the shooting control method provided in the embodiment of the present application may further include a step 109 described below.
And step 109, displaying a path setting control at the target position by the shooting control device.
Wherein the target position is determined from at least one reference image. Specifically, when at least one reference image is a first reference image, the target position is a position corresponding to the first reference image in the second track image; and when the at least one reference image is the first reference image and the second reference image or the at least one reference image is the second reference image, the target position is the end position of the second track image.
For example, assuming that a black circle in fig. 6 is a path setting control, as shown in (a) in fig. 6, before receiving the third input, the photographing control apparatus displays the path setting control at the end position of the first trajectory image 42; after receiving the third input, if at least one of the reference images is the first reference image 44 and a reference image subsequent to the first reference image 44, then, as shown in (b) of fig. 6, the photographing control apparatus may update the first trajectory image to the second trajectory image 45 and display a path setting control at an end position of the trajectory image 45; if the at least one reference image is the first reference image 44, the photographing control apparatus may update the first trajectory image to the trajectory image 46 and display the path setting control at a point a of the trajectory image 46, as shown in fig. 6 (c), where the position between the point a and the point B is the position in the second trajectory image corresponding to the first reference image.
Optionally, in this embodiment of the application, after the shooting control device displays the path setting control at the target position, the user may continue to drag the path setting control to trigger the shooting control device to continue to update the track image until the track image meets the user requirement.
Illustratively, in the embodiment of the present application, after step 109 described above, the shooting control method provided in the embodiment of the present application may further include step 110 and step 111 described below.
And step 110, the shooting control device receives a fourth input of the path setting control.
For the description of the fourth input, reference may be specifically made to the description related to the second input in the foregoing embodiment, and details are not repeated here to avoid repetition.
And step 111, the shooting control device responds to the fourth input, updates the second track image into a third track image, and displays U reference images corresponding to the third track image.
The U reference images comprise other reference images except at least one reference image in the N reference images, the third track image comprises a second track image and a track image indicating a fourth input track, and U is an integer larger than N.
For example, referring to fig. 6, as shown in (b) of fig. 6, the reference image 43 'and the second track image 45 remaining after deletion of at least one of the N reference images 43 are displayed in the photographing preview interface, and the path setting control is displayed at the end position of the second track image 45 so that the user can continue to drag the path setting control (i.e., the fourth input), and then as shown in (d) of fig. 6, the photographing controlling means may update the second track image 45 to the third track image 47 and display N reference images 48 corresponding to the third track image 47, the N reference images 48 including the reference image 43' and a reference image 49 indicating input track information of the fourth input.
In the embodiment of the present application, the third trace image may indicate an undeleted trace portion (hereinafter, referred to as a first trace portion) in the input trace of the second input and the input trace of the fourth input.
In this embodiment of the application, the first motion path may specifically be a motion path corresponding to the input trajectory information of the first trajectory part and the input trajectory information of the fourth input.
Optionally, in this embodiment of the application, after the shooting control apparatus receives the second input of the user, an input duration of the second input or an input speed of the second input may also be recorded. So that the shooting control device controls the micro-pan-tilt to move along the first motion path in a manner related to the input duration of the second input or the input speed of the second input.
Optionally, in this embodiment of the application, assuming that the range identifier includes a plurality of sub identifiers, in a process of controlling a camera (i.e., the target camera) arranged on the micro-pan-tilt to shoot a video, a user may further trigger the target camera to enter a delayed shooting mode through a second target input, and set a shooting path for delayed shooting.
Specifically, the second target input may include a first sub input and a second sub input, the first sub input is an input that drags the path setting control to a first sub identifier of the multiple sub identifiers, the first sub input is used to trigger the camera on the cloudlet platform to enter the automatic delay mode, and determine that a delay starting point is a shooting point indicated by the first sub identifier, the second sub input is a click input to other identifiers of the multiple identifiers except the first sub identifier, and the second sub input is used to trigger the determination of a delay ending point of the camera on the cloudlet platform.
For example, as shown in fig. 7, assuming that the range identifier includes 4 sub identifiers, the user may drag (i.e., first sub input) the path setting control 70 to the first sub identifier 71 of the edge, and pause for a preset time (e.g., 2 seconds), that is, may trigger the camera set on the micro-pan-tilt to enter the automatic delay mode; an arrow 72 in fig. 7 is used to indicate an input trajectory of the first sub-input, and other sub-identifiers may be freely selected as the end points of the automatic delay, apart from the first sub-identifier 71 where the path setting control 70 is located, for example, assuming that the second sub-input is a click input to the second sub-identifier 73, the photographing control apparatus may determine that the delayed movement path of the target camera is a movement path shown by an arrow 74.
It can be understood that, in the embodiment of the present application, the delay path may be a moving path from the shooting point indicated by the first sub identifier to the shooting point indicated by the second sub identifier. After the delay path is set, the shooting control device can control the micro cloud platform to move along the delay path, and in the process of controlling the micro cloud platform to move along the delay path, the target camera is controlled to shoot in a delay mode, and the frame rate of the delayed shooting can be the preset collection frame rate. Thus, a stable and smooth delayed video can be generated.
Illustratively, in the embodiment of the present application, after the step 103, the shooting control method provided in the embodiment of the present application may further include the following step 112, and the step 102 may be specifically realized by the following step 102a or step 102 b.
In step 112, the shooting control device responds to the second input, and obtains the input duration or the input speed of the second input.
And 102a, the shooting control device responds to the first input, controls the micro-cloud platform, completes the movement along the first movement path within a first time length, and controls a camera arranged on the micro-cloud platform to collect images in the process of controlling the micro-cloud platform to move along the first movement path.
And 102b, responding to the first input by the shooting control device, controlling the micro cloud platform by the shooting control device to move along the first movement path at a first speed, and controlling a camera arranged on the micro cloud platform to collect images in the process of controlling the micro cloud platform to move along the first movement path.
The first time length may be H times the input time length of the second input, the first speed may be P times the input speed of the second input, and both H and P are greater than 0.
Optionally, in this embodiment of the application, before the first input is executed, a user may first trigger a setting of a motion mode of the micro-pan/tilt. For example, the micro-pan-tilt is set to move according to the manner shown in the step 102a, or move according to the manner shown in the step 102b, which may be determined according to actual usage requirements, and the embodiment of the present application is not limited.
Optionally, in this embodiment of the application, before the first input is performed, the user may further trigger setting of the values of P and H. Of course, the values of P and H may be preset values. The method can be determined according to actual use requirements, and the embodiment of the application is not limited.
In the embodiment of the application, the shooting control device can control the micro-pan-tilt to complete the movement along the first movement path within the first time length, or control the micro-pan-tilt to move along the first movement path at the first speed, so that the flexibility of shooting videos can be further improved.
Alternatively, in this embodiment of the application, when the shooting control device records the input duration of the second input or the input speed of the second input, the step 102 may be specifically implemented by the step 102c described below.
And 102c, the shooting control device responds to the first input, controls the micro cloud platform to move along the first movement path, and controls a camera arranged on the micro cloud platform to acquire images according to an acquisition frame rate corresponding to the first speed or the first time length in the process of controlling the micro cloud platform to move along the first movement path.
In the embodiment of the application, the input duration input by the user and/or the corresponding relation between the input speed input by the user and the acquisition frame rate can be preset.
For example, the input duration of the user input and the acquisition frame rate may be set to be in a negative correlation. Namely, the larger the input duration is, the smaller the acquisition frame rate corresponding to the input duration is; the shorter the input duration is, the larger the acquisition frame rate corresponding to the input duration is.
For example, when the input duration input by the user is long, it indicates that the user needs to shoot a delayed video, so that shooting at a small frame rate is possible; when the length of the input by the user is small, it indicates that the user needs to shoot a slow-motion video, so that shooting at a large frame rate is possible.
As another example, the input speed of the user input may be set to have a positive correlation with the frame rate of the acquisition. Namely, the larger the input speed is, the larger the acquisition frame rate corresponding to the input speed is; the smaller the input duration is, the smaller the acquisition frame rate corresponding to the input duration is. For example, when the input speed input by the user is large, it may indicate that the user needs to take a delayed video. When the input length input by the user is smaller, it indicates that the user needs to take a slow motion video.
For example, when the input speed input by the user is high, it indicates that the user needs to shoot a slow motion video, so that shooting at a high frame rate is possible; when the input speed input by the user is small, it indicates that the user needs to shoot the delayed video, so that shooting at a small frame rate is possible.
Of course, in actual implementation, it may also be set that the input duration input by the user and the acquisition frame rate are in a negative correlation relationship, or that the input speed input by the user and the acquisition frame rate are in a negative correlation relationship. The method can be determined according to actual use requirements, and the embodiment of the application is not limited.
In this application embodiment, because shoot controlling means can control the camera that sets up on the little cloud platform according to the collection frame rate that corresponds with first speed or first duration, gather the image, consequently can further improve the flexibility of shooing the video.
Optionally, in this embodiment of the application, in the process that the shooting control device controls the micro cloud platform to move along the first movement path, the user may further trigger the shooting control device to switch the acquisition frame rate of the target camera. For example, increasing the acquisition frame rate of the target camera or decreasing the acquisition frame rate of the target camera. This can further improve the flexibility of shooting.
Optionally, in this embodiment of the application, the shooting control device may synthesize an image, which is acquired by the target camera in the process of moving along the first movement path of the micro-cloud platform, into a slow motion video, a delayed video, or a panoramic image.
For example, in the embodiment of the present application, after the step 102, the shooting control method provided in the embodiment of the present application may further include a step 113 described below.
And step 113, synthesizing the Q images into a slow motion video, a delayed video or a panoramic image by the shooting control device.
Wherein, above-mentioned Q image can be for the image that the camera that sets up on the little cloud platform was gathered at the in-process of little cloud platform along first motion path motion, and Q can be for being greater than 1 integer.
Example 1: assuming that the Q images are 300 images, the shooting control apparatus may perform average frame extraction on the 300 images according to a method of extracting one frame every 5 frames to obtain 50 images, and then synthesize the 50 images into a time-delayed video with a rapidly changing viewing angle.
Example 2: assuming that the Q images are 300 images, and the capture frame rate of the camera arranged on the micro-pan-tilt is 30 frames/second when the 300 images are captured, the capture control device may synthesize the 300 images into a delayed video with a playback frame rate of 90 frames/second.
Example 3: assuming that the Q images are 300 images, and the capture frame rate of the camera arranged on the micro-pan-tilt is 60 frames/second when the 300 images are captured, the capture control device may synthesize the 300 images into a slow motion video with a playback frame rate of 15 frames/second.
Example 4, when the user needs to shoot the galaxy arch bridge based on a small star in the sky, the user may trigger setting a motion path as shown in fig. 8 (i.e. a first motion path), and then the user may adjust an initial shooting position, for example, to make an image of the small star be located in a lower area of the preview image, and then click on a panoramic image control in the shooting preview interface (i.e. a first input), and then the shooting control device may control the micro cloud stage to move along the motion path shown by an arrow 80 in fig. 8 in response to the click input, specifically, control the micro cloud stage to move from y0 in a negative direction of the y axis (positive direction of the y axis is an upward direction relative to a screen of the electronic device), and when the motion stage moves to y1, control the micro cloud stage to move in the positive direction of the y axis until the micro cloud stage moves to y2, and during the control of the motion of the micro cloud stage, and controlling a camera arranged on the micro-holder to acquire images.
It can be seen that, in the embodiment of the present application, as shown in fig. 8, in the process of controlling the micro cloud platform to move along the first movement path, the camera arranged on the micro cloud platform is controlled to take 7 images, and each image includes an image 81 of a star, and the image 81 of the star is located at a different position in different images of the 7 images. Then, the photographing control apparatus may synthesize the 7 images into one panoramic image 82, where the panoramic image 82 includes the galaxy arch bridge composed of the star images of the 7 images. In the embodiment of the application, the micro cloud platform can carry out starry sky shooting in a preset path, and the exposure time and the definition of each image shot can be ensured by means of the stability of the micro cloud platform. Thus, flexibility and accuracy of photographing the panoramic image can be improved.
Optionally, in this embodiment of the application, before shooting a video or a panoramic image, the path setting control can be triggered and displayed, and then in the process of shooting the video or the panoramic image, the user can directly control the motion of the micro-pan-tilt in real time through the dragging input of the path setting control. It can be understood that the user can drag the micro-pan-tilt within a range of 360 degrees through the path setting control.
Optionally, in this embodiment of the application, in the case that the path setting control is displayed on the shooting interface or the shooting preview interface, a user double-clicking the screen indicates that the micro cloud platform is reset, that is, the micro cloud platform is controlled to be reset.
In the shooting control method provided by the embodiment of the present application, the execution subject may be a shooting control apparatus, or a control module in the shooting control apparatus for executing the shooting control method. In the embodiment of the present application, a shooting control method executed by a shooting control apparatus is taken as an example, and the shooting control method provided in the embodiment of the present application is described.
As shown in fig. 9, an embodiment of the present application provides a shooting control apparatus 90, where the shooting control apparatus 90 may include: a receiving module 91 and a control module 92. A receiving module 91, which may be configured to receive a first input; the control module 92 may be configured to control the micro cloud platform in the electronic device to move along the first motion path in response to the first input received by the receiving module 91, and control a camera arranged on the micro cloud platform to acquire an image in a process of controlling the micro cloud platform to move along the first motion path; the first motion path is a motion path which is triggered and set by a user in advance, the first motion path is composed of at least two shooting points, different shooting points correspond to different position information, and the position information corresponding to one shooting point is used for indicating a moving position of the micro-holder in the electronic equipment.
In the shooting control device provided by the embodiment of the application, after the first input of the user is received, the micro cloud platform in the electronic equipment is controlled to move along the first motion path preset by the user, and in the process of controlling the micro cloud platform to move along the first motion path, the camera arranged on the micro cloud platform is controlled to collect images, so that the camera can collect images on the path meeting the user requirements, and therefore the camera can be ensured to shoot panoramic images or videos with composition meeting the user requirements. Therefore, the shooting control method provided by the embodiment of the application can obtain the panoramic image and the video meeting the user requirements without triggering multiple times of shooting, so that the operation process of shooting the panoramic image and the video meeting the user requirements can be simplified.
Optionally, in this embodiment of the application, the shooting control apparatus may further include an obtaining module. The receiving module may be further configured to receive a second input before receiving the first input, where the second input is a sliding input, or the second input is an input to at least two preset images of M preset images in a shooting preview interface, each preset image indicates a moving position within a stroke range of the micro-pan/tilt head, and M may be an integer greater than 1;
the acquisition module may be further configured to acquire target track information in response to the second input received by the receiving module, where the target track information is input track information of the second input, and the input track information includes at least one of track direction information, track size information, and track position information of the second input;
the processing module may be further configured to determine, according to the target track information, a first motion path corresponding to the target track information.
Optionally, in this embodiment of the application, in a case that the second input is a slide input; the photographing control apparatus may further include a display module; a display module, configured to display, in response to the second input after the receiving module receives the second input, a first track image, which may be used to indicate an input track of the second input, and display N reference images corresponding to the first track image, where N may be an integer greater than 1; the receiving module can be further used for receiving a third input of a first reference image in the N reference images displayed by the display module, wherein the third input is used for triggering the deletion of at least one reference image in the N reference images; the control module can be further used for responding to the third input received by the receiving module, deleting at least one reference image and updating the first track image into a second track image; the at least one reference picture comprises a first reference picture and/or a second reference picture, and the second reference picture is a reference picture arranged behind the first reference picture.
Optionally, in this embodiment of the application, the sliding input is a dragging input of a path setting control in a shooting preview interface; the display module may be further configured to display a path setting control at a target position of the second track image after the processing module updates the first track image to the second track image, where the target position is determined according to the at least one reference image.
Optionally, in this embodiment of the application, the shooting control apparatus may further include a recording module; the recording module is used for responding to the second input after the receiving module receives the second input and recording the input duration or the input speed of the second input; the control module can be specifically used for controlling the micro-pan-tilt to complete the movement along the first movement path within a first time length; or the control module can be specifically used for controlling the micro-pan-tilt to move along the first movement path at a first speed; the first time length is H times of the input time length, the first speed is P times of the input speed, and both H and P are greater than 0.
Optionally, in this embodiment of the application, the control module may be specifically configured to control the camera to acquire the image at an acquisition frame rate corresponding to the first speed or the first time length.
Optionally, in this embodiment of the application, the control module may be further configured to combine Q images into a slow motion video, a delayed video, or a panoramic image after controlling the camera to collect the images, where the Q images are images collected by the camera in a process of moving along the first movement path of the micro-cloud platform, and Q is an integer greater than 1.
The shooting control device in the embodiment of the present application may be an electronic device, or may be a component, an integrated circuit, or a chip in an electronic device. The electronic device may be a mobile electronic device or a non-mobile electronic device. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a Personal Computer (PC), a Television (TV), a teller machine, a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The photographing control apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The shooting control device 90 provided in the embodiment of the present application can implement each process implemented by the shooting control method in the method embodiments of fig. 1 to fig. 6, and is not described here again to avoid repetition.
As shown in fig. 10, an electronic device 200 according to an embodiment of the present application is further provided, which includes a processor 202, a memory 201, and a program or an instruction stored in the memory 201 and executable on the processor 202, and when the program or the instruction is executed by the processor 202, the processes of the shooting control method embodiment are implemented, and the same technical effects can be achieved, and are not repeated herein to avoid repetition.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic devices and the non-mobile electronic devices described above.
Fig. 11 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 1000 includes, but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, and a processor 1010.
Those skilled in the art will appreciate that the electronic device 1000 may further comprise a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 1010 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 11 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
The user input unit 1007 may be configured to receive a first input; the processor 1010 may be configured to control a micro cloud platform in the electronic device to move along a first movement path in response to a first input received by the user input unit 1007, and control a camera arranged on the micro cloud platform to acquire an image in a process of controlling the micro cloud platform to move along the first movement path;
the first motion path is a motion path which is triggered and set by a user in advance, the first motion path is composed of at least two shooting points, different shooting points correspond to different position information, and the position information corresponding to one shooting point is used for indicating a moving position of the micro-holder in the electronic equipment.
Optionally, in this embodiment of the present application, the user input unit 1007 may be further configured to receive a second input before receiving the first input, where the second input is a sliding input, or the second input is an input to at least two preset images in M preset images in the shooting preview interface, each preset image indicates a moving position within a stroke range of the micro-tripod head, and M may be an integer greater than 1; the processor 1010 may be further configured to, in response to a second input received by the user input unit 1007, obtain target trajectory information, where the target trajectory information is input trajectory information of the second input, and the input trajectory information includes at least one of trajectory direction information, trajectory size information, and trajectory position information of the second input; the processor 1010 may be further configured to determine a first motion path corresponding to the target track information according to the target track information.
Optionally, in this embodiment of the application, in a case that the second input is a slide input; a display unit 1006, which may be configured to display a first track image in response to a second input after the user input unit 1007 receives the second input, and to display N reference images corresponding to the first track image, where the first track image may be used to indicate an input track of the second input, and N may be an integer greater than 1; the user input unit 1007 may be further configured to receive a third input to the first reference picture in the N reference pictures displayed by the display unit 1006, where the third input is used to trigger deletion of at least one reference picture in the N reference pictures; the processor 1010 may be further configured to delete at least one reference image and update the first track image to the second track image in response to a third input received by the user input unit 1007; the at least one reference picture comprises a first reference picture and/or a second reference picture, and the second reference picture is a reference picture arranged behind the first reference picture.
Optionally, in this embodiment of the application, the sliding input is a dragging input of a path setting control in a shooting preview interface; the display unit 1006 may be further configured to display a path setting control at a target position of the second track image after the display unit 1006 updates the first track image into the second track image, where the target position is determined according to at least one reference image.
Optionally, in this embodiment of the application, the processor 1010 is further configured to record, after the user input unit 1007 receives the second input, an input duration or an input speed of the second input in response to the second input;
the processor 1010 may be specifically configured to control the micro-pan-tilt to complete the movement along the first movement path within a first duration; or, the processor 1010 may be specifically configured to control the micro-pan-tilt to move along the first moving path at a first speed; the first time length is H times of the input time length, the first speed is P times of the input speed, and both H and P are greater than 0.
Optionally, in this embodiment of the application, the processor 1010 may be specifically configured to control the camera to acquire an image according to an acquisition frame rate corresponding to the first speed or the first time length.
Optionally, in this embodiment of the application, the processor 1010 may be further configured to, after controlling the camera to acquire images, synthesize Q images into a slow motion video, a delayed video, or a panoramic image, where the Q images are images acquired by the camera in a process of moving along the first movement path of the micro-cloud platform, and Q is an integer greater than 1.
It should be understood that in the embodiment of the present application, the input Unit 1004 may include a Graphics Processing Unit (GPU) 10041 and a microphone 10042, and the Graphics Processing Unit 10041 processes image data of still pictures or videos obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 1006 may include a display panel 10061, and the display panel 10061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1007 includes a touch panel 10071 and other input devices 10072. The touch panel 10071 is also referred to as a touch screen. The touch panel 10071 may include two parts, a touch detection device and a touch controller. Other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 1009 may be used to store software programs as well as various data, including but not limited to application programs and operating systems. Processor 1010 may integrate an application processor that handles primarily operating systems, user interfaces, applications, etc. and a modem processor that handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1010.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the above-mentioned shooting control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is a processor in the electronic device in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the above shooting control method embodiment, and the same technical effect can be achieved.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (12)

1. A shooting control method, characterized by comprising:
receiving a first input;
responding to the first input, controlling a micro cloud platform in the electronic equipment to move along a first movement path, and controlling a camera arranged on the micro cloud platform to acquire images in the process of controlling the micro cloud platform to move along the first movement path;
the first motion path is a motion path preset by a user, the first motion path is composed of at least two shooting points, different shooting points correspond to different position information, and the position information corresponding to one shooting point is used for indicating a moving position of the micro-holder in the electronic equipment;
prior to the receiving the first input, the method further comprises:
receiving a second input, wherein the second input is a sliding input;
responding to the second input, and acquiring target track information, wherein the target track information is input track information of the second input, and the input track information comprises at least one of track direction information, track size information and track position information of the second input;
determining the first motion path corresponding to the target track information according to the target track information;
after the receiving the second input, the method further comprises:
responding to the second input, displaying a first track image and displaying N reference images corresponding to the first track image, wherein the first track image is used for indicating an input track of the second input, and N is an integer greater than 1;
receiving a third input to a first reference picture of the N reference pictures, the third input to trigger deletion of at least one reference picture of the N reference pictures;
in response to the third input, deleting the at least one reference image and updating the first track image to a second track image;
wherein the at least one reference picture comprises the first reference picture and/or a second reference picture, the second reference picture being a reference picture arranged after the first reference picture.
2. The method of claim 1, wherein the sliding input is a drag input to a path setting control in the capture preview interface;
after the updating the first track image into the second track image, the method further comprises:
and displaying the path setting control at the target position of the second track image, wherein the target position is determined according to the at least one reference image.
3. The method of claim 1, wherein after receiving the second input, the method further comprises:
responding to the second input, and acquiring the input duration or the input speed of the second input;
the micro cloud platform in the control electronic device moves along a first motion path, and the micro cloud platform comprises:
controlling the micro-tripod head to complete the movement along the first movement path within a first time length; or the like, or, alternatively,
controlling the micro-pan-tilt to move along the first motion path at a first speed;
the first time length is H times of the input time length, the first speed is P times of the input speed, and both H and P are greater than 0.
4. The method according to claim 3, wherein the controlling a camera disposed on the micro-pan-tilt to capture the image comprises:
and controlling the camera to acquire the image according to the acquisition frame rate corresponding to the first speed or the first time length.
5. The method according to claim 1, wherein after controlling the camera arranged on the micro-pan-tilt head to acquire the image, the method further comprises:
and synthesizing Q images into a slow motion video, a delay video or a panoramic image, wherein the Q images are images acquired by the camera in the process of moving along the first motion path, and Q is an integer greater than 1.
6. A shooting control apparatus, characterized in that the apparatus comprises: the device comprises a receiving module, an acquisition module, a display module and a control module;
the receiving module is used for receiving a first input;
the control module is used for responding to the first input received by the receiving module, controlling a micro cloud platform in the electronic equipment to move along a first movement path, and controlling a camera arranged on the micro cloud platform to acquire images in the process of controlling the micro cloud platform to move along the first movement path;
the first motion path is a motion path which is triggered and set by a user in advance, the first motion path is composed of at least two shooting points, different shooting points correspond to different position information, and the position information corresponding to one shooting point is used for indicating a moving position of the micro-holder in the electronic equipment;
the receiving module is further configured to receive a second input before receiving the first input, where the second input is a sliding input;
the obtaining module is further configured to obtain target track information in response to the second input received by the receiving module, where the target track information is input track information of the second input, and the input track information includes at least one of track direction information, track size information, and track position information of the second input;
the control module is further configured to determine the first motion path corresponding to the target trajectory information according to the target trajectory information;
the display module is configured to, after the receiving module receives the second input, respond to the second input, display a first track image and display N reference images corresponding to the first track image, where the first track image is used to indicate an input track of the second input, and N is an integer greater than 1;
the receiving module is further configured to receive a third input to a first reference image of the N reference images displayed by the display module, where the third input is used to trigger deletion of at least one reference image of the N reference images;
the control module is further configured to delete the at least one reference image and update the first track image to a second track image in response to the third input received by the receiving module;
wherein the at least one reference picture comprises the first reference picture and/or a second reference picture, the second reference picture being a reference picture arranged after the first reference picture.
7. The apparatus of claim 6, wherein the sliding input is a drag input to a path setting control in the capture preview interface;
the display module is further configured to display the path setting control at a target position of a second track image after the control module updates the first track image to the second track image, where the target position is determined according to the at least one reference image.
8. The apparatus of claim 6, further comprising a logging module;
the recording module is used for responding to the second input after the receiving module receives the second input, and recording the input duration or the input speed of the second input;
the control module is specifically used for controlling the micro-pan-tilt and completing the movement along the first movement path within a first time length; or the like, or, alternatively,
the control module is specifically used for controlling the micro-pan-tilt head to move along the first movement path at a first speed;
the first time length is H times of the input time length, the first speed is P times of the input speed, and both H and P are greater than 0.
9. The apparatus according to claim 8, wherein the control module is specifically configured to control the camera to acquire the image at an acquisition frame rate corresponding to the first speed or the first duration.
10. The device of claim 6, wherein the control module is further configured to synthesize Q images into a slow motion video, a delayed video, or a panoramic image after controlling the camera to capture the images, where the Q images are images captured by the camera during the movement of the micro-cloud stage along the first movement path, and Q is an integer greater than 1.
11. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the photographing control method according to any one of claims 1 to 5.
12. A readable storage medium, characterized in that a program or instructions is stored thereon, which when executed by a processor, implements the steps of the photographing control method according to any one of claims 1 to 5.
CN202011449662.9A 2020-12-09 2020-12-09 Shooting control method and device and electronic equipment Active CN112492215B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011449662.9A CN112492215B (en) 2020-12-09 2020-12-09 Shooting control method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011449662.9A CN112492215B (en) 2020-12-09 2020-12-09 Shooting control method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN112492215A CN112492215A (en) 2021-03-12
CN112492215B true CN112492215B (en) 2022-04-12

Family

ID=74941734

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011449662.9A Active CN112492215B (en) 2020-12-09 2020-12-09 Shooting control method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN112492215B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113114933A (en) * 2021-03-30 2021-07-13 维沃移动通信有限公司 Image shooting method and device, electronic equipment and readable storage medium
CN116684724B (en) * 2023-05-19 2024-04-09 中科慧远视觉技术(洛阳)有限公司 Workpiece image acquisition control method and device, workpiece detection equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105894455A (en) * 2016-06-27 2016-08-24 联想(北京)有限公司 Photographing method, device and electronic equipment
CN108449546A (en) * 2018-04-04 2018-08-24 维沃移动通信有限公司 A kind of photographic method and mobile terminal

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102181208B1 (en) * 2014-05-16 2020-11-20 엘지전자 주식회사 Mobile terminal and control method for the mobile terminal
CN110800282B (en) * 2018-11-20 2021-07-27 深圳市大疆创新科技有限公司 Holder adjusting method, holder adjusting device, mobile platform and medium
WO2021012081A1 (en) * 2019-07-19 2021-01-28 深圳市大疆创新科技有限公司 Gimbal control method and device, and computer readable storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105894455A (en) * 2016-06-27 2016-08-24 联想(北京)有限公司 Photographing method, device and electronic equipment
CN108449546A (en) * 2018-04-04 2018-08-24 维沃移动通信有限公司 A kind of photographic method and mobile terminal

Also Published As

Publication number Publication date
CN112492215A (en) 2021-03-12

Similar Documents

Publication Publication Date Title
CN112135046B (en) Video shooting method, video shooting device and electronic equipment
CN112738402B (en) Shooting method, shooting device, electronic equipment and medium
CN112954210B (en) Photographing method and device, electronic equipment and medium
CN112714253B (en) Video recording method and device, electronic equipment and readable storage medium
CN112954199B (en) Video recording method and device
CN112492215B (en) Shooting control method and device and electronic equipment
CN112492212A (en) Photographing method and device, electronic equipment and storage medium
CN113794829B (en) Shooting method and device and electronic equipment
CN112333382B (en) Shooting method and device and electronic equipment
CN112822412A (en) Exposure method and electronic apparatus
CN113873151A (en) Video recording method and device and electronic equipment
CN111669495B (en) Photographing method, photographing device and electronic equipment
CN112738397A (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN113905175A (en) Video generation method and device, electronic equipment and readable storage medium
CN112702497B (en) Shooting method and device
CN113866782A (en) Image processing method and device and electronic equipment
CN113114933A (en) Image shooting method and device, electronic equipment and readable storage medium
CN114466140B (en) Image shooting method and device
CN113794831B (en) Video shooting method, device, electronic equipment and medium
CN112367467B (en) Display control method, display control device, electronic apparatus, and medium
CN114125226A (en) Image shooting method and device, electronic equipment and readable storage medium
CN114025100A (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN114266305A (en) Object identification method and device, electronic equipment and storage medium
CN113923368A (en) Shooting method and device
CN112399092A (en) Shooting method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant