CN111010510A - Shooting control method and device and electronic equipment - Google Patents

Shooting control method and device and electronic equipment Download PDF

Info

Publication number
CN111010510A
CN111010510A CN201911260425.5A CN201911260425A CN111010510A CN 111010510 A CN111010510 A CN 111010510A CN 201911260425 A CN201911260425 A CN 201911260425A CN 111010510 A CN111010510 A CN 111010510A
Authority
CN
China
Prior art keywords
shooting
camera
image
area
driving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911260425.5A
Other languages
Chinese (zh)
Other versions
CN111010510B (en
Inventor
王业
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201911260425.5A priority Critical patent/CN111010510B/en
Publication of CN111010510A publication Critical patent/CN111010510A/en
Application granted granted Critical
Publication of CN111010510B publication Critical patent/CN111010510B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the invention provides a shooting control method, a shooting control device and electronic equipment, which are applied to the electronic equipment, wherein the electronic equipment comprises a plurality of cameras, the plurality of cameras comprise a first camera and at least one second camera, and the view field angle of the first camera is larger than that of the second camera, and the method comprises the following steps: receiving a first touch operation of a user on a first preview image under the condition that the first preview image is displayed on a shooting preview interface, wherein the first preview image is an image collected by a first camera; and determining a shooting track of the at least one second camera based on the first touch operation, and then driving the at least one second camera to shoot along the shooting track. By the embodiment of the invention, the problems that when a user needs to shoot a plurality of images at different angles in a certain area, the shooting process is complicated, and the shooting quality and the shooting efficiency of the images or videos are low are solved.

Description

Shooting control method and device and electronic equipment
Technical Field
The invention relates to the technical field of computers, in particular to a shooting control method and device and electronic equipment.
Background
With the development of terminal technology, especially mobile terminals (such as mobile phones or tablet computers) become necessities of people's lives. Along with the continuous promotion of people to terminal equipment function demand, terminal equipment's function also constantly increases and perfect, like the function of making a video recording, terminal equipment's the function of making a video recording has increased enjoyment, has provided convenience for people's life.
However, when a user needs to take images at a plurality of different angles in a certain area (especially an irregular large area) during shooting with an electronic device, the user needs to hold the electronic device to focus on different angles or sub-areas in the area to take respective shots, which is not only poor in shooting quality, but also only can take one image at a time.
Therefore, the shooting process is complicated, and the shooting quality and the shooting efficiency of the images or videos are low.
Disclosure of Invention
The embodiment of the invention aims to provide a shooting control method, a shooting control device and electronic equipment, and aims to solve the problems that when a user needs to shoot images at a plurality of different angles in a certain area, the shooting process is complicated, and the shooting quality and the shooting efficiency of the images or videos are low.
In order to solve the above technical problem, the embodiment of the present invention is implemented as follows:
in a first aspect, an embodiment of the present invention provides a shooting control method, which is applied to an electronic device, where the electronic device includes multiple cameras, where the multiple cameras include a first camera and at least one second camera, and a field angle of the first camera is greater than a field angle of the second camera, and the method includes:
receiving a first touch operation of a user on a first preview image under the condition that the first preview image is displayed on a shooting preview interface, wherein the first preview image is an image collected by a first camera;
acquiring a shooting track of at least one second camera based on the first touch operation;
and driving at least one second camera to shoot along the shooting track.
In a second aspect, an embodiment of the present invention provides a shooting control apparatus, which is disposed in an electronic device, where the electronic device includes a plurality of cameras, the plurality of cameras includes a first camera and at least one second camera, and a view angle of the first camera is greater than a view angle of the second camera, the apparatus includes:
the device comprises a receiving module, a processing module and a display module, wherein the receiving module is used for receiving a first touch operation of a user on a first preview image under the condition that the shooting preview interface displays the first preview image, and the first preview image is an image collected by a first camera;
the acquisition module is used for determining a shooting track of at least one second camera based on the first touch operation;
and the driving module is used for driving at least one second camera to shoot along the shooting track.
In a third aspect, an embodiment of the present invention provides an electronic device, which includes a processor, a memory, and a computer program stored on the memory and executable on the processor, and when executed by the processor, the electronic device implements the steps of the shooting control method according to the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the shooting control method steps according to the first aspect are implemented.
As can be seen from the above technical solutions provided by the embodiments of the present invention, an embodiment of the present invention is applied to an electronic device, where the electronic device includes a plurality of cameras, the plurality of cameras includes a first camera and at least one second camera, and a view field angle of the first camera is greater than a view field angle of the second camera, and the method includes: receiving a first touch operation of a user on a first preview image under the condition that the first preview image is displayed on a shooting preview interface, wherein the first preview image is an image collected by a first camera; then, based on the first touch operation, determining a shooting track of at least one second camera, and driving the at least one second camera to shoot along the shooting track, so that after the shooting track of the at least one second camera is obtained by receiving the first touch operation of the user on the first preview image, the at least one second camera is driven to shoot along the shooting track, thereby simplifying the shooting process and improving the shooting efficiency.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments described in the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a first flowchart of a shooting control method according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an electronic device according to a second embodiment of the present invention;
fig. 4 is a schematic diagram of a shooting track provided in an embodiment of the present invention;
FIG. 5 is a schematic diagram of a preselected image area provided by an embodiment of the present invention;
FIG. 6 is a schematic diagram of a first interface of a main display area outline box according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of a second interface of a main display area outline box according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of an image stitching interface according to an embodiment of the present invention;
fig. 9 is a schematic block diagram of a photographing control apparatus according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of a third electronic device according to an embodiment of the present invention.
Illustration of the drawings:
20-camera module, 30-shooting control module, 110-first camera, 210-second camera, 160-main display area, 100-first shooting control area, 200-second shooting control area, 120-first shooting key, 130-first zooming control area, 140-first album key, 150-first photographing key, 220-second shooting key, 230-second zooming control area, 240-second album key and 250-second photographing key, 131-first zooming key, 132-first zooming indication bar, 133-first zooming handle, 134-first zooming key, 231-second zooming key, 232-second zooming indication bar, 233-second zooming handle, 234-second zooming key, 515-shooting track, 160-main display area, 521-first pre-selected image area, 522-second pre-selected image area, 523-third pre-selected image area, 524-fourth pre-selected image area, 310-first image, 320-second image.
Detailed Description
The embodiment of the invention provides a shooting control method and device and electronic equipment.
In order to make those skilled in the art better understand the technical solution of the present invention, the technical solution in the embodiment of the present invention will be clearly and completely described below with reference to the drawings in the embodiment of the present invention, and it is obvious that the described embodiment is only a part of the embodiment of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, an execution subject of the method may be an electronic device, which may be a device such as a personal computer, or a mobile electronic device such as a mobile phone and a tablet computer, and the electronic device may be an electronic device used by a user. The method is applied to the electronic equipment which can comprise a plurality of cameras, wherein the plurality of cameras comprise a first camera and at least one second camera, and the view field angle of the first camera is larger than that of the second camera. The method may specifically comprise the steps of:
in S101, when the first preview image is displayed on the shooting preview interface, a first touch operation of a user on the first preview image is received, where the first preview image is an image captured by a first camera.
The shooting preview interface may be a preview interface for providing an image to be shot for a user, for example, a shooting interface of a camera application program in a mobile phone. The first camera may be a normal wide-angle camera (which may be a camera with an angle of view greater than a first threshold), or may be an ultra-wide-angle camera (which may be a camera with an angle of view greater than a second threshold, and the second threshold may be greater than the first threshold), or may be a standard camera (which may be a camera with an angle of view less than the first threshold). The first preview image may be based on an image captured by the first camera, for example: the first preview image may be a wide-angle image acquired based on a common wide-angle camera, or may also be a wide-angle image acquired based on a super wide-angle camera, or may further be an image acquired based on a standard camera. The first touch operation may be set according to actual conditions, for example, the first touch operation may be a sliding operation performed on the first preview image by a user through an operation tool such as a stylus or a stylus held by a finger or a hand of the user.
In implementation, the user may install the camera module 20 in the electronic device, and the camera module 20 may include a plurality of cameras, as shown in fig. 2, and two cameras, namely, a first camera 110 (e.g., a wide-angle camera) and a second camera 210 (e.g., a telephoto camera), may be installed in the user's electronic device. Alternatively, a plurality of cameras, including one first camera 110 and at least one second camera 210, may be installed in the electronic device of the user. And a camera application is also installed in the electronic device. When a user needs to use a camera to shoot, the user can select the camera after starting a camera application installed in the electronic device, the electronic device starts the first camera 110, collects actual scene data in a corresponding space range through the first camera 110 as preview data, and displays the preview data in the shooting preview interface 160.
It should be noted that, the electronic device may further include a shooting control module 30, where the shooting control module 30 may be configured to control shooting operations of the first camera 110 and the second camera 210, and during shooting of an image performed by the electronic device, the shooting control module 30 may simultaneously control the shooting operations of the first camera 110 and the second camera 210. The photographing control module 30 may include a first photographing control area 100 of the first camera 110 and a second photographing control area 200 of the second camera 210, as shown in fig. 3, the first photographing control area 100 and the second photographing control area 200 may be respectively located at two sides of the photographing preview interface 160, wherein the first photographing control area may include: a first photographing key 120, a first zoom control area 130, a first album key 140, and a first photographing key 150. The first zoom control region 130 may include: a first zoom-in button 131, a first zoom-out bar 132, a first zoom-in handle 133, and a first zoom-out button 134. The second photographing control area 200 may include a second photographing key 220, a second zoom control area 230, a second album key 240, and a second photographing key 250. The second zoom control region 230 may include: a second zoom-in button 231, a second zoom-out bar 232, a second zoom-in handle 233, and a second zoom-out button 234.
Currently, with the development of terminal technology, especially mobile terminals (such as mobile phones or tablet computers) become necessities of people's lives. Along with the continuous promotion of people to terminal equipment function demand, terminal equipment's function also constantly increases and perfect, like the function of making a video recording, terminal equipment's the function of making a video recording has increased enjoyment, has provided convenience for people's life. However, when a user needs to take images at a plurality of different angles in a certain area (especially an irregular large area) during shooting with an electronic device, the user needs to hold the electronic device to focus on different angles or sub-areas in the area to take respective shots, which is not only poor in shooting quality, but also only can take one image at a time. Therefore, the shooting process is complicated, and the shooting quality and the shooting efficiency of the images or videos are low. Therefore, the embodiments of the present invention provide a technical solution to solve the above problems, and refer to the following contents.
The electronic device may be preset with an operation setting page of the first preview image, and the user may set the first touch operation through the operation setting page of the first preview image. For example, after the setting is completed, the user may store the set first touch operation in the electronic device, such as a sliding operation performed on the first preview image. In addition, the electronic device may not set the operation setting page of the first preview image, but the electronic device provider sets the first touch operation in the electronic device in advance when the electronic device is shipped from a factory, so that the user may implement the above process through the fixed first touch operation.
After the shooting preview interface is opened, the electronic device may monitor, in real time, a touch operation of the user on the shooting preview interface of the main display area 160, where a specific manner of the touch operation performed by the user in the shooting preview interface may be that the user performs the touch operation through a finger, or that the user performs the touch operation through a stylus, and the like. If it is detected that the user clicks a photographing key of the first camera 110 in the photographing preview interface, the electronic device may acquire preview data in the photographing preview interface through the first camera 110, then generate a first preview image based on the preview data, and display the first preview image in the photographing preview interface, and when the electronic device detects a first touch operation performed on the first preview image by the user, receive the first touch operation, and execute the processing of S102 described below.
In S102, a shooting trajectory of at least one second camera is determined based on the first touch operation.
The second camera 210 may be a camera with a large focal length (e.g., a telephoto camera), and the view angle of the second camera is smaller than that of the first camera, and the second camera 210 is subsequently started to shoot to obtain image information with high shooting quality by acquiring the shooting track of the second camera 210.
In implementation, when the electronic device receives a first touch operation of the user on the first preview image according to the processing in S101, touch points through which a finger of the user or the user slides on the first preview image through a stylus pen may be recorded, and when the user completes the first touch operation, the electronic device may sequentially connect the recorded touch points, and an obtained straight line or a curve may be used as a touch trajectory corresponding to the first touch operation, and determine the touch trajectory as a shooting trajectory of at least one second camera, as shown in fig. 4, where the touch trajectory includes an obtained shooting trajectory 510 of at least one second camera 210.
In S103, at least one second camera is driven to shoot along the shooting trajectory.
In an implementation, after acquiring the capturing trajectory of the at least one second camera 210 through the processing in S102, the electronic device may analyze the length, the direction, and the like of the capturing trajectory to determine a position to be captured, a capturing step of an image, and the like, and then may drive the motion mechanism of the at least one second camera 210 to rotate on the capturing trajectory and capture the image along the determined position or the capturing step on the capturing trajectory.
In the case of only one second camera, after the electronic device acquires the shooting trajectory of the second camera 210 through the processing in S102, the electronic device may drive the second camera to shoot along the shooting trajectory. And when the number of the second cameras is multiple and the shooting tracks are also multiple, after the electronic device acquires the shooting tracks of at least one second camera 210 through the processing in S102, the electronic device may call the corresponding number of second cameras according to the number of the shooting tracks, if the number of the shooting tracks is 2, the number of the called second cameras may also be 2, each second camera may be assigned with one shooting track, so that each second camera is assigned with a corresponding shooting track, and then the corresponding second camera is driven to shoot along the assigned shooting track. When the number of the second cameras is multiple and the shooting track is one, the obtained shooting track can be intercepted into the preset number of sub-shooting tracks according to the number of the second cameras, then, the corresponding sub-shooting track is distributed to each second camera, and the corresponding second camera is driven to shoot and the like along the distributed sub-shooting tracks.
It should be noted that, in the case that the number of the second cameras is multiple, the functions of the respective second cameras may be different, for example, in the case that the number of the second cameras is two, the second cameras may include a TOF camera and a telephoto camera; in the case that the number of the second cameras is three, the second cameras may include an infrared camera, a TOF camera, and a tele camera; under the condition that the quantity of second camera is four, the second camera can include first fixed focus camera, second long focus camera, periscopic camera and TOF camera.
As can be seen from the above technical solutions provided by the embodiments of the present invention, an embodiment of the present invention is applied to an electronic device, where the electronic device includes a plurality of cameras, the plurality of cameras includes a first camera and at least one second camera, and a view field angle of the first camera is greater than a view field angle of the second camera, and the method includes: receiving a first touch operation of a user on a first preview image under the condition that the first preview image is displayed on a shooting preview interface, wherein the first preview image is an image collected by a first camera; then, based on the first touch operation, determining a shooting track of at least one second camera, and driving the at least one second camera to shoot along the shooting track, so that after the shooting track of the at least one second camera is obtained by receiving the first touch operation of the user on the first preview image, the at least one second camera is driven to shoot along the shooting track, thereby simplifying the shooting process and improving the shooting efficiency.
The specific processing manner of S103 may be various, and an optional processing manner is provided below, which may specifically refer to the processing of S1031 to S1032 below.
In S1031, capturing parameter information of at least one second camera is acquired, the capturing parameter information including: at least one of a shooting focal length, a shooting step distance, a shooting duration and a shooting speed.
In S1032, at least one second camera is driven along the shooting trajectory and shooting is performed based on the shooting parameter information.
In an implementation, after the electronic device obtains the shooting track of the at least one second camera through the process of S102, the electronic device may obtain the shooting parameter information of the at least one second camera 210 from a preset database, and then drive the at least one second camera 210 to shoot along the shooting track and based on the shooting parameter information according to the obtained shooting parameter information of the at least one second camera 210.
Specifically, the shooting duration may include a shooting interval duration and a total shooting duration, if the shooting operation performed by driving the at least one second camera 210 is an operation of shooting an image, the electronic device may determine a shooting position of the second camera 210 according to the shooting trajectory obtained in S102 and the shooting interval duration obtained by the electronic device from a preset database, and then the second camera 210 may drive the at least one second camera 210 to shoot along the shooting trajectory according to the shooting position and shooting parameter information such as a shooting focal length, a shooting step length, a shooting speed, and the like.
Or, if the shooting operation executed by driving the at least one second camera 210 is an operation of shooting a video, the at least one second camera 210 may be driven to shoot along the shooting track according to shooting parameter information such as a shooting speed, a total shooting duration, a shooting focal length, and a shooting step length, which are acquired from a preset database.
Specific processing manners of S1031 may be various, and an optional processing manner is provided below, which may specifically refer to the processing of S10311-S10313 described below.
In S10311, a second touch operation input by the user is received.
The second touch operation may be an operation of the second camera key 220 or the second camera key 250 in the second camera control area 200 triggered by the user.
In S10312, first prompt information for prompting the user to input shooting parameter information is displayed.
In S10313, the shooting parameter information input by the user is acquired as the shooting parameter information of the at least one second camera.
It should be noted that after the electronic device obtains the shooting parameter information input by the user, the electronic device may store the shooting parameter information in the preset database, so that after the subsequent electronic device obtains the shooting trajectory of the at least one second camera 210 through the processing in S102, the electronic device may obtain the shooting parameter information of the at least one second camera 210 from the preset database for the user to select, thereby reducing the input operations of the user.
In this regard, after the electronic device has performed the above-mentioned processing procedure of S102, the following processing procedure of S104 may be performed, considering that there may be slight jitter in the first touch operation (e.g. the line drawing operation) performed by the user on the first preview image, thereby affecting the shooting effect of the second camera 210 along the shooting track.
In S104, the shooting trajectory is adjusted according to the first preview image, so that the images within the preset ranges at both sides of the shooting trajectory are in the shooting range corresponding to the shooting focal length of the second camera.
In an implementation, after acquiring the shooting trajectory of the at least one second camera 210 through the processing in S102, the electronic device may adjust the shooting trajectory based on the partial image on the first preview image corresponding to the shooting trajectory, so as to eliminate fine jitter when the user performs a first touch operation (such as a line drawing operation) on the first preview image, improve smoothness of the shooting trajectory, and simultaneously enable images within preset ranges on two sides of the shooting trajectory to be in a shooting range corresponding to a shooting focal length of the second camera 210.
The specific processing manner of S103 may be various, and an alternative processing manner is provided below, which may specifically refer to the processing of S1033 to S1035 described below.
In S1033, at least one second camera is driven to acquire at least one preselected image region along the shooting trajectory.
The preselected image area may be an image area obtained by the at least one second camera 210 during shooting along a shooting track, as shown in fig. 5, the preselected image area may be a first preselected image area 521, a second preselected image area 522, a third preselected image area 523, and a fourth preselected image area 524 that are superimposed on the first preview image and are transparent inside, and a partial area of the first preview image that overlaps with the preselected image area is an image area to be shot corresponding to the preselected image area.
In an implementation, after the electronic device obtains the capturing trajectory 515 of the at least one second camera 210 through the processing of S102, the at least one second camera 210 may be moved to obtain at least one preselected image area along the capturing trajectory 515.
In S1034, an adjustment operation of the user on at least one preselected image region is received, and an adjusted preselected image region is obtained.
The adjustment operation may be to adjust the size of the preselected image area by triggering the second zoom-in button 231 or the second zoom-out button 234 on the second zoom control area 230, or may be to adjust the screen size in the preselected image area by triggering the movement of the second zoom handle 233 on the second zoom bar 232, or may be to adjust the shooting range corresponding to the preselected image area by dragging the position of the preview image area on the shooting track with a finger.
It should be noted that, before receiving the user's adjustment operation on at least one pre-selected image area, the user may activate the second camera 210 picture preview function by clicking on at least one pre-selected image area, so that the transparent preview image area is switched to the real-time preview screen of the second camera 210, then, the user can adjust the size of the real-time preview screen corresponding to the pre-selected image area by triggering the second zoom-in button 231 or the second zoom-out button 234 on the second zoom control area 230, alternatively, the size of the real-time preview screen corresponding to the preselected image area may be adjusted by triggering the movement of the second zoom handle 233 on the second zoom bar 232, alternatively, the real-time preview screen corresponding to the pre-selected image area may be adjusted by dragging the position of the preview image area on the capturing trajectory 515 with a finger.
In implementation, the electronic device obtains at least one preselected image area through the processing in S1033, and then may receive an adjustment operation of the user on the at least one preselected image area, so as to obtain an adjusted preselected image area.
In S1035, the at least one second camera is driven to capture images in the adjusted preselected image areas, respectively.
In practice, after obtaining the adjusted preselected image area through the processing in S1034, the electronic device may acquire the position information of the adjusted preselected image area, and then drive the at least one second camera 210 to rotate to the position where the adjusted preselected image area is located to capture an image according to the acquired position information.
In consideration of the fact that in the actual shooting process, when the second camera 210 is driven to rotate to a corresponding angle to shoot at a certain position on the shooting track 515 in the process of shooting along the shooting track 515 by the second camera 210, there may be a deviation, which causes the shot image to be inconsistent with the target image that the user wants to acquire, in order to reduce the shooting error, the specific processing manner of the above S103 may be various, and an optional processing manner is provided below, which may specifically refer to the processing of S1036-S1041 below.
In S1036, a first position on the shooting trajectory and a shooting focal length of the second camera are acquired.
The first position may be a position of an outline box displayed on the main display area 160 by the second camera 210, and the outline box is used to prompt the user of the content captured by the second camera 210 at this moment.
It should be noted that, during the shooting process, the electronic device acquires the real-time images of the first camera 110 and the at least one second camera 210, but only the first preview image acquired by the first camera 110 may be displayed in the main display area 160, and the outline box 260 of the second camera 210 is displayed at the same time. The frame of the main display area 160 enclosed by the outline box 260 of the second camera at this moment substantially matches the content and size of the real-time frame of the second camera 210, and the outline box 260 is used to prompt the user about the content captured by the second camera 210 at this moment.
In practice, for example, as shown in fig. 7, the first position coordinates of the outline box 260 on the shooting trajectory 515 can be written as (a, b), and the width of the outline box 260 is H and the length is L. After the electronic device acquires the shooting track 515 of the at least one second camera 210 through the processing in S102, first position information of the outline box 260 of the at least one second camera 210 on the shooting track 515 and a shooting focal length of the at least one second camera 210 may be acquired.
In S1037, a first image is acquired by the second camera based on the first position and the photographing focal length.
In implementation, the first image obtained as described above is not displayed on a display screen of the electronic device, as shown in fig. 6.
In S1038, a second image is captured within the first preview image and at the first location, wherein the size of the second image is larger than the size of the first image.
In implementation, as shown in fig. 7, the second image 320 may be an image that the electronic device took within the first preview image and at a first location.
In S1039, feature information of the first image is extracted, the feature information is matched with feature information of the second image, and the second position of the first image is determined.
In an implementation, the acquired first image is not displayed on the display screen of the electronic device, as shown in fig. 6, the first image 310 is the image acquired by the second camera 210 at the first position, the electronic device extracts the feature information of the first image, matches the feature information of the second image with the feature information of the first image, and then determines that the second position coordinate of the first image is (a ', b'), and the width of the first image is H 'and the length of the first image is L'.
In S1040, a target angle of rotation of the second camera is determined according to a difference between the first position and the second position.
In implementation, as shown in fig. 7, the electronic device determines a difference between the first position and the second position according to the first position (a, b), the width of the outline box 260 is H, and the second position coordinates is (a ', b'), the width of the first image is H ', and the length is L', and then determines the target angle of the second camera 210 according to the difference.
In S1041, the second camera is driven to rotate by the target angle to capture one image.
In implementation, as shown in fig. 7, the electronic device drives the second camera 210 to rotate by the target angle to capture an image according to the determined target angle of rotation of the second camera 210, so that the image matches the picture previewed by the outline box 260 in fig. 7.
An image at a certain position on the capturing trajectory 515 can be obtained through the processing in S1036 to S1041, an image at another position on the capturing trajectory 515 can be obtained through the same or similar processing as in S1036 to S1041, and by analogy, images at a plurality of different positions on the capturing trajectory 515 can be obtained respectively.
The specific processing manner of S103 may be various, and an alternative processing manner is provided below, which may specifically refer to the processing of S1042 to S1045.
In S1042, in a case where all or part of the photographing locus 515 is in a preset photographing region, a predetermined region along the photographing locus 515 within the preset photographing region is divided into a preset number of sub-photographing regions.
As shown in fig. 8, the preset capture area may be a preset capture area 410 generated by the electronic device by capturing the position of the user by swiping the finger between any two points A, B on the main display area 160 before capturing.
In implementation, if all or part of the shooting track 515 is in the preset shooting area, the electronic device calculates the preset shooting area 410 of the second camera 210 according to the current shooting parameter information of the second camera 210, and divides a predetermined area along the shooting track 515 in the preset shooting area 410 into a preset number of sub-shooting areas, wherein the size of a single sub-shooting area is larger than that of the second camera 210 shooting the picture 420, and only the middle area of the picture 420 shot by the second camera 420 is reserved during splicing, so as to improve the quality of the shot image.
In S1043, the at least one second camera is driven to capture a plurality of images including the at least one sub-capture area.
In an implementation, the electronic device may drive the at least one second camera 210 to sequentially move the shooting picture 420 to the preset shooting area 410 at certain time intervals to shoot a plurality of images including at least one sub-shooting area.
In S1044, the plurality of images are stitched to obtain a captured image in a predetermined area along the capturing trajectory.
In implementation, the electronic device stores a plurality of images captured by the second camera 210 and stitches the plurality of images to obtain a captured image within a predetermined area along the capture trajectory 515.
In S1045, at least one second camera is driven to shoot along the shooting trajectory 515 outside the preset shooting area.
The specific processing manner of S1043 may be various, and an optional processing manner is provided below, which may specifically refer to the processing from S10431 to S10433.
And S10431, determining a target sub-shooting area in the at least one sub-shooting area, wherein the target sub-shooting area comprises a target object.
The target object is a moving object, for example, the target object may be a moving car, a running child, or the like displayed on the shooting preview interface.
In implementation, after the electronic device divides a predetermined area along the photographing trajectory 515 within the preset photographing area into a preset number of sub-photographing areas through the process of S1042 described above, the identified area including the moving object in the at least one sub-photographing area may be determined as the at least one target sub-photographing area based on an image recognition algorithm.
And S10432, driving at least one second camera to shoot the target sub-shooting area to obtain a first sub-image. In an implementation, when the number of the second cameras is one, after the electronic device determines the target sub-shooting area in the at least one sub-shooting area through the processing in S10431, the second cameras may be driven to shoot the target sub-shooting area according to the distribution of the target sub-shooting areas in the preset shooting area, as shown in fig. 8, in order from top to bottom and from left to right, so as to obtain the first sub-image.
When a plurality of second cameras are provided, the number of lines of the preset shooting area 410 can be divided in advance according to the number of the second cameras, each line of area is labeled, and then the electronic device allocates a shooting line number area corresponding to the label to each second camera; alternatively, the number of columns of the preset shooting area 410 may be divided according to the number of the second cameras, and each column of the area is labeled, and then the electronic device allocates a shooting column number area corresponding to the label to each second camera.
If the electronic device allocates the shooting line number areas corresponding to the labels to each second camera under the condition that the number of the second cameras is multiple, after the electronic device determines the target sub-shooting area in at least one sub-shooting area through the processing of S10431, the corresponding second camera may be driven to shoot the target sub-shooting area according to the line number distribution condition of the target sub-shooting area in the preset shooting area, so as to obtain the first sub-image.
If the electronic device allocates a plurality of shooting column number areas corresponding to the labels to each second camera under the condition that the number of the second cameras is multiple, after the electronic device determines a target sub-shooting area in at least one sub-shooting area through the processing of S10431, the corresponding second camera can be driven to shoot the target sub-shooting area according to the column number distribution condition of the target sub-shooting area in the preset shooting area, so as to obtain a first sub-image.
And S10433, driving at least one second camera to shoot the area except the target sub-shooting area in at least one sub-shooting area to obtain a second sub-image. It should be noted that the process of driving the at least one second camera to shoot the area excluding the target sub-shooting area in the at least one sub-shooting area to obtain the second sub-image may be the same as the process of driving the at least one second camera to shoot the target sub-shooting area to obtain the first sub-image.
Through the mode, the electronic equipment can shoot the moving target object in the shooting area preferentially, and then shoot the sub-shooting area which is not changed or has small change in the shooting area.
In addition, this embodiment also provides a remote control method, and the electronic device may further include a remote control module, which may at least include interface contents of the shooting control module and a wireless communication unit (such as a bluetooth-based communication unit, an infrared-based communication unit, a WiFi-based communication unit, etc.). The remote control system is operated on a remote interface, based on the wireless communication unit, the remote control of the shooting processing of the camera of the electronic equipment can be realized, and the problem that the shooting picture quality is greatly influenced by handheld vibration when a user directly operates the electronic equipment is avoided.
As can be seen from the above technical solutions provided by the embodiments of the present invention, an embodiment of the present invention is applied to an electronic device, where the electronic device includes a plurality of cameras, the plurality of cameras includes a first camera and at least one second camera, and a view field angle of the first camera is greater than a view field angle of the second camera, and the method includes: receiving a first touch operation of a user on a first preview image under the condition that the first preview image is displayed on a shooting preview interface, wherein the first preview image is an image collected by a first camera; then, based on the first touch operation, determining a shooting track of at least one second camera, and driving the at least one second camera to shoot along the shooting track, so that after the shooting track of the at least one second camera is obtained by receiving the first touch operation of the user on the first preview image, the at least one second camera is driven to shoot along the shooting track, thereby simplifying the shooting process and improving the shooting efficiency.
On the basis of the same technical concept, a shooting control device is further provided in an embodiment of the present invention, fig. 9 is a schematic diagram of a module of the shooting control device provided in an embodiment of the present invention, where the shooting control device is configured to execute the shooting control method described in fig. 1 to 8, and as shown in fig. 9, the shooting control device is disposed in an electronic device, where the electronic device includes a plurality of cameras, the plurality of cameras includes a first camera and at least one second camera, and a view field angle of the first camera is greater than a view field angle of the second camera, and the shooting control device includes:
a receiving module 901, configured to receive a first touch operation of a user on a first preview image when a first preview image is displayed on a shooting preview interface, where the first preview image is an image acquired by a first camera;
an obtaining module 902, configured to determine, based on the first touch operation, a shooting trajectory of the at least one second camera;
and a driving module 903, configured to drive at least one second camera to shoot along the shooting track.
As can be seen from the above technical solutions provided by the embodiments of the present invention, an embodiment of the present invention is applied to an electronic device, where the electronic device includes a plurality of cameras, the plurality of cameras includes a first camera and at least one second camera, and a view field angle of the first camera is greater than a view field angle of the second camera, and the method includes: receiving a first touch operation of a user on a first preview image under the condition that the first preview image is displayed on a shooting preview interface, wherein the first preview image is an image collected by a first camera; then, based on the first touch operation, determining a shooting track of at least one second camera, and driving the at least one second camera to shoot along the shooting track, so that after the shooting track of the at least one second camera is obtained by receiving the first touch operation of the user on the first preview image, the at least one second camera is driven to shoot along the shooting track, thereby simplifying the shooting process and improving the shooting efficiency.
Optionally, the driving module 903 includes:
a first obtaining unit, configured to obtain shooting parameter information of at least one second camera, where the shooting parameter information includes: at least one of a shooting focal length, a shooting step distance, a shooting duration and a shooting speed;
and the first driving unit is used for driving at least one second camera to shoot along the shooting track and based on shooting parameter information.
Optionally, the first obtaining unit includes:
the first receiving subunit is used for receiving a second touch operation input by the user;
the display subunit is used for displaying first prompt information, and the first prompt information is used for prompting a user to input shooting parameter information;
and the acquisition subunit is used for acquiring shooting parameter information input by a user as the shooting parameter information of at least one second camera.
Optionally, the apparatus further comprises:
and the adjusting module is used for adjusting the shooting track according to the first preview image so as to enable the images in the preset ranges at two sides of the shooting track to be in the shooting range corresponding to the shooting focal length of the second camera.
Optionally, the driving module 903 includes:
the second driving unit is used for driving at least one second camera to acquire at least one preselected image area along the shooting track;
the second receiving unit is used for receiving the adjustment operation of a user on at least one preselected image area to obtain the adjusted preselected image area;
and the third driving unit is used for driving at least one second camera to shoot images in the adjusted preselected image areas respectively.
Optionally, the driving module 903 includes:
the second acquisition unit is used for acquiring a first position on the shooting track and the shooting focal length of the second camera;
a third acquiring unit, configured to acquire a first image through the second camera based on the first position and the shooting focal length;
a clipping unit configured to clip a second image within the first preview image and at the first position, wherein a size of the second image is larger than a size of the first image;
a first determination unit configured to extract feature information of the first image, match the feature information with feature information of the second image, and determine a second position of the first image;
the second determining unit is used for determining a target angle of rotation of the second camera according to the difference value between the first position and the second position;
and the fourth driving unit is used for driving the second camera to rotate the target angle to shoot an image.
Optionally, the driving module 903 includes:
the area dividing unit is used for dividing a preset area in the preset shooting area along the shooting track into a preset number of sub-shooting areas under the condition that all tracks or part of tracks in the shooting track are in the preset shooting area;
the fifth driving unit is used for driving at least one second camera to shoot a plurality of images including at least one sub-shooting area;
the splicing unit is used for splicing the plurality of images to obtain shot images in a preset area along the shooting track;
and the sixth driving unit is used for driving at least one second camera to shoot along the shooting track outside the preset shooting area.
Optionally, the fifth driving unit specifically includes:
a first determining subunit, configured to determine a target sub-shooting area in at least one of the sub-shooting areas, where the target sub-shooting area includes a target object;
the first driving subunit is used for driving at least one second camera to shoot the target sub-shooting area to obtain a first sub-image;
and the second driving subunit is used for driving at least one second camera to shoot an area except the target sub-shooting area in at least one sub-shooting area to obtain a second sub-image.
As can be seen from the above technical solutions provided by the embodiments of the present invention, an embodiment of the present invention is applied to an electronic device, where the electronic device includes a plurality of cameras, the plurality of cameras includes a first camera and at least one second camera, and a view field angle of the first camera is greater than a view field angle of the second camera, and the method includes: receiving a first touch operation of a user on a first preview image under the condition that the first preview image is displayed on a shooting preview interface, wherein the first preview image is an image collected by a first camera; then, based on the first touch operation, determining a shooting track of at least one second camera, and driving the at least one second camera to shoot along the shooting track, so that after the shooting track of the at least one second camera is obtained by receiving the first touch operation of the user on the first preview image, the at least one second camera is driven to shoot along the shooting track, thereby simplifying the shooting process and improving the shooting efficiency.
The shooting control device provided by the embodiment of the invention can realize each process in the embodiment corresponding to the shooting control method, and is not repeated here for avoiding repetition.
It should be noted that the shooting control apparatus provided in the embodiment of the present invention and the shooting control method provided in the embodiment of the present invention are based on the same inventive concept, and therefore, for specific implementation of the embodiment, reference may be made to implementation of the shooting control method described above, and repeated details are not described again.
Based on the same technical concept, the embodiment of the present invention further provides an electronic device, which is configured to execute the shooting control method, where fig. 10 is a schematic diagram of a hardware structure of an electronic device implementing various embodiments of the present invention, and the electronic device 1000 shown in fig. 10 includes, but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, a processor 1010, and a power supply 1011. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 10 does not constitute a limitation of the electronic device, and that the electronic device may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
Wherein, above-mentioned electronic equipment includes a plurality of cameras, including a first camera and at least one second camera in a plurality of cameras, and the visual field angle of first camera is greater than the visual field angle of second camera, and processor 1010 is used for carrying out following step:
under the condition that a first preview image is displayed on a shooting preview interface, receiving a first touch operation of a user on the first preview image through a touch screen, wherein the first preview image is an image collected by a first camera;
determining a shooting track of at least one second camera based on the first touch operation;
and driving at least one second camera to shoot along the shooting track.
Additionally, the processor 1010 is further configured to perform the following steps:
acquiring shooting parameter information of at least one second camera, wherein the shooting parameter information comprises: at least one of a shooting focal length, a shooting step distance, a shooting duration and a shooting speed;
and driving at least one second camera to shoot along the shooting track and based on shooting parameter information.
In addition, the user input unit 1007 is configured to receive a second touch operation input by the user;
a display unit 1006, configured to display first prompt information, where the first prompt information is used to prompt a user to input shooting parameter information;
and the processor 1010 is configured to acquire shooting parameter information input by a user as shooting parameter information of at least one second camera.
Additionally, the processor 1010 is further configured to perform the following steps:
and adjusting the shooting track according to the first preview image so as to enable images in preset ranges at two sides of the shooting track to be in a shooting range corresponding to the shooting focal length of the second camera. Further, the processor 1010 is further configured to: driving at least one second camera to acquire at least one preselected image area along the shooting track;
the user input unit 1007 is further configured to receive an adjustment operation of a user on at least one preselected image region, so as to obtain an adjusted preselected image region;
and the processor 1010 is further configured to drive at least one of the second cameras to capture images in the adjusted preselected image areas respectively.
Additionally, the processor 1010 is further configured to perform the following steps:
acquiring a first position on the shooting track and a shooting focal length of the second camera;
acquiring a first image through the second camera based on the first position and the shooting focal length;
capturing a second image within the first preview image and at the first location, wherein the size of the second image is larger than the size of the first image;
extracting feature information of the first image, matching the feature information with feature information of the second image, and determining a second position of the first image;
determining a target angle of rotation of the second camera according to the difference value between the first position and the second position;
and driving the second camera to rotate the target angle to shoot an image.
Additionally, the processor 1010 is further configured to perform the following steps:
under the condition that all or part of the shooting tracks are in a preset shooting area, dividing a preset area in the preset shooting area along the shooting tracks into a preset number of sub-shooting areas;
driving at least one second camera to shoot a plurality of images including at least one sub-shooting area;
splicing the plurality of images to obtain shot images in a preset area along the shooting track;
and driving at least one second camera to shoot along the shooting track outside the preset shooting area.
Additionally, the processor 1010 is further configured to perform the following steps:
determining a target sub-shooting area in at least one sub-shooting area, wherein a target object is included in the target sub-shooting area;
driving at least one second camera to shoot the target sub-shooting area to obtain a first sub-image;
and driving at least one second camera to shoot the area except the target sub-shooting area in at least one sub-shooting area to obtain a second sub-image.
As can be seen from the above technical solutions provided by the embodiments of the present invention, an embodiment of the present invention is applied to an electronic device, where the electronic device includes a plurality of cameras, the plurality of cameras includes a first camera and at least one second camera, and a view field angle of the first camera is greater than a view field angle of the second camera, and the method includes: receiving a first touch operation of a user on a first preview image under the condition that the first preview image is displayed on a shooting preview interface, wherein the first preview image is an image collected by a first camera; then, based on the first touch operation, determining a shooting track of at least one second camera, and driving the at least one second camera to shoot along the shooting track, so that after the shooting track of the at least one second camera is obtained by receiving the first touch operation of the user on the first preview image, the at least one second camera is driven to shoot along the shooting track, thereby simplifying the shooting process and improving the shooting efficiency.
It should be noted that the electronic device 1000 according to the embodiment of the present invention can implement each process implemented by the electronic device in the foregoing shooting control method embodiment, and for avoiding repetition, details are not described here again.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 1001 may be used for receiving and sending signals during a message transmission or a call, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 1010; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 1001 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio frequency unit 1001 may also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user through the network module 1002, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 1003 may convert audio data received by the radio frequency unit 1001 or the network module 1002 or stored in the memory 1009 into an audio signal and output as sound. Also, the audio output unit 1003 may also provide audio output related to a specific function performed by the electronic apparatus 1000 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 1003 includes a speaker, a buzzer, a receiver, and the like.
The input unit 1004 is used to receive an audio or video signal. The input Unit 1004 may include a Graphics Processing Unit (GPU) 10041 and a microphone 10042, the Graphics processor 10041 Processing image data of still pictures or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 1006. The image frames processed by the graphic processor 10041 may be stored in the memory 1009 (or other storage medium) or transmitted via the radio frequency unit 1001 or the network module 1002. The microphone 10042 can receive sound and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 1001 in case of a phone call mode.
The electronic device 1000 also includes at least one sensor 1005, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 10061 according to the brightness of ambient light and a proximity sensor that can turn off the display panel 10061 and/or the backlight when the electronic device 1000 moves to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 1005 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which will not be described in detail herein.
The display unit 1006 is used to display information input by the user or information provided to the user. The Display unit 1006 may include a Display panel 10061, and the Display panel 10061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 1007 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 1007 includes a touch panel 10071 and other input devices 10072. The touch panel 10071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 10071 (e.g., operations by a user on or near the touch panel 10071 using a finger, a stylus, or any other suitable object or attachment). The touch panel 10071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1010, and receives and executes commands sent by the processor 1010. In addition, the touch panel 10071 may be implemented by various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 10071, the user input unit 1007 can include other input devices 10072. Specifically, the other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 10071 can be overlaid on the display panel 10061, and when the touch panel 10071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 1010 to determine the type of the touch event, and then the processor 1010 provides a corresponding visual output on the display panel 10061 according to the type of the touch event. Although in fig. 10, the touch panel 10071 and the display panel 10061 are two independent components for implementing the input and output functions of the electronic device, in some embodiments, the touch panel 10071 and the display panel 10061 may be integrated to implement the input and output functions of the electronic device, and the implementation is not limited herein.
The interface unit 1008 is an interface for connecting an external device to the electronic apparatus 1000. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 1008 may be used to receive input from external devices (e.g., data information, power, etc.) and transmit the received input to one or more elements within the electronic device 1000 or may be used to transmit data between the electronic device 1000 and the external devices.
The memory 1009 may be used to store software programs as well as various data. The memory 1009 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, and the like), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 1009 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 1010 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by operating or executing software programs and/or modules stored in the memory 1009 and calling data stored in the memory 1009, thereby integrally monitoring the electronic device. Processor 1010 may include one or more processing units; preferably, the processor 1010 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1010.
The electronic device 1000 may further include a power source 1011 (e.g., a battery) for supplying power to various components, and preferably, the power source 1011 may be logically connected to the processor 1010 through a power management system, so as to manage charging, discharging, and power consumption management functions through the power management system.
In addition, the electronic device 1000 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides an electronic device, which includes a processor 1010, a memory 1009, and a computer program stored in the memory 1009 and capable of running on the processor 1010, where the computer program is executed by the processor 1010 to implement each process of the foregoing unloading method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
Further, corresponding to the method for offloading an application provided in the foregoing embodiment, an embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by the processor 1010, the steps of the foregoing method for offloading an application can be implemented, and the same technical effect can be achieved. The computer-readable storage medium may be a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or any combination thereof. For a hardware implementation, the processing units may be implemented within one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described herein, or a combination thereof.
For a software implementation, the techniques described in this disclosure may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described in this disclosure. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
It should also be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made herein without departing from the spirit and scope of the invention as defined in the appended claims. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the scope of the claims of the present invention.

Claims (14)

1. A shooting control method is applied to electronic equipment, the electronic equipment comprises a plurality of cameras, the plurality of cameras comprise a first camera and at least one second camera, and the view field angle of the first camera is larger than that of the second camera, and the method comprises the following steps:
receiving a first touch operation of a user on a first preview image under the condition that the first preview image is displayed on a shooting preview interface, wherein the first preview image is an image collected by a first camera;
determining a shooting track of at least one second camera based on the first touch operation;
and driving at least one second camera to shoot along the shooting track.
2. The method of claim 1, wherein said driving at least one of the second cameras to capture along the capture trajectory comprises:
acquiring shooting parameter information of at least one second camera, wherein the shooting parameter information comprises: at least one of a shooting focal length, a shooting step distance, a shooting duration and a shooting speed;
and driving at least one second camera to shoot along the shooting track and based on shooting parameter information.
3. The method according to claim 1, wherein after the determining a shooting trajectory of at least one of the second cameras based on the first touch operation, further comprising:
and adjusting the shooting track according to the first preview image so as to enable images in preset ranges at two sides of the shooting track to be in a shooting range corresponding to the shooting focal length of the second camera.
4. The method of claim 1, wherein said driving at least one of the second cameras to capture along the capture trajectory comprises:
driving at least one second camera to acquire at least one preselected image area along the shooting track;
receiving the adjustment operation of a user on at least one preselected image area to obtain an adjusted preselected image area;
and driving at least one second camera to respectively shoot images in the adjusted preselected image areas.
5. The method of claim 1, wherein said driving at least one of the second cameras to capture along the capture trajectory comprises:
acquiring a first position on the shooting track and a shooting focal length of the second camera;
acquiring a first image through the second camera based on the first position and the shooting focal length;
capturing a second image within the first preview image and at the first location, wherein the size of the second image is larger than the size of the first image;
extracting feature information of the first image, matching the feature information with feature information of the second image, and determining a second position of the first image;
determining a target angle of rotation of the second camera according to the difference value between the first position and the second position;
and driving the second camera to rotate the target angle to shoot an image.
6. The method of claim 1, wherein said driving at least one of the second cameras to capture along the capture trajectory comprises:
under the condition that all or part of the shooting tracks are in a preset shooting area, dividing a preset area in the preset shooting area along the shooting tracks into a preset number of sub-shooting areas;
driving at least one second camera to shoot a plurality of images including at least one sub-shooting area;
splicing the plurality of images to obtain shot images in a preset area along the shooting track;
and driving at least one second camera to shoot along the shooting track outside the preset shooting area.
7. The method according to claim 6, wherein the driving at least one of the second cameras to capture a plurality of images including at least one of the sub-capture regions comprises:
determining a target sub-shooting area in at least one sub-shooting area, wherein a target object is included in the target sub-shooting area;
driving at least one second camera to shoot the target sub-shooting area to obtain a first sub-image;
and driving at least one second camera to shoot the area except the target sub-shooting area in at least one sub-shooting area to obtain a second sub-image.
8. The utility model provides a shoot controlling means, its characterized in that is applied to electronic equipment, electronic equipment includes a plurality of cameras, including a first camera and at least one second camera in a plurality of cameras, just the field of view angle of first camera is greater than the field of view angle of second camera, the device includes:
the device comprises a receiving module, a processing module and a display module, wherein the receiving module is used for receiving a first touch operation of a user on a first preview image under the condition that the shooting preview interface displays the first preview image, and the first preview image is an image collected by a first camera;
the acquisition module is used for determining a shooting track of at least one second camera based on the first touch operation;
and the driving module is used for driving at least one second camera to shoot along the shooting track.
9. The apparatus of claim 8, wherein the drive module comprises:
a first obtaining unit, configured to obtain shooting parameter information of at least one second camera, where the shooting parameter information includes: at least one of a shooting focal length, a shooting step distance, a shooting duration and a shooting speed;
and the first driving unit is used for driving at least one second camera to shoot along the shooting track and based on shooting parameter information.
10. The apparatus of claim 8, further comprising:
and the adjusting module is used for adjusting the shooting track according to the first preview image so as to enable the images in the preset ranges at two sides of the shooting track to be in the shooting range corresponding to the shooting focal length of the second camera.
11. The apparatus of claim 8, wherein the drive module comprises:
the second driving unit is used for driving at least one second camera to acquire at least one preselected image area along the shooting track;
the second receiving unit is used for receiving the adjustment operation of a user on at least one preselected image area to obtain the adjusted preselected image area;
and the third driving unit is used for driving at least one second camera to shoot images in the adjusted preselected image areas respectively.
12. The apparatus of claim 8, wherein the drive module comprises:
the second acquisition unit is used for acquiring a first position on the shooting track and the shooting focal length of the second camera;
a third acquiring unit, configured to acquire a first image through the second camera based on the first position and the shooting focal length;
a clipping unit configured to clip a second image within the first preview image and at the first position, wherein a size of the second image is larger than a size of the first image;
a first determination unit configured to extract feature information of the first image, match the feature information with feature information of the second image, and determine a second position of the first image;
the second determining unit is used for determining a target angle of rotation of the second camera according to the difference value between the first position and the second position;
and the fourth driving unit is used for driving the second camera to rotate the target angle to shoot an image.
13. The apparatus of claim 8, wherein the drive module comprises:
the area dividing unit is used for dividing a preset area in the preset shooting area along the shooting track into a preset number of sub-shooting areas under the condition that all tracks or part of tracks in the shooting track are in the preset shooting area;
the fifth driving unit is used for driving at least one second camera to shoot a plurality of images including at least one sub-shooting area;
the splicing unit is used for splicing the plurality of images to obtain shot images in a preset area along the shooting track;
and the sixth driving unit is used for driving at least one second camera to shoot along the shooting track outside the preset shooting area.
14. The device according to claim 13, wherein the fifth driving unit comprises:
a first determining subunit, configured to determine a target sub-shooting area in at least one of the sub-shooting areas, where the target sub-shooting area includes a target object;
the first driving subunit is used for driving at least one second camera to shoot the target sub-shooting area to obtain a first sub-image;
and the second driving subunit is used for driving at least one second camera to shoot an area except the target sub-shooting area in at least one sub-shooting area to obtain a second sub-image.
CN201911260425.5A 2019-12-10 2019-12-10 Shooting control method and device and electronic equipment Active CN111010510B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911260425.5A CN111010510B (en) 2019-12-10 2019-12-10 Shooting control method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911260425.5A CN111010510B (en) 2019-12-10 2019-12-10 Shooting control method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN111010510A true CN111010510A (en) 2020-04-14
CN111010510B CN111010510B (en) 2021-11-16

Family

ID=70114143

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911260425.5A Active CN111010510B (en) 2019-12-10 2019-12-10 Shooting control method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN111010510B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112099689A (en) * 2020-09-14 2020-12-18 Oppo广东移动通信有限公司 Interface display method and device, electronic equipment and computer readable storage medium
CN112306153A (en) * 2020-10-30 2021-02-02 联想(北京)有限公司 Electronic equipment and information processing method
CN112565589A (en) * 2020-11-13 2021-03-26 北京爱芯科技有限公司 Photographing preview method and device, storage medium and electronic equipment
CN112672059A (en) * 2020-12-28 2021-04-16 维沃移动通信有限公司 Shooting method and shooting device
CN112702497A (en) * 2020-12-28 2021-04-23 维沃移动通信有限公司 Shooting method and device
WO2022022726A1 (en) * 2020-07-31 2022-02-03 华为技术有限公司 Image capture method and device
WO2022022715A1 (en) * 2020-07-30 2022-02-03 华为技术有限公司 Photographing method and device
CN114071009A (en) * 2020-07-31 2022-02-18 华为技术有限公司 Shooting method and equipment
CN114071010A (en) * 2020-07-30 2022-02-18 华为技术有限公司 Shooting method and equipment
CN114285963A (en) * 2020-09-27 2022-04-05 华为技术有限公司 Multi-lens video recording method and related equipment
CN114339047A (en) * 2021-12-31 2022-04-12 维沃移动通信有限公司 Shooting control method and device, electronic equipment and medium
CN117714849A (en) * 2023-08-31 2024-03-15 上海荣耀智慧科技开发有限公司 Image shooting method and related equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010148676A (en) * 2008-12-25 2010-07-08 Yoshida Dental Mfg Co Ltd Radiographic apparatus and panoramic image processing program
CN103176343A (en) * 2013-03-22 2013-06-26 深圳市中印印刷制品有限公司 Third-dimensional (3D) photo shooting device
CN104052931A (en) * 2014-06-27 2014-09-17 宇龙计算机通信科技(深圳)有限公司 Image shooting device, method and terminal
CN104125433A (en) * 2014-07-30 2014-10-29 西安冉科信息技术有限公司 Moving object video surveillance method based on multi-PTZ (pan-tilt-zoom)-camera linkage structure
CN105141851A (en) * 2015-09-29 2015-12-09 杨珊珊 Control system and control method for unmanned aerial vehicle and unmanned aerial vehicle
CN105872372A (en) * 2016-03-31 2016-08-17 纳恩博(北京)科技有限公司 Image acquisition method and electronic device
CN108449546A (en) * 2018-04-04 2018-08-24 维沃移动通信有限公司 A kind of photographic method and mobile terminal

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010148676A (en) * 2008-12-25 2010-07-08 Yoshida Dental Mfg Co Ltd Radiographic apparatus and panoramic image processing program
CN103176343A (en) * 2013-03-22 2013-06-26 深圳市中印印刷制品有限公司 Third-dimensional (3D) photo shooting device
CN104052931A (en) * 2014-06-27 2014-09-17 宇龙计算机通信科技(深圳)有限公司 Image shooting device, method and terminal
CN104125433A (en) * 2014-07-30 2014-10-29 西安冉科信息技术有限公司 Moving object video surveillance method based on multi-PTZ (pan-tilt-zoom)-camera linkage structure
CN105141851A (en) * 2015-09-29 2015-12-09 杨珊珊 Control system and control method for unmanned aerial vehicle and unmanned aerial vehicle
CN105872372A (en) * 2016-03-31 2016-08-17 纳恩博(北京)科技有限公司 Image acquisition method and electronic device
CN108449546A (en) * 2018-04-04 2018-08-24 维沃移动通信有限公司 A kind of photographic method and mobile terminal

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114071010B (en) * 2020-07-30 2024-05-24 华为技术有限公司 Shooting method and equipment
CN114071010A (en) * 2020-07-30 2022-02-18 华为技术有限公司 Shooting method and equipment
WO2022022715A1 (en) * 2020-07-30 2022-02-03 华为技术有限公司 Photographing method and device
CN114071009A (en) * 2020-07-31 2022-02-18 华为技术有限公司 Shooting method and equipment
CN114071009B (en) * 2020-07-31 2023-04-18 华为技术有限公司 Shooting method and equipment
WO2022022726A1 (en) * 2020-07-31 2022-02-03 华为技术有限公司 Image capture method and device
CN112099689A (en) * 2020-09-14 2020-12-18 Oppo广东移动通信有限公司 Interface display method and device, electronic equipment and computer readable storage medium
CN114285963B (en) * 2020-09-27 2023-04-18 华为技术有限公司 Multi-lens video recording method and related equipment
CN114285963A (en) * 2020-09-27 2022-04-05 华为技术有限公司 Multi-lens video recording method and related equipment
CN112306153A (en) * 2020-10-30 2021-02-02 联想(北京)有限公司 Electronic equipment and information processing method
CN112565589A (en) * 2020-11-13 2021-03-26 北京爱芯科技有限公司 Photographing preview method and device, storage medium and electronic equipment
CN112702497A (en) * 2020-12-28 2021-04-23 维沃移动通信有限公司 Shooting method and device
CN112672059B (en) * 2020-12-28 2022-06-28 维沃移动通信有限公司 Shooting method and shooting device
CN112672059A (en) * 2020-12-28 2021-04-16 维沃移动通信有限公司 Shooting method and shooting device
CN114339047A (en) * 2021-12-31 2022-04-12 维沃移动通信有限公司 Shooting control method and device, electronic equipment and medium
CN117714849A (en) * 2023-08-31 2024-03-15 上海荣耀智慧科技开发有限公司 Image shooting method and related equipment

Also Published As

Publication number Publication date
CN111010510B (en) 2021-11-16

Similar Documents

Publication Publication Date Title
CN111010510B (en) Shooting control method and device and electronic equipment
CN108668083B (en) Photographing method and terminal
CN109361869B (en) Shooting method and terminal
CN110557566B (en) Video shooting method and electronic equipment
CN108089788B (en) Thumbnail display control method and mobile terminal
CN108471498B (en) Shooting preview method and terminal
WO2020042890A1 (en) Video processing method, terminal, and computer readable storage medium
WO2019174628A1 (en) Photographing method and mobile terminal
CN110557683B (en) Video playing control method and electronic equipment
CN108174103B (en) Shooting prompting method and mobile terminal
WO2019184947A1 (en) Image viewing method and mobile terminal
CN111031398A (en) Video control method and electronic equipment
CN110970003A (en) Screen brightness adjusting method and device, electronic equipment and storage medium
CN110602386B (en) Video recording method and electronic equipment
CN109102555B (en) Image editing method and terminal
CN109922294B (en) Video processing method and mobile terminal
CN107888833A (en) A kind of image capturing method and mobile terminal
CN111464746B (en) Photographing method and electronic equipment
CN108174110B (en) Photographing method and flexible screen terminal
CN108132749B (en) Image editing method and mobile terminal
CN111597370A (en) Shooting method and electronic equipment
CN110941378B (en) Video content display method and electronic equipment
CN113747073B (en) Video shooting method and device and electronic equipment
CN110908517A (en) Image editing method, image editing device, electronic equipment and medium
CN108762641B (en) Text editing method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant