CN113810608A - Shooting method, electronic equipment and storage medium - Google Patents
Shooting method, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN113810608A CN113810608A CN202111077227.2A CN202111077227A CN113810608A CN 113810608 A CN113810608 A CN 113810608A CN 202111077227 A CN202111077227 A CN 202111077227A CN 113810608 A CN113810608 A CN 113810608A
- Authority
- CN
- China
- Prior art keywords
- interface
- video
- shooting
- electronic device
- electronic equipment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
- H04N23/632—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
- Studio Devices (AREA)
Abstract
The application provides a shooting method, electronic equipment and a storage medium, which relate to the technical field of shooting, and the method comprises the following steps: when a video is shot through a camera on the electronic equipment, a display screen of the electronic equipment can display a video picture acquired by the camera in real time through a video shooting interface; the electronic equipment comprises a camera, a video shooting interface, a suspension interface, a thumbnail and a snapshot control, wherein the video shooting interface is arranged on the video shooting interface of the electronic equipment, the snapshot control is arranged on the video shooting interface of the electronic equipment, and after a user clicks the snapshot control, the electronic equipment can select one frame of video frame from a video shot by the camera to serve as the snapshot.
Description
Technical Field
The present disclosure relates to the field of image capture, and in particular, to a shooting method, an electronic device, and a storage medium.
Background
More and more electronic equipment is provided with a camera, and a user can carry the electronic equipment to take videos and photos anytime and anywhere. When the user adopts the electronic equipment to shoot the video, the photo can be shot under the condition of not interrupting the video shooting.
If the user needs to check the effect of the pictures shot in the video shooting process, the user needs to enter a gallery of the electronic equipment to check the shot pictures after the video shooting is finished. If the taken picture is poor in effect, the shooting opportunity of the picture may be missed. If the user interrupts the current video shooting process, and enters the gallery of the electronic device to view the shot pictures after the current video shooting process is interrupted, the video shooting opportunity may be missed. Therefore, the intelligent degree of the electronic device is low, and the shooting requirement of the user cannot be met.
Disclosure of Invention
The application provides a shooting method, electronic equipment and a storage medium, and solves the problem that the intelligent program of the electronic equipment is low and cannot meet the shooting requirement of a user.
In order to achieve the purpose, the technical scheme is as follows:
in a first aspect, the present application provides a shooting method applied to an electronic device including a display screen and a camera, the shooting method including:
the electronic equipment displays a video shooting interface through a display screen, and the video shooting interface is used for displaying a video picture acquired by the camera in real time;
the electronic equipment detects a first operation, wherein the first operation is used for capturing a first image, and the first image is related to a video picture acquired by a camera in real time;
the electronic equipment displays a first suspension interface on the video shooting interface, and the first suspension interface is used for displaying a thumbnail of the first image.
In the application, in the process of shooting the video through the camera of the electronic equipment, a video shooting interface can be displayed on a display screen of the electronic equipment, and the video shooting interface is used for displaying a video picture acquired by the camera in real time. The electronic device may provide a snapshot function during a video shooting process, and when the electronic device detects an operation (e.g., a first operation) for taking a snapshot, the electronic device may select a frame of image from a video frame captured by the camera in real time as a snapshot (e.g., a first image), and display a floating interface (e.g., a first floating interface) on the video shooting interface, through which a thumbnail of the snapshot is displayed. By the method, the snapshot can be displayed to the user in time on the premise of not interrupting the current video shooting, so that the user can preview the snapshot in time, and the shooting experience of the user is improved.
As an implementation manner of the first aspect, a snapshot control is arranged on the video shooting interface, and the first operation acts on the snapshot control.
In the application, the snapshot control is arranged on the video shooting interface, and when a user clicks the snapshot control, a photo is snapshot, so that a more visual snapshot mode can be provided for the user.
As another implementation manner of the first aspect, after the electronic device displays the first floating interface on the video shooting interface, the shooting method further includes:
and under the condition that the first suspension interface displays the first preset time, the electronic equipment closes the first suspension interface.
In the method, the first preset duration can be set, so that the user has enough time to preview the snap shot photo; after the first preset time is displayed, the electronic equipment actively closes the first suspension interface for displaying the thumbnail of the snapshot photo, and the situation that the user is automatically closed to influence video shooting is avoided. The experience of the user is better.
As another implementation manner of the first aspect, after the electronic device displays the first floating interface on the video shooting interface, the shooting method further includes:
the electronic equipment detects a second operation, and the second operation is used for closing the first suspension interface;
the electronic device closes the first floating interface based on the second operation.
In the application, the user can be set to close the first suspension interface autonomously, so that a more diversified mode is provided, and the method is applied to the electronic equipment suitable for different customer groups.
As another implementation manner of the first aspect, a closing control is arranged on the first suspension interface; the second operation acts on the close control.
As another implementation manner of the first aspect, after the electronic device closes the first floating interface, the shooting method further includes:
the electronic equipment detects a third operation, and the third operation is used for triggering the electronic equipment to display a second suspension interface;
the electronic equipment displays a second suspension interface on the video shooting interface based on a third operation, the second suspension interface is used for displaying a thumbnail of the snapshot image, the snapshot image is a snapshot image in the current video shooting process, and the snapshot image comprises the first image.
In the application, another function can be set, the function is triggered by a third operation, a user can display another suspension interface on the video shooting interface through the third operation, and the suspension interface can display all captured photos in the current video shooting process. By providing more diversified functions, the requirement of searching any photo captured in the current video shooting process by a user is met.
As another implementation manner of the first aspect, after the electronic device closes the first floating interface, the shooting method further includes:
the electronic device displays a hover button on the video capture interface, wherein the third operation acts on the hover button.
As an implementation manner of the first aspect, in a case where the electronic device detects the third operation, the shooting method further includes:
the electronic device pauses the current video capture based on the third operation.
In the method and the device, the current video shooting process can be paused when the user previews the snap shot photo in the current video shooting process through the third operation, so that the situation that the user is distracted and can not shoot the video meeting the requirements is avoided.
As another implementation manner of the first aspect, after the electronic device displays the second floating interface on the video shooting interface based on the third operation, the shooting method further includes:
the electronic device detects a fourth operation;
the electronic equipment closes the second suspension interface based on the fourth operation;
the electronic device continues the currently paused video photographing based on the fourth operation.
As another implementation manner of the first aspect, the fourth operation is performed on a first control arranged on the video shooting interface, or is performed on an area, which is outside the second floating interface and is not provided with a control, on the video shooting interface, and the first control is used to pause current video shooting or continue current paused video shooting.
In the present application, a diversified manner is provided to continue the currently paused video capture.
As another implementation manner of the first aspect, in a case where the electronic device detects the first operation, the shooting method further includes:
the electronic equipment records a first time when the first operation is detected;
the electronic equipment generates a first image from a video picture of a next frame of the video picture displayed by the video shooting interface at the first time;
the electronic device generates a thumbnail of the first image.
As another implementation manner of the first aspect, after generating the first image from a video frame next to a video frame displayed by the video capture interface at the first time, the method further includes:
the electronic device stores a first image;
and the electronic equipment sends the storage address of the first image and the index identification of the snapshot image to a media database.
In the application, the first image can be stored in a memory of the electronic equipment, a media database can be arranged and used for caching the storage address of the snapshot in the current video shooting process, and when the storage address of the snapshot in the current video shooting process is stored in the media database, the index can be used as the index of the storage address of the snapshot through the index identification. The corresponding index identification of each video shooting is different.
As another implementation manner of the first aspect, the displaying, by the electronic device, the second floating interface on the video shooting interface based on the third operation includes:
the electronic equipment acquires an index identification of the snapshot image based on the third operation;
the electronic equipment acquires a storage address of the snapshot image in the current video shooting process based on the index identification of the snapshot image;
the electronic equipment acquires a snapshot image based on the acquired storage address;
the electronic equipment generates a thumbnail of the snapshot image;
the electronic device displays the generated thumbnail of the snap image within the second floating interface, wherein the thumbnail of the snap image comprises the thumbnail of the first image.
As another implementation manner of the first aspect, the index identifier of the snapshot image is a unique identifier of a currently shot video.
In a second aspect, an electronic device is provided, comprising a processor for executing a computer program stored in a memory, implementing the method of any of the first aspect of the present application.
In a third aspect, a chip system is provided, which includes a processor coupled to a memory, and the processor executes a computer program stored in the memory to implement the method of any one of the first aspect of the present application.
In a fourth aspect, there is provided a computer readable storage medium storing a computer program which, when executed by one or more processors, performs the method of any one of the first aspects of the present application.
In a fifth aspect, the present application provides a computer program product for causing an apparatus to perform the method of any one of the first aspect of the present application when the computer program product is run on the apparatus.
It is understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
Drawings
Fig. 1 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a schematic view of a video shooting interface displayed on a display screen of an electronic device when the electronic device shoots a video according to an embodiment of the present application;
fig. 3 is a schematic operation diagram of a user snapping a photo in a video shooting process according to an embodiment of the present application;
fig. 4 is a schematic diagram of a process of a user viewing a snapshot after taking the snapshot in a video shooting process according to an embodiment of the present application;
fig. 5 is a schematic view of a floating interface a displayed by an electronic device after a user takes a snapshot in a video shooting process according to an embodiment of the present application;
FIG. 6 is a schematic view of another suspension interface A provided in the embodiments of the present application;
FIG. 7 is a schematic diagram illustrating an operation of a floating interface A according to an embodiment of the present disclosure;
fig. 8 is a schematic view of a hover button after the hover interface a provided in the embodiment of the present application is closed;
fig. 9 is a schematic diagram of a floating interface B triggered and displayed by a floating button according to an embodiment of the present application;
FIG. 10 is another schematic view of a floating interface B provided in an embodiment of the present application;
FIG. 11 is another schematic view of a floating interface B provided in an embodiment of the present application;
FIG. 12 is another schematic view of a floating interface B provided in an embodiment of the present application;
fig. 13 is a technical architecture diagram on which the photographing method provided in the embodiment of the present application depends;
fig. 14 is a timing diagram for implementing a shooting method according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that in the embodiments of the present application, "one or more" means one, two, or more than two; "and/or" describes the association relationship of the associated objects, indicating that three relationships may exist; for example, a and/or B, may represent: a alone, both A and B, and B alone, where A, B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," "fourth," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The shooting method provided by the embodiment of the application can be suitable for electronic equipment provided with a display screen and a camera. The electronic device may be a tablet computer, a mobile phone, a wearable device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), and other electronic devices. The embodiment of the present application does not limit the specific type of the electronic device.
Fig. 1 shows a schematic structural diagram of an electronic device. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a sensor module 180, keys 190, a camera 193, a display 194, and the like. The sensor module 180 may include a pressure sensor 180A, a touch sensor 180K, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
For example, the processor 110 is configured to execute the shooting method in the embodiment of the present application.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
As an example, the memory may have cached therein the storage address and index identification of the snapshot.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function.
As an example, the electronic device may be connected to an external memory that may store video captured by the electronic device and snap shots during the capture of the video.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, and an application program required by at least one function (such as a video capture function, a video playback function, and the like).
In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
As an example, the internal memory may store a computer program for implementing the video shooting method provided by the embodiments of the present application. Of course, videos taken by the electronic device and snap shots during the video taking process may also be stored in the internal memory.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
By way of example, the electronic device may detect, through the pressure sensor and the touch sensor, an operation of the user acting on the display screen, for example, an operation of the user clicking on the snapshot control, an operation of the user clicking on the hover button, and the like.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
As an example, the user may also implement an operation in the video shooting process through a physical key, for example, a shortcut key (e.g., volume + key) in the video shooting process may also be set as a physical key for capturing a photo. The user can place a finger on volume + key in the video shooting process, and the finger that places on volume + key just can realize snapshotting the photo with strength in the video shooting process, provides more diversified mode and realizes snapshotting the photo.
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The camera 193 is used to capture still images or video. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
In the implementation of the application, the camera can collect video pictures, the GPU processes the video pictures collected by the camera, and the display screen displays an interface processed by the GPU. The specific content displayed on the display screen can refer to the description in the following embodiments.
The embodiment of the present application does not particularly limit a specific structure of an execution subject of a photographing method as long as communication can be performed by one photographing method provided according to the embodiment of the present application by running a code recorded with one photographing method of the embodiment of the present application. For example, an execution subject of a shooting method provided in an embodiment of the present application may be a functional module capable of calling a program and executing the program in an electronic device, or a processing device, such as a chip, applied in the electronic device.
The interface schematic diagram of the embodiment of the present application will be described below, and it should be noted that the size, the position, and the style of the control icon in the interface shown in the embodiment of the present application are only used for example, and do not cause any limitation to the present application.
Referring to fig. 2, a schematic view of a video capture interface displayed on a display screen of an electronic device is shown.
As shown in fig. 2, during the process of shooting the video by the electronic device, a video shooting interface displayed on a display screen of the electronic device may display a currently shot video picture (a running puppy shown in fig. 2), a shooting progress (e.g., 00:06 in fig. 2), a status icon (e.g., a shooting icon in front of the shooting progress in fig. 2 indicates that the shooting status is currently in the shooting status), and may further include a control.
As an example, the video capture interface may include: pause/continue control, end control, and snapshot control. Of course, in practical applications, other controls may be included, such as the zoom control shown in fig. 2.
And the ending control is used for ending the current video shooting process. A pause/continue control for displaying a pause icon (shown in fig. 2) during video capturing, the user clicking the pause icon to pause the current video capturing process; and displaying a photographing icon (see fig. 12) when the video photographing is suspended, and the user clicking the photographing icon to continue the current video photographing process. The snapshot control is used for taking a snapshot without pausing and ending the current video capture process.
Taking an application scene as an example, referring to fig. 3, when a user wants to capture a photo while a puppy runs in a process of capturing a video while the puppy runs, the user may click a capture control in a video capturing interface to obtain the capture photo. The snap shots are stored as images in a gallery of the electronic device.
However, the snapshot may not meet the user's expectations. Taking the motion scene shown in fig. 3 (a moving person, animal, or object exists in the shot) as an example, when a moving dog is captured, the following situations may occur: the dog appears blurry in the snap photo due to the running of the dog; alternatively, the user desires to snap the dog in the photograph in a state where both feet are empty, whereas the dog in the photograph is snapped in a state where both feet are grounded. Therefore, it is likely that the snap shot does not meet the user's expectations. The embodiment of the application takes a motion scene as an example, and illustrates that the snapshot may not meet the expectation of the user. In practical application, in other shooting scenes, the snapshot may not meet the expectation of the user, and the embodiments of the present application are not given as examples.
Since the display screen of the electronic device displays the video shooting interface shown in fig. 3 during shooting of the running video of the puppy, the video shooting interface is used for displaying a video picture shot in real time. Therefore, the user cannot confirm whether the snap shot is in expectation immediately after obtaining the snap shot. After the current video shooting process is finished, the user can enter the gallery of the electronic equipment to check whether the snapshot photo is in accordance with the expectation, if not, the user needs to shoot the photo again, however, the user is likely to miss the best shooting opportunity.
If the user wants to immediately see whether the snap shot is as expected, the user needs to return to the main interface in the manner shown in fig. 4 (a). That is, the user can slide upwards from the bottom of the electronic device using a finger, and the interface displayed on the display screen of the electronic device is controlled to be the main interface, and the schematic diagram of the main interface may refer to the interface schematic diagram in (b) in fig. 4.
Then, entering a gallery of the electronic device in a manner shown in (b) of fig. 4; in the case that the main interface of the electronic device includes a gallery application, the user may click the gallery application to control an interface displayed on a display screen of the electronic device to be a preview interface in the gallery, and the schematic diagram of the preview interface may refer to the interface schematic diagram in (c) in fig. 4.
Whether the snap shot is as expected is checked in the manner shown in (c) of fig. 4. Under the condition that the thumbnails are displayed on a preview interface displayed on the electronic equipment in a reverse time order, the first thumbnail (the thumbnail of the image 1) in the preview interface is the latest snapshot. The user can click on the thumbnail of image 1 to see if the snap shot is as expected.
The processes shown in fig. 4 (a) to 4 (c) interrupt the current video photographing process. Of course, the modes shown in fig. 4 (a) to 4 (c) are only for example and do not limit the present application.
In the case that the snapshot is not as expected, the user needs to take the picture again to obtain a picture that can be expected.
In the case where the snapshot is expected, since the current video shooting process has been interrupted, video shooting cannot be continued on the basis of the interrupted video. If the video needs to be continuously shot, the shooting of the next video segment needs to be carried out again.
The function of capturing the photo in the video shooting process is to capture the photo without interrupting the video shooting process, so that the user experiences the function badly and cannot fully play the function.
In view of this, the embodiment of the present application provides a shooting method, which may provide a way for a user to check whether a captured picture in a video shooting process meets the expectations of the user.
As an example, in a case where the display screen of the electronic device displays the video shooting interface shown in fig. 3, the user clicks the snapshot control, the electronic device obtains a snapshot, and displays a floating interface on the video shooting interface shown in fig. 3.
In the embodiment of the application, a user operation for triggering the electronic device to obtain a snapshot (which may be recorded as a first image) is recorded as a first operation. For example, the user clicks the snapshot control as described above. In the embodiment of the present application, an interface displayed above another interface is referred to as a floating interface. For convenience of description, a floating interface triggered and displayed by the snapshot control is referred to as a floating interface a, and the floating interface a may also be referred to as a first floating interface.
Referring to fig. 5, which is a schematic view of an interface after a user clicks a snapshot control on the basis of the video shooting interface shown in fig. 3, in fig. 5 (a), a floating interface a displays a thumbnail of a snapshot. The user can preview the thumbnail of the snap shot displayed by the floating interface a to determine whether the effect of the snap shot meets the user's expectations. The floating interface A displays a thumbnail of the snapshot.
As described above, after the user clicks the snapshot control, a snapshot corresponding to the snapshot operation may not be expected. Therefore, the display method shown in (b) in fig. 5 can also be employed in the embodiment of the present application. The floating interface a may display thumbnails of a preset number of snap shots. For example, 2 sheets, 3 sheets, 4 sheets, etc. may be displayed.
In this embodiment, after the electronic device detects that the user clicks the snapshot control, the time when the user clicks the snapshot control may be recorded. And taking a preset number of video frames related to the time in the video frames acquired by the camera as snap photos to generate thumbnails and image files. The thumbnail is used for displaying on the floating interface A, and the image file is stored in a memory of the electronic equipment.
As an example, when thumbnails of 3 snap photos are displayed, the video frames corresponding to the three thumbnail photos may be: and a video frame corresponding to the video frame displayed at the time, a previous video frame and a next video frame of the video frame corresponding to the video frame displayed at the time are shot by a video shooting interface in the display screen of the electronic equipment.
In practical application, the video frames corresponding to the three thumbnail images may also be: and any three video frames in the video picture displayed by the display screen from a period of time before the time to a period of time after the time.
Of course, the snapping effect presented by the 3 video frames may be the same, may be slightly different, and may be different greatly.
As another example, it may be further provided that: after the electronic device detects that the user clicks the snapshot control, the time when the user clicks the snapshot control can be recorded. Taking a preset number of video frames related to the time from the video frames collected by the camera as snap photos to generate thumbnails; the thumbnail is used for displaying on the floating interface A.
Referring to fig. 5 (b), the user may select, by way of example only, one or more snap shots in thumbnails of a plurality of video frames displayed by the hover interface a that meet expectations. After monitoring the thumbnail of any video frame selected by the user in the suspension interface A, the electronic device generates an image file of the video frame corresponding to the thumbnail selected by the user and stores the image file in a memory of the electronic device.
The embodiment of the application does not limit the display mode of the suspension interface A. When the display mode shown in fig. 5 (b) is adopted, it is not limited whether all the displayed video frames are generated into an image file for storage or the video frame selected by the user is generated into an image file for storage.
As can be understood from fig. 5, the floating interface a for displaying the thumbnail of the snap shot may block a partial area of the photographed video picture. In order not to affect the user to view the shot video picture, the floating interface a may be set to be closed after being displayed for a period of time (e.g., 2s, 3s, 4s, etc.). The time displayed by the floating interface a can be recorded as a first preset time.
As another example, the floating interface a fades gradually until disappearing after being displayed for 2s, and the floating interface a disappears for 3s from the beginning of display to the end.
As another example, referring to fig. 6, a close control may be disposed on the floating interface a, and the close control may be configured to close the floating interface a after receiving a click operation by a user. The embodiment shown in fig. 6 controls the display time of the floating interface a by the user, so that the user can close the floating interface a in time when the user determines that the captured photo is expected, or has enough time to view the thumbnail of the captured photo when the user needs to carefully confirm whether the captured photo is expected.
In this embodiment of the application, a user operation that triggers closing of the floating interface a is denoted as a second operation, for example, an operation that a user clicks a closing control in the floating interface a as described above.
As another example, the user may also adjust the position of the hover interface a. Referring to fig. 7 (a), the user may drag the floating interface a to an arbitrary position on the video capture interface by means of dragging.
The user may also resize the floating interface a. Referring to fig. 7 (b), the user can control the floating interface a to enlarge (or reduce) by pressing two fingers on the border of the floating interface a and then by opening (or pinching) two fingers
The user may also zoom the snap shots displayed within the hover interface a. Referring to fig. 7 (c), the user controls the snap shot in the floating interface a to zoom in (or zoom out) by pressing two fingers inside the floating interface a and then by opening (or pinching) the two fingers.
The manner in which the user operates the floating interface a with a finger in fig. 7 is only for example and does not limit the present application.
In addition, the user may snap multiple photographs during the video capture process. The user may also view other photographs taken by snap in another manner.
As another example, after the floating interface a is closed, a floating button is displayed above a video capture interface displayed on a display screen of the electronic device. The hover button is used to present a plurality of snap shots to the user after the user clicks.
Referring to fig. 8, after the floating interface a for displaying the thumbnail of the snapshot displayed above the video capture interface disappears, a floating button is displayed above the video capture interface.
As previously described, the hover button is displayed after hover interface a is closed. For example, the floating interface a displays the floating button after a certain time disappears, or the user clicks a close control on the floating interface a to trigger the floating interface a to close and then display the floating button.
In addition, in order to facilitate display of other interfaces or controls, a shooting screen will not be displayed in the video shooting interfaces of the embodiments shown in fig. 8 to 12.
When the user clicks the hover button, the hover button on the video shooting interface disappears, and another hover interface is displayed at the same time, and the another hover interface is used for displaying information (for example, thumbnails) of the photos captured in the current video shooting process in a list form (in practical application, other display modes are also possible).
Referring to fig. 9, another floating interface is displayed above the video capture interface after the user clicks the floating button. The floating interface shown in fig. 9 can display the photos taken during the current video shooting process. For convenience of description, the floating interface triggered and displayed by the floating button may be referred to as a floating interface B, and may also be referred to as a second floating interface. In the embodiment of the application, a user operation for triggering the electronic device to display the floating interface B is denoted as a third operation. For example, the operation of the user clicking the hover button as described above.
For example, if 3 photos are captured during the video shooting process, thumbnails of the 3 captured photos may be displayed in the floating interface B illustrated in fig. 9.
Of course, since the area of the floating interface B shown in fig. 9 is limited, and each captured picture has a fixed size when being displayed in the floating interface B, it is possible that the floating interface B cannot completely display each captured picture in the current video shooting process.
For example, in the floating interface B shown in fig. 10, thumbnails of 6 snap shots are displayed in full, and thumbnails of 2 snap shots are displayed in part. If the number of the snap shots is larger than 8 in the current video shooting process, at least 1 thumbnail of the snap shot may be hidden.
When a user needs to view a partially displayed snapshot or a hidden snapshot in the floating interface B, the floating interface B can be triggered to display other snapshots in the floating interface B through a sliding gesture.
For example, in the floating interface B shown in fig. 11, the user triggers the floating interface B to slide and display the captured photos through the slide gesture in the floating interface B. As shown in the floating interface B illustrated in fig. 11, at least 10 photographs are captured during the current video capture process. Triggering the suspended interface B by the sliding gesture of the user to hide the snapshot 1 and the snapshot 2 at present; partially displaying a snapshot 3, a snapshot 4, a snapshot 9 and a snapshot 10; the snap shots 5 to 8 are displayed in full. As the sliding gesture progresses, the snapshot displayed inside the floating interface B slides along with the sliding gesture. During the sliding process, the fully displayed snap shot, the partially displayed snap shot, and the hidden snap shot may be slightly different. The embodiments of the present application are not illustrated.
In practical applications, if the user triggers and displays the floating interface B shown in fig. 9 through the floating button shown in fig. 8, it indicates that the user needs to preview the photo captured in the current video capturing process. I.e. the user's attention may be concentrated in the snapshot displayed in the floating interface B and may be longer in time. In order not to affect the video shooting effect, the current video shooting process may be paused in a case where the user clicks the hover button shown in fig. 8 to trigger the display screen of the electronic device to display the hover interface B shown in fig. 9.
Referring to the interfaces illustrated in fig. 9 to 11, in the scenes illustrated in fig. 9 to 11, the current video shooting process is in a paused state, a pause icon is displayed in front of the shooting progress displayed on the video shooting interface, and a pause/continue control in the video shooting interface displays the shooting icon.
And under the condition that the user clicks the pause/continue control (displayed as a shooting icon) or the area which is not provided with other controls and is except the suspension interface B in the current video shooting interface, switching the current video shooting process from a pause state to a shooting state, and closing the suspension interface B at the same time. In the embodiment of the present application, the operation for triggering closing of the floating interface B (while continuing to capture the currently paused video) is denoted as a fourth operation. The pause/continue control is noted as the first control.
It should be noted that, in the case that the floating button shown in fig. 8 triggers the display of the floating interface B, if the current video shooting process is paused, the shooting status displayed in front of the shooting progress displayed in the video shooting interfaces shown in fig. 9 to 11 is a pause icon, and the pause/continue control displays the shooting icon, so that the user can click the shooting icon to continue the current video shooting process, and simultaneously close the displayed floating interface B.
In the case that the floating button shown in fig. 8 triggers the display of the floating interface B, if the current video shooting process is not suspended, referring to the embodiment shown in fig. 12, the shooting status displayed in front of the shooting progress displayed in the video shooting interface shown in fig. 12 is a shooting icon, and the pause/continue control displays a pause icon, so that when the user needs to pause the current video shooting process, the user clicks the pause icon to pause the current video shooting process.
In addition, the user needs to trigger the currently displayed floating interface B to close through other ways. For example, a closing control is set on the floating interface B in fig. 12, and the user clicks the closing control to trigger closing of the floating interface B; the user can also close the floating interface B through preset gesture triggering. The embodiment of the present application does not limit this.
It should be further noted that, after the suspension interface a corresponding to the first snapshot in the video shooting process is closed, the suspension button shown in fig. 8 starts to be displayed on the video shooting interface. After the user clicks the floating button to trigger the floating interface B to be displayed, the floating button shown in fig. 8 is not displayed on the video shooting interface any more. And after the floating interface B is closed, the video shooting interface continues to display the floating button shown in the figure 8.
After describing the scene interface diagrams of the shooting process, in order to more clearly understand the application scenes, the following describes technical details for implementing the application scenes.
Referring to fig. 13, a technical architecture diagram for a shooting method according to an embodiment of the present application is provided. As shown in fig. 13, the technical architecture includes: the system comprises an application layer, an application framework layer and a hardware abstraction layer.
Only a part of the layers related to the embodiment of the present application is shown in fig. 13, and other layers, for example, a system runtime layer and a kernel layer, may be included in addition to the above layers. The embodiments of the present application are not illustrated.
In addition, the embodiments of the present application will describe partial functions of each module in each layer, and the partial functions of each module described in the embodiments of the present application are functions related to the embodiments of the present application, and do not mean that each module provides only the above functions.
The application layer has a camera application for implementing video and image (photo) capture and other functions in the capture process.
The camera application is provided with a user interface module, and the user interface module can provide each display interface in the above embodiments.
The camera is also provided with a photographing module which can realize the function of image (photo) photographing.
The camera is also provided with a video module, and the video module can realize the video shooting function.
The camera application is further provided with a data processing module, and the data processing module is used for providing support for pictures displayed by each interface in the scene, for example, the pictures displayed by each interface can be drawn based on data transmitted from a lower layer.
The application framework layer is a media framework which can realize data interaction between a camera application in an upper layer (application layer) and each module in a lower layer (hardware abstraction layer).
The media frame is provided with an audio and video coding and decoding module which can carry out coding processing on video data streams transmitted by a lower layer (hardware abstraction layer).
The media framework is further provided with a camera service (camera service), and the camera service can encapsulate the video recording request transmitted by the upper layer application. The encapsulated record request is transmitted to the lower layer (hardware abstraction layer).
The media frame is also provided with a media database which is used for temporarily storing the storage address of the captured photo.
And the hardware abstraction layer can encapsulate the drive of some cameras to interact with the camera hardware at the bottom layer, so that the camera is called to realize the video shooting function and the image shooting function.
The hardware abstraction layer is provided with a Camera module, the Camera module is used for defining a universal standard interface, and the Camera service realizes communication with the underlying Camera hardware based on the standard interface provided by the Camera module.
The hardware abstraction layer is also provided with Graphics classes that provide methods for drawing objects (borders of the floating interface, thumbnails of snap-shot images, etc.) to the display device.
The timing interaction diagram between the layers in the above technical architecture will be described by fig. 14.
Step A1, the user clicks a video recording control on the display screen of the electronic device to start the video recording function of the electronic device.
Step A2, after receiving a video recording instruction corresponding to the operation of clicking the video recording control by the user, the camera application of the electronic device sends a video recording request to the media framework.
Step A3, after the media framework of the electronic device receives the video recording request, the video recording request is encapsulated to obtain the encapsulated video recording request.
Step a4, the media framework of the electronic device sends the encapsulated video recording request to the hardware abstraction layer.
Step A5, the hardware abstraction layer of the electronic device calls the bottom camera to start collecting the video data stream, and the video data stream collected by the camera returns to the hardware abstraction layer.
Step A6, the hardware abstraction layer of the electronic device receives the video data stream collected by the bottom layer camera and sends the received video data stream to the media frame.
Step a7, the media framework of the electronic device performs encoding processing on the received video data stream.
Step A8, the media framework of the electronic device sends the encoded video data stream to the camera application.
And step A9, after the camera application of the electronic equipment receives the video data stream after the encoding processing, drawing a video picture based on the video data stream after the encoding processing and displaying the video picture through a display screen of the electronic equipment.
And displaying in a video shooting interface when the display screen of the electronic equipment displays the video picture.
In addition, after step a5, the camera continues to capture the video data stream, and accordingly, steps a6 to a9 continue. With the time being prolonged, the camera application continuously draws the video data stream received in real time into a video picture and sends the video picture to the display screen for displaying, thereby realizing the video shooting process.
Step B1 is performed when the user wants to take a snapshot during video capture.
And step B1, clicking a snapshot control in the video shooting interface displayed on the display screen of the electronic equipment by the user.
And step B2, after the camera application of the electronic equipment receives a snapshot instruction corresponding to the operation of clicking the snapshot control by the user, the camera application generates a thumbnail from a video frame corresponding to the video picture currently displayed on the video shooting interface.
It should be noted that, the difference of the contents displayed in the video frames of several consecutive frames is not large, and particularly, when the video frame rate is large, the contents displayed in the video frames of several consecutive frames may be almost the same. Pictures in some video frames may be blurry due to motion of people, animals or objects in the scene being photographed.
In addition, the instant the user wants to take a snapshot and the time the user clicks the snapshot control are not exactly the same. Therefore, when the thumbnail is generated, the time when the camera application of the electronic device receives the snapshot instruction or the time when the touch screen detects the user operation triggering the snapshot can be recorded first. As an example, the time may be recorded as a first time.
The video frames that generate the thumbnail may be: and a video frame corresponding to the video picture displayed at the time by a video shooting interface in a display screen of the electronic equipment.
The video frames that generate the thumbnail may also be: and the front n frames, the rear n frames and the like of the video frames corresponding to the video pictures displayed at the time by the video shooting interface in the display screen of the electronic equipment. Wherein n is a positive integer greater than or equal to 1. For example, n may be 1, 2, 3, 4, 5.
The video frames that generate the thumbnail may also be: the method comprises the steps that a video shooting interface in a display screen of the electronic equipment displays the clearest frame of the first n frames, the current frame and the last n frames of a video frame corresponding to a video picture at the time. As an example, a detection model may be provided, which may detect the sharpness of the image. The detection model is input with the previous n frames, the current frame and the next n frames, and the output video frame with the highest definition is used as the video frame for generating the thumbnail.
And step B3, the camera application of the electronic equipment instructs the display screen of the electronic equipment to display a floating interface A, and the picture displayed in the floating interface A is the generated thumbnail.
The thumbnail may be temporarily stored in a memory space (for example, 3M memory space may be pre-allocated), and the memory space is used for temporarily storing the thumbnail of the snapshot displayed in the floating interface a.
In step B4, the camera application of the electronic device will generate an image file for generating the thumbnail video frame and store the image file.
In this step, the image file may be stored in a gallery of the electronic device. If the user exits the current video shooting process, the image file corresponding to the snapshot photo can be checked from the gallery of the electronic equipment.
Step B5, the camera application of the electronic device stores the storage address (and other information) and index identification of the image file into the media database of the media frame.
The index identifier is stored in the media database in association with the storage address of the image file, i.e. the storage address of the image file and the index identifier are stored in one-to-one. The storage address of an image file corresponds to an index identifier. The index identifications corresponding to the storage addresses of the plurality of image files may be the same. The image file corresponding to the index identifier may then be retrieved from the media database based on the index identifier. If the index identifiers of the storage addresses of the image files corresponding to all the snap shots in the current video shooting process are the same, the storage addresses of the image files corresponding to all the snap shots in the current video shooting process can be obtained based on the index identifiers.
In view of the above, the index identification may be associated with the current video shot. As an example, each time a video is shot, a video identification of the video that is currently to be shot may be generated prior to the shooting. For example, the current timestamp + random number is used as the video identifier of the video to be currently captured. And generating a video identifier for each shooting of the video, wherein the generated video identifier is unique. The index identifier may be a video identifier generated when the video is captured.
It should be noted that step B2 and step B3 are used to generate a thumbnail of the snap shot and display the thumbnail of the snap shot so that the user can preview the snap shot. Steps B4 through B5 are used to generate a snapshot into an image file and store the image file, which may be stored in a memory of the electronic device since the snapshot has already generated the image file, so that the user can preview the snapshot when the user needs to preview the snapshot during the current video capture.
As an example, when the user previews the snap shot photo in the video shooting process by clicking the hover button as described above, the storage address of the image file can be obtained from the media database, so that the image file is obtained from the storage address of the image file to generate the thumbnail of the image file; and displaying the thumbnail of the image file on a floating interface B triggered and displayed by a floating button.
Of course, after the current video shooting is finished, when the user enters the gallery to view the snapshot photos, the thumbnail of the image file can be previewed in the preview interface of the gallery.
In practical applications, the camera application may perform step B4 after performing step B3. The camera application may also start a child thread for executing step B4 to step B5 upon receiving the snap instruction. Namely, step B2 to step B3, and step B4 to step B5 are performed simultaneously with two threads.
In step B5, the storage address of the image file corresponding to the snapshot is stored in the media database, and in practical applications, the media database may also store other information of the image file corresponding to the snapshot, for example, the size of the image file corresponding to the snapshot, the snapshot time, and the like.
Step B6, after the floating interface A displays the thumbnail 3s, the camera application of the electronic device closes the floating interface A and displays the floating button.
As described above, when the floating interface a displays a thumbnail, the transparency may be slowly increased until the thumbnail becomes completely transparent and disappears. The hover button is then displayed on the video capture interface. In the embodiment of the application, the transparency may be set to be a value of 0 to 1, and when the floating interface a is displayed on the video shooting picture, the transparency starts to be 0 and then gradually increases to 1. When the value is increased to 1, the suspension interface A disappears, and a suspension button is displayed on the video shooting interface. This embodiment is an implementation manner, and other manners may be adopted in specific implementation.
The setting of suspension button can provide the entry for the user, and this entry is used for looking over the candid photograph of current video shooting in-process, therefore, no matter be the mode that the user closes the control through clicking on the suspension interface A, still should suspend interface A and show the mode that closes by oneself after the certain time, electronic equipment all can show this suspension button to for the user provides the entry of looking over the arbitrary candid photograph of current video shooting in-process.
As shown in step B3, the thumbnail of the snapshot may be temporarily stored in a memory space. After step B6, the thumbnail images stored in the memory space are emptied. So that the thumbnail of the snapshot still can be stored in the memory space when the next snapshot is taken.
It should be noted that, in the video shooting process, if the 1 st snapshot is taken, the steps B1 to B6 need to be executed. On the one hand, the thumbnail of the 1 st snapshot needs to be displayed, so that the user can preview the thumbnail of the 1 st snapshot. On the other hand, the storage address of the image file of the 1 st snapshot and the index identifier need to be stored in the media database in an associated manner, so that the user can preview based on the index identifier when needing to preview all the snapshots in the current video shooting.
If the picture is taken at the ith time (i is a positive integer greater than or equal to 2), steps B1-B6 still need to be executed. On one hand, the thumbnail of the ith snapshot needs to be displayed, so that the user can preview the thumbnail of the ith snapshot. On the other hand, the storage address and the index identifier (which may be the same as the index identifier when the storage address of the image file is stored for the 1 st time) of the image file of the ith snapshot are required to be stored in the media database, so that the user can preview based on the index identifier when all snapshots in the current video shooting need to be previewed.
As an example, in the embodiment shown in fig. 14, if the user clicks the snapshot control again (e.g., 2 nd time) after step B6 shown in fig. 14, the electronic device repeats the steps of step B1 to step B6 again after step B6 shown in fig. 14. The picture taken by the user at 2 nd snap is displayed again in the same manner (the hover button disappears when hover interface a is displayed). And after the thumbnail of the 2 nd snapshot is displayed on the suspension interface A (self-closing after the preset time or closing by clicking a closing control by a user), continuing to display the suspension button.
As another example, in the embodiment shown in fig. 14, if the user clicks the snapshot control again (e.g., 2 nd time) before the electronic device performs step B3 and step B6, that is, before the floating interface a disappears, the electronic device may control the floating interface a to terminate displaying the thumbnail of the 1 st snapshot and change to displaying the thumbnail of the 2 nd snapshot. And displaying a floating button after the thumbnail of the 2 nd snapshot is displayed on the floating interface A (self-closing after the preset time or closing by clicking a closing control by a user).
After the user performs step B1, the electronic device continues to perform steps a5 (receive video data stream) to a9 while the electronic device performs steps B2 to B6. Namely, the video shooting interface displayed on the display screen of the electronic equipment still continuously displays video pictures.
If the user takes a plurality of photos in the current video shooting process, the image files corresponding to the plurality of photos taken in the current video shooting process are stored in the gallery of the electronic equipment. Correspondingly, the media database stores the storage addresses of the image files corresponding to the plurality of pictures captured in the current video capturing process.
After each snapshot, the floating button is displayed again after the floating interface a (a single thumbnail for displaying the current snapshot) displayed on the video shooting interface disappears.
If the user needs to view all the photos taken during the current video capture process (or other photos taken before the current snapshot), the user may perform step C1.
Step C1, the user clicks a hover button on a video capture interface displayed on a display screen of the electronic device.
In step C2, after the camera application of the electronic device receives the query instruction corresponding to the operation of clicking the hover button by the user, the camera application acquires an index identifier (a video identifier of the currently captured video).
Step C3, the camera application sends the index identification to the media frame.
And step C4, the media framework acquires the storage addresses of the image files corresponding to all the snap photos in the current video process from the media database based on the index identification.
Step C5, the media framework returns the found storage address of the image file to the camera application.
And step C6, the camera application acquires image files corresponding to all the snap shots in the current video process based on the received storage address, and displays thumbnails of all the snap shots (or thumbnails of partial snap shots) in a list form through the suspension interface B. After the hover interface B is displayed, the hover button over the video capture interface disappears.
In addition, after the user performs step C1, in an application scene in which the video shooting process needs to be in a pause state, the camera application needs to transmit a pause instruction to the lower layer to cause the camera to pause the acquisition. Specifically, refer to steps D2 to D5.
And D2, after the camera application of the electronic equipment receives the query instruction corresponding to the operation of clicking the hover button by the user, the camera application sends a pause instruction to the media frame.
Step D3, after the media frame of the electronic device receives the pause instruction, the pause instruction is encapsulated to obtain the encapsulated pause instruction.
Step D4, the media framework of the electronic device sends the encapsulated pause instruction to the hardware abstraction layer.
And D5, the hardware abstraction layer of the electronic equipment instructs the bottom camera to stop collecting the video data stream.
Step D2 to step D4 are similar to step a2 to step a4, and step a2 to step a4 transmit an acquisition request, and accordingly, the camera starts to acquire a video data stream and returns the acquired video data stream. Transmitted from step D2 to step D5 are instructions for suspending the capturing, and accordingly, the camera stops capturing the video data stream.
Of course, after the user finishes viewing the thumbnail of the snap shot through the floating interface B, the user may continue the current video shooting process.
In step E1, the user clicks on an area of the video capture interface other than the hover interface B (which may also be a pause/resume control for the currently displayed capture icon).
And step E2, after receiving a video recording continuing instruction corresponding to the operation of clicking the area outside the floating interface B by the user, the camera application of the electronic equipment sends a video recording continuing request to the media framework.
The steps E3 to E9 may refer to the description of the steps A3 to a9, and are not repeated herein.
Of course, in step E9, a hover button may be displayed over the video capture interface while the video frame continues to be drawn and displayed.
As another example, in the embodiment shown in fig. 14, before the electronic device displays the floating interface B until the floating interface B disappears between step C6 and step E9, if the user clicks the snapshot control again (e.g., 2 nd time), the electronic device may not repeatedly perform the steps of step B1 to step B6. Because, the camera in the electronic device is not in the acquisition state but in the pause state at this time. Therefore, it is not necessary to generate a snapshot.
As another example, in the embodiment shown in fig. 14, after the hover button is displayed at step E9, if the user clicks the snapshot control again (e.g., 2 nd time), after step E9 shown in fig. 14, the electronic device repeats the steps of step B1 to step B6 again. The picture taken by the user at 2 nd snap is displayed again in the same manner (the hover button disappears when hover interface a is displayed). And displaying a floating button after the thumbnail of the 2 nd snapshot is displayed on the floating interface A (self-closing after the preset time or closing by clicking a closing control by a user).
As another example, in the embodiment shown in FIG. 14, after the hover button is displayed at step E9, if the user clicks the hover button again (e.g., 2 nd time). After step E9 shown in fig. 14, the electronic device repeats steps C1 to C6 again to display thumbnails of all captured photos in the current video (the floating interface disappears when floating interface B is displayed).
In summary, the snapshot control, as a control on the video shooting interface, may be set as: from the start of the current video shot until the end of the current video shot. The snapshot control is used for triggering the suspension interface A to be displayed on the video shooting interface so as to display the photo captured by the current trigger snapshot control. Whether the display of the suspension interface A does not influence the display of the snapshot control.
And the suspension button starts to display after the suspension interface A displayed for the first time in the video shooting process disappears. The suspension button is used for triggering the display of a suspension interface B on the video shooting interface so as to display all captured pictures in the current video shooting process. After displaying the hover interface B, the hover button may be set to disappear. Of course, after the floating interface a is displayed, the floating button may also be set to disappear. It can also be understood that, after the hover button is displayed for the first time in the current video shooting process, the hover button disappears when the hover interface a or the hover interface B is displayed on the video shooting interface. In the case where the video shooting interface does not display either the floating interface a or the floating interface B, the floating button is displayed.
Finally, after the video shooting is finished, the storage address stored in the media database and associated with the video identifier of the video just finished shooting can be emptied together. The image file corresponding to the storage address (image file generated by taking a snapshot) is not cleared.
The camera application provided by the embodiment of the present application may be called by other application programs, for example, when the other application programs need to use the camera function, the other application programs may call all or part of the functions in the camera application provided by the embodiment of the present application.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the steps in the above-mentioned method embodiments may be implemented.
Embodiments of the present application further provide a computer program product, which when run on a first device, enables the first device to implement the steps in the above method embodiments.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer readable storage medium and used by a processor to implement the steps of the embodiments of the methods described above. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or apparatus capable of carrying computer program code to a first device, including recording media, computer Memory, Read-Only Memory (ROM), Random-Access Memory (RAM), electrical carrier signals, telecommunications signals, and software distribution media. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
An embodiment of the present application further provides a chip system, where the chip system includes a processor, the processor is coupled to the memory, and the processor executes a computer program stored in the memory to implement the steps of any of the method embodiments of the present application. The chip system may be a single chip or a chip module composed of a plurality of chips.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.
Claims (16)
1. A shooting method is applied to electronic equipment, the electronic equipment comprises a display screen and a camera, and the shooting method comprises the following steps:
the electronic equipment displays a video shooting interface through the display screen, and the video shooting interface is used for displaying a video picture acquired by the camera in real time;
the electronic equipment detects a first operation, wherein the first operation is used for capturing a first image, and the first image is related to a video picture acquired by the camera in real time;
the electronic equipment displays a first suspension interface on the video shooting interface, and the first suspension interface is used for displaying the thumbnail of the first image.
2. The shooting method according to claim 1, wherein a snapshot control is provided on the video shooting interface, and the first operation acts on the snapshot control.
3. The shooting method of claim 1 or 2, wherein after the electronic device displays a first floating interface on the video shooting interface, the shooting method further comprises:
and under the condition that the first suspension interface displays a first preset time, the electronic equipment closes the first suspension interface.
4. The shooting method of claim 1 or 2, wherein after the electronic device displays a first floating interface on the video shooting interface, the shooting method further comprises:
the electronic equipment detects a second operation, and the second operation is used for closing the first floating interface;
the electronic device closes the first floating interface based on the second operation.
5. The shooting method according to claim 4, wherein a closing control is provided on the first floating interface; the second operation acts on the close control.
6. The photographing method according to any one of claims 3 to 5, wherein after the electronic device closes the first floating interface, the photographing method further includes:
the electronic equipment detects a third operation, wherein the third operation is used for triggering the electronic equipment to display a second suspension interface;
the electronic equipment displays the second suspension interface on the video shooting interface based on the third operation, the second suspension interface is used for displaying a thumbnail of a snapshot image, the snapshot image is an image snapshot in the current video shooting process, and the snapshot image comprises the first image.
7. The method of claim 6, wherein after the electronic device closes the first floating interface, the method of capturing further comprises:
the electronic equipment displays a suspension button on the video shooting interface, wherein the third operation acts on the suspension button.
8. The photographing method according to claim 6 or 7, wherein in a case where the electronic apparatus detects a third operation, the photographing method further comprises:
the electronic device pauses the current video capture based on the third operation.
9. The shooting method of claim 8, wherein after the electronic device displays the second floating interface on the video shooting interface based on the third operation, the shooting method further comprises:
the electronic device detects a fourth operation;
the electronic equipment closes the second floating interface based on the fourth operation;
the electronic device continues the currently paused video capture based on the fourth operation.
10. The shooting method according to claim 9, wherein the fourth operation is performed on a first control provided on the video shooting interface, or on an area of the video shooting interface other than the second floating interface and not provided with a control, the first control being used to pause a current video shooting or continue a currently paused video shooting.
11. The photographing method according to any one of claims 6 to 10, wherein in a case where the electronic device detects the first operation, the photographing method further comprises:
the electronic device records a first time when the first operation is detected;
the electronic equipment generates the first image from a video picture next to a video picture displayed by the video shooting interface at the first time;
the electronic device generates a thumbnail of the first image.
12. The capture method of claim 11, wherein after generating a first image from a video frame next to a video frame displayed by the video capture interface at the first time, further comprising:
the electronic device stores the first image;
and the electronic equipment sends the storage address of the first image and the index identification of the snapshot image to a media database.
13. The shooting method of claim 12, wherein the electronic device displaying the second floating interface on the video shooting interface based on the third operation comprises:
the electronic equipment acquires an index identifier of the snapshot image based on the third operation;
the electronic equipment acquires a storage address of the snapshot image in the current video shooting process based on the index identification of the snapshot image;
the electronic equipment acquires the snapshot image based on the acquired storage address;
the electronic equipment generates a thumbnail of the snapshot image;
the electronic equipment displays the generated thumbnail of the snapshot image in the second floating interface, wherein the thumbnail of the snapshot image comprises the thumbnail of the first image.
14. The photographing method according to claim 12 or 13, wherein the index identification of the snap-shot image is a unique identification of a currently photographed video.
15. An electronic device, characterized in that the electronic device comprises a processor for executing a computer program stored in a memory, so that the electronic device implements the method according to any of claims 1 to 14.
16. A computer-readable storage medium, in which a computer program is stored which, when run on a processor, implements the method of any one of claims 1 to 14.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111077227.2A CN113810608B (en) | 2021-09-14 | 2021-09-14 | Shooting method, electronic equipment and storage medium |
CN202211416840.7A CN115866390B (en) | 2021-09-14 | 2021-09-14 | Shooting method, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111077227.2A CN113810608B (en) | 2021-09-14 | 2021-09-14 | Shooting method, electronic equipment and storage medium |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211416840.7A Division CN115866390B (en) | 2021-09-14 | 2021-09-14 | Shooting method, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113810608A true CN113810608A (en) | 2021-12-17 |
CN113810608B CN113810608B (en) | 2022-11-25 |
Family
ID=78941013
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211416840.7A Active CN115866390B (en) | 2021-09-14 | 2021-09-14 | Shooting method, electronic equipment and storage medium |
CN202111077227.2A Active CN113810608B (en) | 2021-09-14 | 2021-09-14 | Shooting method, electronic equipment and storage medium |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211416840.7A Active CN115866390B (en) | 2021-09-14 | 2021-09-14 | Shooting method, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (2) | CN115866390B (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114205531A (en) * | 2021-12-23 | 2022-03-18 | 北京罗克维尔斯科技有限公司 | Intelligent photographing method, equipment and device for vehicle and storage medium |
CN115328357A (en) * | 2022-08-15 | 2022-11-11 | 北京达佳互联信息技术有限公司 | Captured image processing method and device, electronic device and storage medium |
CN115525188A (en) * | 2022-02-28 | 2022-12-27 | 荣耀终端有限公司 | Shooting method and electronic equipment |
WO2023160238A1 (en) * | 2022-02-28 | 2023-08-31 | 荣耀终端有限公司 | Image display method and related electronic device |
WO2023160186A1 (en) * | 2022-02-28 | 2023-08-31 | 荣耀终端有限公司 | Video processing method, and electronic device and readable storage medium |
CN116781822A (en) * | 2022-03-15 | 2023-09-19 | 荣耀终端有限公司 | Video processing method, electronic device and readable medium |
WO2024055797A1 (en) * | 2022-09-14 | 2024-03-21 | 荣耀终端有限公司 | Method for capturing images in video, and electronic device |
WO2024179053A1 (en) * | 2023-02-27 | 2024-09-06 | 荣耀终端有限公司 | Snapshot method, terminal device, and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105827935A (en) * | 2015-07-23 | 2016-08-03 | 维沃移动通信有限公司 | Terminal screenshot method and terminal |
CN107580234A (en) * | 2017-09-01 | 2018-01-12 | 歌尔科技有限公司 | Photographic method, display end, shooting head end and system in wireless live broadcast |
CN109726179A (en) * | 2018-12-29 | 2019-05-07 | 努比亚技术有限公司 | Screenshot picture processing method, storage medium and mobile terminal |
CN111290675A (en) * | 2020-03-02 | 2020-06-16 | Oppo广东移动通信有限公司 | Screenshot picture sharing method and device, terminal and storage medium |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160217328A1 (en) * | 2013-09-30 | 2016-07-28 | Danielle YANAI | Image and video processing and optimization |
CN105635614A (en) * | 2015-12-23 | 2016-06-01 | 小米科技有限责任公司 | Recording and photographing method, device and terminal electronic equipment |
CN105681648A (en) * | 2015-12-31 | 2016-06-15 | 北京金山安全软件有限公司 | Picture viewing method and device and electronic equipment |
CN109922266B (en) * | 2019-03-29 | 2021-04-06 | 睿魔智能科技(深圳)有限公司 | Snapshot method and system applied to video shooting, camera and storage medium |
-
2021
- 2021-09-14 CN CN202211416840.7A patent/CN115866390B/en active Active
- 2021-09-14 CN CN202111077227.2A patent/CN113810608B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105827935A (en) * | 2015-07-23 | 2016-08-03 | 维沃移动通信有限公司 | Terminal screenshot method and terminal |
CN107580234A (en) * | 2017-09-01 | 2018-01-12 | 歌尔科技有限公司 | Photographic method, display end, shooting head end and system in wireless live broadcast |
CN109726179A (en) * | 2018-12-29 | 2019-05-07 | 努比亚技术有限公司 | Screenshot picture processing method, storage medium and mobile terminal |
CN111290675A (en) * | 2020-03-02 | 2020-06-16 | Oppo广东移动通信有限公司 | Screenshot picture sharing method and device, terminal and storage medium |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114205531A (en) * | 2021-12-23 | 2022-03-18 | 北京罗克维尔斯科技有限公司 | Intelligent photographing method, equipment and device for vehicle and storage medium |
CN114205531B (en) * | 2021-12-23 | 2024-06-04 | 北京罗克维尔斯科技有限公司 | Intelligent photographing method, device and apparatus for vehicle and storage medium |
CN115525188A (en) * | 2022-02-28 | 2022-12-27 | 荣耀终端有限公司 | Shooting method and electronic equipment |
WO2023160238A1 (en) * | 2022-02-28 | 2023-08-31 | 荣耀终端有限公司 | Image display method and related electronic device |
WO2023160186A1 (en) * | 2022-02-28 | 2023-08-31 | 荣耀终端有限公司 | Video processing method, and electronic device and readable storage medium |
CN116700846A (en) * | 2022-02-28 | 2023-09-05 | 荣耀终端有限公司 | Picture display method and related electronic equipment |
CN116700846B (en) * | 2022-02-28 | 2024-04-02 | 荣耀终端有限公司 | Picture display method and related electronic equipment |
CN116781822A (en) * | 2022-03-15 | 2023-09-19 | 荣耀终端有限公司 | Video processing method, electronic device and readable medium |
EP4273716A4 (en) * | 2022-03-15 | 2024-07-10 | Honor Device Co Ltd | Video processing method, electronic device and readable medium |
CN115328357A (en) * | 2022-08-15 | 2022-11-11 | 北京达佳互联信息技术有限公司 | Captured image processing method and device, electronic device and storage medium |
WO2024055797A1 (en) * | 2022-09-14 | 2024-03-21 | 荣耀终端有限公司 | Method for capturing images in video, and electronic device |
WO2024179053A1 (en) * | 2023-02-27 | 2024-09-06 | 荣耀终端有限公司 | Snapshot method, terminal device, and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN113810608B (en) | 2022-11-25 |
CN115866390B (en) | 2023-11-07 |
CN115866390A (en) | 2023-03-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113810608B (en) | Shooting method, electronic equipment and storage medium | |
JP7536090B2 (en) | Machine translation method and electronic device | |
CN111654629B (en) | Camera switching method and device, electronic equipment and readable storage medium | |
CN113475092B (en) | Video processing method and mobile device | |
CN112954210B (en) | Photographing method and device, electronic equipment and medium | |
CN103581544B (en) | Dynamic area-of-interest adjusts and provided the image-capturing apparatus of dynamic area-of-interest adjustment | |
CN115002340A (en) | Video processing method and electronic equipment | |
WO2023134583A1 (en) | Video recording method and apparatus, and electronic device | |
CN112136309B (en) | System and method for performing rewind operations with a mobile image capture device | |
EP4436198A1 (en) | Method for capturing images in video, and electronic device | |
WO2022179331A1 (en) | Photographing method and apparatus, mobile terminal, and storage medium | |
CN111669495B (en) | Photographing method, photographing device and electronic equipment | |
US11551452B2 (en) | Apparatus and method for associating images from two image streams | |
EP2939411A1 (en) | Image capture | |
WO2024179101A1 (en) | Photographing method | |
WO2024179100A1 (en) | Photographing method | |
CN118474447A (en) | Video processing method, electronic device, chip system and storage medium | |
CN115883958A (en) | Portrait shooting method | |
WO2022257655A1 (en) | Video photographing method and electronic device | |
WO2022262451A1 (en) | Video photographing method and electronic device | |
WO2022206605A1 (en) | Method for determining target object, and photographing method and device | |
CN113794833B (en) | Shooting method and device and electronic equipment | |
CN115802148A (en) | Method for acquiring image and electronic equipment | |
EP4439382A1 (en) | Code scanning method and electronic device | |
WO2023231696A1 (en) | Photographing method and related device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |