CN113810608B - Shooting method, electronic equipment and storage medium - Google Patents

Shooting method, electronic equipment and storage medium Download PDF

Info

Publication number
CN113810608B
CN113810608B CN202111077227.2A CN202111077227A CN113810608B CN 113810608 B CN113810608 B CN 113810608B CN 202111077227 A CN202111077227 A CN 202111077227A CN 113810608 B CN113810608 B CN 113810608B
Authority
CN
China
Prior art keywords
interface
video
electronic equipment
shooting
snapshot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111077227.2A
Other languages
Chinese (zh)
Other versions
CN113810608A (en
Inventor
银康林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211416840.7A priority Critical patent/CN115866390B/en
Priority to CN202111077227.2A priority patent/CN113810608B/en
Publication of CN113810608A publication Critical patent/CN113810608A/en
Application granted granted Critical
Publication of CN113810608B publication Critical patent/CN113810608B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Studio Devices (AREA)

Abstract

The application provides a shooting method, electronic equipment and a storage medium, which relate to the technical field of shooting, and the method comprises the following steps: when a video is shot through a camera on the electronic equipment, a display screen of the electronic equipment can display a video picture acquired by the camera in real time through a video shooting interface; the electronic equipment comprises a camera, a video shooting interface, a suspension interface, a thumbnail and a snapshot control, wherein the video shooting interface is arranged on the video shooting interface of the electronic equipment, the snapshot control is arranged on the video shooting interface of the electronic equipment, and after a user clicks the snapshot control, the electronic equipment can select one frame of video frame from a video shot by the camera to serve as the snapshot.

Description

Shooting method, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of image capture, and in particular, to a shooting method, an electronic device, and a storage medium.
Background
More and more electronic equipment is provided with a camera, and a user can carry the electronic equipment to take videos and photos anytime and anywhere. When a user uses the electronic equipment to shoot a video, the picture can be shot without interrupting the video shooting.
If the user needs to check the effect of the shot pictures in the video shooting process, the user needs to enter a gallery of the electronic equipment to check the shot pictures after the video shooting is finished. If the taken picture is poor in effect, the shooting opportunity of the picture may be missed. If the user interrupts the current video shooting process, and enters the gallery of the electronic device to view the shot pictures after the current video shooting process is interrupted, the video shooting opportunity may be missed. Therefore, the intelligent degree of the electronic device is low, and the shooting requirement of the user cannot be met.
Disclosure of Invention
The application provides a shooting method, electronic equipment and a storage medium, and solves the problem that the intelligent program of the electronic equipment is low and cannot meet the shooting requirement of a user.
In order to achieve the purpose, the following technical scheme is adopted in the application:
in a first aspect, the present application provides a shooting method applied to an electronic device including a display screen and a camera, the shooting method including:
the electronic equipment displays a video shooting interface through a display screen, and the video shooting interface is used for displaying a video picture acquired by the camera in real time;
the method comprises the steps that the electronic equipment detects a first operation, the first operation is used for capturing a first image, and the first image is related to a video picture collected by a camera in real time;
the electronic equipment displays a first suspension interface on the video shooting interface, and the first suspension interface is used for displaying a thumbnail of a first image.
In the application, in the process of shooting the video through the camera of the electronic equipment, a video shooting interface can be displayed on a display screen of the electronic equipment, and the video shooting interface is used for displaying a video picture acquired by the camera in real time. The electronic device may provide a snapshot function during a video shooting process, and after the electronic device detects an operation (e.g., a first operation) for taking a snapshot, may select a frame of image from a video frame captured in real time by the camera as a snapshot (e.g., a first image), and the electronic device displays a floating interface (e.g., a first floating interface) on the video shooting interface, and displays a thumbnail of the snapshot through the floating interface. By the method, the snapshot can be displayed to the user in time on the premise of not interrupting the current video shooting, so that the user can preview the snapshot in time, and the shooting experience of the user is improved.
As an implementation manner of the first aspect, a snapshot control is arranged on the video shooting interface, and the first operation acts on the snapshot control.
In the application, the snapshot control is arranged on the video shooting interface, and when a user clicks the snapshot control, a photo is snapshot, so that a more visual snapshot mode can be provided for the user.
As another implementation manner of the first aspect, after the electronic device displays the first floating interface on the video shooting interface, the shooting method further includes:
and under the condition that the first suspension interface displays the first preset time, the electronic equipment closes the first suspension interface.
In the method, the first preset duration can be set, so that the user has enough time to preview the snap shot photo; after the first preset time is displayed, the electronic equipment actively closes the first suspension interface for displaying the thumbnail of the snapshot photo, and the situation that the user is automatically closed to influence video shooting is avoided. The experience of the user is better.
As another implementation manner of the first aspect, after the electronic device displays the first floating interface on the video shooting interface, the shooting method further includes:
the electronic equipment detects a second operation, and the second operation is used for closing the first suspension interface;
the electronic device closes the first floating interface based on the second operation.
In the application, the user can be set to close the first suspension interface autonomously, so that a more diversified mode is provided, and the method is applied to the electronic equipment suitable for different customer groups.
As another implementation manner of the first aspect, a closing control is arranged on the first suspension interface; the second operation acts on the close control.
As another implementation manner of the first aspect, after the electronic device closes the first floating interface, the shooting method further includes:
the electronic equipment detects a third operation, and the third operation is used for triggering the electronic equipment to display a second suspension interface;
the electronic equipment displays a second suspension interface on the video shooting interface based on a third operation, the second suspension interface is used for displaying a thumbnail of the snapshot image, the snapshot image is a snapshot image in the current video shooting process, and the snapshot image comprises the first image.
In the application, another function can be set, the function is triggered by a third operation, a user can display another suspension interface on the video shooting interface through the third operation, and the suspension interface can display all captured photos in the current video shooting process. By providing more diversified functions, the requirement of searching any photo captured in the current video shooting process by a user is met.
As another implementation manner of the first aspect, after the electronic device closes the first floating interface, the shooting method further includes:
the electronic device displays a hover button on the video capture interface, wherein the third operation acts on the hover button.
As an implementation manner of the first aspect, in a case where the electronic device detects the third operation, the shooting method further includes:
the electronic device pauses the current video capture based on the third operation.
In the method and the device, the current video shooting process can be paused when the user previews the snap shot photo in the current video shooting process through the third operation, so that the situation that the user is distracted and can not shoot the video meeting the requirements is avoided.
As another implementation manner of the first aspect, after the electronic device displays the second floating interface on the video shooting interface based on the third operation, the shooting method further includes:
the electronic device detects a fourth operation;
the electronic equipment closes the second suspension interface based on the fourth operation;
the electronic device continues the currently paused video photographing based on the fourth operation.
As another implementation manner of the first aspect, the fourth operation is performed on a first control arranged on the video shooting interface, or is performed on an area, which is outside the second floating interface and is not provided with a control, on the video shooting interface, and the first control is used to pause current video shooting or continue current paused video shooting.
In the present application, a diversified manner is provided to continue the currently paused video capture.
As another implementation manner of the first aspect, in a case where the electronic device detects the first operation, the shooting method further includes:
the electronic equipment records a first time when the first operation is detected;
the electronic equipment generates a first image from a video picture of a next frame of the video picture displayed by the video shooting interface at the first time;
the electronic device generates a thumbnail of the first image.
As another implementation manner of the first aspect, after generating the first image from a video frame next to a video frame displayed by the video capture interface at the first time, the method further includes:
the electronic device stores a first image;
and the electronic equipment sends the storage address of the first image and the index identification of the snapshot image to a media database.
In the application, the first image can be stored in a memory of the electronic equipment, a media database can be arranged and used for caching the storage address of the snapshot in the current video shooting process, and when the storage address of the snapshot in the current video shooting process is stored in the media database, the index can be used as the index of the storage address of the snapshot through the index identification. The corresponding index identification of each video shooting is different.
As another implementation manner of the first aspect, the displaying, by the electronic device, the second floating interface on the video shooting interface based on the third operation includes:
the electronic equipment acquires an index identification of the snapshot image based on the third operation;
the electronic equipment acquires a storage address of a snapshot image in the current video shooting process based on the index identification of the snapshot image;
the electronic equipment acquires a snapshot image based on the acquired storage address;
the electronic equipment generates a thumbnail of the snapshot image;
the electronic device displays the generated thumbnail of the snap image within the second floating interface, wherein the thumbnail of the snap image comprises the thumbnail of the first image.
As another implementation manner of the first aspect, the index identifier of the snap shot image is a unique identifier of the currently shot video.
In a second aspect, an electronic device is provided, which includes a processor configured to execute a computer program stored in a memory to implement the method of any one of the first aspect of the present application.
In a third aspect, a chip system is provided, which includes a processor coupled to a memory, and the processor executes a computer program stored in the memory to implement the method of any one of the first aspect of the present application.
In a fourth aspect, there is provided a computer readable storage medium storing a computer program which, when executed by one or more processors, performs the method of any one of the first aspects of the present application.
In a fifth aspect, the present application provides a computer program product for causing a device to perform the method of any one of the first aspects of the present application when the computer program product is run on the device.
It is understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
Drawings
Fig. 1 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a schematic view of a video shooting interface displayed on a display screen of an electronic device when the electronic device shoots a video according to an embodiment of the present application;
fig. 3 is a schematic operation diagram of a user snapping a photo in a video shooting process according to an embodiment of the present application;
fig. 4 is a schematic diagram of a process of a user viewing a snapshot after taking the snapshot in a video shooting process according to an embodiment of the present application;
fig. 5 is a schematic view of a floating interface a displayed by an electronic device after a user takes a snapshot in a video shooting process according to an embodiment of the present application;
FIG. 6 is a schematic view of another suspension interface A provided in the embodiments of the present application;
fig. 7 is a schematic diagram of an operation of the floating interface a according to an embodiment of the present application;
fig. 8 is a schematic view of a floating button after the floating interface a is closed according to the embodiment of the present application;
fig. 9 is a schematic diagram of a floating interface B triggered and displayed by a floating button according to an embodiment of the present application;
FIG. 10 is another schematic view of a floating interface B provided in accordance with an embodiment of the present application;
FIG. 11 is another schematic view of a floating interface B provided in accordance with an embodiment of the present application;
FIG. 12 is another schematic view of a floating interface B provided in an embodiment of the present application;
fig. 13 is a technical architecture diagram on which the photographing method provided in the embodiment of the present application depends;
fig. 14 is a timing diagram for implementing a shooting method according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that in the embodiments of the present application, "one or more" means one, two, or more than two; "and/or" describes the association relationship of the associated objects, indicating that three relationships may exist; for example, a and/or B, may represent: a exists singly, A and B exist simultaneously, and B exists singly, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," "fourth," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The shooting method provided by the embodiment of the application can be suitable for electronic equipment provided with a display screen and a camera. The electronic device may be a tablet computer, a mobile phone, a wearable device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), and other electronic devices. The embodiment of the present application does not limit the specific type of the electronic device.
Fig. 1 shows a schematic structural diagram of an electronic device. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a sensor module 180, keys 190, a camera 193, a display 194, and the like. The sensor module 180 may include a pressure sensor 180A, a touch sensor 180K, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
For example, the processor 110 is configured to execute the shooting method in the embodiment of the present application.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
As an example, the memory may have cached therein the storage address and index identification of the snapshot.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function.
As an example, the electronic device may be connected to an external memory that may store video captured by the electronic device and snap shots during the capture of the video.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, and an application program required by at least one function (such as a video capture function, a video playback function, and the like).
In addition, the internal memory 121 may include a high speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a Universal Flash Storage (UFS), and the like.
As an example, the internal memory may store a computer program for implementing the video shooting method provided by the embodiments of the present application. Of course, videos taken by the electronic device and snap shots during the video taking process may also be stored in the internal memory.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
By way of example, the electronic device may detect, through the pressure sensor and the touch sensor, an operation of the user acting on the display screen, for example, an operation of the user clicking on the snapshot control, an operation of the user clicking on the hover button, and the like.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
As an example, the user may also implement an operation in the video shooting process through a physical key, for example, a shortcut key (e.g., volume + key) in the video shooting process may also be set as a physical key for capturing a photo. The user can place a finger on volume + key in the video shooting process, and in the video shooting process, the finger of placing on volume + key just can realize taking a candid photograph with strength, provides more diversified mode and realizes taking a candid photograph.
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, connected to the display screen 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The camera 193 is used to capture still images or video. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
In the implementation of the application, the camera can collect video pictures, the GPU processes the video pictures collected by the camera, and the display screen displays an interface processed by the GPU. The specific content displayed on the display screen can refer to the description in the following embodiments.
The embodiment of the present application does not particularly limit a specific structure of an execution subject of a photographing method as long as communication can be performed by one photographing method provided according to the embodiment of the present application by running a code recorded with one photographing method of the embodiment of the present application. For example, an execution main body of the shooting method provided by the embodiment of the present application may be a functional module capable of calling a program and executing the program in the electronic device, or a processing device, such as a chip, applied to the electronic device.
The following describes an interface schematic diagram of the embodiment of the present application, and it should be noted that the size, the position, and the style of the control icon in the interface shown in the embodiment of the present application are only used for example, and do not cause any limitation to the present application.
Referring to fig. 2, a schematic view of a video capture interface displayed on a display screen of an electronic device is shown.
As shown in fig. 2, in the process of shooting a video by the electronic device, a video shooting interface displayed on a display screen of the electronic device may display a currently shot video picture (a running puppy shown in fig. 2), a shooting progress (e.g., 00.
As an example, the video capture interface may include: pause/continue control, end control, and snapshot control. Of course, in practical applications, other controls may be included, such as the zoom control shown in fig. 2.
And the ending control is used for ending the current video shooting process. A pause/continue control for displaying a pause icon (shown in fig. 2) during video capture, the user clicking on the pause icon to pause the current video capture; and displaying a shooting icon (see fig. 12) when the video shooting is paused, and the user clicks the shooting icon to continue the current video shooting process. The snapshot control is used for taking a snapshot without pausing and ending the current video capture process.
Taking an application scene as an example, referring to fig. 3, when a user wants to capture a photo while a puppy runs in a process of capturing a video while the puppy runs, the user may click a capture control in a video capturing interface to obtain the capture photo. The snap shots are stored as images in a gallery of the electronic device.
However, the snap shot may not meet the user's expectations. Taking the motion scene shown in fig. 3 (a moving person, animal, or object exists in the shot) as an example, when a moving dog is captured, the following situations may occur: the dog appears blurry in the snap photo due to the running of the dog; alternatively, the user desires to snap the dog in the photograph in a state where both feet are empty, whereas the dog in the photograph is snapped in a state where both feet are grounded. Therefore, it is likely that the snap shot does not meet the user's expectations. The embodiment of the application takes a motion scene as an example, and illustrates that the snapshot may not meet the expectation of the user. In practical application, in other shooting scenes, the snapshot may not meet the expectation of the user, and the embodiments of the present application are not given as examples.
During shooting the running video of the puppy, the display screen of the electronic equipment displays the video shooting interface shown in fig. 3, and the video shooting interface is used for displaying a video picture shot in real time. Therefore, the user cannot confirm whether the snap shot is in expectation immediately after obtaining the snap shot. After the current video shooting process is finished, the user can enter the gallery of the electronic equipment to check whether the snapshot photo is in accordance with the expectation, if not, the user needs to shoot the photo again, however, the user is likely to miss the best shooting opportunity.
If the user wants to immediately see whether the snap shot is as expected, the user needs to return to the main interface in the manner shown in fig. 4 (a). That is, the user can slide upwards from the bottom of the electronic device using a finger, and the interface displayed on the display screen of the electronic device is controlled to be the main interface, and the schematic diagram of the main interface may refer to the interface schematic diagram in (b) in fig. 4.
Then, entering a gallery of the electronic device in a manner shown in fig. 4 (b); in the case that the main interface of the electronic device includes a gallery application, the user may click the gallery application to control an interface displayed on a display screen of the electronic device to be a preview interface in the gallery, and a schematic diagram of the preview interface may refer to the interface schematic diagram in (c) in fig. 4.
Whether the snap shot is as expected is checked in the manner shown in (c) in fig. 4. Under the condition that the thumbnails are displayed on a preview interface displayed on the electronic equipment in a reverse time order, the first thumbnail (the thumbnail of the image 1) in the preview interface is the latest snapshot. The user can click on the thumbnail of image 1 to see if the snap shot is as expected.
The processes shown in fig. 4 (a) to 4 (c) interrupt the current video photographing process. Of course, the modes shown in fig. 4 (a) to 4 (c) are only for example and do not limit the present application.
In the case that the snapshot is not as expected, the user needs to take the picture again to obtain a picture that can be expected.
In the case where the snapshot is expected, since the current video shooting process has been interrupted, video shooting cannot be continued on the basis of the interrupted video. If the video still needs to be shot continuously, the next video shot needs to be carried out again.
The function of capturing the photo in the video shooting process is to capture the photo without interrupting the video shooting process, so that the user experiences the function badly and cannot fully play the function.
In view of this, the embodiment of the present application provides a shooting method, which may provide a way for a user to check whether a captured picture in a video shooting process meets the expectations of the user.
As an example, in a case that the display screen of the electronic device displays the video shooting interface shown in fig. 3, the user clicks the snapshot control, the electronic device obtains a snapshot, and displays a floating interface on the video shooting interface shown in fig. 3.
In the embodiment of the application, a user operation for triggering the electronic device to obtain a snapshot (which may be recorded as a first image) is recorded as a first operation. For example, the user clicks the snap control operation as described above. In the embodiment of the present application, an interface displayed above another interface is referred to as a floating interface. For convenience of description, a floating interface triggered and displayed by the snapshot control is referred to as a floating interface a, and the floating interface a may also be referred to as a first floating interface.
Referring to fig. 5, which is a schematic view of an interface after a user clicks a snapshot control on the basis of the video shooting interface shown in fig. 3, in fig. 5 (a), a floating interface a displays a thumbnail of a snapshot. The user can preview the thumbnail of the snap shot displayed by the floating interface a to determine whether the effect of the snap shot meets the user's expectations. The floating interface A displays a thumbnail of the snapshot.
As described above, after the user clicks the snapshot control, a snapshot corresponding to the snapshot operation may not be expected. Therefore, the display method shown in (b) in fig. 5 can also be employed in the embodiment of the present application. The floating interface a may display thumbnails of a preset number of snap shots. For example, 2 sheets, 3 sheets, 4 sheets, etc. may be displayed.
In this embodiment, after the electronic device detects that the user clicks the snapshot control, the time when the user clicks the snapshot control may be recorded. And taking a preset number of video frames related to the time in the video frames acquired by the camera as snap photos to generate thumbnails and image files. The thumbnail is used for displaying on the floating interface A, and the image file is stored in a memory of the electronic equipment.
As an example, when thumbnails of 3 snap photos are displayed, the video frames corresponding to the three thumbnail photos may be: and a video frame corresponding to the video frame displayed at the time, a previous video frame and a next video frame of the video frame corresponding to the video frame displayed at the time are shot by a video shooting interface in the display screen of the electronic equipment.
In practical applications, the video frames corresponding to the three thumbnail images may also be: and any three video frames in the video picture displayed by the display screen from a period of time before the time to a period of time after the time.
Of course, the snapping effect presented by the 3 video frames may be the same, may be slightly different, and may be different greatly.
As another example, it may be further provided that: after the electronic device detects that the user clicks the snapshot control, the time when the user clicks the snapshot control can be recorded. Taking a preset number of video frames related to the time in the video frames collected by the camera as snap photos to generate thumbnails; the thumbnail is used for displaying on the floating interface A.
Referring to fig. 5 (b), the user may select, by way of example only, one or more of the thumbnails of the plurality of video frames displayed by hover interface a by clicking, which snap effects are expected. After monitoring that the thumbnail of any video frame selected in the suspension interface A by the user is displayed by the electronic equipment, generating an image file of the video frame corresponding to the thumbnail selected by the user and storing the image file in a memory of the electronic equipment.
The embodiment of the application does not limit the display mode of the suspension interface A. When the display mode shown in fig. 5 (b) is adopted, it is not limited whether all the displayed video frames are generated into an image file for storage or the video frame selected by the user is generated into an image file for storage.
As can be understood from fig. 5, the floating interface a for displaying the thumbnail of the snap shot may block a partial area of the photographed video picture. In order not to affect the user to view the shot video picture, the floating interface a may be set to be closed after being displayed for a period of time (e.g., 2s, 3s, 4s, etc.). The time displayed by the floating interface a can be recorded as a first preset time.
As another example, the floating interface a fades gradually until disappearing after being displayed for 2s, and the floating interface a disappears for 3s from the beginning of display to the end.
As another example, referring to fig. 6, a close control may be disposed on the floating interface a, and the close control may be configured to close the floating interface a after receiving a click operation by a user. The embodiment shown in fig. 6 controls the display time of the floating interface a by the user, so that the user can close the floating interface a in time when the user determines that the captured photo is expected, or has enough time to view the thumbnail of the captured photo when the user needs to carefully confirm whether the captured photo is expected.
In this embodiment of the application, a user operation that triggers closing of the floating interface a is denoted as a second operation, for example, an operation that a user clicks a closing control in the floating interface a as described above.
As another example, the user may also adjust the position of the hover interface a. Referring to fig. 7 (a), the user may drag the floating interface a to any position on the video capture interface by means of dragging.
The user may also resize the floating interface a. Referring to fig. 7 (b), the user can control the floating interface a to enlarge (or reduce) by pressing two fingers on the border of the floating interface a and then by opening (or pinching) two fingers
The user can also zoom the snap shots displayed in the floating interface A. Referring to fig. 7 (c), the user controls the snap shot in the floating interface a to zoom in (or zoom out) by pressing two fingers inside the floating interface a and then by opening (or pinching) the two fingers.
The manner in which the user operates the floating interface a with a finger in fig. 7 is only for example and does not limit the present application.
In addition, a user may snap multiple photographs during video capture. The user may also view other photographs taken by snap in another manner.
As another example, after the floating interface a is closed, a floating button is displayed above a video capture interface displayed on a display screen of the electronic device. The hover button is used to present a plurality of snap shots to the user after the user clicks.
Referring to fig. 8, after the floating interface a for displaying the thumbnail of the snap shot displayed above the video shooting interface disappears, a floating button is displayed above the video shooting interface.
As previously described, the hover button is displayed after hover interface a is closed. For example, the floating interface a displays the floating button after a certain time disappears, or the user clicks a close control on the floating interface a to trigger the floating interface a to close and then display the floating button.
In addition, in order to facilitate display of other interfaces or controls, a shooting screen will not be displayed in the video shooting interfaces of the embodiments shown in fig. 8 to 12.
When the user clicks the hover button, the hover button on the video shooting interface disappears, and another hover interface is displayed at the same time, and the another hover interface is used for displaying information (for example, thumbnails) of the photos captured in the current video shooting process in a list form (in practical application, other display modes are also possible).
Referring to fig. 9, another floating interface is displayed above the video capture interface after the user clicks the floating button. The floating interface shown in fig. 9 can display the photos taken during the current video shooting process. For convenience of description, the floating interface triggered and displayed by the floating button may be referred to as a floating interface B, and may also be referred to as a second floating interface. In the embodiment of the application, a user operation for triggering the electronic device to display the floating interface B is denoted as a third operation. For example, the operation of the user clicking the hover button as described above.
For example, if 3 photos are captured during the video shooting process, thumbnails of the 3 captured photos may be displayed in the floating interface B illustrated in fig. 9.
Of course, since the area of the floating interface B shown in fig. 9 is limited, and each captured picture has a fixed size when being displayed in the floating interface B, it is possible that the floating interface B cannot completely display each captured picture in the current video shooting process.
For example, in the floating interface B shown in fig. 10, thumbnails of 6 snap shots are displayed in full, and thumbnails of 2 snap shots are displayed in part. If the number of the snap shots is larger than 8 in the current video shooting process, at least 1 thumbnail of the snap shot may be hidden.
When a user needs to view a partially displayed snapshot or a hidden snapshot in the floating interface B, the floating interface B can be triggered to display other snapshots in the floating interface B through a sliding gesture.
For example, in the floating interface B shown in fig. 11, the user triggers the floating interface B to slide and display the captured photos through the slide gesture in the floating interface B. As shown in the floating interface B illustrated in fig. 11, at least 10 photographs are captured during the current video capture. The sliding gesture of the user triggers the suspension interface B to hide the snap photos 1 and 2 currently; partially displaying a snapshot 3, a snapshot 4, a snapshot 9 and a snapshot 10; the snap shots 5 to 8 are displayed in full. As the sliding gesture progresses, the snapshot displayed inside the floating interface B slides along with the sliding gesture. During the sliding process, the fully displayed snap shot, the partially displayed snap shot, and the hidden snap shot may be slightly different. The embodiments of the present application are not illustrated.
In practical applications, if the user triggers and displays the floating interface B shown in fig. 9 through the floating button shown in fig. 8, it indicates that the user needs to preview the photo captured in the current video capturing process. I.e. the user's attention may be concentrated in the snapshot displayed in the floating interface B and may be longer in time. In order not to affect the video shooting effect, the current video shooting process may be paused in a case where the user clicks the hover button shown in fig. 8 to trigger the display screen of the electronic device to display the hover interface B shown in fig. 9.
Referring to the interfaces illustrated in fig. 9 to 11, in the scenes illustrated in fig. 9 to 11, the current video shooting process is in a paused state, a pause icon is displayed in front of the shooting progress displayed on the video shooting interface, and a pause/continue control in the video shooting interface displays the shooting icon.
And under the condition that the user clicks the pause/continue control (displayed as a shooting icon) or the area which is not provided with other controls and is except the suspension interface B in the current video shooting interface, switching the current video shooting process from a pause state to a shooting state, and closing the suspension interface B at the same time. In the embodiment of the present application, the operation for triggering closing of the floating interface B (while continuing to capture the currently paused video) is denoted as a fourth operation. The pause/continue control is noted as the first control.
It should be noted that, in the case that the floating button shown in fig. 8 triggers the display of the floating interface B, if the current video shooting process is paused, the shooting status displayed in front of the shooting progress displayed in the video shooting interfaces shown in fig. 9 to 11 is a pause icon, and the pause/continue control displays the shooting icon, so that the user can click the shooting icon to continue the current video shooting process, and simultaneously close the displayed floating interface B.
In the case that the floating button shown in fig. 8 is used to trigger the floating interface B to be displayed, if the current video shooting process is not to be paused, referring to the embodiment shown in fig. 12, the shooting status displayed in front of the shooting progress displayed in the video shooting interface shown in fig. 12 is a shooting icon, and the pause/resume control displays a pause icon, so that when the user needs to pause the current video shooting process, the user clicks the pause icon to pause the current video shooting process.
In addition, the user needs to trigger the currently displayed floating interface B to be closed through other ways. For example, a closing control is set on the floating interface B in fig. 12, and the user clicks the closing control to trigger closing of the floating interface B; the user can also close the floating interface B through a preset gesture trigger. The embodiments of the present application do not limit this.
It should be further noted that, after the suspension interface a corresponding to the first snapshot in the video shooting process is closed, the suspension button shown in fig. 8 starts to be displayed on the video shooting interface. After the user clicks the floating button to trigger the floating interface B to be displayed, the floating button shown in fig. 8 is not displayed on the video shooting interface any more. And after the floating interface B is closed, the video shooting interface continues to display the floating button shown in the figure 8.
After describing the scene interface diagrams of the shooting process, in order to more clearly understand the application scenes, the following describes technical details for implementing the application scenes.
Referring to fig. 13, a technical architecture diagram for a shooting method according to an embodiment of the present application is provided. As shown in fig. 13, the technical architecture includes: the system comprises an application layer, an application framework layer and a hardware abstraction layer.
Only some layers related to the embodiment of the present application are shown in fig. 13, and other layers, for example, a system runtime layer and a kernel layer, may be further included in addition to the above layers. The embodiments of the present application are not illustrated.
In addition, the embodiments of the present application will describe partial functions of each module in each layer, and the partial functions of each module described in the embodiments of the present application are functions related to the embodiments of the present application, and do not mean that each module provides only the above functions.
The application layer has a camera application for implementing video and image (photo) capture and other functions in the capture process.
The camera application is provided with a user interface module, and the user interface module can provide each display interface in the above embodiments.
The camera is also provided with a photographing module which can realize the function of image (photo) photographing.
Still be equipped with the video module in this camera application, this video module can realize the video and shoot the function.
The camera application is further provided with a data processing module, and the data processing module is used for providing support for pictures displayed by each interface in the scene, for example, the pictures displayed by each interface can be drawn based on data transmitted from a lower layer.
The application framework layer is a media framework which can realize data interaction between a camera application in an upper layer (application layer) and each module in a lower layer (hardware abstraction layer).
The media frame is provided with an audio and video coding and decoding module which can code and process video data streams transmitted by a lower layer (hardware abstraction layer).
The media framework is further provided with a camera service (camera service), and the camera service can encapsulate the video recording request transmitted by the upper layer application. The encapsulated record request is transmitted to the lower layer (hardware abstraction layer).
The media frame is also provided with a media database which is used for temporarily storing the storage address of the captured photo.
And the hardware abstraction layer can encapsulate the drive of some cameras to interact with the camera hardware at the bottom layer, so that the camera is called to realize the video shooting function and the image shooting function.
The hardware abstraction layer is provided with a Camera module, the Camera module is used for defining a universal standard interface, and the Camera service realizes communication with the underlying Camera hardware based on the standard interface provided by the Camera module.
The hardware abstraction layer is also provided with a Graphics class that provides a method for drawing objects (borders of the floating interface, thumbnails of snap-shot images, etc.) to the display device.
The timing interaction diagram between the layers in the above technical architecture will be described by fig. 14.
Step A1, a user clicks a video recording control on a display screen of the electronic equipment to start a video recording function of the electronic equipment.
And step A2, after receiving a video recording instruction corresponding to the operation of clicking the video recording control by the user, the camera application of the electronic equipment sends a video recording request to the media framework.
And step A3, after receiving the video recording request, the media framework of the electronic equipment encapsulates the video recording request to obtain the encapsulated video recording request.
And step A4, the media framework of the electronic equipment sends the packaged video recording request to the hardware abstraction layer.
And step A5, calling a camera at the bottom layer by a hardware abstraction layer of the electronic equipment to start to collect the video data stream, and returning the video data stream collected by the camera to the hardware abstraction layer.
And step A6, a hardware abstraction layer of the electronic equipment receives the video data stream collected by the bottom layer camera and sends the received video data stream to the media frame.
And step A7, the media framework of the electronic equipment performs coding processing on the received video data stream.
And step A8, the media framework of the electronic equipment sends the video data stream after the coding processing to the camera application.
And step A9, after the camera application of the electronic equipment receives the video data stream after the coding processing, drawing a video picture based on the video data stream after the coding processing and displaying the video picture through a display screen of the electronic equipment.
And displaying in a video shooting interface when the display screen of the electronic equipment displays the video picture.
In addition, after step A5, the camera continues to capture the video data stream, and accordingly, steps A6 to A9 also continue. With the time being prolonged, the camera application continuously draws the video data stream received in real time into video pictures and sends the video pictures to the display screen for display, thereby realizing the video shooting process.
Step B1 is performed when the user wants to take a snapshot during video capture.
And step B1, clicking a snapshot control in a video shooting interface displayed on a display screen of the electronic equipment by a user.
And step B2, after the camera application of the electronic equipment receives a snapshot instruction corresponding to the operation of clicking the snapshot control by the user, the camera application generates a thumbnail from a video frame corresponding to a video picture currently displayed on the video shooting interface.
It should be noted that, the difference of the contents displayed in the video frames of several consecutive frames is not large, and particularly, when the video frame rate is large, the contents displayed in the video frames of several consecutive frames may be almost the same. Pictures in some video frames may be blurry due to motion of people, animals or objects in the scene being photographed.
In addition, the instant the user wants to snap and the time when the user clicks the snap control are not exactly the same. Therefore, when the thumbnail is generated, the time when the camera application of the electronic device receives the snapshot instruction or the time when the touch screen detects the user operation triggering the snapshot can be recorded first. As an example, the time may be recorded as a first time.
The video frames that generate the thumbnail may be: and a video frame corresponding to the video picture displayed at the time by a video shooting interface in a display screen of the electronic equipment.
The video frames that generate the thumbnail may also be: and the front n frames, the rear n frames and the like of the video frames corresponding to the video pictures displayed at the time by the video shooting interface in the display screen of the electronic equipment. Wherein n is a positive integer greater than or equal to 1. For example, n may be 1, 2, 3, 4, 5.
The video frames that generate the thumbnail may also be: and a video shooting interface in a display screen of the electronic equipment displays the clearest frame in the first n frames, the current frame and the last n frames of the video frames corresponding to the video pictures displayed at the time. As an example, a detection model may be provided, which may detect the sharpness of the image. The detection model is input with the previous n frames, the current frame and the next n frames, and the output video frame with the highest definition is used as the video frame for generating the thumbnail.
And B3, the camera application of the electronic equipment instructs a display screen of the electronic equipment to display a suspension interface A, and the picture displayed in the suspension interface A is the generated thumbnail.
The thumbnail may be temporarily stored in a memory space (for example, 3M memory space may be pre-allocated), and the memory space is used for temporarily storing the thumbnail of the snapshot displayed in the floating interface a.
And B4, generating an image file by the camera application of the electronic equipment through the video frame for generating the thumbnail and storing the image file.
In this step, the image file may be stored in a gallery of the electronic device. If the user exits the current video shooting process, the image file corresponding to the snapshot photo can be checked from the gallery of the electronic equipment.
Step B5, the camera application of the electronic device stores the storage address (and other information) and the index identification of the image file into the media database of the media frame.
The index identifier is stored in the media database in association with the storage address of the image file, i.e. the storage address of the image file and the index identifier are stored in a one-to-one manner. The storage address of an image file corresponds to an index identifier. The index identifiers corresponding to the storage addresses of the plurality of image files may be the same. The image file corresponding to the index identifier may then be retrieved from the media database based on the index identifier. If the index identifiers of the storage addresses of the image files corresponding to all the snap shots in the current video shooting process are the same, the storage addresses of the image files corresponding to all the snap shots in the current video shooting process can be obtained based on the index identifiers.
In view of the above, the index identification may be associated with the current video shot. As an example, each time a video is shot, a video identification of the video that is currently to be shot may be generated prior to the shooting. For example, the current timestamp + random number is used as the video identifier of the video to be currently captured. And generating a video identifier for each shooting of the video, wherein the generated video identifier is unique. The index identification may be a video identification generated when the video is shot.
It should be noted that, step B2 and step B3 are used to generate a thumbnail of the snap shot photo and display the thumbnail of the snap shot photo, so that the user can preview the snap shot photo. And the steps B4 to B5 are used for generating an image file from the snap shot photo and storing the image file, and the image file can be stored in a memory of the electronic equipment as the snap shot photo generates the image file, so that a user can preview the snap shot photo when needing to preview the snap shot photo in the current video shooting process.
As an example, when the user previews the snap shot photo in the video shooting process by clicking the hover button as described above, the storage address of the image file can be obtained from the media database, so that the image file is obtained from the storage address of the image file to generate the thumbnail of the image file; and displaying the thumbnail of the image file on a floating interface B triggered and displayed by a floating button.
Of course, after the current video shooting is finished, when the user enters the gallery to view the snapshot photos, the thumbnail of the image file can be previewed in the preview interface of the gallery.
In practical applications, the camera application may perform step B4 after performing step B3. The camera application may also start a sub-thread after receiving the snap instruction, the sub-thread being used to perform steps B4 to B5. Namely, step B2 to step B3, and step B4 to step B5 are performed simultaneously using two threads.
In step B5, the storage address of the image file corresponding to the snapshot is stored in the media database for example, and in practical application, the media database may further store other information of the image file corresponding to the snapshot, for example, the size of the image file corresponding to the snapshot, the snapshot time, and the like.
And step B6, after the thumbnail 3s is displayed on the floating interface A, the camera application of the electronic equipment closes the floating interface A and displays a floating button.
As described above, when the floating interface a displays a thumbnail, the transparency may be slowly increased until the thumbnail becomes completely transparent and disappears. The hover button is then displayed on the video capture interface. In the embodiment of the application, the transparency may be set to be a value of 0 to 1, and when the floating interface a is displayed on the video shooting picture, the transparency starts to be 0 and then gradually increases to 1. When the value is increased to 1, the suspension interface A disappears, and a suspension button is displayed on the video shooting interface. This embodiment is an implementation manner, and may be implemented in other ways when the embodiment is implemented specifically.
The setting of suspension button can provide the entry for the user, and this entry is used for looking over the candid photograph of current video shooting in-process, therefore, no matter be the mode that the user closes the control through clicking on the suspension interface A, still should suspend interface A and show the mode that closes by oneself after the certain time, electronic equipment all can show this suspension button to for the user provides the entry of looking over the arbitrary candid photograph of current video shooting in-process.
As shown in step B3, the thumbnail of the snapshot may be temporarily stored in a segment of memory space. After step B6, the thumbnail images stored in the memory space are emptied. So that the thumbnail of the snapshot is still stored in the memory space when the next snapshot is taken.
It should be noted that, in the video shooting process, if the picture is snap-shot at the 1 st time, steps B1 to B6 need to be executed. On the one hand, the thumbnail of the 1 st snap photo needs to be displayed, so that the user can preview the thumbnail of the 1 st snap photo. On the other hand, the storage address of the image file of the 1 st snapshot and the index identifier need to be stored in the media database in an associated manner, so that the user can preview based on the index identifier when needing to preview all the snapshots in the current video shooting.
If the picture is taken at the ith time (i is a positive integer larger than or equal to 2), the steps B1 to B6 still need to be executed. On one hand, the thumbnail of the ith snapshot needs to be displayed, so that the user can preview the thumbnail of the ith snapshot. On the other hand, the storage address and the index identifier (which may be the same as the index identifier when the storage address of the image file is stored for the 1 st time) of the image file of the ith snapshot are required to be stored in the media database, so that the user can preview based on the index identifier when all snapshots in the current video shooting need to be previewed.
As an example, in the embodiment shown in fig. 14, if the user clicks the snapshot control again (for example, 2 nd time) after step B6 shown in fig. 14, the electronic device repeats the steps of step B1 to step B6 again. The picture taken by the user at 2 nd snap is displayed again in the same manner (the hover button disappears when hover interface a is displayed). And after the thumbnail of the 2 nd snapshot is displayed on the suspension interface A (self-closing after the preset time or closing by clicking a closing control by a user), continuing to display the suspension button.
As another example, in the embodiment shown in fig. 14, if the user clicks the snapshot control again (for example, 2 nd time) before the electronic device performs step B3 and step B6, that is, before the floating interface a disappears, the electronic device may control the floating interface a to terminate displaying the thumbnail of the 1 st snapshot and display the thumbnail of the 2 nd snapshot instead. And displaying a floating button after the thumbnail of the 2 nd snapshot is displayed on the floating interface A (self-closing after the preset time or closing by clicking a closing control by a user).
After the user performs step B1, the electronic device continues to perform steps A5 (receiving the video data stream) to A9 while the electronic device performs steps B2 to B6. Namely, the video shooting interface displayed on the display screen of the electronic equipment still continuously displays video pictures.
If the user takes a plurality of photos in the current video shooting process, the image files corresponding to the plurality of photos taken in the current video shooting process are stored in the gallery of the electronic equipment. Correspondingly, the media database stores the storage addresses of the image files corresponding to the plurality of pictures captured in the current video capturing process.
After each snapshot, the floating button is displayed again after the floating interface a (a single thumbnail for displaying the current snapshot) displayed on the video shooting interface disappears.
If the user needs to view all the photos captured in the current video shooting process (or other photos captured before the current photo capture), the user may execute step C1.
And C1, clicking a floating button on a video shooting interface displayed on a display screen of the electronic equipment by a user.
And step C2, after the camera application of the electronic equipment receives a query instruction corresponding to the operation of clicking the floating button by the user, the camera application acquires an index identifier (video identifier of the currently shot video).
Step C3, the camera application sends the index identification to the media framework.
And C4, the media framework acquires the storage addresses of the image files corresponding to all the snap shots in the current video process from the media database based on the index identification.
And step C5, the media framework returns the storage address of the searched image file to the camera application.
And step C6, the camera application acquires image files corresponding to all the snap shots in the current video process based on the received storage address, and displays thumbnails of all the snap shots (or thumbnails of partial snap shots) in a list form through the suspension interface B. After the hover interface B is displayed, the hover button over the video capture interface disappears.
In addition, after the user performs step C1, in an application scene in which the video shooting process needs to be in a pause state, the camera application needs to transmit a pause instruction to a lower layer so that the camera pauses the capturing. Specifically, refer to steps D2 to D5.
And D2, after the camera application of the electronic equipment receives a query instruction corresponding to the operation of clicking the floating button by the user, the camera application sends a pause instruction to the media frame.
And D3, after the media frame of the electronic equipment receives the pause instruction, encapsulating the pause instruction to obtain the encapsulated pause instruction.
And D4, the media framework of the electronic equipment sends the packaged pause instruction to the hardware abstraction layer.
And D5, the hardware abstraction layer of the electronic equipment indicates the camera at the bottom layer to stop collecting the video data stream.
Step D2 to step D4 are similar to step A2 to step A4, the step A2 to step A4 transmits an acquisition request, and correspondingly, the camera starts to acquire a video data stream and returns the acquired video data stream. And D2 to D5, transmitting an instruction for indicating the suspension of acquisition, and correspondingly, stopping acquiring the video data stream by the camera.
Of course, after the user finishes viewing the thumbnail of the snap shot through the floating interface B, the user may continue the current video shooting process.
And E1, clicking an area except the floating interface B on the video shooting interface by the user (or clicking a pause/continue control for currently displaying the shooting icon).
And E2, after receiving a video continuous instruction corresponding to the operation of clicking the area outside the suspension interface B by the user, the camera application of the electronic equipment sends a video continuous request to the media framework.
For the steps E3 to E9, reference may be made to the related descriptions in the steps A3 to A9, which are not described herein again.
Of course, in step E9, in the process of continuing to draw and display the video picture, a hover button may be displayed above the video capture interface.
As another example, in the embodiment shown in fig. 14, between step C6 and step E9, before the electronic device displays the floating interface B until the floating interface B disappears, if the user clicks the snapshot control again (for example, 2 nd time), the electronic device may not repeatedly perform the steps of step B1 to step B6. Because, the camera in the electronic device is not in the acquisition state but in the pause state at this time. Therefore, it is not necessary to generate a snapshot.
As another example, in the embodiment shown in fig. 14, after the hover button is displayed in step E9, if the user clicks the snapshot control again (for example, 2 nd time), after step E9 shown in fig. 14, the electronic device repeats the steps of step B1 to step B6 again. The picture taken by the user at 2 nd snap is displayed again in the same manner (the hover button disappears when hover interface a is displayed). And displaying a floating button after the thumbnail of the 2 nd snapshot is displayed on the floating interface A (self-closing after the preset time or closing by clicking a closing control by a user).
As another example, in the embodiment shown in fig. 14, after the hover button is displayed at step E9, if the user clicks the hover button again (e.g., 2 nd time). After step E9 shown in fig. 14, the electronic device repeats steps C1 to C6 again to display thumbnails of all captured photos in the current video (when the floating interface B is displayed, the floating interface disappears).
In summary, the snapshot control, as a control on the video shooting interface, may be set as: from the start of the current video shot until the end of the current video shot. The snapshot control is used for triggering the suspension interface A to be displayed on the video shooting interface so as to display the photo captured by the current trigger snapshot control. Whether the display of the suspension interface A does not influence the display of the snapshot control.
And the suspension button starts to display after the suspension interface A displayed for the first time in the video shooting process disappears. The suspension button is used for triggering the suspension interface B displayed on the video shooting interface so as to display all captured pictures in the current video shooting process. After displaying the hover interface B, the hover button may be set to disappear. Of course, after the floating interface a is displayed, the floating button may also be set to disappear. It can also be understood that, after the hover button is displayed for the first time in the current video shooting process, the hover button disappears when the hover interface a or the hover interface B is displayed on the video shooting interface. In the case where the video shooting interface does not display either the floating interface a or the floating interface B, the floating button is displayed.
Finally, after the video shooting is finished, the storage address which is stored in the media database and is associated with the video identifier of the video just finished shooting can be emptied together. The image file corresponding to the storage address (image file generated by taking a snapshot) is not cleared.
The camera application provided by the embodiment of the present application may be called by other application programs, for example, when the other application programs need to use the camera function, the other application programs may call all or part of the functions in the camera application provided by the embodiment of the present application.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the steps in the above-mentioned method embodiments may be implemented.
Embodiments of the present application further provide a computer program product, which when run on a first device, enables the first device to implement the steps in the foregoing method embodiments.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer readable storage medium and used by a processor to implement the steps of the embodiments of the methods described above. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or apparatus capable of carrying computer program code to a first device, including recording media, computer Memory, read-Only Memory (ROM), random-Access Memory (RAM), electrical carrier signals, telecommunications signals, and software distribution media. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In some areas, the computer-readable medium may not be an electrical carrier signal or a telecommunications signal.
An embodiment of the present application further provides a chip system, where the chip system includes a processor, the processor is coupled to the memory, and the processor executes a computer program stored in the memory to implement the steps of any of the method embodiments of the present application. The chip system may be a single chip or a chip module composed of a plurality of chips.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the embodiments of the present application, and they should be construed as being included in the present application.

Claims (13)

1. A shooting method is applied to electronic equipment, wherein the electronic equipment comprises a display screen and a camera, and the shooting method comprises the following steps:
the electronic equipment displays a video shooting interface through the display screen, and the video shooting interface is used for displaying a video picture acquired by the camera in real time;
the electronic equipment detects a first operation, the first operation is used for capturing a first image, the first image is related to a video picture acquired by the camera in real time, and an index identifier of the first image is a unique identifier of a currently shot video;
the electronic equipment displays a first suspension interface on the video shooting interface, wherein the first suspension interface is used for displaying a thumbnail of the first image;
under the condition that the first suspension interface displays a first preset time, the electronic equipment closes the first suspension interface;
the electronic equipment detects a third operation, wherein the third operation is used for triggering the electronic equipment to display a second suspension interface;
the electronic equipment displays the second suspension interface on the video shooting interface based on the third operation, wherein the second suspension interface is used for displaying thumbnails of all the snapshot images, the snapshot images are images snapshot in the current video shooting process, and the snapshot images are images with index identifications being unique identifications of the currently shot videos.
2. The shooting method according to claim 1, wherein a snapshot control is provided on the video shooting interface, and the first operation acts on the snapshot control.
3. The shooting method of claim 1 or 2, wherein after the electronic device displays the first hover interface on the video shooting interface, the shooting method further comprises:
the electronic equipment detects a second operation, and the second operation is used for closing the first suspension interface;
the electronic device closes the first floating interface based on the second operation.
4. The shooting method according to claim 3, wherein a closing control is provided on the first floating interface; the second operation acts on the close control.
5. The method of claim 1, wherein after the electronic device closes the first hover interface, the method of capturing further comprises:
the electronic equipment displays a suspension button on the video shooting interface, wherein the third operation acts on the suspension button.
6. The photographing method according to claim 1 or 5, wherein in a case where the electronic apparatus detects a third operation, the photographing method further comprises:
the electronic device pauses the current video capture based on the third operation.
7. The shooting method of claim 6, wherein after the electronic device displays the second floating interface on the video shooting interface based on the third operation, the shooting method further comprises:
the electronic device detects a fourth operation;
the electronic equipment closes the second floating interface based on the fourth operation;
the electronic device continues the currently paused video capture based on the fourth operation.
8. The shooting method according to claim 7, wherein the fourth operation is performed on a first control provided on the video shooting interface or on an area of the video shooting interface, which is outside the second floating interface and is not provided with a control, wherein the first control is used for pausing a current video shooting or continuing a currently paused video shooting.
9. The photographing method according to claim 1, wherein in a case where the electronic device detects the first operation, the photographing method further comprises:
the electronic device records a first time when the first operation is detected;
the electronic equipment generates the first image from a video picture next to a video picture displayed by the video shooting interface at the first time;
the electronic device generates a thumbnail of the first image.
10. The capture method of claim 9, wherein after generating a first image from a video frame next to a video frame displayed by the video capture interface at the first time, further comprising:
the electronic device stores the first image;
and the electronic equipment sends the storage address of the first image and the index identification of the snapshot image to a media database.
11. The shooting method of claim 10, wherein the electronic device displaying the second floating interface on the video shooting interface based on the third operation comprises:
the electronic equipment acquires an index identifier of the snapshot image based on the third operation;
the electronic equipment acquires a storage address of the snapshot image in the current video shooting process based on the index identification of the snapshot image;
the electronic equipment acquires the snapshot image based on the acquired storage address;
the electronic equipment generates a thumbnail of the snapshot image;
the electronic equipment displays the generated thumbnail of the snapshot image in the second floating interface, wherein the thumbnail of the snapshot image comprises the thumbnail of the first image.
12. An electronic device, characterized in that the electronic device comprises a processor for executing a computer program stored in a memory, so that the electronic device implements the method according to any of claims 1 to 11.
13. A computer-readable storage medium, in which a computer program is stored which, when run on a processor, implements the method of any one of claims 1 to 11.
CN202111077227.2A 2021-09-14 2021-09-14 Shooting method, electronic equipment and storage medium Active CN113810608B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211416840.7A CN115866390B (en) 2021-09-14 2021-09-14 Shooting method, electronic equipment and storage medium
CN202111077227.2A CN113810608B (en) 2021-09-14 2021-09-14 Shooting method, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111077227.2A CN113810608B (en) 2021-09-14 2021-09-14 Shooting method, electronic equipment and storage medium

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202211416840.7A Division CN115866390B (en) 2021-09-14 2021-09-14 Shooting method, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113810608A CN113810608A (en) 2021-12-17
CN113810608B true CN113810608B (en) 2022-11-25

Family

ID=78941013

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202111077227.2A Active CN113810608B (en) 2021-09-14 2021-09-14 Shooting method, electronic equipment and storage medium
CN202211416840.7A Active CN115866390B (en) 2021-09-14 2021-09-14 Shooting method, electronic equipment and storage medium

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202211416840.7A Active CN115866390B (en) 2021-09-14 2021-09-14 Shooting method, electronic equipment and storage medium

Country Status (1)

Country Link
CN (2) CN113810608B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114205531B (en) * 2021-12-23 2024-06-04 北京罗克维尔斯科技有限公司 Intelligent photographing method, device and apparatus for vehicle and storage medium
CN116700577A (en) * 2022-02-28 2023-09-05 荣耀终端有限公司 Video processing method, electronic device and readable storage medium
CN116700846B (en) * 2022-02-28 2024-04-02 荣耀终端有限公司 Picture display method and related electronic equipment
CN115525188A (en) * 2022-02-28 2022-12-27 荣耀终端有限公司 Shooting method and electronic equipment
CN116781822A (en) * 2022-03-15 2023-09-19 荣耀终端有限公司 Video processing method, electronic device and readable medium
CN115328357A (en) * 2022-08-15 2022-11-11 北京达佳互联信息技术有限公司 Captured image processing method and device, electronic device and storage medium
CN116320783B (en) * 2022-09-14 2023-11-14 荣耀终端有限公司 Method for capturing images in video and electronic equipment
CN118555477A (en) * 2023-02-27 2024-08-27 荣耀终端有限公司 Snapshot method, terminal equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105827935A (en) * 2015-07-23 2016-08-03 维沃移动通信有限公司 Terminal screenshot method and terminal
CN107580234A (en) * 2017-09-01 2018-01-12 歌尔科技有限公司 Photographic method, display end, shooting head end and system in wireless live broadcast
CN109726179A (en) * 2018-12-29 2019-05-07 努比亚技术有限公司 Screenshot picture processing method, storage medium and mobile terminal
CN111290675A (en) * 2020-03-02 2020-06-16 Oppo广东移动通信有限公司 Screenshot picture sharing method and device, terminal and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015044947A1 (en) * 2013-09-30 2015-04-02 Yanai Danielle Image and video processing and optimization
CN105635614A (en) * 2015-12-23 2016-06-01 小米科技有限责任公司 Recording and photographing method, device and terminal electronic equipment
CN105681648A (en) * 2015-12-31 2016-06-15 北京金山安全软件有限公司 Picture viewing method and device and electronic equipment
CN109922266B (en) * 2019-03-29 2021-04-06 睿魔智能科技(深圳)有限公司 Snapshot method and system applied to video shooting, camera and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105827935A (en) * 2015-07-23 2016-08-03 维沃移动通信有限公司 Terminal screenshot method and terminal
CN107580234A (en) * 2017-09-01 2018-01-12 歌尔科技有限公司 Photographic method, display end, shooting head end and system in wireless live broadcast
CN109726179A (en) * 2018-12-29 2019-05-07 努比亚技术有限公司 Screenshot picture processing method, storage medium and mobile terminal
CN111290675A (en) * 2020-03-02 2020-06-16 Oppo广东移动通信有限公司 Screenshot picture sharing method and device, terminal and storage medium

Also Published As

Publication number Publication date
CN113810608A (en) 2021-12-17
CN115866390B (en) 2023-11-07
CN115866390A (en) 2023-03-28

Similar Documents

Publication Publication Date Title
CN113810608B (en) Shooting method, electronic equipment and storage medium
JP7536090B2 (en) Machine translation method and electronic device
CN111654629B (en) Camera switching method and device, electronic equipment and readable storage medium
CN113475092B (en) Video processing method and mobile device
CN112954210B (en) Photographing method and device, electronic equipment and medium
CN103581544B (en) Dynamic area-of-interest adjusts and provided the image-capturing apparatus of dynamic area-of-interest adjustment
CN109040474B (en) Photo display method, device, terminal and storage medium
WO2023134583A1 (en) Video recording method and apparatus, and electronic device
EP4436198A1 (en) Method for capturing images in video, and electronic device
CN112136309B (en) System and method for performing rewind operations with a mobile image capture device
WO2022179331A1 (en) Photographing method and apparatus, mobile terminal, and storage medium
CN111669495B (en) Photographing method, photographing device and electronic equipment
CN115484403A (en) Video recording method and related device
WO2024179100A1 (en) Photographing method
KR20160088719A (en) Electronic device and method for capturing an image
EP3550817A1 (en) Apparatus and method for associating images from two image streams
WO2024061134A1 (en) Photographing method and apparatus, electronic device, and medium
CN115883958A (en) Portrait shooting method
WO2022262451A1 (en) Video photographing method and electronic device
WO2022262540A1 (en) Photographing method and electronic device
WO2022257655A1 (en) Video photographing method and electronic device
WO2022206605A1 (en) Method for determining target object, and photographing method and device
CN113794833B (en) Shooting method and device and electronic equipment
CN117692759A (en) Method, terminal, storage medium and chip for generating preview image
CN115802148A (en) Method for acquiring image and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant