CN115866390B - Shooting method, electronic equipment and storage medium - Google Patents

Shooting method, electronic equipment and storage medium Download PDF

Info

Publication number
CN115866390B
CN115866390B CN202211416840.7A CN202211416840A CN115866390B CN 115866390 B CN115866390 B CN 115866390B CN 202211416840 A CN202211416840 A CN 202211416840A CN 115866390 B CN115866390 B CN 115866390B
Authority
CN
China
Prior art keywords
interface
video
image
electronic device
electronic equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211416840.7A
Other languages
Chinese (zh)
Other versions
CN115866390A (en
Inventor
银康林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211416840.7A priority Critical patent/CN115866390B/en
Publication of CN115866390A publication Critical patent/CN115866390A/en
Application granted granted Critical
Publication of CN115866390B publication Critical patent/CN115866390B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Studio Devices (AREA)

Abstract

The application provides a shooting method, electronic equipment and a storage medium, and relates to the technical field of shooting, wherein the method comprises the following steps: when a video is shot through a camera on the electronic equipment, a display screen of the electronic equipment can display a video picture acquired by the camera in real time through a video shooting interface; the electronic equipment is provided with a snapshot control on a video shooting interface, after a user clicks the snapshot control, the electronic equipment can select a frame of video frame from videos shot by the camera to serve as a snapshot, a suspension interface is displayed above the video shooting interface and is used for displaying a thumbnail of the snapshot, so that the user can preview the snapshot under the condition of not interrupting the current video shooting when shooting the videos through the electronic equipment, and the intelligent degree of the electronic equipment is improved.

Description

Shooting method, electronic equipment and storage medium
The application relates to a split application of China patent application which is submitted to the national intellectual property agency, the application number is 202111077227.2 and the application name is 'a shooting method, electronic equipment and storage medium' on the day of 9 and 14 of 2021.
Technical Field
The present application relates to the field of image capturing, and in particular, to a capturing method, an electronic device, and a storage medium.
Background
More and more electronic devices are provided with cameras, so that users can carry the electronic devices to take videos and pictures anytime and anywhere. When a user takes a video by using the electronic device, the user can take a picture without interrupting the video.
If the user needs to check the effect of the photo shot in the video shooting process, entering a gallery of the electronic equipment to check the shot photo after the video shooting is finished. If the effect of the photographed picture is poor, the photographing timing of the picture may have been missed. If the user interrupts the current video shooting process and enters the gallery of the electronic device to view the shot photos after interrupting the current video shooting process, the video shooting opportunity may be missed. Therefore, the intelligent degree of the electronic equipment is low, and the shooting requirement of a user cannot be met.
Disclosure of Invention
The application provides a shooting method, electronic equipment and a storage medium, which solve the problem that the shooting requirement of a user cannot be met due to low intelligent program of the electronic equipment.
In order to achieve the above purpose, the application adopts the following technical scheme:
In a first aspect, the present application provides a photographing method applied to an electronic apparatus including a display screen and a camera, the photographing method including:
the electronic equipment displays a video shooting interface through a display screen, and the video shooting interface is used for displaying video pictures acquired by a camera in real time;
the electronic equipment detects a first operation, wherein the first operation is used for capturing a first image, and the first image is related to a video picture acquired by the camera in real time;
the electronic device displays a first hover interface on the video capture interface, the first hover interface configured to display a thumbnail of the first image.
In the application, in the process of shooting video through the camera of the electronic equipment, a video shooting interface can be displayed on the display screen of the electronic equipment, and the video shooting interface is used for displaying video pictures acquired by the camera in real time. The electronic device may provide a snapshot function in the video capturing process, and when the electronic device detects an operation (e.g., a first operation) for capturing a photo, a frame of image may be selected from video frames captured by the camera in real time as a captured photo (e.g., a first image), and the electronic device displays a hover interface (e.g., a first hover interface) on the video capturing interface, through which a thumbnail of the captured photo is displayed. According to the method, on the premise that the current video shooting is not interrupted, the snapshot photo can be displayed to the user in time, so that the user can preview the snapshot photo in time, and the shooting experience of the user is improved.
As an implementation manner of the first aspect, a snapshot control is provided on the video shooting interface, and the first operation acts on the snapshot control.
According to the application, the snapshot control is arranged on the video shooting interface, and when a user clicks the snapshot control, a more visual snapshot mode can be provided for the user.
As another implementation manner of the first aspect, after the electronic device displays the first hover interface on the video capturing interface, the capturing method further includes:
and closing the first suspension interface by the electronic equipment under the condition that the first suspension interface displays the first preset time.
In the application, a first preset time period can be set so that a user has enough time to preview the snap photo; after the first preset duration is set and displayed, the electronic equipment actively closes a first suspension interface for displaying the thumbnail of the snap photo, so that the influence of autonomous closing of a user on video shooting is avoided. The experience of the user is better.
As another implementation manner of the first aspect, after the electronic device displays the first hover interface on the video capturing interface, the capturing method further includes:
the electronic equipment detects a second operation, wherein the second operation is used for closing the first suspension interface;
The electronic device closes the first hover interface based on the second operation.
In the application, the first suspension interface can be closed automatically by a user, so that a more diversified mode is provided and the method is applied to electronic equipment suitable for different client groups.
As another implementation manner of the first aspect, a closing control is provided on the first suspension interface; the second operation acts to close the control.
As another implementation manner of the first aspect, after the electronic device closes the first suspension interface, the photographing method further includes:
the electronic equipment detects a third operation, and the third operation is used for triggering the electronic equipment to display a second suspension interface;
the electronic equipment displays a second suspension interface on the video shooting interface based on a third operation, wherein the second suspension interface is used for displaying a thumbnail of a snap shot image, the snap shot image is a snap shot image in the current video shooting process, and the snap shot image comprises a first image.
In the application, another function can be set, the function is triggered by the third operation, the user can trigger to display another suspension interface on the video shooting interface through the third operation, and the suspension interface can display all the snap shots in the current video shooting process. By providing more diversified functions, the requirement of a user for searching any photo which is snap shot in the current video shooting process is met.
As another implementation manner of the first aspect, after the electronic device closes the first suspension interface, the photographing method further includes:
the electronic device displays a hover button on the video capture interface, wherein a third operation is applied to the hover button.
As an implementation manner of the first aspect, in a case where the electronic device detects the third operation, the photographing method further includes:
the electronic device pauses the current video capture based on the third operation.
According to the application, when the user previews the shot picture in the current video shooting process through the third operation, the current video shooting process is paused, so that the situation that the user disperses energy and cannot shoot the video meeting the requirement is avoided.
As another implementation manner of the first aspect, after the electronic device displays the second hover interface on the video capturing interface based on the third operation, the capturing method further includes:
the electronic device detecting a fourth operation;
the electronic device closes the second suspension interface based on the fourth operation;
the electronic device continues the currently paused video capture based on the fourth operation.
As another implementation manner of the first aspect, the fourth operation acts on a first control set on the video capturing interface, or acts on an area, which is outside the second suspension interface and is not provided with a control, on the video capturing interface, where the first control is used to pause the current video capturing or to continue the currently paused video capturing.
In the application, diversified ways are provided to continue the currently paused video shooting.
As another implementation manner of the first aspect, in a case where the electronic device detects the first operation, the photographing method further includes:
the electronic device records a first time when the first operation is detected;
the electronic equipment generates a first image from a video picture of the next frame of the video picture displayed by the video shooting interface at the first time;
the electronic device generates a thumbnail of the first image.
As another implementation manner of the first aspect, after generating the first image from a video frame next to the video frame displayed by the video capturing interface at the first time, the method further includes:
the electronic device stores a first image;
the electronic device sends the storage address of the first image and the index identification of the snap shot image to a media database.
In the application, the first image can be stored in the memory of the electronic equipment, a media database can be set for caching the storage address of the snapshot photo in the current video shooting process, and the index mark can be used as the index of the storage address of the snapshot photo when the storage address of the snapshot photo in the current video shooting process is stored in the media database. The index identification corresponding to each video shooting is different.
As another implementation manner of the first aspect, displaying, by the electronic device, the second hover interface on the video capturing interface based on the third operation includes:
the electronic equipment acquires an index identifier of the snapshot image based on the third operation;
the electronic equipment acquires a storage address of the snapshot image in the current video shooting process based on the index identification of the snapshot image;
the electronic equipment acquires a snapshot image based on the acquired storage address;
the electronic equipment generates a thumbnail of the snap-shot image;
the electronic device displays a thumbnail of the generated snap image in the second hover interface, the thumbnail of the snap image including a thumbnail of the first image.
As another implementation of the first aspect, the index identifier of the snap shot image is a unique identifier of the currently shot video.
In a second aspect, there is provided an electronic device comprising a processor for executing a computer program stored in a memory, implementing the method of any one of the first aspects of the application.
In a third aspect, there is provided a system on a chip comprising a processor coupled to a memory, the processor executing a computer program stored in the memory to implement the method of any of the first aspects of the application.
In a fourth aspect, there is provided a computer readable storage medium storing a computer program which when executed by one or more processors performs the method of any of the first aspects of the application.
In a fifth aspect, the application provides a computer program product for causing a device to perform the method of any of the first aspects of the application when the computer program product is run on the device.
It will be appreciated that the advantages of the second to fifth aspects may be found in the relevant description of the first aspect, and are not described here again.
Drawings
Fig. 1 is a schematic hardware structure of an electronic device according to an embodiment of the present application;
fig. 2 is a schematic diagram of a video capturing interface displayed on a display screen of an electronic device when the electronic device captures a video according to an embodiment of the present application;
fig. 3 is a schematic diagram illustrating an operation of a user capturing a photograph during a video capturing process according to an embodiment of the present application;
fig. 4 is a schematic diagram of a process of viewing a snap shot photo after a user captures a photo in a video shooting process according to an embodiment of the present application;
fig. 5 is a schematic diagram of a suspension interface a displayed by an electronic device after a user captures a photo in a video capturing process according to an embodiment of the present application;
FIG. 6 is a schematic diagram of another levitation interface A according to an embodiment of the present application;
FIG. 7 is a schematic diagram illustrating the operation of the suspension interface A according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a hover button after hover interface A is closed according to an embodiment of the present application;
fig. 9 is a schematic diagram of a hover interface B triggered and displayed by a hover button according to an embodiment of the present application;
FIG. 10 is another schematic view of a levitation interface B according to an embodiment of the present application;
FIG. 11 is another schematic diagram of a levitation interface B according to an embodiment of the present application;
FIG. 12 is another schematic view of a levitation interface B according to an embodiment of the present application;
FIG. 13 is a schematic diagram of a shooting method according to an embodiment of the present application;
fig. 14 is a timing chart for implementing a shooting method according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that in embodiments of the present application, "one or more" means one, two, or more than two; "and/or", describes an association relationship of the association object, indicating that three relationships may exist; for example, a and/or B may represent: a alone, a and B together, and B alone, wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship.
Furthermore, in the description of the present specification and the appended claims, the terms "first," "second," "third," "fourth," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
The shooting method provided by the embodiment of the application can be applied to electronic equipment provided with a display screen and a camera. The electronic device can be a tablet computer, a mobile phone, a wearable device, a notebook computer, an ultra-mobi le persona l computer (UMPC), a netbook, a personal digital assistant (persona l d igita lass i stant, PDA) and other electronic devices. The embodiment of the application does not limit the specific type of the electronic equipment.
Fig. 1 shows a schematic structural diagram of an electronic device. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a sensor module 180, keys 190, a camera 193, a display 194, and the like. Among other things, the sensor module 180 may include a pressure sensor 180A, a touch sensor 180K, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (app l icat ion processor, AP), a modem processor, a graphics processor (graph ics process ing un it, GPU), an image signal processor (image s igna l processor, ISP), a controller, a memory, a video codec, a digital signal processor (d igita l s igna l processor, DSP), a baseband processor, and/or a neural network processor (neuro-network process ing un it, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
For example, the processor 110 is configured to perform the photographing method in the embodiment of the present application.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it may be called directly from memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
As an example, the memory may have cached therein the storage address and index identification of the snapshot.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions.
As an example, the electronic device may be connected to an external memory that may store video taken by the electronic device and snap shots during video taking.
The internal memory 121 may be used to store computer-executable program code that includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store application programs (such as a video capturing function, a video playing function, etc.) required for at least one function of the operating system.
In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (un iversa l f l ash storage, UFS), and the like.
As an example, the internal memory may store a computer program for implementing the video photographing method provided by the embodiment of the present application. Of course, videos taken by the electronic device and snap shots during video taking may also be stored in the internal memory.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
As an example, the electronic apparatus may detect an operation of a user acting on the display screen, such as an operation of a user clicking a snap control, an operation of a user clicking a hover button, or the like, through a pressure sensor and a touch sensor.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
As an example, the user may also implement an operation during video capturing through a physical key, for example, a shortcut key (e.g., volume+key) during video capturing may also be set as a physical key for capturing a photograph. In the video shooting process, a user can place a finger on the volume and the key, and in the video shooting process, the snap photo can be realized by applying force to the finger placed on the volume and the key, so that more diversified modes are provided for realizing the snap photo.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The camera 193 is used to capture still images or video. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
In the implementation of the application, the camera can acquire video pictures, the GPU processes the video pictures acquired by the camera, and the display screen displays the interface processed by the GPU. For details of display on the display screen, reference is made to the description of the embodiments described later.
The embodiment of the present application is not particularly limited to a specific structure of an execution subject of a photographing method, as long as communication can be performed with a photographing method provided according to the embodiment of the present application by running a code recorded with the photographing method of the embodiment of the present application. For example, the execution body of a photographing method provided by the embodiment of the application may be a functional module in an electronic device that can call a program and execute the program, or a processing apparatus, such as a chip, applied to the electronic device.
The following will describe an interface schematic diagram of an embodiment of the present application, and it should be noted that the size, position and style of the control icon in the interface shown in the embodiment of the present application are only used as examples, and are not limiting.
Referring to fig. 2, a schematic diagram of a video capturing interface displayed on a display screen of an electronic device is shown.
As shown in fig. 2, during the process of capturing video, the video capturing interface displayed by the display screen of the electronic device may display a currently captured video frame (the running dog shown in fig. 2), a capturing progress (e.g., 00:06 in fig. 2), a status icon (e.g., a capturing icon in front of the capturing progress in fig. 2 indicates that the capturing status is currently in a capturing state), and a control may also be included.
As an example, the video capturing interface may include: pause/continue control, end control, and snapshot control. Of course, in practical applications, other controls may be included, such as the zoom control shown in fig. 2.
The ending control is used for ending the current video shooting process. The pause/continue control is used for displaying a pause icon (see fig. 2) in the video shooting process, and the user clicks the pause icon to pause the current video shooting process; and displaying a photographing icon (see fig. 12) when the video photographing is paused, and the user clicks the photographing icon to continue the current video photographing process. The snapshot control is used to take a photograph without pausing the current video capture process and without ending the current video capture process.
Taking an application scenario as an example, referring to fig. 3, in the process of capturing a video of a dog running, a user wants to capture a photo of the dog running, and the user can click a capture control in a video capturing interface to obtain a captured photo. The snap shots are stored as images in a gallery of the electronic device.
However, the snap shots may not meet the expectations of the user. Taking the sports scene shown in fig. 3 (a person, an animal or an object moving in a shooting picture) as an example, when a moving puppy is snap shot, the following situations may occur: the puppies appear blurred in the snap shots due to the running of the puppies; alternatively, the user desires to take a photograph of the state in which the posture of the puppy is emptied with both feet, whereas the state in which the puppy is landed with both feet is taken a photograph. Therefore, it is likely that the snap shots do not meet the expectations of the user. Taking a sports scene as an example, the embodiment of the application illustrates that the snapshot may not meet the expectations of users. In practical applications, in other shooting scenes, the snap shots may not meet the expectations of the user, and the embodiments of the present application are not illustrated one by one.
Since the display screen of the electronic device displays the video photographing interface shown in fig. 3 during the process of photographing the video of the puppy running, the video photographing interface is used for displaying the video picture photographed in real time. Thus, the user cannot confirm whether or not the snapshot is in line with expectations immediately after the snapshot is obtained. After the current video shooting process is finished, the user can enter a gallery of the electronic device to check whether the snap shot photos meet expectations, and if not, the photos need to be shot again, however, the best shooting time is likely to be missed currently.
If the user wants to immediately see if the snapshot is as expected, he needs to go back to the main interface in the manner shown in fig. 4 (a). That is, the user can slide upwards from the bottom of the electronic device with a finger, and control the interface displayed on the display screen of the electronic device to be the main interface, and the schematic diagram of the main interface can refer to the schematic diagram of the interface in (b) in fig. 4.
Then, entering a gallery of the electronic device in the manner shown in (b) of fig. 4; in the case where the main interface of the electronic device includes a gallery application, the user may click on the gallery application to control the interface displayed on the display screen of the electronic device to be a preview interface in the gallery, and the schematic diagram of the preview interface may refer to the schematic diagram of the interface in (c) in fig. 4.
It is checked whether the snap shots are in line with expectations in the manner shown in fig. 4 (c). In the case that the preview interface displayed by the electronic device displays the thumbnails in reverse order of time, the first thumbnail (thumbnail of image 1) in the preview interface is the latest snapshot. The user can click on the thumbnail of image 1 to see if the snapshot is as expected.
The processes shown in (a) to (c) of fig. 4 interrupt the current video photographing process. Of course, the manner shown in fig. 4 (a) to 4 (c) is merely for example, and does not limit the present application.
In the case where the snap shot is not in line with expectations, the user needs to re-take the photo to obtain a photo that can be in line with expectations.
In case the snap shot is in line with expectations, it is not possible to continue the video shooting on the basis of the interrupted video, as the current video shooting process has been interrupted. If the video still needs to be continuously shot, the next video shooting needs to be performed again.
The function of capturing photos in the video shooting process is to capture photos under the condition of not interrupting the video shooting process, so that the user experience on the function is poor, and the function cannot be fully exerted.
In view of this, the embodiment of the present application provides a shooting method, which can provide a way for a user to check whether a snapshot taken in a video shooting process meets the expectations of the user.
As one example, where the display screen of the electronic device displays the video capture interface shown in fig. 3, the user clicks on the snapshot control, the electronic device takes a snapshot, and a hover interface is displayed over the video capture interface shown in fig. 3.
In the embodiment of the application, the user operation for triggering the electronic equipment to obtain the snap photo (which can be recorded as a first image) is recorded as a first operation. For example, the user clicks on the snapshot control as described above. In the embodiment of the present application, the interface displayed above the other interface is referred to as a floating interface. For convenience of description, a hover interface triggered and displayed by the snapshot control is denoted as hover interface a, which may also be denoted as a first hover interface.
Referring to fig. 5, to illustrate an interface diagram after a user clicks a snapshot control on the basis of the video photographing interface shown in fig. 3, fig. 5 (a), a hover interface a displays a thumbnail of a snapshot. The user may preview the thumbnail of the snap photo displayed by the hover interface a to determine whether the effect of the snap photo meets the user's expectations. The hover interface a displays a thumbnail of a snap shot.
As previously described, after the user clicks the snapshot control, a snapshot corresponding to the snapshot operation may not be as expected. Therefore, the embodiment of the present application may also employ the display method shown in (b) of fig. 5. The hover interface a may display a preset number of thumbnails of the snap shots. For example, 2, 3, 4, etc. sheets may be displayed.
In this embodiment, after the electronic device detects that the user clicks the snapshot control, the time when the user clicks the snapshot control may be recorded. And taking a preset number of video frames which are related to the time in the video frames collected by the camera as snap shots to generate a thumbnail and an image file. The thumbnail is for display at this hover interface a and the image file is stored in the memory of the electronic device.
As an example, when thumbnail images of 3 snap shots are displayed, the video frames corresponding to the three thumbnail images may be: the video shooting interface in the display screen of the electronic equipment displays the video frame corresponding to the video picture at the time, and the previous video frame and the next video frame of the video frame corresponding to the video picture displayed at the time.
In practical application, the video frames corresponding to the three thumbnails may also be: the time is from a period before the time to a period in the middle of the period after the time, and any three video frames in the video pictures displayed by the display screen are displayed.
Of course, the snapshot effect presented by the 3 video frames may be identical, may be slightly different, and may be quite different.
As another example, it may also be set to: after the electronic equipment detects that the user clicks the snapshot control, the time of detecting that the user clicks the snapshot control can be recorded. Taking a preset number of video frames which are related to the time in the video frames collected by the camera as snap shots to generate thumbnail images; the thumbnail is for display at this hover interface a.
Referring to (b) of fig. 5, the user may select, by clicking (for example only), one or more of the thumbnails of the plurality of video frames displayed in the hover interface a to meet the expectations. After the electronic equipment monitors the thumbnail of any video frame selected by the user in the suspension interface A, the video frame corresponding to the thumbnail selected by the user generates an image file and stores the image file in a memory of the electronic equipment.
The embodiment of the application does not limit the display mode of the suspension interface A. When the display method shown in fig. 5 (b) is adopted, the storage of the image file generated by all the video frames displayed or the storage of the image file generated by the video frames selected by the user is not limited.
As can be appreciated from fig. 5, the hover interface a for displaying a thumbnail of a snap shot photo may obscure a partial area of the captured video frame. In order not to affect the user's view of the captured video, the hover interface a may be configured to close after being displayed for a period of time (e.g., 2s, 3s, 4s, etc.). The time that the hover interface a is displayed may be noted as a first preset duration.
As another example, after the hover interface a is displayed for 2s, it fades until it disappears, and the hover interface a is displayed for 3s from the beginning to the end.
As another example, referring to fig. 6, a close control may be provided on the hover interface a, which may be used to close the hover interface a after receiving a click operation from a user. The embodiment shown in fig. 6 controls the display time of the hover interface a by a user, so that the user can close the hover interface a in time if it is determined that the snapshot is expected, or can have enough time to view the thumbnail of the snapshot if it is necessary to carefully confirm whether the snapshot is expected.
In the embodiment of the present application, the user operation of triggering the closing of the hover interface a is referred to as a second operation, for example, the operation of clicking the closing control in the hover interface a by the user as described above.
As another example, the user may also adjust the position of the hover interface a. Referring to (a) of fig. 7, the user can drag the hover interface a to an arbitrary position on the video capturing interface by way of dragging.
The user may also adjust the size of the hover interface a. Referring to (b) of fig. 7, the user controls the hover interface a to be enlarged (or reduced) by pressing two fingers against the border of the hover interface a and then by stretching (or pinching) the hover interface a with two fingers
The user may also zoom in or out on the snap shots displayed in the hover interface a. Referring to (c) of fig. 7, the user controls the enlargement (or reduction) of the snap shot picture in the hovering interface a by pressing two fingers inside the hovering interface a and then by opening (or pinching) the fingers.
The manner in which the user operates the levitation interface a by finger in fig. 7 is merely for example and does not limit the present application.
In addition, the user may take multiple photographs during the video capture process. The user may also view other photographs taken in another way.
As another example, after the above-mentioned hover interface a is closed, a hover button is displayed above a video capturing interface displayed by a display screen of the electronic device. The hover button is used to present a plurality of snap shots to a user after the user clicks.
Referring to fig. 8, after the hover interface a for displaying the thumbnail of the snap photo displayed above the video capturing interface disappears, a hover button is displayed above the video capturing interface.
As previously described, the hover button is displayed after hover interface a is closed. For example, the hover button is displayed after a certain time has elapsed, or the hover button is displayed after the user clicks a close control on hover interface a to trigger the hover interface a to close.
In addition, in order to facilitate the display of other interfaces or controls, a photographing screen will not be displayed any more in the video photographing interfaces of the embodiments shown in fig. 8 to 12.
The user clicks the hover button, and the hover button on the video capture interface disappears, while another hover interface is displayed, which is used to display information (e.g., thumbnail images) of the photographs currently captured during video capture in the form of a list (in practice, other display modes are also possible).
Referring to fig. 9, after the user clicks the hover button, another hover interface is displayed over the video capture interface. The hover interface shown in fig. 9 may display a photograph taken during the current video capture. For convenience of description, the hover interface displayed by triggering the hover button may be referred to as hover interface B, or may be referred to as a second hover interface. The embodiment of the application records the user operation for triggering the electronic equipment to display the suspension interface B as a third operation. For example, the user clicks the hover button as described above.
For example, if 3 photographs are snap shot during video shooting, thumbnail images of the 3 snap shot photographs may be displayed in the hover interface B illustrated in fig. 9.
Of course, since the area of the hover interface B shown in fig. 9 is limited and each snap shot is displayed in the hover interface B with a fixed size, it is possible that the hover interface B cannot completely display each snap shot during the current video capturing.
For example, in the hover interface B shown in fig. 10, thumbnail images of 6 snap shots are displayed in full, and thumbnail images of 2 snap shots are displayed in part. If more than 8 photographs are snap shot during the current video capture, at least 1 thumbnail of the snap shot photographs may be hidden.
When a user needs to view the snapshot or the hidden snapshot displayed in part on the suspension interface B, the suspension interface B can be triggered to display other snapshots through a sliding gesture in the suspension interface B.
For example, in the hover interface B illustrated in fig. 11, the user triggers the hover interface B to slide to display the snap shot photo through a sliding gesture within the hover interface B. As shown in the hover interface B of fig. 11, at least 10 photos are snap shot during the current video capture. Triggering the suspension interface B by a sliding gesture of a user to hide the snapshot 1 and the snapshot 2 currently; partially displaying a snapshot 3, a snapshot 4, a snapshot 9 and a snapshot 10; the snap shots 5 to 8 are displayed in full. As the slide gesture proceeds, the snap shots displayed inside the hover interface B slide along with it. The fully displayed snapshot, the partially displayed snapshot, and the hidden snapshot may be slightly different during the sliding process. The embodiments of the present application are not illustrated one by one.
In practical application, if the user triggers to display the hover interface B shown in fig. 9 through the hover button shown in fig. 8, it indicates that the user needs to preview the snapshot in the current video capturing process. I.e. the user's attention may be focused on the snap shots displayed in the hover interface B and may be longer in time. In order not to affect the video capturing effect, the current video capturing process may be paused in a case where the user clicks the hover button shown in fig. 8 to trigger the display screen of the electronic device to display the hover interface B shown in fig. 9.
Referring to the interfaces shown in fig. 9 to 11, in the scenes shown in fig. 9 to 11, the current video photographing process is in a pause state, a pause icon is displayed in front of the photographing progress displayed by the video photographing interface, and a pause/continue control in the video photographing interface displays the photographing icon.
In case the user clicks the pause/continue control (displayed as a shooting icon) or an area of the current video shooting interface other than the hover interface B and not provided with other controls, the current video shooting process is switched from the pause state to the shooting state while the hover interface B is closed. In the embodiment of the present application, the operation for triggering the closing of the hover interface B (while continuing to capture the currently paused video capture) is denoted as a fourth operation. The pause/resume control is denoted as a first control.
It should be noted that, in the case where the suspension button shown in fig. 8 triggers the suspension interface B to be displayed, if the current video shooting process is suspended, the shooting state displayed in front of the shooting progress displayed in the video shooting interface in fig. 9 to 11 is a suspension icon, and the suspension/continuation control displays the shooting icon, so that the user clicks the shooting icon to continue the current video shooting process, and simultaneously closes the displayed suspension interface B.
In the case where the suspension button shown in fig. 8 triggers the suspension interface B to be displayed, if the current video capturing process is not suspended, the embodiment shown in fig. 12 may be referred to, in which the capturing state displayed in front of the capturing progress displayed in the video capturing interface in fig. 12 is a capturing icon, and the suspension/continuation control displays a suspension icon, so that the user may need to click on the suspension icon to suspend the current video capturing process when the current video capturing process is required to be suspended.
In addition, the user needs to trigger the currently displayed hover interface B to close in other ways. For example, in fig. 12, a closing control is set on the suspension interface B, and the user clicks the closing control to trigger closing the suspension interface B; the user can trigger to close the suspension interface B through a preset gesture. The embodiments of the present application are not limited in this regard.
It should be further noted that, after the hover interface a corresponding to the first snapshot in the video capturing process is closed, the hover button shown in fig. 8 starts to be displayed on the video capturing interface. After the user clicks the hover button to trigger the hover interface B to be displayed, the hover button shown in fig. 8 is no longer displayed on the video capturing interface. After the hover interface B is closed, the video capture interface continues to display the hover button shown in fig. 8.
After describing the respective scene interface diagrams of the shooting process, technical details for realizing the respective application scenes will be described below for a clearer understanding of the respective application scenes.
Referring to fig. 13, a technical architecture diagram on which a photographing method according to an embodiment of the present application depends is provided. As shown in fig. 13, the technical architecture includes: an application layer, an application framework layer and a hardware abstraction layer.
Only some of the layers associated with embodiments of the present application are shown in fig. 13, and other layers, e.g., a system runtime layer, a kernel layer, may be included in addition to the above layers. The embodiments of the present application are not illustrated one by one.
In addition, the embodiment of the present application will describe some functions of each module in the above-described respective layers, and the some functions of each module described in the embodiment of the present application are functions related to the embodiment of the present application, and do not mean that each module only provides the above-described functions.
The application layer has camera applications for implementing video capture and image (photo) capture as well as other functions in the capture process.
The camera application is provided with a user interface module, which can provide the display interfaces in the above embodiments.
The camera application is also provided with a photographing module which can realize an image (photo) photographing function.
The camera application is also provided with a video recording module, and the video recording module can realize a video shooting function.
The camera application is also provided with a data processing module, which is used for providing support for the pictures displayed by the interfaces in the scene, for example, the pictures displayed by the interfaces can be drawn based on the data transmitted from the lower layer.
The application framework layer presents a media framework that can enable data interaction between camera applications in the upper layer (application layer) and various modules in the lower layer (hardware abstraction layer).
The media frame is provided with an audio/video encoding/decoding module which can encode and process video data streams transmitted by a lower layer (hardware abstraction layer).
The media framework is also provided with a camera service (camera service) that can encapsulate video requests transmitted by upper layer applications. The encapsulated video request is transmitted to the lower layer (hardware abstraction layer).
The media framework is also provided with a media database for temporarily storing the storage address of the snap shot photos.
The hardware abstraction layer can encapsulate the drivers of some cameras to interact with the camera hardware at the bottom layer, so that the camera is called to realize the video shooting function and the image shooting function.
The hardware abstraction layer is provided with a Camera module, the Camera module is used for defining a universal standard interface, and Camera service realizes communication with Camera hardware at the bottom layer based on the standard interface provided by the Camera module.
The hardware abstraction layer is also provided with graphics classes that provide a method for drawing objects (frames of a hover interface, thumbnail images of snap shots, etc.) to a display device.
A timing interaction diagram between the various layers in the above-described technical architecture will be described below with reference to fig. 14.
And step A1, clicking a video recording control on a display screen of the electronic equipment by a user to start a video recording function of the electronic equipment.
And step A2, after the camera application of the electronic equipment receives a video recording instruction corresponding to the operation of clicking the video recording control by a user, sending a video recording request to the media framework.
And step A3, after the media framework of the electronic equipment receives the video recording request, packaging the video recording request to obtain the packaged video recording request.
And step A4, the media framework of the electronic equipment sends the encapsulated video request to a hardware abstraction layer.
And step A5, the hardware abstraction layer of the electronic equipment calls the camera at the bottom layer to start to collect the video data stream, and the video data stream collected by the camera is returned to the hardware abstraction layer.
And step A6, the hardware abstraction layer of the electronic equipment receives the video data stream collected by the bottom camera and sends the received video data stream to the media frame.
And step A7, the media framework of the electronic equipment encodes the received video data stream.
Step A8, the media framework of the electronic device sends the encoded video data stream to the camera application.
And step A9, after the camera application of the electronic equipment receives the video data stream after the coding processing, drawing a video picture based on the video data stream after the coding processing and displaying the video picture through a display screen of the electronic equipment.
And displaying the video picture in a video shooting interface when a display screen of the electronic equipment displays the video picture.
In addition, after step A5, the camera continues to collect the video data stream, and accordingly, steps A6 to A9 are also continuously performed. As time increases, the camera application continues to draw the video data stream received in real time into video frames and send the video frames to the display screen for display, thereby realizing the video shooting process.
Step B1 is performed when the user wants to take a snapshot during video capturing.
And step B1, clicking a snapshot control in a video shooting interface displayed on a display screen of the electronic equipment by a user.
And B2, after the camera application of the electronic equipment receives a snapshot instruction corresponding to the operation of clicking the snapshot control by a user, the camera application generates a thumbnail from a video frame corresponding to a video picture currently displayed on the video shooting interface.
It should be noted that, when the difference in content displayed in video frames of consecutive several frames is not large, particularly when the video frame rate is large, it is possible that the content displayed in video frames of consecutive several frames is almost the same. The pictures in some video frames may be blurred simply due to the movement of people, animals or objects in the photographed scene.
In addition, the instant the user wants to snap and the time the user clicks the snap control are not exactly the same. Therefore, when the thumbnail is generated, the time when the camera application of the electronic device receives the snapshot instruction or the time when the touch screen detects the user operation triggering the snapshot photo can be recorded first. As an example, this time may be noted as a first time.
The video frames that generate the thumbnail may be: and video frames corresponding to the video pictures displayed by the video shooting interface in the display screen of the electronic equipment at the time.
The video frames that generate the thumbnail may also be: the video shooting interface in the display screen of the electronic device displays the front n frames and the rear n frames of the video frames corresponding to the video pictures at the time. Wherein n is a positive integer greater than or equal to 1. For example, n may be 1, 2, 3, 4, 5.
The video frames that generate the thumbnail may also be: the video shooting interface in the display screen of the electronic equipment displays the clearest frame among the first n frames, the current frame and the last n frames of the video frames corresponding to the video picture at the time. As an example, a detection model may be set, which may detect the sharpness of an image. The detection model is inputted with the aforementioned "previous n frames, current frame, and subsequent n frames", and the video frame with the highest output sharpness is used as the video frame for generating the thumbnail.
And B3, the camera application of the electronic equipment indicates a display screen of the electronic equipment to display a suspension interface A, and a picture displayed in the suspension interface A is a generated thumbnail.
The thumbnail may be temporarily stored in a section of memory space (e.g., 3M of memory space may be pre-allocated) that is used to temporarily store the thumbnail of the snap photo displayed in hover interface a.
Step B4, the camera application of the electronic device generates an image file for generating the abbreviated video frame and stores the image file.
In this step, the image file may be stored in a gallery of the electronic device. If the user exits the current video shooting process, the image file corresponding to the snapshot photo can be checked from the gallery of the electronic equipment.
Step B5, the camera application of the electronic device stores the storage address (and other information) and index identification of the image file in the media database of the media frame.
The index identification is stored in the media database in association with the storage address of the image file, i.e. the storage address of the image file and the index identification are stored one-to-one. The storage address of an image file corresponds to an index identifier. Index identifications corresponding to storage addresses of the plurality of image files may be identical. The image file corresponding to the index identity may then be retrieved from the media database based on the index identity. If the index identifiers of the storage addresses of the image files corresponding to all the snap shots in the current video shooting process are the same, the storage addresses of the image files corresponding to all the snap shots in the current video shooting process can be obtained based on the index identifiers.
In view of the above description, the index identification may be associated with the current video shot. As an example, each time a video is captured, a video identification of the video currently to be captured may be generated before the capturing is performed. For example, the current timestamp+random number is used as the video identification of the video to be currently shot. Each time a video is shot, a video identifier is generated, and the generated video identifier is unique. The index identifier may be a video identifier generated when capturing video.
It should be noted that, step B2 and step B3 are used to generate a thumbnail of the snapshot photo and display the thumbnail of the snapshot photo, so that the user can preview the snapshot photo. The steps B4 to B5 are used for generating and storing the snapshot photo into an image file, and the image file can be stored in a memory of the electronic device because the snapshot photo is already generated, so that a user can preview the snapshot photo when the user needs to preview the photo which is snapped in the current video shooting process.
As an example, when the user previews the snap shot in the video shooting process by clicking the hover button as described above, the storage address of the image file can be obtained from the media database, so that the image file is obtained from the storage address of the image file, and a thumbnail of the image file is generated; and displaying the thumbnail of the image file on a hover interface B triggered and displayed by the hover button.
Of course, after the current video shooting is finished, when the user enters the gallery to view the snap photo, the thumbnail of the image file can be previewed in the preview interface of the gallery.
In practical applications, the camera application may execute step B4 after executing step B3. The camera application may also start a sub-thread after receiving the snapshot instruction, the sub-thread being used to perform steps B4 to B5. Namely, steps B2 to B3, and steps B4 to B5 are performed simultaneously by two threads.
In step B5, the storage address of the image file corresponding to the snapshot photo is stored in the media database, and in practical application, the media database may also store other information of the image file corresponding to the snapshot photo, for example, the size of the image file corresponding to the snapshot photo, the snapshot time, and so on.
And step B6, after the thumbnail images 3s are displayed on the hover interface A, the camera application of the electronic device closes the hover interface A and displays a hover button.
As previously described, the hover interface A, when displayed as a thumbnail, may slowly increase in transparency until it becomes completely transparent and disappears. A hover button is then displayed on the video capture interface. In the embodiment of the application, the transparency can be set to be a value of 0-1, and when the suspension interface A is displayed on the video shooting picture, the transparency starts to be 0 and then gradually increases to 1. When the height is increased to 1, the suspension interface A disappears, and a suspension button is displayed on the video shooting interface. This embodiment is one implementation and may be otherwise implemented.
The suspension button can provide an entrance for a user, and the entrance is used for viewing the snap shots in the current video shooting process, so that the electronic equipment can display the suspension button no matter in a mode that the user clicks a closing control on the suspension interface A or in a mode that the suspension interface A is closed by itself after being displayed for a certain time, and the entrance for viewing the snap shots in the current video shooting process is provided for the user.
The thumbnail of the snap shot may be temporarily stored in a memory space, as described in step B3. After step B6, the thumbnail image stored in the memory space is emptied. So that the thumbnail of the snap photo is still stored in the memory space when the photo is snapped next time.
It should be noted that, in the video capturing process, if the 1 st snapshot is taken, steps B1 to B6 are required to be executed. On the one hand, the thumbnail of the 1 st snapshot is required to be displayed so that the user can preview the thumbnail of the 1 st snapshot. On the other hand, the storage address of the image file of the 1 st snapshot is required to be stored in a media database in association with the index identifier, so that the user can preview based on the index identifier when the user needs to preview all the snapshots in the current video shooting.
If the ith snapshot (i is a positive integer greater than or equal to 2), steps B1 to B6 still need to be performed. On the one hand, the thumbnail of the ith snapshot needs to be displayed so that the user can preview the thumbnail of the ith snapshot. On the other hand, the storage address and index identifier of the image file of the ith snapshot are required to be stored in the media database (which may be the same as the index identifier when the storage address of the image file is stored for the 1 st time), so that the user can preview based on the index mark when all the snapshots in the current video shooting need to be previewed.
As an example, in the embodiment shown in fig. 14, if the user clicks the snapshot control again (e.g., the 2 nd time) after step B6 shown in fig. 14, the electronic device repeatedly executes the steps of steps B1 to B6 again after step B6 shown in fig. 14. The photograph taken by the user 2 nd time is again displayed in the same manner (hover interface a is displayed with hover button disappeared). After the thumbnail of the 2 nd snapshot photo is displayed on the suspension interface A, the thumbnail is closed (after the preset time, the thumbnail is closed by the user or the thumbnail is closed by clicking a closing control) and the suspension button is continuously displayed.
As another example, in the embodiment shown in fig. 14, if the user clicks the snapshot control again (e.g., the 2 nd time) before the electronic device executes the steps B3 and B6, i.e., before the hover interface a disappears, the electronic device may control the hover interface a to terminate displaying the thumbnail of the 1 st snapshot photo and display the thumbnail of the 2 nd snapshot photo instead. After the thumbnail of the 2 nd snapshot photo is displayed on the suspension interface A, the thumbnail is closed (after the preset time, the thumbnail is closed by self or the user clicks a closing control to close the thumbnail), and a suspension button is displayed.
After the user performs step B1, the electronic device still continuously performs step A5 (receiving the video data stream) to step A9 during the process of performing steps B2 to B6. I.e. the video pictures are still continuously displayed in the video shooting interface displayed by the display screen of the electronic device.
If the user captures a plurality of photos in the current video shooting process, image files corresponding to the plurality of photos captured in the current video shooting process are stored in a gallery of the electronic equipment. Correspondingly, the storage addresses of the image files corresponding to the multiple photographs captured in the current video shooting process are stored in the media database.
After each snapshot, the hover button is redisplayed after the hover interface a (Shan Zhangsu thumbnail for displaying the current snapshot) displayed on the video capture interface disappears.
If the user needs to view all the photos taken in the current video shooting process (or other photos taken before the current snapshot), the user may execute step C1.
And step C1, clicking a hover button on a video shooting interface displayed on a display screen of the electronic equipment by a user.
And step C2, after the camera application of the electronic equipment receives a query instruction corresponding to the operation of clicking the hover button by a user, the camera application acquires an index identifier (a video identifier of the currently shot video).
Step C3, the camera application sends the index identification to the media framework.
And step C4, the media framework acquires the storage addresses of the image files corresponding to all the snap shots in the current video process from the media database based on the index identification.
And step C5, the media framework returns the storage address of the searched image file to the camera application.
In step C6, the camera application obtains the image files corresponding to all the snap shots in the current video process based on the received storage address, and displays the thumbnails of all the snap shots (or the thumbnails of part of the snap shots) in a list form through the suspension interface B. After the hover interface B is displayed, the hover button above the video capture interface disappears.
In addition, after the user executes step C1, in an application scene where the video capturing process needs to be in a suspended state, the camera application needs to transmit a suspension instruction to the lower layer to suspend the capturing by the camera. Reference may be made specifically to steps D2 to D5.
And D2, after the camera application of the electronic equipment receives a query instruction corresponding to the operation of clicking the hover button by the user, the camera application sends a pause instruction to the media frame.
And D3, after receiving the pause instruction, the media framework of the electronic equipment encapsulates the pause instruction to obtain the encapsulated pause instruction.
And D4, the media framework of the electronic equipment sends the packaged pause instruction to the hardware abstraction layer.
And D5, the hardware abstraction layer of the electronic equipment instructs the camera at the bottom layer to stop acquiring the video data stream.
Step D2 to step D4 are similar to step A2 to step A4, and step A2 to step A4 transmit an acquisition request, and accordingly, the camera starts to acquire a video data stream and returns the acquired video data stream. The step D2 to the step D5 transmit an instruction for indicating to pause the acquisition, and correspondingly, the camera stops acquiring the video data stream.
Of course, after the user finishes viewing the thumbnail of the snap photo through the hover interface B, the user may continue with the current video capture process.
In step E1, the user clicks on an area (or clicks on a pause/continue control of the currently displayed shooting icon) of the video shooting interface other than the hover interface B.
And E2, after receiving a continuous video recording instruction corresponding to the operation of clicking the area outside the suspension interface B by the user, the camera application of the electronic equipment sends a continuous video recording request to the media frame.
The steps E3 to E9 may refer to the related descriptions in the steps A3 to A9, and are not described herein.
Of course, in step E9, the hover button may be displayed over the video capture interface while continuing to draw and display the video picture.
As another example, in the embodiment shown in fig. 14, before the electronic device displays the hover interface B until the hover interface B disappears in steps C6 to E9, if the user clicks the snap control again (e.g., the 2 nd time), the electronic device may not repeatedly perform the steps of steps B1 to B6. Because, at this time, the camera in the electronic device is not in the acquisition state, but in the pause state. Thus, it is not necessary to generate a snap shot.
As another example, in the embodiment shown in fig. 14, after the hover button is displayed in step E9, if the user clicks the snap control again (e.g., the 2 nd time), the electronic device repeatedly performs the steps of steps B1 to B6 again after step E9 shown in fig. 14. The photograph taken by the user 2 nd time is again displayed in the same manner (hover interface a is displayed with hover button disappeared). After the thumbnail of the 2 nd snapshot photo is displayed on the suspension interface A, the thumbnail is closed (after the preset time, the thumbnail is closed by self or the user clicks a closing control to close the thumbnail), and a suspension button is displayed.
As another example, in the embodiment shown in fig. 14, after the hover button is displayed in step E9, if the user clicks the hover button again (e.g., 2 nd time). Then after step E9 shown in fig. 14, the electronic device repeatedly performs steps C1 to C6 again to display thumbnails of all the snap shots in the current video process (the hover interface B is displayed while the hover interface is disappeared).
In summary, the snapshot control, as a control on the video capturing interface, may be set as: and displaying from the beginning of the current video shooting to the end of the current video shooting. The snapshot control is used for triggering the floating interface A to be displayed on the video shooting interface so as to display the photos which are currently triggered to be snapshot by the snapshot control. Whether the suspension interface A is displayed or not does not influence the display of the snapshot control.
The suspension button starts to be displayed after the suspension interface A displayed for the first time in the current video shooting process disappears. The suspension button is used for triggering the suspension interface B to be displayed on the video shooting interface so as to display all the snap shots in the current video shooting process. After the hover interface B is displayed, the hover button may be set to disappear. Of course, after the hover interface a is displayed, the hover button may be set to disappear. It is also understood that when the hover button is displayed for the first time during the current video capturing, the hover button disappears when hover interface a or hover interface B is displayed on the video capturing interface. In the case where the video capturing interface is not displayed with either the hover interface a or the hover interface B, the hover button is displayed.
Finally, after the video shooting is finished, the storage address associated with the video identifier of the video just finished to be shot stored in the media database can be cleared together. The image file corresponding to the storage address (snapshot photo-generated image file) is not emptied.
The camera application provided by the embodiment of the application can be called by other application programs, for example, when the other application programs need to use the camera function, the other application programs can call all or part of the functions in the camera application provided by the embodiment of the application.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present application.
The embodiments of the present application also provide a computer readable storage medium storing a computer program, which when executed by a processor, implements the steps of the above-described method embodiments.
Embodiments of the present application also provide a computer program product enabling a first device to carry out the steps of the method embodiments described above, when the computer program product is run on the first device.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above-described embodiments, and may be implemented by a computer program to instruct related hardware, and the computer program may be stored in a computer readable storage medium, where the computer program, when executed by a processor, may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, executable files or in some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying the computer program code to the first device, a recording medium, a computer Memory, a Read-only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a U-disk, removable hard disk, magnetic or optical disk, etc. In some jurisdictions, computer readable media may not be electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The embodiment of the application also provides a chip system, which comprises a processor, wherein the processor is coupled with the memory, and the processor executes a computer program stored in the memory to realize the steps of any method embodiment of the application. The chip system can be a single chip or a chip module composed of a plurality of chips.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the elements and method steps of the examples described in connection with the embodiments disclosed herein can be implemented as electronic hardware, or as a combination of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (16)

1. A photographing method, characterized by being applied to an electronic device, the electronic device including a display screen and a camera, the photographing method comprising:
after the electronic equipment starts video recording, displaying a video shooting interface through the display screen, wherein the video shooting interface is used for displaying video pictures acquired by the camera in real time;
in the process of video recording of the electronic equipment, the electronic equipment captures a first image, wherein the first image is related to a video picture acquired by the camera in real time, and the electronic equipment captures a second image, and the second image is related to the video picture acquired by the camera in real time;
the electronic equipment detects a third operation, and the third operation is used for triggering the electronic equipment to display a second suspension interface;
the electronic equipment displays the second suspension interface on the video shooting interface based on the third operation, wherein the second suspension interface is used for displaying a thumbnail of a snap shot image, the snap shot image is a snap shot image in the current video shooting process, and the snap shot image comprises the first image and the second image.
2. The shooting method as claimed in claim 1, wherein during recording by the electronic device, the electronic device captures a first image, the first image being related to a video frame acquired by the camera in real time, and the method comprises:
In the process of video recording of the electronic equipment, the electronic equipment detects a first operation, and the first operation is used for capturing a first image;
and the electronic equipment displays a first suspension interface on the video shooting interface, wherein the first suspension interface is used for displaying the thumbnail of the first image.
3. A shooting method as claimed in claim 2, wherein a snapshot control is provided on the video shooting interface, and the first operation acts on the snapshot control.
4. A shooting method as claimed in claim 2 or 3, wherein the electronic device, after displaying the first hover interface on the video shooting interface, further comprises:
and closing the first suspension interface by the electronic equipment under the condition that the first suspension interface displays a first preset time period.
5. A shooting method as claimed in claim 2 or 3, wherein the electronic device, after displaying the first hover interface on the video shooting interface, further comprises:
the electronic equipment detects a second operation, wherein the second operation is used for closing the first suspension interface;
the electronic device closes the first hover interface based on the second operation.
6. The shooting method of claim 5, wherein a closing control is arranged on the first suspension interface; the second operation acts on the close control.
7. The method of claim 4, wherein the electronic device, after closing the first hover interface, the capturing method further comprises:
the electronic device displays a hover button on the video capture interface, wherein the third operation acts on the hover button.
8. The photographing method according to claim 1 or 7, wherein in the case where the electronic device detects a third operation, the photographing method further comprises:
the electronic device pauses the current video capture based on the third operation.
9. The photographing method of claim 8, wherein the electronic device, after displaying the second hover interface on the video photographing interface based on the third operation, further comprises:
the electronic device detecting a fourth operation;
the electronic equipment closes the second suspension interface based on the fourth operation;
the electronic device continues the currently paused video capture based on the fourth operation.
10. The shooting method as claimed in claim 9, wherein the fourth operation acts on a first control provided on the video shooting interface or acts on an area of the video shooting interface other than the second suspension interface, where no control is provided, and the first control is used to pause the current video shooting or continue the currently paused video shooting.
11. A photographing method as claimed in claim 2 or 3, wherein in the event that the electronic device detects the first operation, the photographing method further comprises:
the electronic device records a first time at which the first operation is detected;
the electronic equipment generates the first image from a video picture of a next frame of the video picture displayed by the video shooting interface at the first time;
the electronic device generates a thumbnail of the first image.
12. The photographing method of claim 11, wherein said generating a first image of the video photographing interface at a next frame of video frames of the video frames displayed at the first time further comprises:
the electronic device stores the first image;
and the electronic equipment sends the storage address of the first image and the index identification of the snap shot image to a media database.
13. The method of shooting as recited in claim 12, wherein the electronic device displaying the second hover interface on the video shooting interface based on the third operation comprises:
the electronic equipment acquires an index identifier of the snap-shot image based on the third operation;
the electronic equipment acquires a storage address of the snapshot image in the current video shooting process based on the index identification of the snapshot image;
the electronic equipment acquires the snapshot image based on the acquired storage address;
the electronic equipment generates a thumbnail of the snap-shot image;
and the electronic equipment displays the generated thumbnail of the snap image in the second suspension interface, wherein the thumbnail of the snap image comprises the thumbnail of the first image.
14. A shooting method as claimed in claim 12 or 13, wherein the index identity of the snap shot image is a unique identity of the currently shot video.
15. An electronic device comprising a processor for executing a computer program stored in a memory to cause the electronic device to implement the method of any one of claims 1 to 14.
16. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program which, when run on a processor, implements the method according to any one of claims 1 to 14.
CN202211416840.7A 2021-09-14 2021-09-14 Shooting method, electronic equipment and storage medium Active CN115866390B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211416840.7A CN115866390B (en) 2021-09-14 2021-09-14 Shooting method, electronic equipment and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211416840.7A CN115866390B (en) 2021-09-14 2021-09-14 Shooting method, electronic equipment and storage medium
CN202111077227.2A CN113810608B (en) 2021-09-14 2021-09-14 Shooting method, electronic equipment and storage medium

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202111077227.2A Division CN113810608B (en) 2021-09-14 2021-09-14 Shooting method, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN115866390A CN115866390A (en) 2023-03-28
CN115866390B true CN115866390B (en) 2023-11-07

Family

ID=78941013

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202211416840.7A Active CN115866390B (en) 2021-09-14 2021-09-14 Shooting method, electronic equipment and storage medium
CN202111077227.2A Active CN113810608B (en) 2021-09-14 2021-09-14 Shooting method, electronic equipment and storage medium

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202111077227.2A Active CN113810608B (en) 2021-09-14 2021-09-14 Shooting method, electronic equipment and storage medium

Country Status (1)

Country Link
CN (2) CN115866390B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114205531B (en) * 2021-12-23 2024-06-04 北京罗克维尔斯科技有限公司 Intelligent photographing method, device and apparatus for vehicle and storage medium
CN116700577A (en) * 2022-02-28 2023-09-05 荣耀终端有限公司 Video processing method, electronic device and readable storage medium
CN115525188A (en) * 2022-02-28 2022-12-27 荣耀终端有限公司 Shooting method and electronic equipment
CN116700846B (en) * 2022-02-28 2024-04-02 荣耀终端有限公司 Picture display method and related electronic equipment
CN114827342B (en) * 2022-03-15 2023-06-06 荣耀终端有限公司 Video processing method, electronic device and readable medium
CN115328357A (en) * 2022-08-15 2022-11-11 北京达佳互联信息技术有限公司 Captured image processing method and device, electronic device and storage medium
CN116320783B (en) * 2022-09-14 2023-11-14 荣耀终端有限公司 Method for capturing images in video and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105635614A (en) * 2015-12-23 2016-06-01 小米科技有限责任公司 Recording and photographing method, device and terminal electronic equipment
CN105681648A (en) * 2015-12-31 2016-06-15 北京金山安全软件有限公司 Picture viewing method and device and electronic equipment
CN109726179A (en) * 2018-12-29 2019-05-07 努比亚技术有限公司 Screenshot picture processing method, storage medium and mobile terminal
CN109922266A (en) * 2019-03-29 2019-06-21 睿魔智能科技(深圳)有限公司 Grasp shoot method and system, video camera and storage medium applied to video capture

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160217328A1 (en) * 2013-09-30 2016-07-28 Danielle YANAI Image and video processing and optimization
CN105827935B (en) * 2015-07-23 2018-10-16 维沃移动通信有限公司 A kind of method and terminal of terminal sectional drawing
CN107580234B (en) * 2017-09-01 2020-06-30 歌尔科技有限公司 Photographing method, display end, camera head end and system in wireless live broadcast
CN111290675B (en) * 2020-03-02 2023-02-17 Oppo广东移动通信有限公司 Screenshot picture sharing method and device, terminal and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105635614A (en) * 2015-12-23 2016-06-01 小米科技有限责任公司 Recording and photographing method, device and terminal electronic equipment
CN105681648A (en) * 2015-12-31 2016-06-15 北京金山安全软件有限公司 Picture viewing method and device and electronic equipment
CN109726179A (en) * 2018-12-29 2019-05-07 努比亚技术有限公司 Screenshot picture processing method, storage medium and mobile terminal
CN109922266A (en) * 2019-03-29 2019-06-21 睿魔智能科技(深圳)有限公司 Grasp shoot method and system, video camera and storage medium applied to video capture

Also Published As

Publication number Publication date
CN113810608A (en) 2021-12-17
CN113810608B (en) 2022-11-25
CN115866390A (en) 2023-03-28

Similar Documents

Publication Publication Date Title
CN115866390B (en) Shooting method, electronic equipment and storage medium
US11995530B2 (en) Systems and methods for providing feedback for artificial intelligence-based image capture devices
CN111654629B (en) Camera switching method and device, electronic equipment and readable storage medium
WO2021052458A1 (en) Machine translation method and electronic device
CN113475092B (en) Video processing method and mobile device
CN112954210B (en) Photographing method and device, electronic equipment and medium
CN113099146B (en) Video generation method and device and related equipment
WO2022262475A1 (en) Image capture method, graphical user interface, and electronic device
WO2024055797A1 (en) Method for capturing images in video, and electronic device
CN115689963B (en) Image processing method and electronic equipment
WO2023134583A1 (en) Video recording method and apparatus, and electronic device
WO2023083132A1 (en) Photographing method and apparatus, and electronic device and readable storage medium
WO2023035921A1 (en) Method for image snapshot in video recording, and electronic device
CN115484403A (en) Video recording method and related device
CN112136309B (en) System and method for performing rewind operations with a mobile image capture device
US11551452B2 (en) Apparatus and method for associating images from two image streams
CN115883958A (en) Portrait shooting method
WO2023036007A1 (en) Method for acquiring image, and electronic device
WO2022206605A1 (en) Method for determining target object, and photographing method and device
CN113794833B (en) Shooting method and device and electronic equipment
WO2024055817A1 (en) Code scanning method and electronic device
WO2023231696A1 (en) Photographing method and related device
CN118102079A (en) Shooting method, shooting device, electronic equipment and storage medium
CN116668857A (en) Method and device for displaying light field photo
CN117692759A (en) Method, terminal, storage medium and chip for generating preview image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant