WO2023160238A1 - 一种图片显示方法及相关电子设备 - Google Patents

一种图片显示方法及相关电子设备 Download PDF

Info

Publication number
WO2023160238A1
WO2023160238A1 PCT/CN2022/143658 CN2022143658W WO2023160238A1 WO 2023160238 A1 WO2023160238 A1 WO 2023160238A1 CN 2022143658 W CN2022143658 W CN 2022143658W WO 2023160238 A1 WO2023160238 A1 WO 2023160238A1
Authority
WO
WIPO (PCT)
Prior art keywords
video
picture
electronic device
thumbnail
interface
Prior art date
Application number
PCT/CN2022/143658
Other languages
English (en)
French (fr)
Inventor
贾桂卿
赫伽宁
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Priority to EP22924557.6A priority Critical patent/EP4276619A1/en
Publication of WO2023160238A1 publication Critical patent/WO2023160238A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/73Querying
    • G06F16/738Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440245Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display the reformatting operation being performed only on part of the stream, e.g. a region of the image or a time segment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8549Creating video summaries, e.g. movie trailer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Definitions

  • the present application relates to the field of picture display, in particular to a picture display method and related electronic equipment.
  • the application of the video recording function on the mobile device has also been developed rapidly.
  • photos cannot be taken at the same time, so that after the video recording is completed, the user can only obtain the video, but not the pictures during the video recording process, which reduces the user experience.
  • the embodiment of the present application provides a picture display method, which solves the problem that the user cannot browse the picture corresponding to the video at any time during the process of watching the video.
  • the embodiment of the present application provides a method for displaying pictures, which is applied to electronic devices, and the method includes: starting a gallery application; displaying a first interface of the gallery, the first interface includes a first thumbnail, and the first thumbnail is The thumbnail corresponding to the first video; detect the user's first input operation for the first thumbnail; obtain the thumbnails of N (N is an integer greater than 0) associated pictures associated with the first video according to the group ID of the first video Thumbnail, the associated picture is a snapped picture during the recording of the first video, and the group ID is used to identify the first video; the video playback interface of the first video is displayed, and the video playback interface includes a first display frame and a second display frame, and the first display The frame is used to display the frame of the first video, and the second display frame is used to display the thumbnails of N related pictures.
  • the user can click the thumbnail of the picture associated with the video on the video playback interface to view the
  • the playback interface is switched to the image preview interface, and the image corresponding to the thumbnail can be browsed.
  • the first information includes the mapping relationship between the storage path of the first video and the group ID of the first video; obtaining the group ID of the first video according to the first information.
  • the group ID of the first video can be acquired in the media file library, and the thumbnails of the associated pictures of the first video can be read according to the group ID.
  • obtaining the thumbnails of N associated pictures associated with the first video according to the group ID of the first video includes: in the media information library according to the group ID of the first video Read the second information; the second information includes the mapping relationship between the group ID of the first video, the storage path of N associated pictures, and the thumbnails of N associated pictures; after reading the storage of N associated pictures, In the case of a path, obtain thumbnails of N associated pictures associated with the first video.
  • the thumbnail of the associated picture of the first video can be displayed on the video playback interface of the first video, so that the user can browse the associated picture based on the thumbnail.
  • the video playback interface includes the first positioning control to display the video playback interface of the first video, it further includes: detecting the user's second input operation on the first positioning control; A positioning control indicates the first target thumbnail, the first target thumbnail is the thumbnail in the second display frame; based on the first target thumbnail, the storage path of the first associated picture is obtained in the second information; the second information is the media The information in the information base, the second information includes the group ID of the first video, the storage path of the N associated pictures, and the mapping relationship between the thumbnails of the N associated pictures, and the first target thumbnail is the first associated picture The thumbnail image of the first associated image is retrieved according to the storage path of the first associated image; the first preview interface is displayed, and the first preview interface includes a first image preview frame and a second display frame; the first image preview frame is used to display the first image preview frame An associated image, the first positioning control indicates a first target thumbnail, the first target thumbnail is a thumbnail in the second display frame, and the first target thumbnail
  • the video playback interface further includes a first control, the first control is a control displayed when a configuration file exists, and the first control is used to trigger the generation of a second video, and the second video different from the first video. In this way, the user can trigger the electronic device to generate the second video by clicking the first control.
  • the method further includes: detecting the user's third input operation on the first control; reading the first control according to the group ID of the first video; Three information, the first information includes the configuration file storage path of the first video, the mapping relationship between the group ID; According to the configuration file storage path of the first video, the configuration file of the first video is called; the configuration file based on the first video pair
  • the first video is processed to obtain the second video; the video playback interface of the second video is displayed, the video playback interface of the second video is different from the video playback interface of the first video, and the video playback interface of the second video includes a preview frame of the first video , the first video preview box is used to display the picture of the second video.
  • the first preview interface includes a second display frame, and the second display frame includes a second target thumbnail, and after the first preview interface is displayed, further includes: detecting that the user The input operation of the thumbnail of the second target; the video playback interface of the first video is displayed. In this way, the user can switch from the preview interface of the associated picture to the video playback interface of the first video by clicking the second target thumbnail.
  • the maximum number of thumbnails of the associated pictures displayed in the second display frame is M
  • the second display frame includes a first switching control to display the video playback of the first video After the interface, it also includes: when N is greater than M, detecting the user’s input operation on the first switching control; switching the thumbnail in the second display box; the thumbnail displayed in the second display box before switching is the same as the second The thumbnails displayed in the display box are different after switching. In this way, the second display frame can be made to switch the thumbnails of the associated pictures.
  • an embodiment of the present application provides an electronic device, which includes: one or more processors and a memory; the memory is coupled to the one or more processors, and the memory is used to store computer program codes,
  • the computer program code includes computer instructions, and the one or more processors call the computer instructions to make the electronic device execute: start the gallery application; display the first interface of the gallery, the first interface includes a first thumbnail, the first thumbnail is the thumbnail corresponding to the first video; the user’s first input operation on the first thumbnail is detected; according to the group ID of the first video, the thumbnails of N associated pictures associated with the first video are obtained, and the associated picture is the first Capture pictures during video recording, the group ID is used to identify the first video; display the video playback interface of the first video, the video playback interface includes a first display frame and a second display frame, the first display frame is used to display the first video , the second display frame is used to display thumbnails of N associated pictures.
  • the one or more processors call the computer instruction to make the electronic device execute: read the first information from the media file library, the first information includes the first video storing the mapping relationship between the path and the group ID of the first video; acquiring the group ID of the first video according to the first information.
  • the one or more processors call the computer instruction so that the electronic device executes: reading the second information in the media information library according to the group ID of the first video;
  • the second information includes the mapping relationship between the group ID of the first video, the storage paths of N associated pictures, and the thumbnails of N associated pictures; in the case of reading the storage paths of N associated pictures, obtain Thumbnails of the N associated pictures associated with the first video.
  • the one or more processors invoke the computer instructions to cause the electronic device to execute: detecting a second input operation by the user on the first positioning control; Indicate the first target thumbnail, the first target thumbnail is the thumbnail in the second display frame; based on the first target thumbnail, obtain the storage path of the first associated picture in the second information; the second information is the media information library
  • the second information includes the mapping relationship between the group ID of the first video, the storage path of the N associated pictures, and the thumbnails of the N associated pictures, and the first target thumbnail is the thumbnail of the first associated picture ;Call the first associated image according to the storage path of the first associated image; display the first preview interface, the first preview interface includes the first image preview frame and the second display frame; the first image preview frame is used to display the first associated image ,
  • the first positioning control indicates the first target thumbnail, the first target thumbnail is the thumbnail in the second display frame, and the first target thumbnail is the thumbnail of the first associated picture.
  • the one or more processors call the computer instruction to make the electronic device perform: detecting a third input operation by the user on the first control;
  • the group ID reads the third information, and the first information includes the mapping relationship between the configuration file storage path of the first video and the group ID;
  • the configuration file of the first video is transferred according to the configuration file storage path of the first video; based on the first
  • the video configuration file processes the first video to obtain the second video; displays the video playback interface of the second video, the video playback interface of the second video is different from the video playback interface of the first video, and the video playback interface of the second video includes The first video preview frame is used to display the picture of the second video.
  • the one or more processors call the computer instruction to make the electronic device perform: detecting the user's input operation on the second target thumbnail; displaying the first video Video playback interface.
  • the one or more processors call the computer instruction to make the electronic device execute: when N is greater than M, detect the user's input for the first switch control Operation; switch the thumbnail in the second display frame; the thumbnail displayed in the second display frame before switching is different from the thumbnail displayed in the second display frame after switching.
  • an embodiment of the present application provides an electronic device, including: a touch screen, a camera, one or more processors, and one or more memories; the one or more processors and the touch screen , the camera, the one or more memories are coupled, the one or more memories are used to store computer program codes, the computer program codes include computer instructions, and when the one or more processors execute the computer instructions , so that the electronic device executes the method described in the first aspect or any possible implementation manner of the first aspect.
  • an embodiment of the present application provides a chip system, which is applied to an electronic device, and the chip system includes one or more processors, and the processor is used to call a computer instruction so that the electronic device executes the first Aspect or the method described in any possible implementation manner of the first aspect.
  • the embodiment of the present application provides a computer program product containing instructions.
  • the computer program product is run on an electronic device, the electronic device is made to execute any one of the possible implementations of the first aspect or the first aspect. The method described in the manner.
  • the embodiment of the present application provides a computer-readable storage medium, including instructions, and when the instructions are run on the electronic device, the electronic device executes any one of the possible implementations of the first aspect or the first aspect. The method described in the manner.
  • FIG. 1A-FIG. 1H are a set of exemplary user interface diagrams provided by the embodiment of the present application.
  • 2A-2K are another set of exemplary user interface diagrams provided by the embodiment of the present application.
  • FIG. 3 is a schematic diagram of a hardware structure of an electronic device 100 provided in an embodiment of the present application.
  • Fig. 4 is a frame flow diagram of a picture display method provided by an embodiment of the present application.
  • FIG. 5 is a flow chart of a picture display method provided by an embodiment of the present application.
  • Fig. 6 is a flow chart of displaying a second display frame and a first control provided by an embodiment of the present application
  • Fig. 7A is an interface diagram of the first video playback provided by the embodiment of the present application when the storage path of the associated picture of the first video is not read;
  • FIG. 7B is a schematic diagram of a first video playback interface provided by an embodiment of the present application when the storage path of the first video configuration file has not been read.
  • a unit may be, but is not limited to being limited to, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or distributed between two or more computers.
  • these units can execute from various computer readable media having various data structures stored thereon.
  • a unit may, for example, be based on a signal having one or more data packets (eg, data from a second unit interacting with another unit between a local system, a distributed system, and/or a network. For example, the Internet via a signal interacting with other systems) Communicate through local and/or remote processes.
  • an embodiment of the present application proposes a picture display method, which includes: when the electronic device enables the function of obtaining pictures related to the recorded video, after the video recording ends, When the user browses the recorded video, the electronic device displays thumbnails of pictures associated with the video on the video playback interface. After detecting the user's click operation on the thumbnail, the electronic device displays a picture preview interface, and the picture preview interface displays a picture corresponding to the thumbnail. Users can perform operations such as sharing, editing, deleting, forwarding, etc. on the picture in the picture preview interface.
  • the user can browse the related pictures of the video at any time during the process of browsing the video, without having to perform a series of cumbersome operations such as "exit the video playback interface-find the related picture in the gallery-browse the related picture", Thereby, the time cost of browsing pictures of the user is saved, and the use experience of the user is improved.
  • FIG. 1A-FIG. 1H are a set of exemplary user interfaces provided by the embodiment of the present application.
  • the user interface 10 is the main interface (also called desktop or main screen, etc.) of the electronic device 100 , and the main interface includes a gallery icon 101 , a camera icon 102 and other function icons.
  • the electronic device 100 detects an input operation (eg, click) on the camera icon 102 , in response to the input operation, the electronic device 100 displays the user interface 11 as shown in FIG. 1B .
  • the user interface 11 is a shooting interface of the electronic device 100 .
  • the shooting interface includes a preview area 111, a zoom ratio display frame 112, a zoom ratio adjustment control 113, a function area 114, a gallery control 115, a start control 116, an echo control 117, a function indication icon 118, and "more functions" Control 119.
  • the preview area 111 is used to display the real-time preview picture of the shooting environment acquired by the camera in real time, and the preview area 111 includes a first shooting object 1111 , a second shooting object 1112 and a third shooting object 1113 .
  • the zoom ratio display frame 112 is used to display the zoom ratio of the preview image in the preview area 111, wherein 1X is a zoom ratio of 1 times, 2X is a zoom ratio of 2 times, 3X is a zoom ratio of 3 times...and so on.
  • the zoom factor is positively correlated with the focal length of the picture. The larger the zoom factor, the larger the focal length of the picture, and the larger the subject in the captured picture. It can be seen from FIG. 1B that the zoom factor of the current preview picture is 1 times the zoom factor.
  • the echo control 117 is used to switch between the front camera and the rear camera of the electronic device 100.
  • the electronic device 100 When the electronic device 100 is currently working with a rear camera, it detects the click operation of the user on the echo control 117, and responds to the Operation, the electronic device 100 switches the working camera to the front camera. At this time, in the preview area 111, the preview picture obtained by the front camera is displayed, and the preview picture obtained by the rear camera is not displayed.
  • the electronic device 100 In the case of the front camera, the user’s click operation on the echo control 117 is detected, and in response to the operation, the electronic device 100 switches the working camera to the rear camera, and at this time, in the preview area 111, the display of the rear camera The obtained preview image does not display the preview image obtained by the front camera.
  • the function area 114 is used for the current function mode of the camera application, and the function area 114 includes a "portrait” control 1141, a "64M” control 1142, a "video recording” control 1143, a “photographing” control 1144, a "more” control 1145 and function indication icons 118.
  • the function indication icon 118 is used to indicate the current shooting mode of the electronic device. As shown in FIG. 1B , the function indicating icon is below the “recording” control 1143 , and the current shooting mode of the electronic device 100 is the recording mode.
  • the zoom magnification adjustment control 113 is used to adjust the zoom magnification of the preview picture.
  • the electronic device 100 detects an input operation (for example, sliding up) for the zoom magnification adjustment control 113, in response to the input operation, the electronic device increases the zoom magnification of the preview picture , to display the user interface 12 as shown in FIG. 1C .
  • the user interface 12 is the shooting interface of the electronic device 100. It can be seen from the zoom ratio display frame that the zoom ratio of the current preview picture is 3 times the zoom ratio, and the first shooting object, the second shooting object and The third subject is enlarged compared to the subject in FIG. 1B .
  • the electronic device 100 detects an input operation (for example, click) on the "more function" control, in response to the input operation, the electronic device 100 displays the electronic device 13 as shown in FIG. 1D .
  • the user interface 13 includes a "more functions" display area 131, in which the "more functions" display area 131 includes: a return control 1311, a "one record multiple get” control 1312, and a "subtitle synchronization” control 1313 , a “reference line” control 1314 , a “macro mode” control 1315 , and a “settings” control 1316 .
  • the "one record multiple get” control 1312 is used to enable the one record multiple get function, that is: while the electronic device is shooting a video, it acquires all or part of the pictures associated with the video, and processes and saves these pictures, so that A shooting function to get video and pictures at the same time.
  • the associated picture of the video may be a picture of a highlight screen of the video.
  • the electronic device 100 detects an input operation (for example, a single click) on the "one record multiple get" control 1312, in response to the input operation, the electronic device 100 displays the user interface 14 as shown in FIG. 1E.
  • the color of the "One Record Multiple Get" control in the user interface 14 is different from the color of the "One Record Multiple Get” control in the user interface 13, which is used to indicate that the electronic device 100 has enabled one record multiple get Function.
  • the electronic device 100 detects an input operation on the return control 141 , in response to the operation, the electronic device 100 displays the user interface 12 as shown in FIG. 1G .
  • the user interface 12 please refer to the related description of the user interface 12 in FIG. 1C above, and the embodiment of the present application will not repeat it here.
  • the electronic device 100 when the electronic device 100 detects an input operation such as a click on the "one record multiple get" control, in response to the operation, the electronic device 100 can use the user interface 15 shown in FIG. 1F, and the user The interface 15 includes a prompt box 151, and the prompt box 151 is used to remind the user that the one-record-multiple function has been enabled.
  • the prompt box 151 includes a text message "the function of multiple access to one record has been enabled".
  • the electronic device 100 detects a click operation on the start control, and in response to the operation, the electronic device 100 starts recording a video and displays the user interface 16 as shown in FIG. 1H .
  • the electronic device 100 may display a “One Record Multiple” control on the user interface 12 . After detecting a single-click operation on the "one-record-multiple-get” control, in response to the operation, the electronic device 100 enables the one-record multiple-get function.
  • the user interface 16 is a video recording interface, which includes a preview picture display area 161, a recording time display area 162, a zoom ratio display area 163, a zoom ratio adjustment control 164, a pause recording control 165, a stop Recording controls 166 .
  • the preview picture display area 161 is used to display the preview picture of the current shooting environment in real time
  • the recording time display area 162 is used to display the shooting duration of the current video, as shown in Figure 1H, as can be seen from the recording time display area 162, the recording of the current video The time is 25s.
  • the zoom magnification display area 163 is used to display the zoom magnification of the current preview picture, and it can be known from the zoom magnification display area 163 that the zoom magnification of the current preview picture is 3 times the zoom magnification.
  • the zoom magnification adjustment control 164 is used to adjust the zoom magnification of the preview image.
  • the pause recording control 165 is used for the user to pause the current video recording.
  • the stop recording control 166 is used for the user to stop recording the video. As shown in FIG. 1H, after the electronic device 100 detects a click operation for the stop recording control 166, in response to the operation, the electronic device 100 stops recording the video, and saves the video and The associated image for this video.
  • FIGS. 1A-1H have introduced the application scenarios of video recording by the electronic device 100 when the "one record, many" function is turned on.
  • the application scenario of playing the recorded video in the above-mentioned Fig. 1A-Fig. 1H application scene will be introduced.
  • the video recorded in the above-mentioned Fig. 1A-Fig. 1H is video 1 as an example for illustration.
  • . 2A-2J are a set of exemplary user interfaces provided by the embodiment of the present application.
  • the user interface 20 is the main interface of the electronic device 100 , and the main interface includes a gallery icon 201 and other application icons.
  • An input operation for example, a click
  • the electronic device 100 displays the user interface 21 as shown in FIG. 2B .
  • the user interface 21 is an album display interface
  • the album display interface includes an “album” icon 211 , a screenshot icon 212 , a video icon 213 , and a picture icon 214 .
  • the screenshot icon 212 is an icon of a collection of screenshot files stored by the electronic device 100.
  • the electronic device 100 When a click operation of the user on the screenshot icon 212 is detected, the electronic device 100 will display a thumbnail of the screenshot image stored in it, and the video icon 213 It is an icon of a collection of video files stored in an electronic device.
  • the electronic device 100 will display thumbnails of the video files stored in it.
  • the picture icon 214 is a video file stored in the electronic device 100.
  • the icon of the associated picture file set, the associated picture file stored in the electronic device 100 is a video-related picture file, after the electronic device 100 detects the click operation on the picture icon 214, the electronic device 100 displays the shortened version of the stored associated picture file.
  • Sketch map The electronic device 100 detects a click operation on the video icon 213, and in response to the operation, the electronic device 100 displays the user interface 22 as shown in FIG. 2C.
  • the user interface 22 is a video display interface, and the video display interface is used to display thumbnails of video files stored in the electronic device 100.
  • the thumbnail 221 includes a picture indicator icon 2221, and the picture indicator 2221 is used to indicate that the video 1 includes associated pictures acquired through the "one record, many" function, and the thumbnail 221 can also display the duration of the video 1. It can be seen that the duration of video 1 is 30 seconds.
  • the electronic device 100 detects a click operation on the thumbnail image 221 , and in response to the operation, the electronic device 100 displays the user interface 23 as shown in FIG. 2D .
  • the user interface 23 is the video playback interface of the gallery APP, and the video playback interface includes a video display area 231, a first video playback control 232, a picture display area 233, a return control 234, a video generation control 235, a video sharing control 236 , video edit control 237 , video delete control 238 , and “more” control 239 .
  • the video display area 231 is used to display video.
  • the video sharing control 236 is used to trigger the electronic device 100 to share the video.
  • the first video play control 232 is used to trigger the play of the video 1 .
  • the video editing control 237 is used to trigger the electronic device to edit the video.
  • the electronic device 100 can edit the video 1 (for example, perform operations such as editing the video).
  • the video deletion control 238 is used for the user to delete the video. When the electronic device 100 detects a click operation on the video deletion control 238 , the electronic device 100 can delete the video 1 .
  • Picture display area 233 is used for displaying the thumbnail of video 1 associated picture, and this picture display area 233 can comprise the thumbnail of picture 1 ⁇ picture 4, and picture 1 is the thumbnail of video 1 (can be the cover of video 1), picture 2 ⁇ Picture 4 is the related picture of Video 1, and the related picture can be a picture of the wonderful moment during the shooting of Video 1. Of course, there may be more or fewer associated pictures of video 1 in the picture display area 233 .
  • the picture display area 233 may also include an indication icon 2331 , and the indication icon 2331 is used to locate the thumbnail in the picture display area 233 . As shown in FIG. 2D , the current indicator icon 2331 is above the thumbnail 2332 of picture 1, and is used to indicate that the current interface is the playback interface of video 1.
  • the picture display area 233 may also include a slide control 2333 . After the electronic device 100 detects the click operation on the first video play control 232, in response to the operation, the electronic device 100 plays the video 1 and displays the user interface 24 as shown in FIG. 2E.
  • the user interface 24 is a video playback interface, and the video playback interface is used to play videos.
  • a progress bar 241 and a pause control 242 are included.
  • the progress bar 241 is used to display the current playing progress of the video 1, and it can be known from the progress bar 241 that the playing progress of the video 1 is 15 seconds.
  • the pause control 242 is used to pause playing the video.
  • the electronic device 100 detects a click operation on the pause control 242 , the electronic device 100 can pause the playing of the video 1 in response to the operation.
  • the electronic device 100 detects the click operation on the sliding control 243, and the electronic device 100 displays the user interface 25 as shown in FIG. 2F.
  • the picture display area for example, the above-mentioned picture display area 233
  • only a fixed number of thumbnails can be displayed (for example, only 4 thumbnails of pictures can be displayed)
  • the video associated picture If the number is greater than the maximum number of thumbnails that can be displayed in the picture display area, the user can view thumbnails of other pictures by clicking the sliding control.
  • the user interface 25 is a video playback interface, and in the video playback interface, the picture display area 251 displays thumbnails of Picture 1 , Picture 3 - Picture 5 .
  • the picture display area in FIG. 2F displays the thumbnail of picture 5 and does not display the thumbnail of picture 2 . In this way, the user can view the thumbnail of picture 5.
  • the user can also view thumbnails of other pictures in Video 1 in other ways.
  • the electronic device 100 detects a user's input operation (for example, sliding to the left) on the image display area, in response to the operation, the electronic device 100 displays the user interface 25 as shown in FIG. 2F .
  • the electronic device 100 when the electronic device 100 detects an input operation on the indicator icon 252 (for example, sliding to the left, so that the indicator icon 252 is above the thumbnail 253 of picture 2), in response to the operation, the electronic device 100 displays 2G shows the user interface 26.
  • the user interface 26 is a picture display interface
  • the picture display interface includes a picture preview area 261 , a function setting area 262 , a picture display area 263 , and a return control 264 .
  • the picture preview area 261 is used to display the thumbnail of the picture 3 in the above-mentioned FIG. Multiple controls 2624.
  • the sharing control 2621 is used for the user to share the picture 3.
  • the electronic device 100 detects a click operation on the sharing control 2621, the electronic device 100 forwards the picture 3 in response to the operation.
  • the editing control 2622 is used for the user to edit the picture 3.
  • the electronic device 100 When the electronic device 100 detects a click operation on the video editing control 2622, in response to the operation, the electronic device 100 edits the picture 3 (for example, adjusting the brightness or contrast of the picture 3, etc.) .
  • the delete control 2623 is used for the user to delete the picture 3.
  • the electronic device 100 detects a click operation on the delete control 2623, the electronic device 100 deletes the picture 3 in response to the operation.
  • the electronic device 100 displays the user interface 27 as shown in FIG. 2H.
  • the electronic device 100 when the electronic device 100 detects an input operation on the indicator icon 265 (for example, when the indicator icon is moved above the thumbnail of picture 1), in response to the operation, the electronic device 100 may also display the following: The user interface 27 shown in Figure 2H.
  • user interface 27 is a video playback interface, and this video playback interface is consistent with the state of user interface 25 in the above-mentioned Figure 2F, that is: the progress of video 1 played by user interface 27 is 15 seconds, and the progress of video 1 played by user interface 27 is 15 seconds. Video 1 also has a progress of 15 seconds.
  • the electronic device 100 detects a click operation on the video generation control 271, in response to the operation, the electronic device 100 displays the user interface 28 as shown in FIG. 2I.
  • the user interface 28 is a video playback interface, and the user interface 28 includes a first prompt box 281, and the first prompt box 281 is used to prompt the user whether to generate a video 2, as shown in Figure 2I, the first prompt box 281 shows Text information "whether to generate the second video".
  • the first prompt box 281 includes a confirm control 2811 and a cancel control 2812.
  • the confirm control 2811 is used to trigger the electronic device 100 to generate the video 2.
  • a user interface 29 as shown in FIG. 2J is displayed.
  • the user interface 29 is a video playback interface
  • the video playback interface includes a video preview area 291 and a save control 292 .
  • the video preview area 291 is used to display the picture of the video 2 .
  • the user can search and browse the video 2 in the gallery.
  • the user can view the video 2 in the video display interface.
  • the user interface 30 has more thumbnail images 301 of the video 2 than the user interface 22 .
  • the electronic device 100 may play the video 2 .
  • FIG. 3 is a schematic diagram of a hardware structure of an electronic device 100 provided by an embodiment of the present application.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and A subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an environmental Light sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU) wait.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • graphics processing unit graphics processing unit
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit, NPU
  • the wireless communication function of the electronic device 100 can be realized by the antenna 1 , the antenna 2 , the mobile communication module 150 , the wireless communication module 160 , a modem processor, a baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves and radiate them through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
  • the wireless communication module 160 can provide wireless local area network (wireless local area networks, WLAN) (such as Wi-Fi network), Bluetooth (BlueTooth, BT), BLE broadcasting, global navigation satellite system (global navigation satellite system) applied on the electronic device 100. system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the electronic device 100 realizes the display function through the GPU, the display screen 194 , and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used for displaying pictures, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194 , where N is a positive integer greater than 1.
  • the electronic device 100 can realize the shooting function through the ISP, the camera 193 , the video codec, the GPU, the display screen 194 and the application processor.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into a picture visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Digital signal processors are used to process digital signals. In addition to digital picture signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, voice recognition, text understanding, and the like.
  • the electronic device 100 can implement audio functions through the audio module 170 , the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • the audio module 170 may also be used to encode and decode audio signals.
  • the audio module 170 may be set in the processor 110 , or some functional modules of the audio module 170 may be set in the processor 110 .
  • Speaker 170A also referred to as a "horn" is used to convert audio electrical signals into sound signals.
  • Electronic device 100 can listen to music through speaker 170A, or listen to hands-free calls.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the receiver 170B can be placed close to the human ear to receive the voice.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can put his mouth close to the microphone 170C to make a sound, and input the sound signal to the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In some other embodiments, the electronic device 100 may be provided with two microphones 170C, which may also implement a noise reduction function in addition to collecting sound signals. In some other embodiments, the electronic device 100 can also be provided with three, four or more microphones 170C to realize sound signal collection, noise reduction, sound source identification, and directional recording functions.
  • the electronic device 100 may further include one or more items such as a button 190, a motor 191, an indicator 192, and a SIM card interface 195 (or eSIM card), which is not limited in this embodiment of the present application.
  • FIGS. 1A-2K introduce the application scenarios involved in the image display method described in the embodiment of the present application.
  • FIG. 4 the framework of a picture display method provided by the embodiment of the present application is introduced. Please refer to FIG. 4 .
  • FIG. 4 is a flowchart of a frame of a method for displaying a picture provided by an embodiment of the present application. The frame diagram is briefly introduced below.
  • the electronic device After the video recording function of the camera APP is turned on, the electronic device performs video recording (that is, step S401). During the video recording process, the electronic device can identify the current scene through the AI algorithm (ie step S402), so as to obtain one or more pictures of the wonderful moments related to the video during the shooting process (ie step S403). After the video recording ends, the electronic device saves the recorded video and the picture of the wonderful moment corresponding to the video (that is, step S404).
  • the electronic device can search the recorded video through the gallery APP, and trigger the electronic device to enter the video playback interface, so as to realize browsing the recorded video (ie step S405), and display the thumbnail of the wonderful moment picture corresponding to the video in the video playback interface ( There is an indicator control above one of the thumbnails) and a display AI video control that, when triggered, allows the electronic device to generate an AI video. While browsing recorded video:
  • the user can browse any picture of a wonderful moment by moving the position of the indicator control (the indicator control can be the indicator icon 2331 in FIG. Go to the picture browsing interface, so as to display the wonderful moment picture corresponding to the thumbnail indicated by the indicator control.
  • the user can perform operations such as sharing, editing, deleting, copying, and cloud synchronization of the wonderful moment picture (that is, step S407).
  • the user can click the AI video control in the video playing interface, and then the electronic device can generate an AI video based on the video (ie step S408).
  • the AI video can be understood as a video that automatically matches a template (with special effects, music, etc.) generated by the electronic device based on the video.
  • the electronic device can display the AI video on the preview interface (that is, step S409).
  • the user can edit the AI video (for example, cropping, adding motion effects) etc.), after clicking the save control on the preview interface, the electronic device can save the AI video (that is, step S410).
  • FIG. 5 is a flow chart of a picture display method provided in the embodiment of the present application. The specific process is as follows:
  • Step S501 Displaying a first shooting preview interface, the first shooting preview interface includes a first picture preview frame and a first object control, and the first preview frame displays a real-time captured picture by a camera.
  • the first shooting preview interface is a preview interface before the electronic device shoots a video.
  • the first shooting preview interface may be the user interface 11 shown in FIG. 1B above.
  • the first picture preview frame is used to display the picture captured by the camera in real time, that is, the real-time picture of the shooting environment.
  • the preview frame of the first picture may be the preview area 111 in FIG. 1B above.
  • the first control is used for the user to activate the electronic device to start recording video.
  • the first target control may be the above-mentioned start control 116 in FIG. 1B .
  • Step S502 After detecting the first operation on the first shooting preview interface, the electronic device activates the one-record-multiple-get function.
  • the one-record-multiple-get function is a function for the electronic device to acquire one or more associated pictures during the recording process of the first video.
  • a second target control may be included in the first shooting preview interface, and the second target control is used to trigger the electronic device to start the function of "one record multiple".
  • one recording is more: the electronic device acquires one or more target pictures during the recording process of the first video.
  • the associated picture is a picture taken during the recording of the first video, for example, a picture manually taken by the user during the recording of the first video, or a picture automatically taken by the electronic device during the recording of the first video.
  • the associated picture of the first video may be a picture of a wonderful moment of the first video, or may be a picture processed by AI on the picture of a wonderful moment of the first video.
  • AI processing can include: identifying the shooting scene, and matching filters suitable for the shooting scene for these wonderful moment pictures.
  • the first operation may be a user's click operation on the second target control. After the electronic device detects the user's first operation on the second target control, in response to the first operation, the electronic device enables the one-record-multiple-get function.
  • the second target control can be a setting item in "Setting Options", and the "one record multiple" function can be enabled by default when the electronic device records a video. If a click on the second target control is detected In the case of operation, the electronic device can turn off the "one-record-multiple-get" function.
  • the first shooting preview interface may include a first setting control.
  • the first setting control can be the "more functions" control 119 in the above-mentioned Fig. 1B.
  • the electronic device 100 detects a click operation on the first setting control, the electronic device displays a first function display frame, which includes the second target control, and the first function display frame may be the above-mentioned In the "More Functions" display area 131 in 1D
  • the second target control can be the "One Record Multiple Get" control 1312 in the above-mentioned Figure 1D
  • the first operation can be for the "One Record Multiple Get” control in the above Figure 1D
  • An input operation eg, click
  • Step S503 After detecting the second operation on the first target control, display the first shooting interface and start shooting the first video.
  • the electronic device displays the first shooting interface and starts shooting the first video.
  • the first shooting interface is different from the first shooting preview interface.
  • the first shooting interface may include a second picture preview box and a third object control, and the second picture preview box is used to display the picture of the shooting object.
  • the second operation may be the click operation on the start control in the above-mentioned FIG. 1F
  • the first shooting interface may be the user interface 16 in the above-mentioned FIG. 1H
  • the second screen preview box may be the preview shown in the above-mentioned FIG. 1H
  • the third target control may be the stop recording control 166 in FIG. 1H.
  • Step S504 At the first moment when the shooting of the first video starts, the real-time captured picture of the camera is displayed in the second picture preview frame of the first shooting interface.
  • Step S505 At the second moment, after detecting the third operation on the third target control in the first shooting interface, save the first video and its corresponding associated picture.
  • the third operation may be the single-click operation on the stop recording control 166 in FIG. A video and an associated image corresponding to that video.
  • the associated picture is a picture acquired by the electronic device during the recording process of the first video.
  • the associated picture may be a wonderful moment picture in the recording process of the first video.
  • the camera After the electronic device detects the third operation on the third target control, the camera generates a configuration file and a first identifier, and inserts the first identifier as meta information into the first video and the media file corresponding to the first video.
  • the media file corresponding to the first video is generated after the electronic device detects the third operation on the third target control, and the media file corresponding to the first video includes: a thumbnail of an associated picture and a configuration file.
  • the configuration file may be used for the electronic device to generate the second video based on the first video, and the second video may be the AI video described in the above-mentioned embodiment in FIG. 4 .
  • the first identifier may be a group ID.
  • the camera will generate a group ID for the video, and insert the group ID into the video file.
  • the group ID is used to identify different videos. between is unique.
  • the electronic device can generate group ID 0010 for video 1, and 0010 is the unique identifier of video 1; after the recording of video 2 is finished, the electronic device can generate group ID 0020 for video 2, and 0020 is video 2 unique identifier of .
  • the format of the group ID may be a binary string in the format of Universally Unique Identifier (UUID). In this embodiment of the present application, description is made by taking the first identifier as a group ID as an example.
  • UUID Universally Unique Identifier
  • the electronic device may write information such as the mapping relationship between the group ID and the first video storage path into the media file library.
  • the electronic device In the process of recording the first video, if the electronic device turns on the function of "recording multiple times" and detects that there is an associated picture of the first video, the electronic device can store the group ID, the storage path of the associated picture, and the thumbnail of the associated picture three times.
  • the mapping relationship of the user is written into the media information library (for example, MediaLibrary).
  • the electronic device In the process of recording the first video, if the electronic device detects that there is a configuration file of the first video, the electronic device can generate a thumbnail of the second video, and the electronic device can store the group ID, the storage path of the configuration file, and the thumbnail of the second video
  • the mapping relationship among the three is written into the media information library (for example, MediaLibrary).
  • Table 1 is an exemplary mapping relationship table between a video group ID and a storage path of the video file (that is, the recorded original video file).
  • Table 2 is a mapping relationship table among associated picture thumbnails, video group IDs, and associated picture storage paths.
  • Table 3 is a mapping relation table among the thumbnail image of the second video, the storage path of the video configuration file, and the video group ID. Table 1, Table 2 and Table 3 are as follows:
  • the storage path is the address of the first video or associated picture or configuration file in the memory of the electronic device.
  • the electronic device can find the storage location of the first video or associated picture or configuration file by reading the storage path, and call the first video, configuration file or associated image, so that the first video or associated image can be displayed on the screen of the electronic device , or, the electronic device may generate the second video based on the configuration file.
  • the storage path of the picture 1 is "Download/email/Video"
  • the picture 1 is stored in the email file under the Download folder of the electronic device, and in the Video file under the email file.
  • the configuration file of the first video can be used by the electronic device to generate the second video
  • the configuration file is a video description file, including video clip information
  • the video clip information is related to the first video.
  • the video clip information may be time information for instructing the electronic device to clip the first video.
  • the format of the video clip information may be "30-32, 60-64, 100-115".
  • Second video clips, 100-115 second video clips, and these three video clips are synthesized and processed to obtain a second video with a duration of 21 seconds.
  • the configuration file may also include motion effect processing information, where the motion effect processing information is used to instruct the electronic device to configure motion effects and other operations for the second video.
  • configuring the motion effect for the second video by the electronic device may be: the electronic device adds a motion effect scene (for example, a rain scene) to the second video, so that the rain scene will be displayed during the playback of the second video.
  • Step S506 In response to the first input operation, display the first video playback interface, the first video playback interface includes a first display frame, a second display frame and a first control, the first display frame is used to display the first video , the second display frame is used to display thumbnails in the first target atlas, and the first target atlas includes thumbnails of N associated pictures.
  • the first input operation may be the input operation for the thumbnail image 221 in FIG. 2D above
  • the first video playback interface may be the video playback interface of the gallery APP in FIG. 2D
  • the first display frame may be the video in FIG. 2D above.
  • the second display frame may be the picture display area 233 in FIG. 2D above
  • the first control may be the video generation control 235 in FIG. 2D above.
  • the first target atlas is a collection of all associated pictures.
  • FIG. 6 is a second display provided by the embodiment of the present application
  • the flow chart displayed by the box and the first control, the specific process is as follows:
  • Step S601 the electronic device reads the mapping information between the group ID and the first video storage path in the media file library.
  • the mapping information is used to represent the mapping relationship corresponding to the first video and its group ID.
  • the mapping information between the group ID and the first video storage path may be Table 1 in the above step S505.
  • the camera After the recording of the first video ends, the camera generates a group ID for the first video, and inserts the group ID into the first video file, and then the electronic device combines the group ID with the first video file.
  • a mapping relationship between video file storage paths is written into the media file library.
  • the electronic device After detecting the first input operation, the electronic device reads the second information in the media file library, so as to obtain the group ID of the first video.
  • Step S602 When the group ID (first identification) of the first video is read, the electronic device reads the first information in the media information database.
  • the second information may be the mapping relationship between the thumbnail of the associated picture of the first video, the group ID of the first video, and the storage path of the associated picture of the first video, and the mapping relationship may be the table in the above step S505 2.
  • Step S603 In the case of reading the storage path of the first video-associated picture, the electronic device displays a second display frame on the first video playback interface, and displays the thumbnails in the first target atlas in the second display frame. Sketch map.
  • the electronic device when the electronic device reads the storage path of the associated pictures of the first video in the media information library based on the group ID of the first video, the electronic device can determine that there are associated pictures in the first video, and play the first video A second display frame is displayed on the interface, and the thumbnails in the first target atlas are displayed in the second display frame.
  • the maximum number of thumbnails displayed in the second target display frame is M
  • the number of thumbnails in the first target atlas is N
  • the thumbnails displayed in the first target atlas in the second display frame can be divided into the following two cases:
  • the first case when N is greater than or equal to M, the second display frame displays M thumbnails in the first target atlas. For example, as shown in FIG. 2D , assuming that the first video has 5 associated pictures, since the picture display area can only display up to 4 thumbnails, only the thumbnails of pictures 1 to 4 are displayed in the picture display area.
  • the second case when N is smaller than M, the second display frame displays all the thumbnails in the first target atlas. For example, if the first video has only 3 associated pictures, and the second display frame can display up to 4 thumbnails, then the thumbnails of these 3 associated pictures can be displayed in the second display frame.
  • Step S604 If the storage path of the first video-associated picture is not read, the electronic device does not display the second display frame on the first video playback interface.
  • the electronic device may determine that there is no associated picture in the first video. At this time, the electronic device does not display the second display frame on the first video playback interface.
  • FIG. 7A is a schematic diagram of the first video playback interface when the storage path of the first video-associated picture is not read.
  • the user interface 70 in FIG. 7A does not display the picture display area 233 .
  • Step S605 When the group ID (first identification) of the first video is read, the electronic device reads the third information in the media information database.
  • the third information may be the storage path of the first video configuration file, the group ID of the first video, and the thumbnail of the second video, and the mapping relationship between the three, and the third information may be the above-mentioned FIG. 5 embodiment Table 3 of step S505.
  • Step S606 When the storage path of the first video configuration file is read, the electronic device displays the first control on the first video playback interface.
  • the electronic device when the electronic device reads the storage path of the first video configuration file, the electronic device determines that the first video exists in the configuration file, and the second video may be generated based on the configuration file. Therefore, the electronic device can display a first control on the first video playback interface, and the first control can display a thumbnail of the second video. Wherein, the thumbnail of the second video can be obtained based on the configuration file of the first video.
  • the configuration file is used to indicate that the generated second video includes the content of the 30th to 32nd second video segment in the first video
  • the electronic device can obtain the 30th second to the 32nd second video segment in the first video From second to 32nd second, any frame of picture in the video clip is used as the cover of the second video, a thumbnail of the frame of picture is generated, and the thumbnail of the frame of picture is stored in the media information database.
  • the electronic device wants to display the first control on the first video playback interface, the electronic device calls the thumbnail from the media information library and displays it on the first control.
  • the thumbnail of the second video can also be the thumbnail of any associated picture, or can be the thumbnail of any picture in the album of the electronic device.
  • the thumbnail of the second video There are no restrictions on the source of thumbnails.
  • Step S607 If the storage path of the first video configuration file is not read, the electronic device does not display the first control on the first video playback interface.
  • the electronic device when the electronic device does not read the storage path of the first video configuration file, the electronic device judges that the first video does not have a configuration file. Therefore, the electronic device does not have the function of generating the second video based on the configuration file. The device does not display the first control on the first video playback interface.
  • FIG. 7B is a schematic diagram of the first video playback interface when the storage path of the first video configuration file is not read. As shown in FIG. 7B , compared to the user interface 23 in FIG. 2D , the user interface 71 in FIG. 7B does not display the video generation control 235 .
  • steps S601 to S607 describe the specific process of displaying the first control and the second display frame on the first video playback interface after the electronic device responds to the first input operation.
  • step S506 is described by taking the electronic device detecting that the first video has the first identifier, the configuration file of the first video, and the associated picture as an example. It can be understood that, after the electronic device responds to the first input operation, the electronic device may also display the first control and the second display frame on the first video playback interface.
  • the electronic device when the number N of associated pictures is greater than M, after the electronic device detects a fourth input operation on the second display frame, in response to the fourth input operation, the electronic device switches the Sketch map.
  • the fourth input operation may be the click operation on the sliding control 243 in the above-mentioned FIG. 2E
  • the thumbnail displayed in the second display box may be the picture in the picture display area in the above-mentioned FIG. 2F
  • the thumbnails displayed in the second display box can be the thumbnail of picture 1 in the picture display area in Figure 2G above, and the thumbnails of picture 3 to picture 4 5 thumbnails.
  • Step S507 Detecting a second input operation on the first positioning control in the second display frame, the electronic device displays a first preview interface, the first preview interface includes a first picture preview frame, and the first The image preview frame is used to display a first associated image, the first associated image is an image corresponding to the first thumbnail, and the first preview interface is different from the first video playback interface.
  • the first preview interface is used to display the associated picture corresponding to the first thumbnail.
  • the first thumbnail may be the thumbnail 253 in FIG. 2F above
  • the second input operation may be 252 input operation (for example, slide to the left, so that the indicator icon 252 is above the thumbnail 253 of the picture 2)
  • the first preview interface can be the picture display interface in the above-mentioned FIG. 2G
  • the first picture preview frame can be the picture shown above.
  • the first positioning control may be the indicator icon 252 in FIG. 2F.
  • the electronic device After the electronic device detects the second input operation for the first positioning control, based on the storage path of the associated picture corresponding to the first thumbnail, the electronic device retrieves the file of the associated picture in the corresponding folder, and stores the associated picture. The picture is displayed on the first preview interface.
  • the first preview interface may also include a first thumbnail image.
  • the first thumbnail image may be the thumbnail image 2632 of picture 1 in FIG.
  • the electronic device displays the first video playback interface.
  • the state of the first video playback interface can be the same as the state before being switched to the first preview interface.
  • the state of the first video playback interface includes the first video playback progress.
  • the third input operation may be the input operation on the indication icon 265 in the above-mentioned FIG. 2G (for example, the indication icon is moved above the thumbnail 2632 of picture 1).
  • the first preview interface may further include a fourth control.
  • the fourth control may be the return control 264 in FIG. 2G above.
  • the electronic device may switch from the first preview interface Return to the first video playback interface.
  • the state of the first video playing interface may be the same as the state before being switched to the first preview interface, and the state of the first video playing interface includes the playing progress of the first video.
  • Step S508 After detecting the third input operation on the first control, display the second video playback interface, the second video playback interface is used to display the picture of the second video, and the second video is a video obtained based on the first video.
  • the second video playback interface is an interface for playing a second video
  • the second video is a video generated by the electronic device based on the first video and its configuration file.
  • the third input operation may be the click operation on the video generation control 271 in the above-mentioned FIG. 2H
  • the second video playback interface may be the video playback interface in the above-mentioned FIG. 2J .
  • the electronic device can read the configuration file in the memory of the electronic device according to the storage path of the configuration file of the first video, and based on the first video and its corresponding The configuration file of the second video is generated, and the second video is displayed on the second video playback interface.
  • the second video playback interface includes a first video preview frame, and the first video preview frame is used to display the picture of the second video.
  • the first video preview frame can be the video preview area 291 in the above-mentioned FIG. 2J .
  • the second video playback interface includes a third control, and when the electronic device detects a fifth input operation on the third control, the electronic device saves the second video.
  • the third control may be the save control 292 in the above-mentioned FIG. 2J
  • the fifth input operation may be the click operation on the save control 292 in the above-mentioned FIG. 2J .
  • the user can play the video from the video by clicking the thumbnail of the picture associated with the video on the video playback interface during the process of browsing or playing the video.
  • the interface switches to the image preview interface, where you can browse the image corresponding to the thumbnail.
  • all or part of them may be implemented by software, hardware, firmware or any combination thereof.
  • software When implemented using software, it may be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the processes or functions according to the present application will be generated in whole or in part.
  • the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from a website, computer, server or data center Transmission to another website site, computer, server, or data center by wired (eg, coaxial cable, optical fiber, DSL) or wireless (eg, infrared, wireless, microwave, etc.) means.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server or a data center integrated with one or more available media.
  • the available medium may be a magnetic medium (for example, a floppy disk, a hard disk, a magnetic tape), an optical medium (for example, DVD), or a semiconductor medium (for example, a Solid State Disk).
  • the program can be stored in a computer-readable storage medium. When the program is executed, it can It includes the processes of the above-mentioned method embodiments.
  • the aforementioned storage medium includes: ROM or random access memory RAM, magnetic disk or optical disk, and other various media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Security & Cryptography (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

本申请提供一种图片显示方法及相关电子设备,该方法包括:启动图库应用;显示图库的第一界面,第一界面包括第一缩略图,第一缩略图为第一视频对应的缩略图;检测到用户针对第一缩略图的第一输入操作;根据第一视频的分组ID获取与第一视频关联的N张关联图片的缩略图,关联图片为第一视频录制过程中抓拍图片,分组ID用于标识第一视频;显示第一视频的视频播放界面,该视频播放界面包括第一显示框和第二显示框,第一显示框用于显示第一视频的画面,第二显示框用于显示N张关联图片的缩略图。

Description

一种图片显示方法及相关电子设备
本申请要求于2022年02月28日提交中国专利局、申请号为202210188529.5、发明名称为“一种图片显示方法及相关电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及图片显示领域,尤其涉及一种图片显示方法及相关电子设备。
背景技术
随着计算机技术和互联网技术的发展和普及,互联网也在向移动化发展,用户可以通过智能手机实现对网络的访问,以获取网络服务,得到相应的网络体验。随着电子消费、计算机、通信(3C)融合的到来,人们越来越多地将注意力放到了对各个不同领域的信息设备进行综合利用的研究上,以充分利用现有资源设备来为人们更好的服务。
目前,在移动设备上的视频录制功能应用也得到了很快的发展。但是,在进行视频录制时,不能同时进行拍照,使得用户在使用视频录制完成后,只能获取视频,不能得到视频录制过程中的图片,这降低了用户体验。
发明内容
本申请实施例提供了一种图片显示方法,解决了用户在观看视频的过程中,不能随时浏览该视频对应的图片的问题。
第一方面,本申请实施例提供了一种图片显示方法,应用于电子设备,该方法包括:启动图库应用;显示图库的第一界面,第一界面包括第一缩略图,第一缩略图为第一视频对应的缩略图;检测到用户针对第一缩略图的第一输入操作;根据第一视频的分组ID获取与第一视频关联的N(N为大于0的整数)张关联图片的缩略图,关联图片为第一视频录制过程中抓拍图片,分组ID用于标识第一视频;显示第一视频的视频播放界面,该视频播放界面包括第一显示框和第二显示框,第一显示框用于显示第一视频的画面,第二显示框用于显示N张关联图片的缩略图。
在上述实施例中,通过将视频与该视频关联图片在同一用户界面上进行展示,用户浏览视频或播放视频的过程中可以通过单击视频播放界面上与该视频关联图片的缩略图,从视频播放界面切换到图片预览界面,可以浏览该缩略图对应的图片。通过上述方式,用户可以在观看视频的情况下,可以快速浏览该视频相关的图片,而不必执行“退出视频播放-进入图库-查找关联图片-浏览图片”等繁琐操作,大大节约了用户的时间,提高了用户体验。
结合第一方面,在一种可能实现的方式中,检测到用户针对第一缩略图的第一输入操作之后,根据第一视频的分组ID获取与第一视频关联的N张关联图片的缩略图之前,还包括:在媒体文件库中读取第一信息,第一信息包括第一视频存储路径与第一视频的分组ID之间的映射关系;根据第一信息获取第一视频的分组ID。这样,可以在媒体文件库中获取第一视频的分组ID,并根据该分组ID读取第一视频的关联图片的缩略图。
结合第一方面,在一种可能实现的方式中,根据第一视频的分组ID获取与第一视频关 联的N张关联图片的缩略图,包括:根据第一视频的分组ID在媒体信息库中读取第二信息;第二信息包括第一视频的分组ID、N张关联图片的存储路径以及N张关联图片的缩略图三者之间的映射关系;在读取到N张关联图片的存储路径的情况下,获取与第一视频关联的N张关联图片的缩略图。这样,在读取到关联图片的缩略图后,在第一视频的视频播放界面可以显示第一视频关联图片的缩略图,从而使得用户可以基于该缩略图,浏览关联图片。
结合第一方面,在一种可能实现的方式中,视频播放界面包括第一定位控件显示第一视频的视频播放界面之后,还包括:检测到用户针对第一定位控件的第二输入操作;第一定位控件指示第一目标缩略图,第一目标缩略图为第二显示框中的缩略图;基于第一目标缩略图在第二信息中获取第一关联图片的存储路径;第二信息为媒体信息库中的信息,第二信息包括第一视频的分组ID、N张关联图片的存储路径以及N张关联图片的缩略图三者之间的映射关系,第一目标缩略图为第一关联图片的缩略图;根据第一关联图片的存储路径调取第一关联图片;显示第一预览界面,第一预览界面包括第一图片预览框和第二显示框;第一图片预览框用于显示第一关联图片,第一定位控件指示第一目标缩略图,第一目标缩略图为第二显示框中的缩略图,第一目标缩略图为第一关联图片的缩略图。这样,用户可以根据关联图片的缩略图,从视频播放界面切换到图片预览界面,从而实现对关联图片的浏览。
结合第一方面,在一种可能实现的方式中,视频播放界面还包括第一控件,第一控件为存在配置文件情况下显示的控件,第一控件用于触发生成第二视频,第二视频与所述第一视频不同。这样,用户可以通过单击第一控件,触发电子设备生成第二视频。
结合第一方面,在一种可能实现的方式中,显示第一视频的视频播放界面之后,还包括:检测到用户针对第一控件的第三输入操作;根据第一视频的分组ID读取第三信息,第一信息包括第一视频的配置文件存储路径、分组ID之间的映射关系;根据第一视频的配置文件存储路径调取第一视频的配置文件;基于第一视频的配置文件对第一视频进行处理,得到第二视频;显示第二视频的视频播放界面,第二视频的视频播放界面与第一视频的视频播放界面不同,第二视频的视频播放界面包括第一视频预览框,第一视频预览框用于显示第二视频的画面。
结合第一方面,在一种可能实现的方式中,第一预览界面包括第二显示框,第二显示框包括第二目标缩略图,显示第一预览界面之后,还包括:检测到用户针对第二目标缩略图的输入操作;显示第一视频的视频播放界面。这样,用户可以通过单击第二目标缩略图,从关联图片的预览界面切换到第一视频的视频播放界面。
结合第一方面,在一种可能实现的方式中,第二显示框显示的关联图片的缩略图的最大数量为M,第二显示框包括第一切换控件,显示所述第一视频的视频播放界面之后,还包括:在N大于M的情况下,检测到用户针对第一切换控件的输入操作;切换第二显示框中的缩略图;第二显示框在切换前显示的缩略图与第二显示框在切换后显示的缩略图不同。这样,可以使得第二显示框切换关联图片的缩略图。
第二方面,本申请实施例提供了一种电子设备,该电子设备包括:一个或多个处理器 和存储器;该存储器与该一个或多个处理器耦合,该存储器用于存储计算机程序代码,该计算机程序代码包括计算机指令,该一个或多个处理器调用该计算机指令以使得该电子设备执行:启动图库应用;显示图库的第一界面,第一界面包括第一缩略图,第一缩略图为第一视频对应的缩略图;检测到用户针对第一缩略图的第一输入操作;根据第一视频的分组ID获取与第一视频关联的N张关联图片的缩略图,关联图片为第一视频录制过程中抓拍图片,分组ID用于标识第一视频;显示第一视频的视频播放界面,该视频播放界面包括第一显示框和第二显示框,第一显示框用于显示第一视频的画面,第二显示框用于显示N张关联图片的缩略图。
结合第二方面,在一种可能实现的方式中,该一个或多个处理器调用该计算机指令以使得该电子设备执行:在媒体文件库中读取第一信息,第一信息包括第一视频存储路径与第一视频的分组ID之间的映射关系;根据第一信息获取第一视频的分组ID。
结合第二方面,在一种可能实现的方式中,该一个或多个处理器调用该计算机指令以使得该电子设备执行:根据第一视频的分组ID在媒体信息库中读取第二信息;第二信息包括第一视频的分组ID、N张关联图片的存储路径以及N张关联图片的缩略图三者之间的映射关系;在读取到N张关联图片的存储路径的情况下,获取与第一视频关联的N张关联图片的缩略图。
结合第二方面,在一种可能实现的方式中,该一个或多个处理器调用该计算机指令以使得该电子设备执行:检测到用户针对第一定位控件的第二输入操作;第一定位控件指示第一目标缩略图,第一目标缩略图为第二显示框中的缩略图;基于第一目标缩略图在第二信息中获取第一关联图片的存储路径;第二信息为媒体信息库中的信息,第二信息包括第一视频的分组ID、N张关联图片的存储路径以及N张关联图片的缩略图三者之间的映射关系,第一目标缩略图为第一关联图片的缩略图;根据第一关联图片的存储路径调取第一关联图片;显示第一预览界面,第一预览界面包括第一图片预览框和第二显示框;第一图片预览框用于显示第一关联图片,第一定位控件指示第一目标缩略图,第一目标缩略图为第二显示框中的缩略图,第一目标缩略图为第一关联图片的缩略图。
结合第二方面,在一种可能实现的方式中,该一个或多个处理器调用该计算机指令以使得该电子设备执行:检测到用户针对第一控件的第三输入操作;根据第一视频的分组ID读取第三信息,第一信息包括第一视频的配置文件存储路径、分组ID之间的映射关系;根据第一视频的配置文件存储路径调取第一视频的配置文件;基于第一视频的配置文件对第一视频进行处理,得到第二视频;显示第二视频的视频播放界面,第二视频的视频播放界面与第一视频的视频播放界面不同,第二视频的视频播放界面包括第一视频预览框,第一视频预览框用于显示第二视频的画面。
结合第二方面,在一种可能实现的方式中,该一个或多个处理器调用该计算机指令以使得该电子设备执行:检测到用户针对第二目标缩略图的输入操作;显示第一视频的视频播放界面。
结合第二方面,在一种可能实现的方式中,该一个或多个处理器调用该计算机指令以使得该电子设备执行:在N大于M的情况下,检测到用户针对第一切换控件的输入操作;切换第二显示框中的缩略图;第二显示框在切换前显示的缩略图与第二显示框在切换后显 示的缩略图不同。
第三方面,本申请实施例提供了一种电子设备,包括:触控屏、摄像头、一个或多个处理器和一个或多个存储器;所述一个或多个处理器与所述触控屏、所述摄像头、所述一个或多个存储器耦合,所述一个或多个存储器用于存储计算机程序代码,计算机程序代码包括计算机指令,当所述一个或多个处理器执行所述计算机指令时,使得所述电子设备执行如第一方面或第一方面的任意一种可能实现的方式所述的方法。
第四方面,本申请实施例提供了一种芯片系统,该芯片系统应用于电子设备,该芯片系统包括一个或多个处理器,该处理器用于调用计算机指令以使得该电子设备执行如第一方面或第一方面的任意一种可能实现的方式所述的方法。
第五方面,本申请实施例提供了一种包含指令的计算机程序产品,当该计算机程序产品在电子设备上运行时,使得该电子设备执行如第一方面或第一方面的任意一种可能实现的方式所述的方法。
第六方面,本申请实施例提供了一种计算机可读存储介质,包括指令,当该指令在电子设备上运行时,使得该电子设备执行如第一方面或第一方面的任意一种可能实现的方式所述的方法。
附图说明
图1A-图1H为本申请实施例提供的一组示例性用户界面图;
图2A-图2K为本申请实施例提供的另一组示例性的用户界面图;
图3为本申请实施例提供的电子设备100的硬件结构示意图;
图4为本申请实施例提供的一种图片显示方法的框架流程图;
图5为本申请实施例提供的一种图片显示方法的流程图;
图6为本申请实施例提供的一种第二显示框和第一控件显示的流程图;
图7A是本申请实施例提供而一种在未读取到第一视频关联图片存储路径的情况下,第一视频播放界面图;
图7B是本申请实施例提供而一种在未读取到第一视频配置文件的存储路径的情况下,第一视频播放界面的示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。在本文中提及“实施例”意味着,结合实施例描述的特定特征、结构或者特性可以包含在本实施例申请的至少一个实施例中。在说明书中的各个位置出现该短语并不一定均是相同的实施例,也不是与其它实施例互斥的独立的或是备选的实施例。本领域技术人员可以显式地和 隐式地理解的是,本文所描述的实施例可以与其它实施例相结合。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
本申请的说明书和权利要求书及所述附图中术语“第一”、“第二”、“第三”等是区别于不同的对象,而不是用于描述特定顺序。此外,术语“包括”和“具有”以及它们的任何变形,意图在于覆盖不排他的包含。例如,包含了一系列步骤或单元,或者可选地,还包括没有列出的步骤或单元,或者可选地还包括这些过程、方法、产品或设备固有的其它步骤或单元。
附图中仅示出了与本申请相关的部分而非全部内容。在更加详细地讨论示例性实施例之前,应当提到的是,一些示例性实施例被描述成作为流程图描绘的处理或方法。虽然流程图将各项操作(或步骤)描述成顺序的处理,但是其中的许多操作可以并行地、并发地或者同时实施。此外,各项操作的顺序可以被重新安排。当其操作完成时所述处理可以被终止,但是还可以具有未包括在附图中的附加步骤。所述处理可以对应于方法、函数、规程、子例程、子程序等等。
在本说明书中使用的术语“部件”、“模块”、“系统”、“单元”等用于表示计算机相关的实体、硬件、固件、硬件和软件的组合、软件或执行中的软件。例如,单元可以是但不限于在处理器上运行的进程、处理器、对象、可执行文件、执行线程、程序和/或分布在两个或多个计算机之间。此外,这些单元可从在上面存储有各种数据结构的各种计算机可读介质执行。单元可例如根据具有一个或多个数据分组(例如来自与本地系统、分布式系统和/或网络间的另一单元交互的第二单元数据。例如,通过信号与其它系统交互的互联网)的信号通过本地和/或远程进程来通信。
如今,随着智能手机技术的不断发展,越来越多的用户喜欢使用手机中的相机功能进行拍照和摄像,以此来记录生活中的点点滴滴。特别的,在用户通过相机进行视频录制时,用户还想获取一些与录制相关联的图片,例如,视频录制过程中的某个精彩画面的图片等等。同时,电子设备也可以对这些图片进行处理(例如,通过AI算法识别场景,添加相应的滤镜等),得到视觉效果更好的图片。
但是,当前电子设备对于图片和视频是分开展示的,即:用户在浏览视频的过程中,若想要浏览与该视频相关联的图片,用户必须先退出当前的视频播放界面。然后在图库应用中寻找该视频的关联图片,才能实现对视频关联图片的浏览。在用户浏览图片之后,若用户又想观看视频,用户又必须退出当前的图片预览界面,又要重新寻找目标视频,才能播放浏览目标视频。这就使得,用户在观看视频时,若要浏览该视频的关联图片的步骤过于繁琐,且浪费用户的时间,大大浪费了用户的时间成本,降低了用户的使用体验。
为了解决上述视频及其关联图片不能联合显示的问题,本申请实施例提出了一种图片显示方法,该方法包括:在电子设备开启获取录制视频相关图片功能的情况下,在视频录制结束后,当用户浏览录制视频时,电子设备在视频播放界面显示该视频的相关联图片的缩略图。当检测到用户针对该缩略图的单击操作后,电子设备显示图片预览界面,该图片预览界面显示该缩略图对应的图片。用户可以在该图片预览界面中可以对图片进行分享、编辑、删除、转发等操作。通过上述图片显示方法,用户在浏览视频的过程中,可以随时 浏览该视频的关联图片,而不必执行“退出视频播放界面-在图库中寻找关联图片-浏览关联图片”等一系列的繁琐操作,从而节约了用户浏览图片的时间成本,提高了用户的使用体验。
下面,结合图1A-图1H对本申请实施例提供的一种图片显示方法的应用场景进行介绍。图1A-图1H是本申请实施例提供的一组示例性用户界面。
如图1A所示,用户界面10为电子设备100的主界面(也可称为桌面或主屏幕等),在该主界面中包括图库图标101、相机图标102以及其它功能图标。当电子设备100检测到针对相机图标102的输入操作(例如,单击),响应该输入操作,电子设备100显示如图1B所示的用户界面11。
如图1B所示,用户界面为11为电子设备100的拍摄界面。在该拍摄界面中,包括预览区域111、变焦倍率显示框112、变焦倍率调节控件113、功能区域114、图库控件115、开始控件116、回显控件117、功能指示图标118以及“更多功能”控件119。预览区域111用于实时显示相机获取的拍摄环境的实时预览图片,在该预览区域111中,包括第一拍摄对象1111、第二拍摄对象1112以及第三拍摄对象1113。变焦倍率显示框112用于显示预览区域111中预览图片的变焦倍数,其中,1X为1倍变焦倍数,2X为2倍变焦倍数,3X为3倍变焦倍数……以此类推。变焦倍数与图片的焦距呈正相关,变焦倍数越大,图片的焦距越大,拍摄图片中拍摄对象就越大,由图1B可知,当前预览图片的变焦倍数为1倍变焦倍数。回显控件117用于切换电子设备100的前置摄像头和后置摄像头,在电子设备100当前工作的摄像头为后置摄像头的情况下,检测到用户对回显控件117的单击操作,响应该操作,电子设备100将工作摄像头切换为前置摄像头,此时,在预览区域111中,显示前置摄像头获取的预览图片,不显示后置摄像头获取的预览图片,当电子设备100当前工作的摄像头为前置摄像头的情况下,检测到用户对回显控件117的单击操作,响应该操作,电子设备100将工作摄像头切换为后置摄像头,此时,在预览区域111中,显示后置摄像头获取的预览图片,不显示前置摄像头获取的预览图片。当电子设备100检测到针对图库控件115的输入操作后,电子设备100会显示图库界面,从而使得用户可以浏览电子设备100拍摄的视频或图片。功能区域114用于相机应用当前的功能模式,功能区域114包括“人像”控件1141、“64M”控件1142、“录像”控件1143、“拍照”控件1144、“更多”控件1145以及功能指示图标118。其中,功能指示图标118用于指示电子设备当前的拍摄模式,如图1B所示,功能指示图标在“录像”控件1143的下方,电子设备100当前的拍摄模式为录像模式。变焦倍率调节控件113用于调节预览图片的变焦倍率,当电子设备100检测到针对变焦倍率调节控件113的输入操作(例如,上滑),响应该输入操作,电子设备增大预览图片的变焦倍率,显示如图1C所示的用户界面12。
如图1C所示,用户界面12为电子设备100的拍摄界面,由变焦倍率显示框可知,当前预览图片的变焦倍率为3倍变焦倍率,预览区域中的第一拍摄对象、第二拍摄对象以及第三拍摄对象相较于图1B中的拍摄对象被放大。当电子设备100检测到针对“更多功能”控件的输入操作(例如,单击),响应该输入操作,电子设备100显示如图1D所示的电子设备13。
如图1D所示,用户界面13包括“更多功能”显示区域131,在该“更多功能”显示区域131中包括:返回控件1311、“一录多得”控件1312、“字幕同步”控件1313、“参考线”控件1314、“微距模式”控件1315以及“设置”控件1316。其中,“一录多得”控件1312用于开启一录多得功能,即:电子设备在拍摄视频的同时,获取全部或部分与该视频的关联图片,并对这些图片进行处理并保存,从而同时得到视频和图片的一种拍摄功能。其中,视频的关联图片可以为该视频精彩画面的图片。当电子设备100检测到针对“一录多得”控件1312的输入操作(例如,单击)后,响应该输入操作,电子设备100显示如图1E所示的用户界面14。
如图1E所示,用户界面14中的“一录多得”控件的颜色相较于用户界面13中“一录多得”控件的颜色不同,用于指示电子设备100已开启一录多得功能。当电子设备100检测到针对返回控件141的输入操作,响应该操作,电子设备100显示如图1G所示的用户界面12。用户界面12的相关叙述请参见上述图1C中用户界面12的相关叙述,本申请实施例在此不再赘述。
在一种可能实现的方式中,当电子设备100检测到针对“一录多得”控件的单击等输入操作后,响应该操作,电子设备100可以如图1F所示的用户界面15,用户界面15包括提示框151,该提示框151用于提示用户一录多得功能已开启。如图1F所示,提示框151包括文字信息“一录多得功能已开启”。如图1F所示,电子设备100检测到针对开始控件的单击操作,响应该操作,电子设备100开始录制视频,并显示如图1H所示的用户界面16。
在一些实施例中,电子设备100可以在用户界面12上显示“一录多得”控件。当检测到针对“一录多得”控件的单击操作后,响应该操作,电子设备100开启一录多得功能。
应当理解的是,上述图1C-图1F仅是对电子设备100开启“一录多得”功能的示例性描述,不应该对本申请实施例的保护范围构成限制,本申请实施例对电子设备100开启“一录多得”功能的方式不做任何限制。
如图1H所示,用户界面16为视频录制界面,该视频录制界面中包括预览图片显示区域161、录制时间显示区域162、变焦倍率显示区域163、变焦倍率调节控件164、暂停录制控件165、停止录制控件166。其中,预览图片显示区域161用于实时显示当前拍摄环境的预览图片,录制时间显示区域162用于显示当前视频的拍摄时长,如图1H所示,由录制时间显示区域162可知,当前视频的录制时间为25s。变焦倍率显示区域163用于显示当前预览图片的变焦倍率,由变焦倍率显示区域163可知,当前预览图片的变焦倍率为3倍变焦倍率。变焦倍率调节控件164用于调节预览图片的变焦倍率,变焦倍率调节控件164的相关叙述请参见上述图1B中变焦倍率调节控件113的相关叙述,本申请实施例在此不再赘述。暂停录制控件165用于用户暂停当前视频录制,当电子设备100检测到针对暂停录制控件165的单击操作后,响应该操作,电子设备100暂停录制视频。停止录制控件166用于用户停止录制视频,如图1H所示,电子设备100检测到针对停止录制控件166的单击操作后,响应该操作,电子设备100停止录制该视频,并且保存该视频以及该视频的关联图片。
上述图1A-图1H对电子设备100在开启“一录多得”功能的情况下,进行视频录制的应用场景进行了介绍。下面,结合图2A-图2K,对播放上述图1A-图1H应用场景中录制视频的应用场景进行介绍,本申请实施例以上述图1A-图1H录制的视频为视频1为例,进行说明。图2A-图2J为本申请实施例提供的一组示例性的用户界面。
如图2A所示,用户界面20为电子设备100的主界面,该主界面包括图库图标201和其它应用图标。检测到针对图库图标201的输入操作(例如,单击),响应该操作,电子设备100显示如图2B所示的用户界面21。
如图2B所示,用户界面21为相册显示界面,该相册显示界面包括“相册”图标211、截屏图标212、视频图标213、图片图标214。其中,截屏图标212为电子设备100存储的截屏文件集合的图标,在检测到用户针对截屏图标212的单击操作的情况下,电子设备100会显示其存储的截屏图片的缩略图,视频图标213为电子设备存储的视频文件集合的图标,在检测到用户针对视频图标213的单击操作的情况下,电子设备100会显示其存储的视频文件的缩略图,图片图标214为电子设备100存储的关联图片文件集合的图标,电子设备100存储的关联图片文件为视频关联图片的文件,在电子设备100检测到针对图片图标214的单击操作后,电子设备100显示其存储的关联图片文件的缩略图。电子设备100检测到针对视频图标213的单击操作,响应该操作,电子设备100显示如图2C所示的用户界面22。
如图2C所示,用户界面22为视频显示界面,该视频显示界面用于显示电子设备100存储的视频文件的缩略图,如图2C所示,在该视频显示界面中,包括视频1的缩略图221。其中,缩略图221包括图片指示图标2221,该图片指示图标2221用于指示视频1包括通过“一录多得”功能获取的关联图片,缩略图221还可以显示视频1的时长,由缩略图221可知,视频1的时长为30秒。电子设备100检测到针对缩略图221的单击操作,响应该操作,电子设备100显示如图2D所示的用户界面23。
如图2D所示,用户界面23为图库APP的视频播放界面,该视频播放界面包括视频显示区域231、第一视频播放控件232、图片显示区域233、返回控件234、视频生成控件235、视频分享控件236、视频编辑控件237、视频删除控件238以及“更多”控件239。
其中,视频显示区域231用于显示视频。视频分享控件236用于触发电子设备100分享视频,当电子设备100检测到针对视频分享控件236的单击操作后,响应该操作,电子设备100将视频进行转发。第一视频播放控件232用于触发视频1的播放。视频编辑控件237用于触发电子设备编辑视频,在电子设备100检测到针对视频编辑控件237的单击操作的情况下,电子设备100可以编辑视频1(例如,对该视频进行剪辑等操作)。视频删除控件238用于用户删除该视频,在电子设备100检测到针对视频删除控件238的单击操作的情况下,电子设备100可以删除视频1。
图片显示区域233用于显示视频1关联图片的缩略图,该图片显示区域233可以包括图片1~图片4的缩略图,图片1为视频1的缩略图(可以为视频1的封面),图片2~图片4为视频1的关联图片,关联图片可以为视频1拍摄过程中的精彩瞬间图片。当然,图片显示区域233中视频1的关联图片可以更多或者更少。在图片显示区域233中还可以包括指示图标2331,指示图标2331用于定位图片显示区域233中的缩略图。如图2D所示,当 前指示图标2331在图片1的缩略图2332的上方,用于表示当前界面为视频1的播放界面。图片显示区域233还可以包括滑动控件2333。电子设备100检测到针对第一视频播放控件232的单击操作后,响应该操作,电子设备100播放视频1,并显示如图2E所示的用户界面24。
如图2E所示,用户界面24为视频播放界面,该视频播放界面用于播放视频。在该视频播放界面中,包括进度条241和暂停控件242。其中,进度条241用于显示视频1当前的播放进度,由进度条241可知,视频1的播放进度为15秒。暂停控件242用于暂停播放视频,在电子设备100检测到针对暂停控件242的单击操作的情况下,响应该操作,电子设备100可以暂停视频1的播放。电子设备100检测到针对滑动控件243的单击操作,电子设备100显示如图2F所示的用户界面25。
在一些实施例中,由于图片显示区域(例如上述图片显示区域233)的显示空间有限,只能显示固定数量的缩略图(例如,只能显示4张图片的缩略图),当视频关联图片的数量大于图片显示区域可显示缩略图的最大数量的情况下,用户可以通过单击滑动控件的方式来查看其它图片的缩略图。
如图2F所示,用户界面25为视频播放界面,在该视频播放界面中,图片显示区域251显示图片1、图片3~图片5的缩略图。相较于图2E中的图片显示区域,图2F中的图片显示区域显示图片5的缩略图,不显示图片2的缩略图。这样,用户可以查看图片5的缩略图。
在一些实施例中,用户还可以通过其它方式查看视频1的其他图片的缩略图。示例性的,在图2E中,当电子设备100检测到用户针对图片显示区域的输入操作(例如,向左滑动),响应该操作,电子设备100显示如图2F所示的用户界面25。
在图2F中,当电子设备100检测到针对指示图标252的输入操作(例如,向左滑动,使得指示图标252在图片2的缩略图253的上方),响应该操作,电子设备100显示如图2G所示的用户界面26。
如图2G所示,用户界面26为图片显示界面,该图片显示界面包括图片预览区域261和功能设置区域262、图片显示区域263、返回控件264。图片预览区域261用于显示上述图2F中图片3的缩略图,此时指示图标265在图片3的缩略图2631的上方,功能设置区域262包括分享控件2621、编辑控件2622、删除控件2623以及更多控件2624。分享控件2621用于用户分享图片3,当电子设备100检测到针对分享控件2621的单击操作后,响应该操作,电子设备100将图片3进行转发。编辑控件2622用于用户编辑图片3,当电子设备100检测到针对视频编辑控件2622的单击操作后,响应该操作,电子设备100编辑图片3(例如,调节图片3的亮度或对比度等操作)。删除控件2623用于用户删除图片3,当电子设备100检测到针对删除控件2623的单击操作后,响应该操作,电子设备100删除图片3。当电子设备100检测到针对返回控件264的单击操作后,响应该操作,电子设备100显示如图2H所示的用户界面27。
在一种可能实现的方式中,当电子设备100检测到针对指示图标265的输入操作(例如,指示图标被移动到图片1的缩略图上方时),响应该操作,电子设备100也可以显示如图2H所示的用户界面27。
如图2H所示,用户界面27为视频播放界面,该视频播放界面与上述图2F中的用户界面25状态一致,即:用户界面27播放的视频1的进度为15秒,用户界面27播放的视频1的进度也为15秒。当电子设备100检测到针对视频生成控件271的单击操作,响应该操作,电子设备100显示如图2I所示的用户界面28。
如图2I所示,用户界面28为视频播放界面,用户界面28包括第一提示框281,第一提示框281用于提示用户是否生成视频2,如图2I所示,第一提示框281显示文字信息“是否生成第二视频”。第一提示框281包括确定控件2811和取消控件2812,确定控件2811用于触发电子设备100生成视频2,当电子设备100检测到针对确定控件2811的单击操作后,响应该操作,电子设备100显示如图2J所示的用户界面29。
如图2J所示,用户界面29为视频播放界面,该视频播放界面包括视频预览区域291、保存控件292。其中,视频预览区域291用于显示视频2的画面。电子设备100检测到针对保存控件292的单击操作后,响应该操作,电子设备100保存视频2。
电子设备100保存视频2后,用户可以在图库中查找并浏览视频2。示例性的,在电子设备100保存视频2之后,用户可以在视频显示界面中查看视频2。例如,如图2K所示,用户界面30相较于用户界面22多了视频2的缩略图301。当电子设备100检测到针对缩略图301的输入操作的情况下,电子设备100可以播放视频2。
下面对电子设备100的结构进行介绍。请参阅图3,图3是本申请实施例提供的电子设备100的硬件结构示意图。
电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。
其中,传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个 或多个处理器中。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如Wi-Fi网络),蓝牙(BlueTooth,BT),BLE广播,全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图片处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图片,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图片。ISP还可以对图片的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
数字信号处理器用于处理数字信号,除了可以处理数字图片信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图片识别,人脸识别,语音识别,文本理解等。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备100可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备100可以设置至少一个麦克风170C。在另一些实施例中,电子设备100可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备100还可以设置三个,四个或更多麦克风170C,实现采集声音信号、降噪、还可以识别声音来源,实现定向录音功能等。
在一些实施例中,电子设备100还可以包括按键190、马达191、指示器192以及SIM卡接口195(或eSIM卡)等一项或多项,本申请实施例对此不作任何限制。
上述图1A-图2K对本申请实施例所述的图片显示方法所涉及的应用场景进行了介绍。下面,结合图4,对本申请实施例提供的一种图片显示方法的框架进行介绍。请参见图4,图4是本申请实施例提供的一种图片显示方法的框架流程图,下面,对该框架图进行简要介绍。
电子设备在开启相机APP的视频录制功能之后,进行视频录制(即步骤S401)。在视频录制过程中,电子设备可以通过AI算法识别当前的场景(即步骤S402),以便在拍摄过程中获取一张或多张与该视频相关的精彩瞬间图片(即步骤S403)。在视频录制结束后,电子设备保存录制的视频以及与该视频对应的精彩瞬间图片(即步骤S404)。
电子设备可以通过图库APP查找录制的视频,并触发电子设备进入视频播放界面,从而实现浏览录制视频(即步骤S405),在视频播放界面中,显示与该视频对应的精彩瞬间图片的缩略图(在其中一张缩略图的上方有一个指示控件)以及显示AI视频控件,在该控件被触发的情况下,电子设备可以生成AI视频。在浏览录制视频的过程中:
一方面,用户可以通过移动指示控件(指示控件可以为上述图2D中的指示图标2331)的位置来浏览任意一张精彩瞬间图片(即步骤S406),此时,电子设备会从视频播放界面切换到图片浏览界面,从而显示指示控件指示的缩略图对应的精彩瞬间图片。在图片浏览界面中,用户可以对该精彩瞬间图片进行分享、编辑、删除、复制、云同步等操作(即步骤S407)。
另一方面,用户可以通过点击视频播放界面中的AI视频控件,然后电子设备可以基于该视频生成一个AI视频(即步骤S408)。其中,该AI视频可以理解为电子设备基于该视频生成的一段自动匹配模板(有特效、音乐等)的视频。在电子设备生成AI视频后,电子设备可以在预览界面中展示(即步骤S409)该AI视频,在预览该AI视频的过程中,用户可以对该AI视频进行编辑(例如,裁剪、添加动效等),在点击预览界面的保存控件后,电子设备可以保存该AI视频(即步骤S410)。
下面,结合图5,对本申请实施例提供的一种图片显示方法的具体流程进行叙述。请参见图5,图5是本申请实施例提供的一种图片显示方法的流程图,具体流程如下:
步骤S501:显示第一拍摄预览界面,第一拍摄预览界面包括第一画面预览框和第一目标控件,第一预览框显示摄像头实时采集的画面。
具体地,第一拍摄预览界面为电子设备拍摄视频之前的预览界面。示例性的,第一拍摄预览界面可以为上述图1B所示的用户界面11。第一画面预览框用于显示摄像头实时采集的画面,即:拍摄环境的实时画面。示例性的,第一画面预览框可以为上述图1B中的预览区域111。第一控件用于用户启动电子设备开始录制视频。示例性的,第一目标控件可以为上述图1B中的开始控件116。
步骤S502:在检测到针对第一拍摄预览界面的第一操作后,电子设备开启一录多得功能。所述一录多得功能为电子设备在第一视频的录制过程中获取一张或多张关联图片的功能。
具体地,在第一拍摄预览界面中可以包括第二目标控件,第二目标控件用于触发电子设备开启“一录多得”的功能。其中,一录多得为:电子设备在第一视频的录制过程中获取一张或多张目标图片。
关联图片为在录制第一视频的过程中所拍摄的图片,例如在录制第一视频的过程中用户手动拍摄的图片,或者,在录制第一视频的过程中电子设备自动拍摄的图片。示例性的,第一视频的关联图片可以为第一视频的精彩瞬间图片,也可以为对第一视频的精彩瞬间图片进行AI处理的图片。例如,AI处理可以包括:识别拍摄场景,并为这些精彩瞬间图片匹配与拍摄场景相适应的滤镜等操作。
第一操作可以为用户针对第二目标控件的单击操作,电子设备检测到用户针对第二目标控件的第一操作后,响应该第一操作,电子设备开启一录多得功能。
在一些实施例中,第二目标控件可以为“设置选项”中的一个设置项,“一录多得”功能可以在电子设备录制视频时默认开启,若检测到针对第二目标控件的单击操作的情况下,电子设备可以关闭“一录多得”功能。
在一种可能实现的方式中,第一拍摄预览界面可以包括第一设置控件。示例性的,第 一设置控件可以为上述图1B中的“更多功能”控件119。当电子设备100检测到针对第一设置控件的单击操作后,电子设备显示第一功能显示框,该第一功能显示框中包括所述第二目标控件,第一功能显示框可以为上述图1D中的“更多功能”显示区域131,第二目标控件可以为上述图1D中的“一录多得”控件1312,第一操作可以为针对上述图1D中针对“一录多得”控件1312的输入操作(例如,单击)。
步骤S503:在检测到针对第一目标控件的第二操作后,显示第一拍摄界面,开始拍摄第一视频。
具体地,电子设备在检测到针对第一目标控件的第二操作后,显示第一拍摄界面,开始拍摄第一视频。其中,第一拍摄界面与第一拍摄预览界面不同,第一拍摄界面可以包括第二画面预览框和第三目标控件,第二画面预览框用于显示拍摄对象的画面。
示例性的,第二操作可以为上述图1F中针对开始控件的单击操作,第一拍摄界面可以为上述图1H中的用户界面16,第二画面预览框可以为上述图1H所示的预览图片显示区域161,第三目标控件可以为上述图1H中的停止录制控件166。
步骤S504:在开始拍摄第一视频的第一时刻,在第一拍摄界面的第二画面预览框中显示摄像头实时采集的画面。
步骤S505:在第二时刻,在检测到针对第一拍摄界面中第三目标控件的第三操作后,保存第一视频和其对应的关联图片。
具体地,第三操作可以为上述图1H中针对停止录制控件166的单击操作,当电子设备检测到针对第三目标控件的第三操作后,电子设备停止第一视频的录制,保存第一视频以及该视频对应的关联图片。其中,关联图片为在第一视频录制过程中,电子设备获取的图片。示例性的,该关联图片可以为第一视频录制过程中的精彩瞬间图片。
在电子设备检测到针对第三目标控件的第三操作后,相机会生成配置文件和第一标识,并将第一标识作为元信息插入到第一视频和第一视频对应的媒体文件中。第一视频对应的媒体文件在电子设备检测到针对第三目标控件的第三操作后生成的,第一视频对应的媒体文件包括:关联图片的缩略图、配置文件。其中,配置文件可以用于电子设备基于第一视频生成第二视频,第二视频可以为上述图4实施例中所述的AI视频。
示例性的,第一标识可以为分组ID,在视频录制结束后,相机会给视频生成分组ID,并把该分组ID插入到视频文件中,分组ID用于标识不同视频,在不同电子设备之间具备唯一性。例如,在视频1录制结束后,电子设备可以给视频1生成分组ID 0010,0010为视频1的唯一标识,在视频2录制结束后,电子设备可以给视频2生成分组ID 0020,0020为视频2的唯一标识。分组ID的格式可以为通用唯一识别码(Universally Unique Identifier,UUID)格式的二进制串。本申请实施例以第一标识为分组ID为例,进行说明。
电子设备可以将分组ID与第一视频存储路径的映射关系等信息写入媒体文件库。在录制第一视频的过程中,若电子设备开启“一录多得”的功能且检测到存在第一视频的关联图片,电子设备可以将分组ID、关联图片存储路径、关联图片的缩略图三者的映射关系写入媒体信息库(例如,MediaLibrary)。在录制第一视频的过程中,若电子设备检测到存在第一视频的配置文件,电子设备可以生成第二视频缩略图,电子设备可以将分组ID、配置文件的存储路径以及第二视频缩略图三者的映射关系写入媒体信息库(例如,MediaLibrary)。
示例性的,表1为视频分组ID与该视频文件(即录制的原始视频文件)存储路径的示例性映射关系表。表2为关联图片缩略图、视频分组ID、关联图片存储路径三者的映射关系表。表3为第二视频缩略图、视频配置文件存储路径以及视频分组ID三者的映射关系表。表1、表2和表3如下所示:
表1
视频 分组ID 存储路径
视频1 001 XXX/XXX/XXX
…… …… ……
表2
图片 分组ID 缩略图 存储路径
图片1 001 缩略图1 XXX/XXX/001
图片2 001 缩略图2 XXX/XXX/002
图片3 001 缩略图3 XXX/XXX/003
图片4 001 缩略图4 XXX/XXX/004
图片5 001 缩略图5 XXX/XXX/005
…… …… …… ……
表3
缩略图 分组ID 配置文件存储路径
缩略图A 001 XXX/XXX/005
…… …… ……
存储路径为第一视频或关联图片或配置文件在电子设备内存中的地址。电子设备可以通过读取存储路径,可以找到第一视频或关联图片或配置文件的存储位置,调取第一视频、配置文件或关联图片,使得第一视频或关联图片可以在电子设备屏幕上显示,或者,电子设备可以基于配置文件生成第二视频。
示例性的,若图片1的存储路径为“Download/email/Video”,说明图片1存储在电子设备Download文件夹下的email文件中,且在email文件下的Video文件中。
下面,对配置文件进行说明。第一视频的配置文件可以用于电子设备生成第二视频,该配置文件为视频描述文件,包括视频剪辑信息,该视频剪辑信息与第一视频相关。例如,该视频剪辑信息可以为用于指示电子设备剪辑第一视频的时间信息。示例性,视频剪辑信息的格式可以为“30-32、60-64、100-115”,该视频剪辑信息表示,剪辑第一视频的第30秒~32秒的视频片段、第60秒~64秒的视频片段、100秒~115秒的视频片段,并将这三个视频片段进行合成和处理,可以得到时长为21秒的第二视频。
可选地,该配置文件还可以包括动效处理信息,该动效处理信息用于指示电子设备给第二视频配置动效等操作。示例性的,电子设备给第二视频配置动效可以为:电子设备给第二视频添加动效场景(例如,下雨场景),这样,在第二视频播放过程中会显示下雨的场景。
步骤S506:响应第一输入操作,显示第一视频播放界面,所述第一视频播放界面包括第一显示框、第二显示框以及第一控件,所述第一显示框用于显示第一视频的画面,所述第二显示框用于显示第一目标图集中的缩略图,所述第一目标图集包括N张关联图片的缩略图。
具体地,第一输入操作可以为上述图2D针对缩略图221的输入操作,第一视频播放界面可以为上述图2D中图库APP的视频播放界面,第一显示框可以为上述图2D中的视频显示区域231,第二显示框可以为上述图2D中的图片显示区域233,第一控件可以为上述图2D中的视频生成控件235。其中,第一目标图集为所有关联图片的集合。
下面,结合图6,对电子设备在第一视频播放界面上显示第二显示框和第一控件的具体流程进行说明,请参见图6,图6为本申请实施例提供的一种第二显示框和第一控件显示的流程图,具体流程如下:
步骤S601:电子设备在媒体文件库中读取分组ID与第一视频存储路径的映射信息。
具体地,该映射信息用于表征第一视频与其分组ID对应的映射关系。示例性的,分组ID与第一视频存储路径的映射信息可以为上述步骤S505中的表1。如上述步骤S505所述,在第一视频录制结束后,相机会给第一视频生成一个分组ID,并将该分组ID插入到第一视频文件中,然后,电子设备会将该分组ID与第一视频文件存储路径之间的映射关系写入媒体文件库中。
在检测到第一输入操作后,电子设备读取媒体文件库中的第二信息,从而获取第一视频的分组ID。
步骤S602:在读取到第一视频的分组ID(第一标识)的情况下,电子设备在媒体信息库中读取第一信息。
具体地,第二信息可以为第一视频的关联图片缩略图、第一视频分组ID与第一视频的关联图片存储路径三者之间的映射关系,该映射关系可以为上述步骤S505中的表2。
步骤S603:在读取到第一视频关联图片存储路径的情况下,电子设备在第一视频播放界面上显示第二显示框,并在所述第二显示框内显示第一目标图集中的缩略图。
具体地,在电子设备基于第一视频的分组ID,在媒体信息库中读取到第一视频关联图片的存储路径的情况下,电子设备可以判断第一视频存在关联图片,在第一视频播放界面上显示第二显示框,并在所述第二显示框内显示第一目标图集中的缩略图。
其中,第二目标显示框显示缩略图的最大数量为M,第一目标图集中缩略图的数量为N,第二显示框显示第一目标图集中的缩略图可以分为以下两种情况:
第一种情况:当N大于或等于M时,第二显示框显示第一目标图集中M张缩略图。例如,如图2D所示,假设第一视频有5张关联图片,由于图片显示区域最多只能显示4张缩略图,因此,在图片显示区域只显示了图片1~图片4的缩略图。
第二种情况:当N小于M时,第二显示框显示第一目标图集中的所有缩略图。例如,如果第一视频只有3张关联图片,且第二显示框最多可以显示4张缩略图,那么,可以在第二显示框中显示这3张关联图片的缩略图。
步骤S604:在未读取到第一视频关联图片存储路径的情况下,电子设备在第一视频播放界面上不显示第二显示框。
具体地,在电子设备基于第一视频的分组ID,在媒体信息库中未读取到第一视频关联图片的存储路径的情况下,电子设备可以判断第一视频不存在关联图片。此时,电子设备在第一视频播放界面上不显示第二显示框。
示例性的,如图7A所示,图7A为在未读取到第一视频关联图片存储路径的情况下,第一视频播放界面的示意图。由图7A可知,相较于上述图2D中的用户界面23,图7A中的用户界面70未显示图片显示区域233。
步骤S605:在读取到第一视频的分组ID(第一标识)的情况下,电子设备在媒体信息库中读取第三信息。
具体地,第三信息可以为第一视频配置文件的存储路径、第一视频的分组ID以及第二视频的缩略图,三者之间的映射关系,第三信息可以为上述图5实施例中步骤S505的表3。
步骤S606:在读取到第一视频配置文件的存储路径的情况下,电子设备在第一视频播放界面中显示第一控件。
具体地,在电子设备读取到第一视频配置文件的存储路径的情况下,电子设备确定第一视频存在配置文件,可以基于该配置文件生成第二视频。因此,电子设备可以在第一视频播放界面上显示第一控件,该第一控件可以显示第二视频的缩略图。其中,第二视频的缩略图可以基于第一视频的配置文件得到。
例如,当第一视频拍摄完成后,若该配置文件用于指示生成的第二视频中包括第一视频中第30秒~32秒视频片段的内容,当电子设备可以获取第一视频中第30秒~第32秒这个视频片段内任意一帧图片,将该帧图片作为第二视频的封面,生成该帧图片的缩略图,将该帧图片的缩略图存储到媒体信息库中。当电子设备要在第一视频播放界面上显示第一控件时,电子设备从媒体信息库中调用该缩略图,并将其显示在第一控件上。
在一种可能实现的方式中,第二视频的缩略图也可以为任意一张关联图片的缩略图,也可以为电子设备相册中任意一张图片的缩略图,本申请实施例对于第二视频的缩略图来源不做限制。
步骤S607:在未读取到第一视频配置文件的存储路径的情况下,电子设备在第一视频播放界面中不显示第一控件。
具体地,在电子设备未读取到第一视频配置文件的存储路径的情况下,电子设备判断第一视频不存在配置文件,因此,电子设备不具备基于配置文件生成第二视频的功能,电子设备在第一视频播放界面中不显示第一控件。
示例性的,如图7B所示,图7B为在未读取到第一视频配置文件的存储路径的情况下,第一视频播放界面的示意图。由图7B所示,相较于图2D中的用户界面23,图7B中的用户界面71未显示视频生成控件235。
上述步骤S601-步骤S607介绍了电子设备响应第一输入操作后,在第一视频播放界面上显示第一控件和第二显示框的具体流程。
应当理解的是,步骤S506以电子设备检测到第一视频存在第一标识、第一视频的配置文件以及关联图片为例进行说明。可以理解的是,电子设备在响应第一输入操作后,电子设备还可以在第一视频播放界面上显示第一控件以及第二显示框。
可选地,在关联图片的数量N大于M的情况下,当电子设备检测到针对第二显示框的 第四输入操作后,响应该第四输入操作,电子设备切换第二显示框上的缩略图。
示例性的,第四输入操作可以为上述图2E中针对滑动控件243的单击操作,在响应第四输入操作之前,第二显示框显示的缩略图可以为上述图2F中图片显示区域中图片1的缩略图~图片4的缩略图,在响应第四输入操作之后,第二显示框显示的缩略图可以为上述图2G中图片显示区域中图片1的缩略图,图片3的缩略图~图片5的缩略图。
步骤S507:检测到针对所述第二显示框中第一定位控件的第二输入操作,所述电子设备显示第一预览界面,所述第一预览界面包括第一图片预览框,所述第一图片预览框用于显示第一关联图片,所述第一关联图片为所述第一缩略图对应的图片,所述第一预览界面与所述第一视频播放界面不同。
具体地,第一预览界面用于显示第一缩略图对应的关联图片,示例性的,第一缩略图可以为上述图2F中的缩略图253,第二输入操作可以为上述图2F针对指示图标252的输入操作(例如,向左滑动,使得指示图标252在图片2的缩略图253的上方),第一预览界面可以为上述图2G中的图片显示界面,第一图片预览框可以为上述图2G中的图片预览区域261,第一定位控件可以为上述图2F中的指示图标252。
电子设备在检测到针对第一定位控件的第二输入操作后,电子设备基于第一缩略图对应的关联图片的存储路径,在对应的文件夹中调取该关联图片的文件,并将该关联图片显示在第一预览界面上。
可选地,第一预览界面还可以包括第一缩略图,示例性的,第一缩略图可以为上述图2G中的图片1的缩略图2632,当电子设备检测到针对第一定位控件的第三输入操作后,电子设备显示第一视频播放界面,此时,第一视频播放界面的状态可以与被切换到第一预览界面之前的状态相同,第一视频播放界面的状态包括第一视频的播放进度。示例性的,第三输入操作可以为上述图2G中,针对指示图标265的输入操作(例如,指示图标被移动到图片1的缩略图2632上方)。
可选地,第一预览界面还可以包括第四控件。示例性的,第四控件可以为上述图2G中的返回控件264,当电子设备检测到针对第四控件的输入操作(例如,单击),响应该操作,电子设备可以由第一预览界面切换回第一视频播放界面。此时,第一视频播放界面的状态可以与被切换到第一预览界面之前的状态相同,第一视频播放界面的状态包括第一视频的播放进度。
步骤S508:检测到针对第一控件的第三输入操作后,显示第二视频播放界面,第二视频播放界面用于显示第二视频的画面,第二视频为基于第一视频得到的视频。
具体地,第二视频播放界面为播放第二视频的界面,第二视频为电子设备基于第一视频及其配置文件生成的视频。示例性的,第三输入操作可以为上述图2H中针对视频生成控件271的单击操作,第二视频播放界面可以为上述图2J中的视频播放界面。
当电子设备检测到针对第一控件的第三输入操作后,电子设备可以根据第一视频的配置文件的存储路径,在电子设备的内存中读取该配置文件,并基于第一视频和其对应的配置文件生成第二视频,并在第二视频播放界面上显示第二视频。其中,第二视频播放界面包括第一视频预览框,第一视频预览框用于显示所述第二视频的画面,示例性的,第一视频预览框可以为上述图2J中的视频预览区域291。
可选地,第二视频播放界面包括第三控件,当电子设备检测到针对第三控件的第五输入操作后,电子设备保存第二视频。示例性的,第三控件可以为上述图2J中的保存控件292,第五输入操作可以为上述图2J中,针对保存控件292的单击操作。
本申请实施例,通过将视频与该视频关联图片在同一用户界面上进行展示,用户浏览视频或播放视频的过程中可以通过单击视频播放界面上与该视频关联图片的缩略图,从视频播放界面切换到图片预览界面,可以浏览该缩略图对应的图片。通过上述方式,用户可以在观看视频的情况下,可以快速浏览该视频相关的图片,而不必执行“退出视频播放-进入图库-查找关联图片-浏览图片”等繁琐操作,大大节约了用户的时间,提高了用户体验。此外,在浏览图片时,可以通过单击电子设备上的Back键或者返回控件,由图片预览界面重新切换回视频播放界面,切换后的视频播放界面播放的视频的进度与该视频播放界面切换到图片预览界面之前的视频播放进度相同,从而避免在切换回视频播放界面后,电子设备重新播放视频,导致用户重新寻找视频原来的播放进度,进而浪费用户时间,降低用户体验的问题。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘Solid State Disk)等。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,可以由计算机程序来指令相关的硬件完成,该程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法实施例的流程。而前述的存储介质包括:ROM或随机存储记忆体RAM、磁碟或者光盘等各种可存储程序代码的介质。
总之,以上所述仅为本发明技术方案的实施例,并非用于限定本发明的保护范围。凡根据本发明的揭露,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。

Claims (10)

  1. 一种图片显示方法,其特征在于,应用于电子设备,所述方法包括:
    启动图库应用;
    显示所述图库的第一界面,所述第一界面包括第一缩略图,所述第一缩略图为第一视频对应的缩略图;
    检测到用户针对所述第一缩略图的第一输入操作;
    根据所述第一视频的分组ID获取与所述第一视频关联的N张关联图片的缩略图,所述关联图片为所述第一视频录制过程中抓拍图片,所述分组ID用于标识所述第一视频,N为大于0的整数;
    显示所述第一视频的视频播放界面,所述视频播放界面包括第一显示框和第二显示框,所述第一显示框用于显示所述第一视频的画面,所述第二显示框用于显示所述N张关联图片的缩略图。
  2. 如权利要求1所述的方法,其特征在于,所述检测到用户针对所述第一缩略图的第一输入操作之后,所述根据所述第一视频的分组ID获取与所述第一视频关联的N张关联图片的缩略图之前,还包括:
    在媒体文件库中读取第一信息,所述第一信息包括所述第一视频存储路径与所述第一视频的分组ID之间的映射关系;
    根据所述第一信息获取所述第一视频的分组ID。
  3. 如权利要求1-2任一项所述的方法,其特征在于,所述根据所述第一视频的分组ID获取与所述第一视频关联的N张关联图片的缩略图,包括:
    根据所述第一视频的分组ID在媒体信息库中读取第二信息;所述第二信息包括所述第一视频的分组ID、所述N张关联图片的存储路径以及所述N张关联图片的缩略图三者之间的映射关系;
    在读取到所述N张关联图片的存储路径的情况下,获取与所述第一视频关联的N张关联图片的缩略图。
  4. 如权利要求1-3任一项所述的方法,其特征在于,所述视频播放界面包括第一定位控件,所述显示所述第一视频的视频播放界面之后,还包括:
    检测到用户针对所述第一定位控件的第二输入操作;
    所述第一定位控件指示第一目标缩略图,所述第一目标缩略图为所述第二显示框中的缩略图;
    基于所述第一目标缩略图在第二信息中获取第一关联图片的存储路径;第二信息为媒体信息库中的信息,所述第二信息包括所述第一视频的分组ID、所述N张关联图片的存储路径以及所述N张关联图片的缩略图三者之间的映射关系,所述第一目标缩略图为所述第一关联图片的缩略图;
    根据所述第一关联图片的存储路径调取所述第一关联图片;
    显示第一预览界面,所述第一预览界面包括第一图片预览框和所述第二显示框;所述第一图片预览框用于显示第一关联图片,所述第一定位控件指示第一目标缩略图,所述第一目标缩略图为所述第二显示框中的缩略图,所述第一目标缩略图为所述第一关联图片的缩略图。
  5. 如权利要求1-4任一项所述的方法,其特征在于,所述视频播放界面还包括第一控件,所述第一控件为存在配置文件情况下显示的控件,所述第一控件用于触发生成第二视频,所述第二视频与所述第一视频不同。
  6. 如权利要求5所述的方法,其特征在于,所述显示所述第一视频的视频播放界面之后,还包括:
    检测到用户针对所述第一控件的第三输入操作;
    根据所述第一视频的分组ID读取第三信息,所述第一信息包括所述第一视频的配置文件存储路径、分组ID之间的映射关系;
    根据所述第一视频的配置文件存储路径调取所述第一视频的配置文件;
    基于所述第一视频的配置文件对所述第一视频进行处理,得到第二视频;
    显示第二视频的视频播放界面,所述第二视频的视频播放界面与所述第一视频的视频播放界面不同,所述第二视频的视频播放界面包括第一视频预览框,所述第一视频预览框用于显示所述第二视频的画面。
  7. 如权利要求4所述的方法,其特征在于,所述第一预览界面包括所述第二显示框,所述第二显示框包括所述第二目标缩略图,所述显示第一预览界面之后,还包括:
    检测到用户针对所述第二目标缩略图的输入操作;
    显示所述第一视频的视频播放界面。
  8. 如权利要求1-7任一项所述的方法,其特征在于,所述第二显示框显示的所述关联图片的缩略图的最大数量为M,所述第二显示框包括第一切换控件,所述显示所述第一视频的视频播放界面之后,还包括:
    在所述N大于所述M的情况下,检测到用户针对所述第一切换控件的输入操作;
    切换所述第二显示框中的缩略图;所述第二显示框在切换前显示的缩略图与所述第二显示框在切换后显示的缩略图不同。
  9. 一种电子设备,其特征在于,包括:存储器、处理器和触控屏;其中:
    所述触控屏用于显示内容;
    所述存储器,用于存储计算机程序,所述计算机程序包括程序指令;
    所述处理器用于调用所述程序指令,使得所述电子设备执行如权利要求1-8任一项所述的方法。
  10. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,该计算机程序被处理器执行时,实现如权利要求1-8任一项所述的方法。
PCT/CN2022/143658 2022-02-28 2022-12-29 一种图片显示方法及相关电子设备 WO2023160238A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP22924557.6A EP4276619A1 (en) 2022-02-28 2022-12-29 Image display method and related electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210188529.5 2022-02-28
CN202210188529.5A CN116700846B (zh) 2022-02-28 2022-02-28 一种图片显示方法及相关电子设备

Publications (1)

Publication Number Publication Date
WO2023160238A1 true WO2023160238A1 (zh) 2023-08-31

Family

ID=87764673

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/143658 WO2023160238A1 (zh) 2022-02-28 2022-12-29 一种图片显示方法及相关电子设备

Country Status (3)

Country Link
EP (1) EP4276619A1 (zh)
CN (1) CN116700846B (zh)
WO (1) WO2023160238A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8078603B1 (en) * 2006-10-05 2011-12-13 Blinkx Uk Ltd Various methods and apparatuses for moving thumbnails
CN106792272A (zh) * 2016-11-28 2017-05-31 维沃移动通信有限公司 一种视频缩略图的生成方法及移动终端
CN108833787A (zh) * 2018-07-19 2018-11-16 百度在线网络技术(北京)有限公司 用于生成短视频的方法和装置
CN111061912A (zh) * 2018-10-16 2020-04-24 华为技术有限公司 一种处理视频文件的方法及电子设备
CN113810608A (zh) * 2021-09-14 2021-12-17 荣耀终端有限公司 一种拍摄方法、电子设备及存储介质
WO2022007724A1 (zh) * 2020-07-06 2022-01-13 北京字节跳动网络技术有限公司 视频处理方法、装置、设备及存储介质

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106303669B (zh) * 2016-08-17 2019-08-09 深圳鑫联迅科技有限公司 一种视频剪辑方法和装置
CN111541936A (zh) * 2020-04-02 2020-08-14 腾讯科技(深圳)有限公司 视频及图像处理方法、装置、电子设备、存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8078603B1 (en) * 2006-10-05 2011-12-13 Blinkx Uk Ltd Various methods and apparatuses for moving thumbnails
CN106792272A (zh) * 2016-11-28 2017-05-31 维沃移动通信有限公司 一种视频缩略图的生成方法及移动终端
CN108833787A (zh) * 2018-07-19 2018-11-16 百度在线网络技术(北京)有限公司 用于生成短视频的方法和装置
CN111061912A (zh) * 2018-10-16 2020-04-24 华为技术有限公司 一种处理视频文件的方法及电子设备
WO2022007724A1 (zh) * 2020-07-06 2022-01-13 北京字节跳动网络技术有限公司 视频处理方法、装置、设备及存储介质
CN113810608A (zh) * 2021-09-14 2021-12-17 荣耀终端有限公司 一种拍摄方法、电子设备及存储介质

Also Published As

Publication number Publication date
EP4276619A1 (en) 2023-11-15
CN116700846B (zh) 2024-04-02
CN116700846A (zh) 2023-09-05

Similar Documents

Publication Publication Date Title
WO2021078284A1 (zh) 一种内容接续方法及电子设备
WO2021164445A1 (zh) 一种通知处理方法、电子设备和系统
JP7355941B2 (ja) 長焦点シナリオにおける撮影方法および端末
WO2020155014A1 (zh) 智能家居设备分享系统、方法及电子设备
WO2021143269A1 (zh) 一种长焦场景下的拍摄方法及移动终端
WO2020119464A1 (zh) 一种视频拆分方法及电子设备
WO2022135527A1 (zh) 一种视频录制方法及电子设备
CN112261624B (zh) 应用中传输文件的方法、电子设备及系统
US20230208790A1 (en) Content sharing method, apparatus, and system
CN111741366A (zh) 音频播放方法、装置、终端及存储介质
CN115529413A (zh) 拍摄方法及相关装置
WO2022127670A1 (zh) 一种通话方法、相关设备和系统
US20230273902A1 (en) File Opening Method and Device
CN116724560A (zh) 跨设备的协同拍摄方法、相关装置及系统
CN113747056A (zh) 拍照方法、装置及电子设备
CN115514882A (zh) 一种分布式拍摄方法,电子设备及介质
WO2023160238A1 (zh) 一种图片显示方法及相关电子设备
WO2022222773A1 (zh) 拍摄方法、相关装置及系统
CN114079691B (zh) 一种设备识别方法及相关装置
JP2023544561A (ja) メッセージ表示方法及び電子デバイス
KR20170083905A (ko) 이동 단말기 및 그 제어방법
EP4310646A1 (en) Device discovery method and system and electronic device
WO2022161058A1 (zh) 一种全景图像的拍摄方法及电子设备
WO2023202431A1 (zh) 一种定向拾音方法及设备
WO2022228010A1 (zh) 一种生成封面的方法及电子设备

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2022924557

Country of ref document: EP

Effective date: 20230807

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22924557

Country of ref document: EP

Kind code of ref document: A1