WO2020147521A1 - 用于显示图像的方法和装置 - Google Patents
用于显示图像的方法和装置 Download PDFInfo
- Publication number
- WO2020147521A1 WO2020147521A1 PCT/CN2019/127598 CN2019127598W WO2020147521A1 WO 2020147521 A1 WO2020147521 A1 WO 2020147521A1 CN 2019127598 W CN2019127598 W CN 2019127598W WO 2020147521 A1 WO2020147521 A1 WO 2020147521A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- time point
- target
- video
- time
- frame
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 77
- 238000012545 processing Methods 0.000 claims abstract description 44
- 230000008569 process Effects 0.000 claims description 26
- 230000004044 response Effects 0.000 claims description 17
- 238000004590 computer program Methods 0.000 claims description 10
- 238000011897 real-time detection Methods 0.000 claims description 9
- 238000001514 detection method Methods 0.000 claims description 7
- 238000010586 diagram Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 8
- 238000004891 communication Methods 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 208000003028 Stuttering Diseases 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/102—Programmed access in sequence to addressed parts of tracks of operating record carriers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/47217—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
- G06F3/04855—Interaction with scrollbars
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/46—Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/00007—Time or data compression or expansion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/44—Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/426—Internal components of the client ; Characteristics thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44213—Monitoring of end-user related data
- H04N21/44222—Analytics of user selections, e.g. selection of programs or purchase activity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/845—Structuring of content, e.g. decomposing content into time segments
- H04N21/8455—Structuring of content, e.g. decomposing content into time segments involving pointers to the content, e.g. pointers to the I-frames of the video stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/00007—Time or data compression or expansion
- G11B2020/00072—Time or data compression or expansion the compressed signal including a video signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72442—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for playing music files
Definitions
- the embodiments of the present disclosure relate to the field of computer technology, and in particular to methods and devices for displaying images.
- the existing method is to preprocess the video to set more key frames in the video frames included in the video. Since the key frame is decoded faster, by setting more key frames, the user can quickly preview the video by dragging the progress bar to locate the video frame that needs to be processed.
- the embodiments of the present disclosure propose methods and devices for displaying images.
- an embodiment of the present disclosure provides a method for displaying an image.
- the method includes: determining a selected time point at which a user adjusts the playback progress of a target video, wherein the target video includes a set of key frames; In the frame set, determine the target key frame, where the difference between the time point corresponding to the target key frame and the selected time point meets the first preset condition; based on the target key frame, the video frame corresponding to the selected time point is decoded to obtain The decoded video frame; the decoded video frame is displayed in the first target display area.
- determining the selected time point at which the user adjusts the playback progress of the target video includes at least one of the following: in response to detecting the stay time of the control point used to adjust the playback progress of the target video at the current time point If the time threshold is greater than or equal to the preset time threshold, it is determined that the current time point is the selected time point; in response to detecting that the user no longer controls the control point, it is determined that the current time point corresponding to the control point is the selected time point.
- the method before determining the selected time point at which the user adjusts the playback progress of the target video, the method further includes: in the process of the user adjusting the playback progress of the target video, detecting the adjusted time point in real time; The time point of real-time detection, the target time point is determined, and the video frame corresponding to the determined target time point is displayed in the second target display area.
- determining the target time point based on the time point detected in real time includes: determining that the distance from the detected time point conforms to the second preset from the time points corresponding to the key frames included in the key frame set.
- the condition time point is regarded as the target time point.
- determining the target time point based on the real-time detection time point includes: selecting the time point as the target time point from the target time period in which the detected time point is located, where the target time period is based on the key
- the frame set is a time period in the time period set obtained by dividing the playback time of the target video.
- determining the target time point based on the real-time detection point in time includes: acquiring processing capability information of the target processor, where the target processor is used to process video frames included in the target video, and the processing capability information is used to characterize the target The ability of the processor to process information; according to a preset time interval corresponding to the processing capability information, the time point detected in real time is periodically determined as the target time point.
- determining the target time point based on the real-time detection time point, and displaying the video frame corresponding to the determined target time point in the second target display area includes: determining the currently detected time point as the target time Point; perform the following display steps: decode the video frame corresponding to the determined target time point to obtain the decoded video frame for display in the second target display area; determine whether the second target display area includes the obtained The decoded video frame; in response to determining that the second target display area includes the obtained decoded video frame, re-determine the last detected time point as the target time point, and continue to perform the display step using the newly determined target time point .
- an embodiment of the present disclosure provides an apparatus for displaying an image.
- the apparatus includes: a first determining unit configured to determine a selected time point at which a user adjusts the playback progress of a target video, wherein the target The video includes a set of key frames; the second determining unit is configured to determine a target key frame from the set of key frames, wherein the difference between the time point corresponding to the target key frame and the selected time point meets the first preset condition; the decoding unit , Configured to decode the video frame corresponding to the selected time point based on the target key frame to obtain the decoded video frame; the display unit is configured to display the decoded video frame in the first target display area.
- the first determining unit includes at least one of the following: the first determining module is configured to respond to detecting that the stay time of the control point used to adjust the playback progress of the target video at the current time point is greater than or equal to a preset Determine the current time point as the selected time point; the second determining module is configured to determine that the current time point corresponding to the control point is the selected time point in response to detecting that the user no longer controls the control point.
- the device further includes: a detection unit configured to detect the adjusted time point in real time when the user adjusts the playback progress of the target video; the second determining unit is configured to detect the time point based on the real-time detection Point, determine the target time point, and display the video frame corresponding to the determined target time point in the second target display area.
- the second determining unit is further configured to: determine, from time points respectively corresponding to the key frames included in the key frame set, that the distance to the detected time point meets the second preset condition time point as the target Point in time.
- the second determining unit is further configured to: select the time point as the target time point from the target time period in which the detected time point is located, wherein the target time period is based on the comparison of the key frame set to the target video The time period in the time period set obtained by dividing the playing time of.
- the second determining unit includes: an acquisition module configured to acquire processing capability information of the target processor, wherein the target processor is used to process video frames included in the target video, and the processing capability information is used to characterize the target processor Ability to process information; the third determining module is configured to periodically determine a time point detected in real time as a target time point according to a preset time interval corresponding to the processing capacity information.
- the second determining unit includes: a fourth determining module, configured to determine the currently detected time point as the target time point; and the display module, configured to perform the following display steps: The corresponding video frame is decoded to obtain the decoded video frame for display in the second target display area; it is determined whether the second target display area includes the obtained decoded video frame; the fifth determining module is configured to In response to determining that the second target display area includes the obtained decoded video frame, the time point of the most recent detection is re-determined as the target time point, and the display step is continued with the newly determined target time point.
- the embodiments of the present disclosure provide a terminal device, which includes: one or more processors; a storage device on which one or more programs are stored; when one or more programs are stored by one or more Execution by two processors, so that one or more processors implement the method described in any implementation manner of the first aspect.
- embodiments of the present disclosure provide a computer-readable medium having a computer program stored thereon, and the computer program, when executed by a processor, implements the method described in any implementation manner in the first aspect.
- the method and device for displaying images determine the selected time point at which the user adjusts the playback progress of the target video, and then determine the key frame from the key frame set included in the target video, and based on the target key Frame, decode the video frame corresponding to the selected time point, and finally display the decoded video frame in the first target display area, so that when the user adjusts the video playback progress, the existing key frames can be used to achieve the selected Preview of the video frame corresponding to the time point without adding more key frames to the video in advance, saving the time required to add key frames to the video in advance, improving the flexibility of displaying video frames, and helping to improve Video frame positioning and the efficiency of processing video frames.
- Fig. 1 is an exemplary system architecture diagram in which an embodiment of the present disclosure can be applied;
- Fig. 2 is a flowchart of an embodiment of a method for displaying images according to an embodiment of the present disclosure
- FIG. 3 is a schematic diagram of an application scenario of the method for displaying images according to an embodiment of the present disclosure
- FIG. 4 is a flowchart of another embodiment of a method for displaying images according to an embodiment of the present disclosure
- FIG. 5 is a flowchart of determining a target time point of a method for displaying an image according to an embodiment of the present disclosure
- FIG. 6 is a schematic structural diagram of an embodiment of an apparatus for displaying images according to an embodiment of the present disclosure
- Fig. 7 is a schematic structural diagram of a computer system suitable for implementing a terminal device of an embodiment of the present disclosure.
- FIG. 1 shows an exemplary system architecture 100 of a method for displaying an image or an apparatus for displaying an image to which an embodiment of the present disclosure can be applied.
- the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105.
- the network 104 is a medium used to provide a communication link between the terminal devices 101, 102, 103 and the server 105.
- the network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, and so on.
- the user can use the terminal devices 101, 102, 103 to interact with the server 105 through the network 104 to receive or send messages, and so on.
- Various communication client applications such as video playback applications, web browser applications, social platform software, etc., may be installed on the terminal devices 101, 102, and 103.
- the terminal devices 101, 102, and 103 may be hardware or software.
- the terminal devices 101, 102, 103 When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices with a display screen and supporting video playback, including but not limited to smart phones, tablet computers, laptop computers, desktop computers, and so on.
- the terminal devices 101, 102, and 103 are software, they can be installed in the electronic devices listed above. It can be implemented as multiple software or software modules (for example, software or software modules used to provide distributed services), or as a single software or software module. There is no specific limit here.
- the server 105 may be a server that provides various services, for example, a background video server that provides support for videos played on the terminal devices 101, 102, and 103.
- the background video server can send videos to terminal devices, and can also receive videos from terminal devices.
- the method for displaying images provided by the embodiments of the present disclosure is generally executed by the terminal devices 101, 102, 103, and correspondingly, the apparatus for displaying images is generally set in the terminal devices 101, 102, 103.
- the server can be hardware or software.
- the server can be implemented as a distributed server cluster composed of multiple servers, or as a single server.
- the server is software, it can be implemented as multiple software or software modules (for example, software or software modules for providing distributed services), or as a single software or software module. There is no specific limit here.
- the numbers of terminal devices, networks, and servers in FIG. 1 are only schematic. According to the implementation needs, there can be any number of terminal devices, networks and servers.
- the foregoing system architecture may not include a server and a network.
- the method for displaying images includes the following steps:
- Step 201 Determine the selected time point at which the user adjusts the playback progress of the target video.
- the execution subject of the method for displaying images may determine the selected time point at which the user adjusts the playback progress of the target video.
- the target video may be a video to be displayed in the first target display area by selecting a video frame from the video frames included therein.
- the target video may be a video obtained remotely using a wireless connection or a wired connection, or a video pre-stored locally (for example, a video recorded by the user using the above-mentioned execution subject).
- the target video in this embodiment is usually a compressed video, for example, the original video is compressed using the existing H26X coding standard.
- the aforementioned target video includes a set of key frames.
- a key frame also called an I frame
- I frame is a frame that completely retains image data in the compressed video.
- the key frame is decoded, only the image data of the current frame can be decoded.
- the compressed video can also include P frames and B frames.
- the data included in the P frame (also called the difference frame) is used to characterize the difference between this frame and the previous key frame (or P frame).
- the P frame does not include complete image data, but only includes data that characterizes the image difference from the previous frame.
- the data included in the B frame (also called the two-way difference frame) is used to characterize the difference between the current frame and the previous and subsequent frames.
- a compressed video it usually includes multiple key frames, between which there are multiple P frames and B frames. For example, suppose that the sequence of the video frames included in a certain video frame sequence is IBBPBBPBBP, and the images corresponding to the B and P frames can be decoded based on the I frame.
- the selected time point is the play time point selected by the user when adjusting the play progress of the target video.
- the user can drag the progress bar of the target video or slide on the displayed video screen to adjust the playback progress of the target video.
- the user can use an electronic device such as a mouse to adjust the playback progress of the target video.
- the execution subject includes a touch screen
- the user can also adjust the playback progress by sliding a finger on the screen.
- the above-mentioned execution subject may determine the selected time point in at least one of the following ways:
- Manner 1 In response to detecting that the stay time of the control point used to adjust the playback progress of the target video at the current time point is greater than or equal to a preset time threshold, it is determined that the current time point is the selected time point.
- the above-mentioned control points can be points displayed on the screen (for example, the points showing the current playback progress displayed on the progress bar), or points that are not displayed (for example, when the user slides his finger on the video screen displayed on the screen, the finger touches the screen Point), the user can drag the control point through touch, mouse click, etc. to adjust the playback progress.
- the foregoing time threshold may be a time preset by a technician, such as 2 seconds, 5 seconds, and so on.
- Manner 2 In response to detecting that the user no longer controls the control point, it is determined that the current time point corresponding to the control point is the selected time point. Specifically, the execution subject can detect in real time whether the user clicks or touches the control point. For example, when it is detected that the user lifts the finger or releases the mouse button, it is determined that the control point is no longer manipulated, and the current playback time point is determined to be the selected time point.
- Step 202 Determine a target key frame from the key frame set.
- the execution subject of the method for displaying an image may determine the target key frame from the key frame set. Wherein, the difference between the time point corresponding to the target key frame and the selected time point meets the first preset condition.
- the first preset condition may include at least one of the following: the time point corresponding to the key frame precedes the selected time point and has the smallest difference from the selected time point; the time point corresponding to the key frame precedes the selected time point and is the same as the selected time point.
- the time difference is less than or equal to the preset time difference threshold.
- Step 203 Based on the target key frame, decode the video frame corresponding to the selected time point to obtain the decoded video frame.
- the execution subject of the method for displaying images may decode the video frame corresponding to the selected time point to obtain the decoded video frame.
- the above-mentioned target video includes key frames, P frames, and B frames. If the video frame corresponding to the selected time point is a key frame, it can be decoded in the same way as the key frame is decoded; if the time point is selected
- the corresponding video frame is a P frame or a B frame, and then based on the target key frame, the decoding can be performed in the manner of decoding the P frame or the B frame.
- the foregoing methods for decoding I-frames, P-frames, and B-frames are well-known technologies that are currently widely studied and applied, and will not be repeated here.
- Step 204 Display the decoded video frame in the first target display area.
- the above-mentioned execution subject may display the decoded video frame obtained in step 203 in the first target display area.
- the above-mentioned first target display area may be a display area for displaying decoded video frames.
- the first target display area may be an area where the video is played on the screen, or other areas on the screen (for example, for The window that allows the user to process the decoded video frame).
- the decoding time for key frames is shorter than the decoding time for P and B frames. Therefore, in order to enable the user to quickly and accurately preview the video frame corresponding to the playback time point selected by the user, the existing method usually uses advance Preprocessing the video to add more key frames to the video, or to set each video frame as a key frame, and this process will take more time. By adopting the above steps, there is no need to pre-process the video in advance, but only need to determine the time point selected by the user and display the video frame corresponding to the selected time point, thereby helping to improve the efficiency of video processing.
- FIG. 3 is a schematic diagram of an application scenario of the method for displaying images according to this embodiment.
- the user has previously recorded a video 302 through the terminal device 301, and the video 302 includes a set of predetermined key frames (for example, frames indicated by the symbol "*" in the figure).
- the user wants to extract a video frame from the video 302 for processing (for example, adding special effects).
- the user adjusts the playback progress of the video 302 by dragging the progress bar of the video 302.
- the terminal device 301 determines that the last play time point the user drags to is the selected time point (for example, the play time point corresponding to the dragged point 305 shown in the figure), and the selected time point The corresponding video frame is 3022 shown in the figure. Then, the terminal device 301 determines the target key frame from the key frame set, where the difference between the time point corresponding to the target key frame and the selected time point meets the first preset condition (for example, the corresponding time point is before the selected time point And the smallest gap) key frame 3021.
- the first preset condition for example, the corresponding time point is before the selected time point And the smallest gap
- the terminal device 301 decodes the video frame 3022 corresponding to the selected time point based on the target key frame 3021, obtains the decoded video frame 303, and displays the decoded video frame in the first target display area (for example, the window for playing the video) 304 Video frame 303.
- the method provided by the above-mentioned embodiment of the present disclosure determines the selected time point at which the user adjusts the playback progress of the target video, then determines the key frame from the key frame set included in the target video, and selects the selected key frame based on the target key frame.
- the video frame corresponding to the time point is decoded, and finally the decoded video frame is displayed in the first target display area, so that when the user adjusts the video playback progress, the existing key frames can be used to realize the video corresponding to the selected time point Frame preview without adding more key frames to the video in advance, saving the time required to add key frames to the video in advance, improving the flexibility of displaying video frames, and helping to improve the positioning and processing of video frames The efficiency of the video frame.
- FIG. 4 shows a flow 400 of still another embodiment of a method for displaying an image.
- the process 400 of the method for displaying images includes the following steps:
- Step 401 During the process of the user adjusting the playback progress of the target video, the adjusted time point is detected in real time.
- the execution subject of the method for displaying images may detect the adjusted time point in real time when the user adjusts the playback progress of the target video.
- the target video may be a video obtained remotely using a wireless connection or a wired connection, or may be a video pre-stored locally (for example, a video recorded by the user using the above-mentioned execution subject).
- the manner in which the user adjusts the playback progress of the target video may be the same as the manner described in the embodiment in FIG. 2, and will not be repeated here.
- the aforementioned target video includes a set of key frames.
- a key frame also called an I frame
- I frame is a frame that completely retains image data in the compressed video.
- the key frame is decoded, only the image data of the current frame can be decoded.
- the aforementioned adjusted point in time may be the playback time point detected in real time when the user adjusts the playback progress of the aforementioned target video.
- the user can drag the progress bar of the target video or slide on the displayed video screen to adjust the playback progress of the target video.
- the user can use an electronic device such as a mouse to adjust the playback progress of the target video.
- the execution subject includes a touch screen
- the user can also adjust the playback progress by sliding a finger on the screen.
- Step 402 Determine a target time point based on the real-time detected time point, and display a video frame corresponding to the determined target time point in a second target display area.
- the above-mentioned execution subject may determine the target time point according to various methods based on the time point of real-time detection, and display the video frame corresponding to the determined target time point in the second target display area.
- the second target display area may be a display area for previewing the video frame corresponding to the target time point. It should be noted that the second target display area may be a display area different from the first target display area (for example, a preview window), or the same display area as the first target display area (for example, the display area of the playback target video) .
- the above-mentioned execution subject may determine the target time point according to the following steps:
- the second preset condition may include at least one of the following: the distance between the time point corresponding to the key frame and the detected time point is the smallest; the distance between the time point corresponding to the key frame and the detected time point is less than or equal to the preset Set the distance threshold.
- the above-mentioned distance may be the absolute value of the difference between the time point corresponding to the key frame and the detected time point, that is, the target time point may precede or follow the detected time point. Since the key frame is decoded faster, the key frame can be displayed in real time when the user adjusts the playback progress of the target video, which helps the user to judge the currently adjusted playback progress.
- the above-mentioned execution subject may determine the target time point according to the following steps:
- the target time period is a time period in a time period set obtained by dividing the playback time of the target video based on the key frame set.
- N is a positive integer
- the entire playback time of the target video can be divided according to the time point corresponding to each key frame to obtain N Time periods (that is, a collection of time periods).
- the above-mentioned execution subject may determine the time period in which the detected time point is located as the target time period, and select the time point as the target time point in various ways from the target time period. For example, from the target time period, a time point in the middle position is selected as the target time point, or a time point is randomly selected as the target time point.
- the above-mentioned execution subject may determine the target time point according to the following steps:
- the processing capability information of the target processor may obtain processing capability information remotely or locally.
- the target processor may be a processor provided on the execution subject, and the target processor may be used to process the video frames included in the target video.
- the foregoing processing capability information may be used to characterize the capability of the target processor to process information (for example, including processing speed, cache size, etc.).
- the foregoing processing capability information may include, but is not limited to, at least one of the following: model of the target processor, The main frequency value, the number of cores of the target processor, etc.
- the time point detected in real time is periodically determined as the target time point.
- the correspondence between the processing capability information and the time interval may be characterized by a correspondence table including multiple processing capability information and multiple time intervals.
- the above-mentioned execution subject may search for the time interval corresponding to the determined processing capability information from the correspondence table, and periodically determine the detected time point as the target time point according to the time interval.
- the weaker the processing capability represented by the processing capability information for example, the lower the main frequency value
- the target processor can be reduced when the processing capability of the target processor is low.
- the number of processing video frames can ease the burden on the target processor.
- step 402 can be performed as follows:
- Step 4021 Determine the current detected time point as the target time point.
- the time point of the first detection is determined as the target time point.
- Step 4022 perform the following display steps: decode the video frame corresponding to the determined target time point to obtain the decoded video frame for display in the second target display area; determine whether the second target display area includes the obtained The decoded video frame.
- the above-mentioned execution subject may decode the video frame corresponding to the determined target time point to obtain the decoded video frame. If the video frame corresponding to the determined target time point is a key frame, you can decode the key frame according to the way of decoding the key frame; if the video frame corresponding to the determined target time point is a P frame or a B frame, you can follow Frames or B-frames are decoded in the manner of decoding.
- the key frame for decoding is decoded in a manner of decoding a P frame or a B frame to obtain a decoded video frame.
- the execution subject may determine whether the decoded video frame is displayed in the second target display area, and if displayed in the second target display area, determine that the second target display area includes the video frame corresponding to the determined target time point .
- Step 4023 In response to determining that the second target display area includes the obtained decoded video frame, re-determine the last detected time point as the target time point, and continue to perform the above display step (ie, the newly determined target time point) Step 4022).
- the detected time point changes in real time
- the above-mentioned execution subject may display the decoded video frame obtained in step 4022 in the second target display area in response to determining , Re-determine the time point of the last real-time monitoring as the target time point, and then use the re-determination as the target time point to perform step 4022 again.
- the above-mentioned execution subject may respond to determining that the second target display area does not include the video frame corresponding to the determined target time point (that is, the video frame corresponding to the target time point has not been decoded and displayed), and wait for the corresponding target time point The video frame decoding is complete.
- this implementation can avoid the stuttering phenomenon caused by processing a large number of video frames (such as control point movement). Time lag), can adapt to processors with different processing capabilities.
- Step 403 Determine the selected time point for the user to adjust the playback progress of the target video.
- step 403 is basically the same as step 201 in the embodiment corresponding to FIG. 2, and will not be repeated here.
- Step 404 Determine the target key frame from the key frame set.
- step 404 is basically the same as step 202 in the embodiment corresponding to FIG. 2, and will not be repeated here.
- Step 405 Based on the target key frame, decode the video frame corresponding to the selected time point to obtain the decoded video frame.
- step 405 is basically the same as step 203 in the embodiment corresponding to FIG. 2, and will not be repeated here.
- Step 406 Display the decoded video frame in the first target display area.
- step 406 is basically the same as step 204 in the embodiment corresponding to FIG. 2, and will not be repeated here.
- the process 400 of the method for displaying images in this embodiment highlights the step of previewing the video frame in the process of the user adjusting the playback progress of the target video. . Therefore, the solution described in this embodiment can realize the real-time preview of the video frame in the process of adjusting the playback progress without adding key frames to the target video in advance, which further improves the flexibility of displaying the video frame and helps to make The user accurately locates the video frame corresponding to the selected time point.
- the present disclosure provides an embodiment of a device for displaying images.
- the device embodiment corresponds to the method embodiment shown in FIG.
- the device can be applied to various electronic devices.
- the apparatus 600 for displaying images in this embodiment includes: a first determining unit 601 configured to determine a selected time point at which the user adjusts the playback progress of the target video, wherein the target video includes a key Frame set; the second determining unit 602 is configured to determine the target key frame from the key frame set, where the difference between the time point corresponding to the target key frame and the selected time point meets the first preset condition; the decoding unit 603, It is configured to decode the video frame corresponding to the selected time point based on the target key frame to obtain the decoded video frame; the display unit 604 is configured to display the decoded video frame in the first target display area.
- the first determining unit 601 may determine the selected time point at which the user adjusts the playback progress of the target video.
- the target video may be a video to be displayed in the first target display area by selecting a video frame from the video frames included therein.
- the target video may be a video obtained remotely using a wireless connection or a wired connection, or may be a video pre-stored locally (for example, a video recorded by the user using the above-mentioned device 600).
- the target video in this embodiment is usually a compressed video, for example, the original video is compressed using the existing H26X coding standard.
- the aforementioned target video includes a set of key frames.
- a key frame also called an I frame
- I frame is a frame that completely retains image data in the compressed video.
- the key frame is decoded, only the image data of the current frame can be decoded.
- the selected time point is the play time point selected by the user when adjusting the play progress of the target video.
- the user can drag the progress bar of the target video or slide on the displayed video screen to adjust the playback progress of the target video.
- the user can use electronic equipment such as a mouse to adjust the playback progress of the target video.
- the device 600 includes a touch screen, the user can also adjust the playback progress by sliding a finger on the screen.
- the second determining unit 602 may determine the target key frame from the key frame set, where the difference between the time point corresponding to the target key frame and the selected time point meets the first preset condition. From the key frame set, determine the target key frame. Wherein, the difference between the time point corresponding to the target key frame and the selected time point meets the first preset condition.
- the first preset condition may include at least one of the following: the time point corresponding to the key frame precedes the selected time point and has the smallest difference with the selected time point, and the time point corresponding to the key frame precedes the selected time point and is the same as the selected time point. The time difference is less than or equal to the preset time difference threshold.
- the decoding unit 603 may decode the video frame corresponding to the selected time point to obtain the decoded video frame.
- the above-mentioned target video includes key frames, P frames, and B frames. If the video frame corresponding to the selected time point is a key frame, it can be decoded in the same way as the key frame is decoded; if the time point is selected The corresponding video frame is a P frame or a B frame, and then based on the target key frame, the decoding can be performed in the manner of decoding the P frame or the B frame. It should be noted that the foregoing methods for decoding I-frames, P-frames, and B-frames are well-known technologies that are currently widely studied and applied, and will not be repeated here.
- the display unit 604 may display the decoded video frame obtained by the decoding unit 603 in the first target display area.
- the above-mentioned first target display area may be a display area for displaying decoded video frames.
- the first target display area may be an area where the video is played on the screen, or other areas on the screen (for example, for The window that allows the user to process the decoded video frame).
- the decoding time for key frames is shorter than the decoding time for P and B frames. Therefore, in order to enable the user to quickly and accurately preview the video frame corresponding to the playback time point selected by the user, the existing method usually uses advance Preprocessing the video to add more key frames to the video, or to set each frame as a key frame, and this process will consume more time. By adopting the above steps, there is no need to pre-process the video in advance, but only need to determine the time point selected by the user and display the video frame corresponding to the selected time point, thereby helping to improve the efficiency of video processing.
- the first determining unit 601 includes at least one of the following: a first determining module (not shown in the figure), configured to adjust the playback of the target video in response to detection The stay time of the progress control point at the current time point is greater than or equal to the preset time threshold, and the current time point is determined to be the selected time point; the second determination module (not shown in the figure) is configured to respond to detecting that the user is not Then manipulate the control point to determine the current time point corresponding to the control point as the selected time point.
- the device 600 may further include: a detection unit (not shown in the figure), configured to detect the adjusted target video in real time during the process of the user adjusting the playback progress of the target video Time point; the second determining unit (not shown in the figure) is configured to determine the target time point based on the time point detected in real time, and display the video frame corresponding to the determined target time point in the second target display area.
- a detection unit (not shown in the figure) configured to detect the adjusted target video in real time during the process of the user adjusting the playback progress of the target video Time point
- the second determining unit (not shown in the figure) is configured to determine the target time point based on the time point detected in real time, and display the video frame corresponding to the determined target time point in the second target display area.
- the second determining unit 602 may be further configured to: determine from the time points respectively corresponding to the key frames included in the key frame set that the distance to the detected time point is consistent with The second preset condition time point is used as the target time point.
- the second determining unit 602 may be further configured to select the time point as the target time point from the target time period where the detected time point is located, where the target time A segment is a time segment in a time segment set obtained by dividing the playback time of the target video based on the key frame set.
- the second determining unit 602 may include: an acquisition module (not shown in the figure), configured to acquire processing capability information of the target processor, where the target processor is used for processing The target video includes video frames, and processing capability information is used to characterize the capability of the target processor to process information; the third determining module (not shown in the figure) is configured to follow a preset time interval corresponding to the processing capability information, Periodically determine the time point detected in real time as the target time point.
- the second determining unit 602 may include: a fourth determining module (not shown in the figure), configured to determine the current detected time point as the target time point; and a display module (Not shown in the figure), is configured to perform the following display steps: decode the video frame corresponding to the determined target time point to obtain the decoded video frame for display in the second target display area; determine the first 2. Whether the target display area includes the obtained decoded video frame; the fifth determining module (not shown in the figure) is configured to, in response to determining that the second target display area includes the obtained decoded video frame, reset The time point of the most recent detection is determined as the target time point, and the re-determined target time point is used to continue the above display step.
- the device provided by the foregoing embodiment of the present disclosure determines the selected time point at which the user adjusts the playback progress of the target video, then determines the key frame from the key frame set included in the target video, and selects the key frame based on the target key frame.
- the video frame corresponding to the time point is decoded, and finally the decoded video frame is displayed in the first target display area, so that when the user adjusts the video playback progress, the existing key frames can be used to realize the video corresponding to the selected time point Frame preview without adding more key frames to the video in advance, saving the time required to add key frames to the video in advance, improving the flexibility of displaying video frames, and helping to improve the positioning and processing of video frames The efficiency of the video frame.
- FIG. 7 shows a schematic structural diagram of a terminal device 700 suitable for implementing the embodiments of the present disclosure.
- the terminal devices in the embodiments of the present disclosure may include, but are not limited to, such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals ( For example, mobile terminals such as car navigation terminals) and fixed terminals such as digital TVs, desktop computers, and the like.
- the electronic device shown in FIG. 7 is only an example, and should not bring any limitation to the function and scope of use of the embodiments of the present disclosure.
- the terminal device 700 may include a processing device (such as a central processing unit, a graphics processor, etc.) 701, which can be loaded into a random access device according to a program stored in a read-only memory (ROM) 702 or from a storage device 708.
- the program in the memory (RAM) 703 executes various appropriate actions and processing.
- various programs and data required for the operation of the terminal device 700 are also stored.
- the processing device 701, the ROM 702, and the RAM 703 are connected to each other through a bus 704.
- An input/output (I/O) interface 705 is also connected to the bus 704.
- the following devices can be connected to the I/O interface 705: including input devices 706 such as touch screen, touch panel, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; including, for example, liquid crystal display (LCD), speakers, vibration An output device 707 such as a device; a storage device 708 such as a magnetic tape and a hard disk; and a communication device 709.
- the communication device 709 may allow the terminal device 700 to perform wireless or wired communication with other devices to exchange data.
- FIG. 7 shows a terminal device 700 having various devices, it should be understood that it is not required to implement or have all the illustrated devices. More or fewer devices may be implemented or provided instead.
- embodiments of the present disclosure include a computer program product that includes a computer program carried on a computer-readable medium, the computer program containing program code for performing the method shown in the flowchart.
- the computer program may be downloaded and installed from the network through the communication device 709, or installed from the storage device 708, or installed from the ROM 702.
- the processing device 701 the above-mentioned functions defined in the method of the embodiment of the present disclosure are executed.
- the computer-readable medium described in the present disclosure may be a computer-readable signal medium or a computer-readable storage medium, or any combination of the two.
- the computer-readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or any combination of the above. More specific examples of computer-readable storage media may include, but are not limited to: electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read-only memory (ROM), erasable Programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the foregoing.
- the computer-readable storage medium may be any tangible medium that contains or stores a program, and the program may be used by or in combination with an instruction execution system, apparatus, or device.
- the computer-readable signal medium may include a data signal that is propagated in baseband or as part of a carrier wave, in which computer-readable program code is carried. This propagated data signal can take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
- the computer-readable signal medium may also be any computer-readable medium other than a computer-readable storage medium, and the computer-readable signal medium may send, propagate, or transmit a program for use by or in combination with an instruction execution system, apparatus, or device .
- the program code contained on the computer-readable medium can be transmitted by any suitable medium, including but not limited to: wire, optical cable, RF (Radio Frequency), etc., or any suitable combination of the above.
- the above-mentioned computer-readable medium may be included in the above-mentioned terminal device; or it may exist alone without being assembled into the terminal device.
- the above-mentioned computer-readable medium carries one or more programs, and when the above-mentioned one or more programs are executed by the terminal device, the terminal device: determines the selected time point at which the user adjusts the playback progress of the target video, wherein:
- the target video includes a key frame set; from the key frame set, the target key frame is determined, where the difference between the time point corresponding to the target key frame and the selected time point meets the first preset condition; based on the target key frame, the selected time Click the corresponding video frame to decode to obtain the decoded video frame; display the decoded video frame in the first target display area.
- the computer program code used to perform the operations of the present disclosure can be written in one or more programming languages or a combination thereof.
- the programming languages include object-oriented programming languages—such as Java, Smalltalk, C++, and also conventional Procedural programming language-such as "C" language or similar programming language.
- the program code may execute entirely on the user's computer, partly on the user's computer, as an independent software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
- the remote computer may be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computer (for example, through an Internet service provider Internet connection).
- LAN local area network
- WAN wide area network
- Internet service provider Internet connection for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
- each block in the flowchart or block diagram may represent a module, program segment, or part of code that contains one or more logic functions Executable instructions.
- the functions marked in the block may also occur in a different order from the order marked in the drawings. For example, two blocks shown in succession can actually be executed substantially in parallel, or they can sometimes be executed in the reverse order, depending on the functions involved.
- each block in the block diagrams and/or flowcharts, and combinations of blocks in the block diagrams and/or flowcharts can be implemented with dedicated hardware-based systems that perform specified functions or operations Or, it can be realized by a combination of dedicated hardware and computer instructions.
- the units described in the embodiments of the present disclosure may be implemented in software or hardware. Among them, the name of the unit does not constitute a limitation on the unit itself under certain circumstances.
- the first determining unit can also be described as "a unit that determines the selected time point when the user adjusts the playback progress of the target video.” .
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Social Psychology (AREA)
- Television Signal Processing For Recording (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Description
Claims (16)
- 一种用于显示图像的方法,包括:确定用户对目标视频的播放进度进行调整的选定时间点,其中,所述目标视频包括关键帧集合;从所述关键帧集合中,确定目标关键帧,其中,所述目标关键帧对应的时间点与所述选定时间点之差符合第一预设条件;基于所述目标关键帧,对所述选定时间点对应的视频帧进行解码,获得解码后视频帧;在第一目标显示区域中显示所述解码后视频帧。
- 根据权利要求1所述的方法,其中,所述确定用户对目标视频的播放进度进行调整的选定时间点,包括以下至少一项:响应于检测到用于调整所述目标视频的播放进度的控制点在当前时间点的停留时间大于等于预设的时间阈值,确定所述当前时间点为选定时间点;响应于检测到所述用户不再操控所述控制点,确定所述控制点当前对应的时间点为选定时间点。
- 根据权利要求1或2所述的方法,其中,在所述确定用户对目标视频的播放进度进行调整的选定时间点之前,所述方法还包括:在所述用户调整所述目标视频的播放进度的过程中,实时检测调整至的时间点;基于实时检测的时间点,确定目标时间点,及在第二目标显示区域中显示所确定的目标时间点对应的视频帧。
- 根据权利要求3所述的方法,其中,所述基于实时检测的时间点,确定目标时间点,包括:从所述关键帧集合包括的关键帧分别对应的时间点中,确定与所 检测到的时间点的距离符合第二预设条件时间点作为目标时间点。
- 根据权利要求3所述的方法,其中,所述基于实时检测的时间点,确定目标时间点,包括:从所检测到的时间点所在的目标时间段中,选择时间点作为目标时间点,其中,所述目标时间段是基于所述关键帧集合对所述目标视频的播放时间进行分割得到的时间段集合中的时间段。
- 根据权利要求3所述的方法,其中,所述基于实时检测的时间点,确定目标时间点,包括:获取目标处理器的处理能力信息,其中,所述目标处理器用于处理所述目标视频包括的视频帧,所述处理能力信息用于表征所述目标处理器处理信息的能力;按照预设的、与所述处理能力信息对应的时间间隔,周期地将实时检测到的时间点确定为目标时间点。
- 根据权利要求3所述的方法,其中,所述基于实时检测的时间点,确定目标时间点,及在第二目标显示区域中显示所确定的目标时间点对应的视频帧,包括:将当前检测的时间点确定为目标时间点;执行如下显示步骤:对所确定的目标时间点对应的视频帧进行解码,得到用于在所述第二目标显示区域中显示的解码后的视频帧;确定所述第二目标显示区域是否包括所得到的解码后的视频帧;响应于确定所述第二目标显示区域包括所得到的解码后的视频帧,重新将最近一次检测的时间点确定为目标时间点,利用重新确定的目标时间点,继续执行所述显示步骤。
- 一种用于显示图像的装置,包括:第一确定单元,被配置成确定用户对目标视频的播放进度进行调整的选定时间点,其中,所述目标视频包括关键帧集合;第二确定单元,被配置成从所述关键帧集合中,确定目标关键帧,其中,所述目标关键帧对应的时间点与所述选定时间点之差符合第一预设条件;解码单元,被配置成基于所述目标关键帧,对所述选定时间点对应的视频帧进行解码,获得解码后视频帧;显示单元,被配置成在第一目标显示区域中显示所述解码后视频帧。
- 根据权利要求8所述的装置,其中,所述第一确定单元包括以下至少一项:第一确定模块,被配置成响应于检测到用于调整所述目标视频的播放进度的控制点在当前时间点的停留时间大于等于预设的时间阈值,确定所述当前时间点为选定时间点;第二确定模块,被配置成响应于检测到所述用户不再操控所述控制点,确定所述控制点当前对应的时间点为选定时间点。
- 根据权利要求8或9所述的装置,其中,所述装置还包括:检测单元,被配置成在所述用户调整所述目标视频的播放进度的过程中,实时检测调整至的时间点;第二确定单元,被配置成基于实时检测的时间点,确定目标时间点,及在第二目标显示区域中显示所确定的目标时间点对应的视频帧。
- 根据权利要求10所述的装置,其中,所述第二确定单元进一步被配置成:从所述关键帧集合包括的关键帧分别对应的时间点中,确定与所检测到的时间点的距离符合第二预设条件时间点作为目标时间点。
- 根据权利要求10所述的装置,其中,所述第二确定单元进一步被配置成:从所检测到的时间点所在的目标时间段中,选择时间点作为目标 时间点,其中,所述目标时间段是基于所述关键帧集合对所述目标视频的播放时间进行分割得到的时间段集合中的时间段。
- 根据权利要求10所述的装置,其中,所述第二确定单元包括:获取模块,被配置成获取目标处理器的处理能力信息,其中,所述目标处理器用于处理所述目标视频包括的视频帧,所述处理能力信息用于表征所述目标处理器处理信息的能力;第三确定模块,被配置成按照预设的、与所述处理能力信息对应的时间间隔,周期地将实时检测到的时间点确定为目标时间点。
- 根据权利要求10所述的装置,其中,所述第二确定单元包括:第四确定模块,被配置成将当前检测的时间点确定为目标时间点;显示模块,被配置成执行如下显示步骤:对所确定的目标时间点对应的视频帧进行解码,得到用于在所述第二目标显示区域中显示的解码后的视频帧;确定所述第二目标显示区域是否包括所得到的解码后的视频帧;第五确定模块,被配置成响应于确定所述第二目标显示区域包括所得到的解码后的视频帧,重新将最近一次检测的时间点确定为目标时间点,利用重新确定的目标时间点,继续执行所述显示步骤。
- 一种终端设备,包括:一个或多个处理器;存储装置,其上存储有一个或多个程序,当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现如权利要求1-7中任一所述的方法。
- 一种计算机可读介质,其上存储有计算机程序,其中,该程序被处理器执行时实现如权利要求1-7中任一所述的方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB2110265.2A GB2594214B (en) | 2019-01-15 | 2019-12-23 | Image display method and apparatus |
US17/422,965 US11482257B2 (en) | 2019-01-15 | 2019-12-23 | Image display method and apparatus |
JP2021540877A JP7273163B2 (ja) | 2019-01-15 | 2019-12-23 | 画像表示方法と装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910037110.8 | 2019-01-15 | ||
CN201910037110.8A CN111436005B (zh) | 2019-01-15 | 2019-01-15 | 用于显示图像的方法和装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020147521A1 true WO2020147521A1 (zh) | 2020-07-23 |
Family
ID=71580845
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/127598 WO2020147521A1 (zh) | 2019-01-15 | 2019-12-23 | 用于显示图像的方法和装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US11482257B2 (zh) |
JP (1) | JP7273163B2 (zh) |
CN (1) | CN111436005B (zh) |
GB (1) | GB2594214B (zh) |
WO (1) | WO2020147521A1 (zh) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114245231A (zh) * | 2021-12-21 | 2022-03-25 | 威创集团股份有限公司 | 一种多视频同步跳转方法、装置、设备及可读存储介质 |
WO2024041406A1 (zh) * | 2022-08-26 | 2024-02-29 | 广州市百果园信息技术有限公司 | 视频目标帧确定方法、装置、设备及存储介质 |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113225613B (zh) * | 2020-01-21 | 2022-07-08 | 北京达佳互联信息技术有限公司 | 图像识别、视频直播方法和装置 |
CN113259780B (zh) * | 2021-07-15 | 2021-11-05 | 中国传媒大学 | 全息多维音视频播放进度条生成、显示和控制播放方法 |
CN113591743B (zh) * | 2021-08-04 | 2023-11-24 | 中国人民大学 | 书法视频识别方法、系统、存储介质及计算设备 |
CN113630649B (zh) * | 2021-08-05 | 2024-04-19 | Vidaa(荷兰)国际控股有限公司 | 一种显示设备及视频播放进度的调整方法 |
CN113423009B (zh) * | 2021-08-23 | 2021-12-24 | 北京拓课网络科技有限公司 | 一种视频进度调整方法、装置及电子设备 |
CN114489458A (zh) * | 2022-01-28 | 2022-05-13 | 维沃移动通信有限公司 | 参数调节方法及装置 |
CN116248963B (zh) * | 2023-02-23 | 2024-07-12 | 北京奇艺世纪科技有限公司 | 视频播放方法、装置、电子设备及存储介质 |
CN116527914A (zh) * | 2023-04-28 | 2023-08-01 | 北京沃东天骏信息技术有限公司 | 适用于空间图像的解码方法及装置 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110218997A1 (en) * | 2010-03-08 | 2011-09-08 | Oren Boiman | Method and system for browsing, searching and sharing of personal video by a non-parametric approach |
CN102780919A (zh) * | 2012-08-24 | 2012-11-14 | 乐视网信息技术(北京)股份有限公司 | 通过关键帧进行视频定位和播放的方法 |
CN104618794A (zh) * | 2014-04-29 | 2015-05-13 | 腾讯科技(北京)有限公司 | 播放视频的方法和装置 |
CN104918120A (zh) * | 2014-03-12 | 2015-09-16 | 联想(北京)有限公司 | 一种播放进度调节方法及电子设备 |
CN108337546A (zh) * | 2017-01-20 | 2018-07-27 | 杭州海康威视数字技术股份有限公司 | 一种目标对象显示方法及装置 |
CN108401188A (zh) * | 2018-03-05 | 2018-08-14 | 青岛海信传媒网络技术有限公司 | 一种媒体播放的方法及装置 |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1090574A (ja) * | 1996-09-18 | 1998-04-10 | Hitachi Cable Ltd | 自己支持型光ケーブルの製造方法 |
JPH11146326A (ja) * | 1997-08-11 | 1999-05-28 | Casio Comput Co Ltd | 画像検索方法および画像検索装置 |
JPH11213642A (ja) * | 1998-01-21 | 1999-08-06 | Fujitsu Ten Ltd | Dab受信装置 |
JP2006148839A (ja) | 2004-11-25 | 2006-06-08 | Sharp Corp | 放送装置、受信装置、及びこれらを備えるデジタル放送システム |
KR20060090481A (ko) * | 2005-02-07 | 2006-08-11 | 엘지전자 주식회사 | 단말기에서 빠른 검색과 역방향 검색 수행 시 수행 간격의정확성 개선 방법 |
JP4244051B2 (ja) | 2005-04-15 | 2009-03-25 | ソニー株式会社 | プログラム、復号装置、復号方法、並びに、記録媒体 |
JP2007250048A (ja) | 2006-03-14 | 2007-09-27 | Sony Corp | 画像処理装置、画像処理方法、画像処理プログラム及びプログラム格納媒体 |
JP4829800B2 (ja) | 2007-01-17 | 2011-12-07 | 株式会社日立製作所 | 記録再生装置 |
JP5089544B2 (ja) | 2008-09-16 | 2012-12-05 | キヤノン株式会社 | 画像再生装置及びその制御方法 |
JP2012043207A (ja) | 2010-08-19 | 2012-03-01 | Sony Corp | 情報処理装置、情報処理方法、およびプログラム |
CN103024561B (zh) * | 2011-09-28 | 2016-05-25 | 深圳市快播科技有限公司 | 一种拖拽进度条的显示方法及装置 |
JP6135524B2 (ja) * | 2014-01-27 | 2017-05-31 | ブラザー工業株式会社 | 画像情報処理プログラム、画像情報処理方法及び画像情報処理装置 |
CN103957471B (zh) | 2014-05-05 | 2017-07-14 | 华为技术有限公司 | 网络视频播放的方法和装置 |
CN104918136B (zh) * | 2015-05-28 | 2018-08-31 | 北京奇艺世纪科技有限公司 | 视频定位方法和装置 |
CN105208463B (zh) * | 2015-08-31 | 2017-12-15 | 暴风集团股份有限公司 | 针对m3u8文件进行帧确定的方法和系统 |
CN105898588A (zh) * | 2015-12-07 | 2016-08-24 | 乐视云计算有限公司 | 视频定位方法和装置 |
CN105635844A (zh) * | 2016-01-12 | 2016-06-01 | 腾讯科技(深圳)有限公司 | 在浏览器中播放视频的方法和装置 |
WO2018058368A1 (zh) * | 2016-09-28 | 2018-04-05 | 深圳市柔宇科技有限公司 | 系统性能提升方法、系统性能提升装置及显示装置 |
WO2018082213A1 (zh) * | 2016-11-07 | 2018-05-11 | 华为技术有限公司 | 一种显示方法以及电子终端 |
CN107197182B (zh) * | 2017-06-06 | 2020-05-05 | 青岛海信电器股份有限公司 | 一种在电视上显示屏幕菜单的方法、装置及电视 |
CN108737908B (zh) * | 2018-05-21 | 2021-11-30 | 腾讯科技(深圳)有限公司 | 一种媒体播放方法、装置及存储介质 |
CN108965907B (zh) * | 2018-07-11 | 2020-03-13 | 北京字节跳动网络技术有限公司 | 用于播放视频的方法、装置和系统 |
-
2019
- 2019-01-15 CN CN201910037110.8A patent/CN111436005B/zh active Active
- 2019-12-23 JP JP2021540877A patent/JP7273163B2/ja active Active
- 2019-12-23 GB GB2110265.2A patent/GB2594214B/en active Active
- 2019-12-23 WO PCT/CN2019/127598 patent/WO2020147521A1/zh active Application Filing
- 2019-12-23 US US17/422,965 patent/US11482257B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110218997A1 (en) * | 2010-03-08 | 2011-09-08 | Oren Boiman | Method and system for browsing, searching and sharing of personal video by a non-parametric approach |
CN102780919A (zh) * | 2012-08-24 | 2012-11-14 | 乐视网信息技术(北京)股份有限公司 | 通过关键帧进行视频定位和播放的方法 |
CN104918120A (zh) * | 2014-03-12 | 2015-09-16 | 联想(北京)有限公司 | 一种播放进度调节方法及电子设备 |
CN104618794A (zh) * | 2014-04-29 | 2015-05-13 | 腾讯科技(北京)有限公司 | 播放视频的方法和装置 |
CN108337546A (zh) * | 2017-01-20 | 2018-07-27 | 杭州海康威视数字技术股份有限公司 | 一种目标对象显示方法及装置 |
CN108401188A (zh) * | 2018-03-05 | 2018-08-14 | 青岛海信传媒网络技术有限公司 | 一种媒体播放的方法及装置 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114245231A (zh) * | 2021-12-21 | 2022-03-25 | 威创集团股份有限公司 | 一种多视频同步跳转方法、装置、设备及可读存储介质 |
CN114245231B (zh) * | 2021-12-21 | 2023-03-10 | 威创集团股份有限公司 | 一种多视频同步跳转方法、装置、设备及可读存储介质 |
WO2024041406A1 (zh) * | 2022-08-26 | 2024-02-29 | 广州市百果园信息技术有限公司 | 视频目标帧确定方法、装置、设备及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
US11482257B2 (en) | 2022-10-25 |
CN111436005B (zh) | 2022-03-08 |
GB2594214A (en) | 2021-10-20 |
US20220148624A1 (en) | 2022-05-12 |
JP2022519172A (ja) | 2022-03-22 |
GB2594214B (en) | 2023-05-03 |
CN111436005A (zh) | 2020-07-21 |
GB202110265D0 (en) | 2021-09-01 |
JP7273163B2 (ja) | 2023-05-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020147521A1 (zh) | 用于显示图像的方法和装置 | |
CN109640188B (zh) | 视频预览方法、装置、电子设备及计算机可读存储介质 | |
US10182095B2 (en) | Method and system for video call using two-way communication of visual or auditory effect | |
WO2020233142A1 (zh) | 多媒体文件播放方法、装置、电子设备和存储介质 | |
CN108965907B (zh) | 用于播放视频的方法、装置和系统 | |
WO2021160143A1 (zh) | 用于显示视频的方法、装置、电子设备和介质 | |
WO2021218556A1 (zh) | 信息展示方法、装置和电子设备 | |
CN111258736B (zh) | 信息处理方法、装置和电子设备 | |
CN109462779B (zh) | 视频预览信息的播放控制方法、应用客户端及电子设备 | |
JP2023522092A (ja) | インタラクション記録生成方法、装置、デバイス及び媒体 | |
CN110958481A (zh) | 视频页面显示方法、装置、电子设备和计算机可读介质 | |
US20230421857A1 (en) | Video-based information displaying method and apparatus, device and medium | |
CN110673886B (zh) | 用于生成热力图的方法和装置 | |
US10872356B2 (en) | Methods, systems, and media for presenting advertisements during background presentation of media content | |
CN109788333A (zh) | 用于全屏显示视频的方法及装置 | |
CN112000251A (zh) | 用于播放视频的方法、装置、电子设备和计算机可读介质 | |
CN111385599B (zh) | 视频处理方法和装置 | |
WO2023098576A1 (zh) | 图像处理方法、装置、设备及介质 | |
CN111027495A (zh) | 用于检测人体关键点的方法和装置 | |
CN115379245B (zh) | 信息显示方法、装置和电子设备 | |
WO2020143556A1 (zh) | 用于展示页面的方法和装置 | |
CN114554292A (zh) | 视角的切换方法、装置、电子设备、存储介质和程序产品 | |
WO2021031909A1 (zh) | 数据内容的输出方法、装置、电子设备及计算机可读介质 | |
CN111385638B (zh) | 视频处理方法和装置 | |
CN110366002B (zh) | 视频文件合成方法、系统、介质和电子设备 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19910555 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021540877 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 202110265 Country of ref document: GB Kind code of ref document: A Free format text: PCT FILING DATE = 20191223 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 09.11.2021) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19910555 Country of ref document: EP Kind code of ref document: A1 |