CN113453069A - Display device and thumbnail generation method - Google Patents

Display device and thumbnail generation method Download PDF

Info

Publication number
CN113453069A
CN113453069A CN202110678680.2A CN202110678680A CN113453069A CN 113453069 A CN113453069 A CN 113453069A CN 202110678680 A CN202110678680 A CN 202110678680A CN 113453069 A CN113453069 A CN 113453069A
Authority
CN
China
Prior art keywords
picture
ith picture
ith
video file
thumbnail
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110678680.2A
Other languages
Chinese (zh)
Other versions
CN113453069B (en
Inventor
贾桂丽
邵肖明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202110678680.2A priority Critical patent/CN113453069B/en
Publication of CN113453069A publication Critical patent/CN113453069A/en
Priority to PCT/CN2022/072894 priority patent/WO2022156729A1/en
Application granted granted Critical
Publication of CN113453069B publication Critical patent/CN113453069B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8549Creating video summaries, e.g. movie trailer

Abstract

The embodiment of the application discloses a display device and a thumbnail generation method. The display device includes a display and a controller. The controller is configured to intercept an ith picture of the video file, determine whether the ith picture of the current screenshot is a pure-color picture by judging whether color parameters of N pixel points on the ith picture are equal, and if the ith picture is not the pure-color picture, content displayed by the ith picture can represent content corresponding to the video file to a certain extent, so that the ith picture can be used as a thumbnail. Therefore, the display device shown in the embodiment can avoid that the generated thumbnail is a pure-color picture in the generation process of the thumbnail, and the user experience is good.

Description

Display device and thumbnail generation method
Technical Field
The present disclosure relates to the field of file display technologies, and in particular, to a display device and a thumbnail generation method.
Background
The display of various types of files by display devices is receiving wide attention from users. In general, a display device may be a TV (television), a VR (Virtual Reality) device, a mobile terminal, or the like; the files displayed by the display device can be audio, video, pictures, word and other files. The display device is internally provided with some memory capacity for storing local files, so that the display device can display the locally stored files anytime and anywhere.
When a file is displayed on a display device, if only the file type and name are displayed, a lot of inconvenience is brought to a user. Especially for a file of a picture type, if only the name of the file is displayed, it cannot help the user to recognize the contents of each picture. The display device may present the picture-type file in the form of a picture. There are generally two implementations of displaying the picture type file in the form of a picture: one is to display the original image after reducing, and the method can occupy a large amount of memory resources; another method is to generate a thumbnail from each of the pictures contained in the folder opened by the user and display the thumbnail.
In the process of producing a video-type file, scene conversion is often involved, and a pure-color picture is inserted between two frames of a converted scene in order to improve the viewing effect of a user. This may result in the captured thumbnail of the video-type file being a solid picture, which may result in a poor user experience.
Disclosure of Invention
In order to solve technical problems in the prior art, embodiments of the present application show a display device and a thumbnail generation method.
A first aspect of embodiments of the present application shows a display device, including:
a display;
a controller configured to:
intercepting an ith picture, wherein the ith picture is a frame picture in a video file, i is the times of intercepting the picture in the video file in the thumbnail generation process, and i is a positive integer and is greater than or equal to 1;
selecting N pixel points on the ith picture, wherein N is a positive integer and is greater than or equal to 2;
respectively reading color parameters of N pixel points, wherein the color parameters are characteristic values representing the colors of the pixel points;
and if at least 2 color parameters are different, reducing the ith picture to obtain a thumbnail of the video file.
The display device shown in the embodiment of the application comprises a display and a controller. The controller is configured to intercept an ith picture of the video file, determine whether the ith picture of the current screenshot is a pure-color picture by judging whether color parameters of N pixel points on the ith picture are equal, and if the ith picture is not the pure-color picture, content displayed by the ith picture can represent content corresponding to the video file to a certain extent, so that the ith picture can be used as a thumbnail. Therefore, the display device shown in the embodiment can avoid that the generated thumbnail is a pure-color picture in the generation process of the thumbnail, and the user experience is good.
A second aspect of the embodiments of the present application shows a thumbnail generation method, including:
intercepting an ith picture, wherein the ith picture is a frame picture in a video file, i is the times of intercepting the picture in the video file in the thumbnail generation process, and i is a positive integer and is greater than or equal to 1;
selecting N pixel points on the ith picture, wherein N is a positive integer and is greater than or equal to 2;
respectively reading color parameters of N pixel points, wherein the color parameters are characteristic values representing the colors of the pixel points;
and if at least 2 color parameters are different, reducing the ith picture to obtain a thumbnail of the video file.
The embodiment of the application shows a thumbnail generation method. The controller is configured to intercept an ith picture of the video file, determine whether the ith picture of the screenshot is a pure color picture by judging whether color parameters of N pixel points on the ith picture are equal, and if the ith picture is not the pure color picture, content displayed by the ith picture can represent content corresponding to the video file to a certain extent, so that the ith picture can be used as a thumbnail. Therefore, the thumbnail generation method shown in the embodiment can avoid that the generated thumbnail is a pure-color picture in the generation process of the thumbnail, and the user experience is better.
Drawings
In order to more clearly illustrate the embodiments of the present application or the implementation manner in the related art, a brief description will be given below of the drawings required for the description of the embodiments or the related art, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained by those skilled in the art according to the drawings.
FIG. 1 illustrates a usage scenario of a display device according to some embodiments;
fig. 2 illustrates a hardware configuration block diagram of the control apparatus 100 according to some embodiments;
fig. 3 illustrates a hardware configuration block diagram of the display apparatus 200 according to some embodiments;
FIG. 4 illustrates a software configuration diagram in the display device 200 according to some embodiments;
FIG. 5 is a flow chart of interaction between a display device and a user provided in accordance with one possible embodiment;
FIG. 6 is a schematic diagram of an ith picture provided in accordance with a possible embodiment;
FIG. 7 is a diagram illustrating an ith picture according to one possible embodiment;
FIG. 8 is a diagram illustrating an ith picture according to one possible embodiment;
FIG. 9 is a schematic diagram of an ith picture provided in accordance with a possible embodiment;
FIG. 10 is a flowchart illustrating a method for determining whether an ith picture is a pure color picture according to one possible embodiment;
fig. 11 is a flowchart illustrating a method for capturing an ith picture according to a possible embodiment.
Detailed Description
To make the purpose and embodiments of the present application clearer, the following will clearly and completely describe the exemplary embodiments of the present application with reference to the attached drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
Fig. 1 is a schematic diagram of a usage scenario of a display device according to an embodiment. As shown in fig. 1, the display apparatus 200 is also in data communication with a server 400, and a user can operate the display apparatus 200 through the smart device 300 or the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes at least one of an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, and controls the display device 200 in a wireless or wired manner. The user may control the display apparatus 200 by inputting a user instruction through at least one of a key on a remote controller, a voice input, a control panel input, and the like.
In some embodiments, the smart device 300 may include any of a mobile terminal, a tablet, a computer, a laptop, an AR/VR device, and the like.
In some embodiments, the smart device 300 may also be used to control the display device 200. For example, the display device 200 is controlled using an application running on the smart device.
In some embodiments, the smart device 300 and the display device may also be used for communication of data.
In some embodiments, the display device 200 may also be controlled in a manner other than the control apparatus 100 and the smart device 300, for example, the voice instruction control of the user may be directly received by a module configured inside the display device 200 to obtain a voice instruction, or may be received by a voice control apparatus provided outside the display device 200.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers.
In some embodiments, software steps executed by one step execution agent may be migrated on demand to another step execution agent in data communication therewith for execution. Illustratively, software steps performed by the server may be migrated to be performed on a display device in data communication therewith, and vice versa, as desired.
Fig. 2 exemplarily shows a block diagram of a configuration of the control apparatus 100 according to an exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply. The control apparatus 100 may receive an input operation instruction from a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200.
In some embodiments, the communication interface 130 is used for external communication, and includes at least one of a WIFI chip, a bluetooth module, NFC, or an alternative module.
In some embodiments, the user input/output interface 140 includes at least one of a microphone, a touchpad, a sensor, a key, or an alternative module.
Fig. 3 shows a hardware configuration block diagram of the display apparatus 200 according to an exemplary embodiment.
In some embodiments, the display apparatus 200 includes at least one of a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, a user interface.
In some embodiments the controller comprises a central processor, a video processor, an audio processor, a graphics processor, a RAM, a ROM, a first interface to an nth interface for input/output.
In some embodiments, the display 260 includes a display screen component for displaying pictures, and a driving component for driving image display, a component for receiving image signals from the controller output, displaying video content, image content, and menu manipulation interface, and a user manipulation UI interface, etc.
In some embodiments, the display 260 may be at least one of a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
In some embodiments, the tuner demodulator 210 receives broadcast television signals via wired or wireless reception, and demodulates audio/video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals.
In some embodiments, communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver. The display apparatus 200 may establish transmission and reception of control signals and data signals with the control device 100 or the server 400 through the communicator 220.
In some embodiments, the detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for collecting ambient light intensity; alternatively, the detector 230 includes an image collector, such as a camera, which may be used to collect external environment scenes, attributes of the user, or user interaction gestures, or the detector 230 includes a sound collector, such as a microphone, which is used to receive external sounds.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, and the like. The interface may be a composite input/output interface formed by the plurality of interfaces.
In some embodiments, the controller 250 and the modem 210 may be located in different separate devices, that is, the modem 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink, an icon, or other operable operation region. The operations related to the selected object are: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to an icon.
In some embodiments the controller comprises at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphics Processing Unit (GPU), a RAM Random Access Memory (RAM), a ROM (Read-Only Memory), a first to nth interface for input/output, a communication Bus (Bus), and the like.
A CPU processor. For executing operating system and application instructions stored in the memory, and executing various applications, data and contents according to various interactive instructions receiving external input, so as to finally display and play various audio-video contents. The CPU processor may include a plurality of processors. E.g. comprising a main processor and one or more sub-processors.
In some embodiments, a graphics processor for generating various graphics objects, such as: at least one of an icon, an operation menu, and a user input instruction display figure. The graphic processor comprises an arithmetic unit, which performs operation by receiving various interactive instructions input by a user and displays various objects according to display attributes; the system also comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, the video processor is configured to receive an external video signal, and perform at least one of video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a signal displayed or played on the direct display device 200.
In some embodiments, the video processor includes at least one of a demultiplexing module, a video decoding module, an image composition module, a frame rate conversion module, a display formatting module, and the like. The demultiplexing module is used for demultiplexing the input audio and video data stream. And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like. And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display. And the frame rate conversion module is used for converting the frame rate of the input video. And the display formatting module is used for converting the received video output signal after the frame rate conversion, and changing the signal to be in accordance with the signal of the display format, such as an output RGB data signal.
In some embodiments, the audio processor is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform at least one of noise reduction, digital-to-analog conversion, and amplification processing to obtain a sound signal that can be played in the speaker.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on display 260, and the user input interface receives the user input commands through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form that is acceptable to the user. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, an operation area, etc. displayed in the display screen of the electronic device, where the operation area may include at least one of visual interface elements such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
In some embodiments, user interface 280 is an interface that may be used to receive control inputs (e.g., physical buttons on the body of the display device, or the like).
In some embodiments, a system of display devices may include a Kernel (Kernel), a command parser (shell), a content system, and an application. The kernel, shell, and content system together make up the basic operating system architecture that allows users to manage content, run programs, and use the system. After power-on, the kernel is started, kernel space is activated, hardware is abstracted, hardware parameters are initialized, and virtual memory, a scheduler, signals and inter-thread communication (IPC) are operated and maintained. And after the kernel is started, loading the Shell and the user application. The application is compiled into machine code after being started, forming a thread.
As shown in fig. 4, the system of the display device may include a Kernel (Kernel), a command parser (shell), a content system, and an application program. The kernel, shell, and content system together make up the basic operating system architecture that allows users to manage content, run programs, and use the system. After power-on, the kernel is started, kernel space is activated, hardware is abstracted, hardware parameters are initialized, and virtual memory, a scheduler, signals and interprocess communication (IPC) are operated and maintained. And after the kernel is started, loading the Shell and the user application program. The application program is compiled into machine code after being started, and a process is formed.
As shown in fig. 4, the system of the display device is divided into three layers, i.e., an application layer, a middleware layer and a hardware layer from top to bottom.
The Application layer mainly includes common applications on the television and an Application Framework (Application Framework), wherein the common applications are mainly applications developed based on the Browser, such as: HTML5 APPs; and Native APPs (Native APPs);
an Application Framework (Application Framework) is a complete program model, and has all basic functions required by standard Application software, such as: content access, data exchange, and interfaces to use these functions (toolbars, status lists, menus, dialog boxes).
Native APPs (Native APPs) may support online or offline, message push, or local resource access.
The middleware layer comprises various television protocols, multimedia protocols, system components and other middleware. The middleware can use basic service (function) provided by system software to connect each part of an application system or different applications on a network, and can achieve the purposes of resource sharing and function sharing.
The hardware layer mainly comprises an HAL interface, hardware and a driver, wherein the HAL interface is a unified interface for butting all the television chips, and specific logic is realized by each chip. The driving mainly comprises: audio drive, display driver, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (like fingerprint sensor, temperature sensor, pressure sensor etc.) and power drive etc..
In some embodiments, the display device may directly enter a display interface of a signal source selected last time after being started, or a signal source selection interface, where the signal source may be a preset video-on-demand program, or may be at least one of an HDMI interface, a live tv interface, and the like, and after a user selects different signal sources, the display may display contents obtained from different signal sources. May be used.
When a file is displayed on a display device, if only the file type and name are displayed, a lot of inconvenience is brought to a user. Especially for files of the picture type video type, if only the name of the file is displayed, it is not possible to help the user recognize the contents of each picture. The display device may present the picture-type file in the form of a picture. There are generally two implementations of displaying the picture type file in the form of a picture: one is to display the original image after reducing, and the method can occupy a large amount of memory resources; another method is to generate a thumbnail from each of the pictures contained in the folder opened by the user and display the thumbnail. In the process of producing a video-type file, scene conversion is often involved, and a pure-color picture is inserted between two frames of a converted scene in order to improve the viewing effect of a user. This may result in the captured thumbnail of the video-type file being a solid picture, which may result in a poor user experience.
In order to solve the above technical problem, an embodiment of the present application provides a display device, where the display device at least includes: a display and a controller. The structure and function of the display and the controller may refer to the above embodiments. The newly added functionality of the display and controller is described below in conjunction with the specific figures.
Fig. 5 is a flowchart of interaction between a display device and a user according to an embodiment.
The user is configured to perform step S51 to trigger the thumbnail generation function.
There are various ways in which the user may trigger the thumbnail generation function. As a feasible embodiment, the display device may be configured with a thumbnail generation control, and after a user selects a video file, the user may generate a thumbnail of the video file by touching the thumbnail generation control. In some feasible embodiments, the user may select multiple video files at the same time, and then the user may generate thumbnails of the multiple video files by touching the thumbnail generation control. The implementation manner of the user triggering the thumbnail generation function in the process of the actual application may be, but is not limited to, the above-mentioned one, and the applicant does not limit the process herein.
The controller is configured to execute step S52 to intercept the ith picture;
in the application, the ith picture is a frame picture in a video file, i is the number of times of capturing pictures in the video file in the thumbnail generation process, and i is a positive integer and is greater than or equal to 1;
in the application, when the controller intercepts the video frame picture, the pure color picture may be intercepted. In this case, the controller needs to intercept the video frame again. Therefore, in the scheme related to this embodiment, i pictures need to be taken, where i is a positive integer and is greater than or equal to 1. The following describes the process of capturing the ith picture with reference to a specific example.
The video frame that the controller first captures is referred to as picture 1 and the video frame that the controller second captures is referred to as picture 2 … …. In the application, if the 1 st picture of the controller is not a pure color picture, the controller does not intercept the 2 nd picture, and if the 1 st picture is a pure color picture, the controller continues to intercept the 2 nd picture.
In order to ensure that the picture displayed by the display at the angle viewed by the user smoothly transits from the thumbnail to the video file in the process of playing the video file. In some feasible embodiments, the controller is further configured to: and responding to the trigger of the user on the thumbnail, controlling the display to play the video file, wherein the playing process of the video file is played at the video frame starting point corresponding to the thumbnail.
According to the scheme shown in the embodiment, in the process of playing the video file, the picture displayed by the display is changed from the thumbnail to the frame picture of the video file.
Further, in the technical solution shown in this embodiment, the playing process of the video file is played with the thumbnail as a starting point. Therefore, in the ith picture selection process, the controller selects the picture of the first frame in the video file as the picture 1 for the first time, so that any picture of the video file cannot be omitted in the process of playing the video file by taking the thumbnail as a starting point, and the viewing experience of a user is improved.
The controller is configured to execute step S53 to select N pixel points on the ith picture, where N is a positive integer greater than or equal to 2;
the embodiment does not limit the selection mode of the N pixel points; for example, in some feasible embodiments, the controller may randomly select N pixel points on the ith picture; for another example, in some feasible embodiments, the controller may select N pixel points on the ith picture according to a certain rule. Without being limited by the applicant, the following description will be made on the selection process of N pixels with reference to specific examples.
Fig. 6 is a schematic diagram of an ith picture according to a possible embodiment. It can be seen that N is equal to 4 in this embodiment, the controller equally divides the ith picture into 5 equal parts in the transverse direction, and the dividing lines are the first dividing line 61, the second dividing line 62, the third dividing line 63 and the fourth dividing line 64. The 4 pixels are respectively the center point 65 of the first dividing line 61, the center point 66 of the second dividing line 62, the center point 67 of the third dividing line 63 and the center point 68 of the fourth dividing line 64.
Fig. 7 is a schematic diagram of an ith picture according to a possible embodiment. It can be seen that N is equal to 3 in this embodiment, the controller equally divides the ith picture into 4 equal parts in the transverse direction, the dividing lines are the first dividing line 71, the second dividing line 72, the third dividing line 73, and 3 pixel points are the center point 74 of the first dividing line 71, the center point 75 of the second dividing line 72, and the center point 76 of the third dividing line 73, respectively.
Fig. 8 is a schematic diagram of an ith picture according to a possible embodiment. It can be seen that in this embodiment, N is equal to 5, and the controller randomly selects 5 pixels, pixel 81, pixel 82, pixel 83, pixel 84, and pixel 85 from the ith picture.
Optionally, in some feasible embodiments, the controller is further configured to: and selecting N pixel points on the ith picture by adopting a binary search method. If the ith picture has pixel points with different color parameters, selecting N pixel points on the ith picture by adopting a binary search method can find the pixel points with different color parameters in a short time.
The following describes an implementation manner of selecting N pixel points on the ith picture by using a binary search method in combination with a specific drawing. Fig. 9 is a schematic diagram of an ith picture according to a possible embodiment. In this embodiment, N is equal to 5, and 5 pixels are respectively: a first pixel 91, a second pixel 92, a third pixel 93, a fourth pixel 94 and a fifth pixel 95; the first pixel point is a pixel point corresponding to the central position of the ith picture; the second pixel point is a pixel point corresponding to the center position of a first connecting line, and the first connecting line is a connecting line between the center position of the ith picture and the upper left corner position of the ith picture; the third pixel point is a pixel point corresponding to the center position of a second connecting line, and the second connecting line is a connecting line between the center position of the ith picture and the upper right corner position of the ith picture; the fourth pixel point is a pixel point corresponding to the central position of a third connecting line, and the third connecting line is a connecting line between the central position of the ith picture and the lower left corner position of the ith picture; the fifth pixel point is a pixel point corresponding to the center position of a fourth connecting line, and the fourth connecting line is a connecting line between the center position of the ith picture and the lower right corner position of the ith picture.
It should be noted that this embodiment is merely an exemplary implementation manner for selecting N pixel points on the ith picture by using a binary search method. The selection mode is not limited, and other binary search methods can be adopted to select N pixel points on the ith picture in the actual application process, so that the applicant does not have excessive limitation again.
The controller is configured to execute step S54 to read the color parameters of the N pixel points respectively.
In this application, the color parameter is a characteristic Value representing the color of the pixel point, and may be, but not limited to, an RGB (RED, Green, Blue, color system) Value, and an HSV (Hue, Saturation, color model) Value.
The RGB value and RGB color scheme are a color standard in the industry, and various colors are obtained by changing three color channels of red (R), green (G) and blue (B) and superimposing the three color channels on each other, RGB represents the colors of the three channels of red, green and blue, and the standard almost includes all colors that can be perceived by human vision, and is one of the most widely used color systems. If the RGB values of the two pixel points are the same, the two pixel points are just right to have the same color.
HSV is more closely related to the perceived experience of humans with color than RGB. The color tone, the brightness degree and the brightness degree of the color are visually expressed, and the color contrast is convenient to carry out. Under the HSV color space, it is easier to track objects of a certain color than BGR, which is often used to segment objects of a given color. The way of HSV expressing color images consists of three parts: hue (Hue ); saturation (Saturation, color purity); value (lightness). If the HSV values of the two pixel points are the same, it is right to show that the two pixel points have the same color.
The controller is configured to execute step S55 to zoom out the ith picture to obtain a thumbnail of the video file if there are at least 2 color parameters different.
In the application, if at least 2 color parameters are different among the color parameters of the N pixel points, it is proved that the colors of at least two pixel points in the ith picture are different, and further, the ith picture can be proved not to be a pure color picture. And for the ith picture which is not a pure color picture, the controller reduces the ith picture to obtain a thumbnail of the video file.
In the application, if the color parameters of the N pixel points are the same, it is proved that the colors of the N pixel points of the ith picture are the same, and further, it can be proved that the ith picture may be a pure color picture.
The display device shown in the embodiment of the application comprises a display and a controller. The controller is configured to intercept an ith picture of the video file, determine whether the ith picture of the current screenshot is a pure-color picture by judging whether color parameters of N pixel points on the ith picture are equal, and if the ith picture is not the pure-color picture, content displayed by the ith picture can represent content corresponding to the video file to a certain extent, so that the ith picture can be used as a thumbnail. Therefore, the display device shown in the embodiment can avoid that the generated thumbnail is a pure-color picture in the generation process of the thumbnail, and the user experience is good.
In order to ensure the accuracy of the determination result of whether the ith picture is a pure color picture, in a feasible embodiment, if the color parameters of the N pixel points are the same, M pixel points of the picture may be collected again, and then the color parameters of the M pixel points are further compared to determine whether the ith picture is a pure color picture.
The following describes an implementation process for determining whether the ith picture is a pure color picture with reference to a specific drawing. Fig. 10 is a flowchart illustrating a method for determining whether an ith picture is a pure color picture according to a feasible embodiment, and on the basis of the display device illustrated in the above embodiment, the controller is further configured to perform the following steps:
s101, if the N color parameters are the same, M pixel points are selected from the ith picture, wherein M is a positive integer and is greater than or equal to 2.
In this application, the number of M and the number of M are not limited in this embodiment, where M may be equal to N or may not be equal to N. However, it is to be ensured that the selection manner of the M pixel points is different from the selection manner of the N pixel points;
for example, if the N pixel points adopt the selection method shown in fig. 9, then the M pixel points may adopt the selection method shown in fig. 8 or the selection method shown in fig. 7.
If the N pixel points adopt the selection method shown in fig. 8, then the selection method shown in fig. 9 or the selection method shown in fig. 7 can be adopted for the M pixel points.
It should be noted that, in the embodiment of the present application, several pixel point selection manners are introduced only by way of example, the selection manners are not limited, and a corresponding pixel point selection manner may be adopted according to a requirement in a practical application process, where the applicant does not make too many limitations.
S102, respectively reading color parameters of M pixel points;
the implementation of reading the color parameters of the M pixel points can refer to the above embodiments, and the applicant does not describe here any more.
S103, if the M color parameters are the same, intercepting an ith picture.
In the application, if at least 2 color parameters are different among the color parameters of the M pixel points, it is proved that the colors of at least two pixel points in the ith picture are different, and further, the ith picture can be proved not to be a pure color picture. And for the ith picture which is not a pure color picture, the controller reduces the ith picture to obtain a thumbnail of the video file.
In the application, if the color parameters of the M pixel points are the same, it is proved that the colors of the M pixel points of the ith picture are the same, and further, it can be proved that the ith picture may be a pure color picture.
In this embodiment, the sampling interval of the ith picture is not limited. For example, as a possible embodiment, the controller may capture an ith picture every 10 frames every other video frame in the video file. For example, the first screenshot controller can capture the 1 st frame of video of the video file, the second screenshot controller can capture the 11 th frame of video of the video file, and the third screenshot controller can capture the 21 st frame of video of the video file … ….
According to the technical scheme shown in the embodiment of the application, in order to ensure that a user does not miss any picture as much as possible when watching a video, the i-th picture is captured in the first frames of a video file as much as possible, and if the continuously captured pictures are all pure-color pictures or continuous multi-frame pure-color pictures at the beginning of the video. In order to quickly find a non-solid-color picture in an application scene with continuous multiple frames of solid-color pictures at the beginning, the embodiment limits the capturing mode of the picture, and specifically, the ith picture is a picture of the jth frame in the video file;
when i is 1, j is 1;
when i is 2, j is 2;
when i >2, j ═ (i-1) + (i-2);
for example, the controller may intercept frame pictures of 1 st, 2 nd, 3 rd, 5 th, 8 th, 13 th and 21 st frames as the ith image, respectively.
In some application scenarios, the video file is a screen card video, which is composed of a series of pure color images. If the video file is a screenshot card video then the controller will continue to perform the act of capturing the ith picture. In order to avoid the above problem, an embodiment of the present application illustrates an ith picture capturing method.
Fig. 11 is a flowchart illustrating a method for capturing an ith picture according to a possible embodiment, and on the basis of the display device illustrated in the above embodiment, the controller is further configured to perform the following steps:
s111, judging whether i is equal to a time threshold value;
in this embodiment, the number threshold is a set positive integer greater than or equal to 2; the time threshold value can be set according to actual conditions. For example, in a possible embodiment the number threshold may be equal to 10.
In this embodiment, the controller determines whether the i is equal to the time threshold after determining that the ith picture is a pure color picture.
S112, if i is smaller than the time threshold, intercepting the ith picture;
s113, if i is equal to the time threshold, reducing the ith picture to obtain a thumbnail of the video file.
In the application, if the pictures intercepted for i times in the video file are all pure-color pictures, the video file is proved to be possible to be a screen graphic card video, and under the condition, when the ith picture is determined to be the pure-color picture, the ith picture is controlled to be reduced to obtain the thumbnail of the video file.
A second aspect of the embodiments of the present application shows a thumbnail generation method, including:
intercepting an ith picture, wherein the ith picture is a frame picture in a video file, i is the times of intercepting the picture in the video file in the thumbnail generation process, and i is a positive integer and is greater than or equal to 1;
selecting N pixel points on the ith picture, wherein N is a positive integer and is greater than or equal to 2;
respectively reading color parameters of N pixel points, wherein the color parameters are characteristic values representing the colors of the pixel points;
and if at least 2 color parameters are different, reducing the ith picture to obtain a thumbnail of the video file.
The embodiment of the application shows a thumbnail generation method. The controller is configured to intercept an ith picture of the video file, determine whether the ith picture of the screenshot is a pure color picture by judging whether color parameters of N pixel points on the ith picture are equal, and if the ith picture is not the pure color picture, content displayed by the ith picture can represent content corresponding to the video file to a certain extent, so that the ith picture can be used as a thumbnail. Therefore, the thumbnail generation method shown in the embodiment can avoid that the generated thumbnail is a pure-color picture in the generation process of the thumbnail, and the user experience is better.
In a specific implementation, the present invention further provides a computer storage medium, where the computer storage medium may store a program, and the program may include some or all of the steps in each embodiment of the method for customizing a control key and the method for starting the control key provided by the present invention when executed. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), a Random Access Memory (RAM), or the like.
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented as software plus a required general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be essentially or partially implemented in the form of software products, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and include instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method in the embodiments or some parts of the embodiments of the present invention.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A display device, comprising:
a display;
a controller configured to:
intercepting an ith picture, wherein the ith picture is a frame picture in a video file, i is the times of intercepting the picture in the video file in the thumbnail generation process, and i is a positive integer and is greater than or equal to 1;
selecting N pixel points on the ith picture, wherein N is a positive integer and is greater than or equal to 2;
respectively reading color parameters of N pixel points, wherein the color parameters are characteristic values representing the colors of the pixel points;
and if at least 2 color parameters are different, reducing the ith picture to obtain a thumbnail of the video file.
2. The display device of claim 1, wherein the controller is further configured to:
if the N color parameters are the same, M pixel points are selected from the ith picture, wherein M is a positive integer and is greater than or equal to 2, and the selection mode of the M pixel points is different from that of the N pixel points;
respectively reading color parameters of M pixel points;
and if the M color parameters are the same, intercepting the ith picture.
3. The display device of claim 2, wherein if the M color parameters are all the same, the controller is further configured to:
judging whether the i is equal to a frequency threshold value, wherein the frequency threshold value is a set positive integer which is greater than or equal to 2;
if i is smaller than the time threshold value, intercepting the ith picture;
and if i is equal to the time threshold, reducing the ith picture to obtain the thumbnail of the video file.
4. The display device according to any one of claims 1-3, wherein the controller is further configured to:
and responding to the trigger of the user on the thumbnail, controlling the display to play the video file, wherein the playing process of the video file is played at the video frame starting point corresponding to the thumbnail.
5. The apparatus according to claim 4, wherein when i is equal to 1, the ith picture is a frame picture of a first frame in the video file.
6. The display device according to any one of claims 1-3, wherein the controller is further configured to:
and selecting N pixel points on the ith picture by adopting a binary search method.
7. The display device according to any one of claims 1 to 3, wherein M pixel points are randomly selected on the ith picture.
8. The display device according to claim 6, wherein N is equal to 5, and the 5 pixels are respectively: a first pixel point, a second pixel point, a third pixel point, a fourth pixel point and a fifth pixel point;
the first pixel point is a pixel point corresponding to the central position of the ith picture;
the second pixel point is a pixel point corresponding to the center position of a first connecting line, and the first connecting line is a connecting line between the center position of the ith picture and the upper left corner position of the ith picture;
the third pixel point is a pixel point corresponding to the center position of a second connecting line, and the second connecting line is a connecting line between the center position of the ith picture and the upper right corner position of the ith picture;
the fourth pixel point is a pixel point corresponding to the central position of a third connecting line, and the third connecting line is a connecting line between the central position of the ith picture and the lower left corner position of the ith picture;
the fifth pixel point is a pixel point corresponding to the center position of a fourth connecting line, and the fourth connecting line is a connecting line between the center position of the ith picture and the lower right corner position of the ith picture.
9. The display device according to claim 4, wherein the ith picture is a picture of a jth frame in the video file,
when i is 1, j is 1;
when i is 2, j is 2;
when i >2, j ═ i-1) + (i-2).
10. A thumbnail image generation method, comprising:
intercepting an ith picture, wherein the ith picture is a frame picture in a video file, i is the times of intercepting the picture in the video file in the thumbnail generation process, and i is a positive integer and is greater than or equal to 1;
selecting N pixel points on the ith picture, wherein N is a positive integer and is greater than or equal to 2;
respectively reading color parameters of N pixel points, wherein the color parameters are characteristic values representing the colors of the pixel points;
and if at least 2 color parameters are different, reducing the ith picture to obtain a thumbnail of the video file.
CN202110678680.2A 2021-01-22 2021-06-18 Display device and thumbnail generation method Active CN113453069B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110678680.2A CN113453069B (en) 2021-06-18 2021-06-18 Display device and thumbnail generation method
PCT/CN2022/072894 WO2022156729A1 (en) 2021-01-22 2022-01-20 Display device and display method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110678680.2A CN113453069B (en) 2021-06-18 2021-06-18 Display device and thumbnail generation method

Publications (2)

Publication Number Publication Date
CN113453069A true CN113453069A (en) 2021-09-28
CN113453069B CN113453069B (en) 2022-11-11

Family

ID=77811783

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110678680.2A Active CN113453069B (en) 2021-01-22 2021-06-18 Display device and thumbnail generation method

Country Status (1)

Country Link
CN (1) CN113453069B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116107479A (en) * 2023-03-02 2023-05-12 优视科技有限公司 Picture display method, electronic device and computer storage medium

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160182577A1 (en) * 2014-12-19 2016-06-23 Yahoo!, Inc. Content selection
CN105824532A (en) * 2016-03-16 2016-08-03 无锡科技职业学院 Recognizing and hash locating method of rectangular graphs
CN105847816A (en) * 2016-04-01 2016-08-10 广州华多网络科技有限公司 Video file thumbnail creation method and electronic device
CN106027893A (en) * 2016-05-30 2016-10-12 广东欧珀移动通信有限公司 Method and device for controlling Live Photo generation and electronic equipment
CN106101868A (en) * 2016-07-18 2016-11-09 乐视控股(北京)有限公司 Reduced graph generating method and generating means
CN107528897A (en) * 2017-08-22 2017-12-29 上海斐讯数据通信技术有限公司 A kind of cloud disk reduced graph generating method and device
CN107851330A (en) * 2015-07-21 2018-03-27 高通股份有限公司 Zero pixel for graphics process is rejected
CN108334531A (en) * 2017-09-19 2018-07-27 平安普惠企业管理有限公司 Picture tone extracting method, equipment and computer readable storage medium
CN108470334A (en) * 2018-03-20 2018-08-31 上海顺久电子科技有限公司 A kind of method and device of acquisition screen intensity and coloration
CN108600781A (en) * 2018-05-21 2018-09-28 腾讯科技(深圳)有限公司 A kind of method and server of the generation of video cover
CN110209978A (en) * 2019-01-28 2019-09-06 腾讯科技(深圳)有限公司 A kind of data processing method and relevant apparatus
CN110324665A (en) * 2019-07-25 2019-10-11 深圳创维-Rgb电子有限公司 A kind of method, terminal and the storage medium of the automatic review of a film by the censor
CN111901679A (en) * 2020-08-10 2020-11-06 广州繁星互娱信息科技有限公司 Method and device for determining cover image, computer equipment and readable storage medium

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160182577A1 (en) * 2014-12-19 2016-06-23 Yahoo!, Inc. Content selection
CN107851330A (en) * 2015-07-21 2018-03-27 高通股份有限公司 Zero pixel for graphics process is rejected
CN105824532A (en) * 2016-03-16 2016-08-03 无锡科技职业学院 Recognizing and hash locating method of rectangular graphs
CN105847816A (en) * 2016-04-01 2016-08-10 广州华多网络科技有限公司 Video file thumbnail creation method and electronic device
CN106027893A (en) * 2016-05-30 2016-10-12 广东欧珀移动通信有限公司 Method and device for controlling Live Photo generation and electronic equipment
CN106101868A (en) * 2016-07-18 2016-11-09 乐视控股(北京)有限公司 Reduced graph generating method and generating means
CN107528897A (en) * 2017-08-22 2017-12-29 上海斐讯数据通信技术有限公司 A kind of cloud disk reduced graph generating method and device
CN108334531A (en) * 2017-09-19 2018-07-27 平安普惠企业管理有限公司 Picture tone extracting method, equipment and computer readable storage medium
CN108470334A (en) * 2018-03-20 2018-08-31 上海顺久电子科技有限公司 A kind of method and device of acquisition screen intensity and coloration
CN108600781A (en) * 2018-05-21 2018-09-28 腾讯科技(深圳)有限公司 A kind of method and server of the generation of video cover
CN110209978A (en) * 2019-01-28 2019-09-06 腾讯科技(深圳)有限公司 A kind of data processing method and relevant apparatus
CN110324665A (en) * 2019-07-25 2019-10-11 深圳创维-Rgb电子有限公司 A kind of method, terminal and the storage medium of the automatic review of a film by the censor
CN111901679A (en) * 2020-08-10 2020-11-06 广州繁星互娱信息科技有限公司 Method and device for determining cover image, computer equipment and readable storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
WEIXIN_34407348: "java 切图 判断图片是否是纯色/彩色图片", 《HTTPS://BLOG.CSDN.NET/WEIXIN_34407348/ARTICLE/DETAILS/93505496》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116107479A (en) * 2023-03-02 2023-05-12 优视科技有限公司 Picture display method, electronic device and computer storage medium
CN116107479B (en) * 2023-03-02 2024-02-13 优视科技有限公司 Picture display method, electronic device and computer storage medium

Also Published As

Publication number Publication date
CN113453069B (en) 2022-11-11

Similar Documents

Publication Publication Date Title
CN114302190B (en) Display equipment and image quality adjusting method
CN112667184A (en) Display device
CN112601117B (en) Display device and content presentation method
CN112887778A (en) Switching method of video resource playing modes on display equipment and display equipment
CN113630649A (en) Display device and video playing progress adjusting method
CN113535019A (en) Display device and display method of application icons
CN112653906A (en) Video hotspot playing method on display device and display device
CN113360066B (en) Display device and file display method
WO2022161401A1 (en) Screen-projection data processing method and display device
CN113490032A (en) Display device and medium resource display method
CN113111214A (en) Display method and display equipment for playing records
CN113453069B (en) Display device and thumbnail generation method
CN113163258A (en) Channel switching method and display device
CN112580625A (en) Display device and image content identification method
CN113573149B (en) Channel searching method and display device
CN112905008B (en) Gesture adjustment image display method and display device
CN113453052B (en) Sound and picture synchronization method and display device
CN113064691B (en) Display method and display equipment for starting user interface
CN112911371A (en) Double-channel video resource playing method and display equipment
CN113573112A (en) Display device and remote controller
CN112601116A (en) Display device and content display method
CN113596559A (en) Method for displaying information in information bar and display equipment
CN114302203A (en) Image display method and display device
CN112367550A (en) Method for realizing multi-title dynamic display of media asset list and display equipment
CN113350781B (en) Display device and game mode switching method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant