CN112672199B - Display device and multi-layer overlapping method - Google Patents

Display device and multi-layer overlapping method Download PDF

Info

Publication number
CN112672199B
CN112672199B CN202011528031.6A CN202011528031A CN112672199B CN 112672199 B CN112672199 B CN 112672199B CN 202011528031 A CN202011528031 A CN 202011528031A CN 112672199 B CN112672199 B CN 112672199B
Authority
CN
China
Prior art keywords
pattern
layer
touch
display
background
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011528031.6A
Other languages
Chinese (zh)
Other versions
CN112672199A (en
Inventor
王敏
张振宝
李保成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202011528031.6A priority Critical patent/CN112672199B/en
Priority to CN202210821867.8A priority patent/CN115243094A/en
Publication of CN112672199A publication Critical patent/CN112672199A/en
Priority to PCT/CN2021/117796 priority patent/WO2022089043A1/en
Priority to CN202180066094.0A priority patent/CN116324689A/en
Application granted granted Critical
Publication of CN112672199B publication Critical patent/CN112672199B/en
Priority to US18/157,324 priority patent/US20230162704A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44016Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving splicing one content stream with another content stream, e.g. for substituting a video clip
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440263Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA

Abstract

The application provides a display device and a multi-layer superposition method, which can detect a touch track input by a user through a touch component while a display displays a user interface, and display a touch track pattern in a first layer in real time. After the touch track pattern is obtained, interpolation operation can be performed on the touch track pattern in the first layer according to the background pattern in the second layer, the resolution of the touch track pattern is improved, and finally the conversion pattern after interpolation operation is overlapped with the background pattern and is displayed through a display in real time. The display device can perform interpolation operation on the touch track pattern according to the background pattern, can relieve the influence of the transparency of the first layer on the edge interpolation algorithm result of the touch track pattern, relieve the problem of sawtooth or black edge when the touch track pattern is superposed, and improve the real-time display effect.

Description

Display device and multi-layer overlapping method
Technical Field
The application relates to the technical field of touch televisions, in particular to a display device and a multi-layer superposition method.
Background
The intelligent television is a television product based on Internet application technology and provided with an open operating system and a chip. The intelligent television has an open application platform and can realize a bidirectional human-computer interaction function. The intelligent television integrates multiple functions such as audio and video, entertainment, data and the like and is used for meeting the diversified and personalized requirements of users. The smart television may have a variety of interaction modes, for example, a touch component may be integrated on a display screen to form a touch television for satisfying touch interaction of a user.
The user can input various types of interactive instructions to the touch television through touch operation. For example, a user may input a touch action track on the display screen, and the touch television displays a corresponding track screen in real time according to the touch action track. In the process of displaying the track, the touch television can display in real time by overlapping the layers. The currently common superimposing method is to set the transparencies of different layers respectively, and then display the contents in the different layers by adjusting the transparencies of the layers. Generally, a plurality of layers share the same transparency, and if transparency processing for a certain layer during overlaying is to be changed, transparency proportion of the whole layer is required to be changed at the same time.
Therefore, when multiple images are stacked, the transparent proportion of some pixel points in the image layer cannot be flexibly changed. And when the resolution ratio of each layer is different, the layers cannot be superposed accurately. In order to improve the accurate layer superposition and the display effect, the layer picture with low resolution can be promoted to high resolution by an interpolation method, and then superposition is carried out. When the same layer element has different transparencies, the resolution is improved by interpolation and other methods, so that the colors of the boundaries connected by the transparencies are different, and the final display effect is influenced. For example, in the line drawing operation, the interpolation algorithm needs to perform interpolation calculation on the colors of the drawn line and the transparent colors, so that jaggies or black edges appear at the edges of the drawn line pattern, and the display effect is affected.
Disclosure of Invention
The application provides a display device and a multi-layer stacking method, and aims to solve the problem that the final display effect is poor due to an interpolation algorithm in a traditional stacking method.
In a first aspect, the present application provides a display device comprising: display, touch-control subassembly and controller. Wherein the display is configured to display a user interface, the touch-sensitive component is configured to detect a touch trajectory input by a user, and the controller is configured to perform the following procedural steps:
acquiring a touch track pattern in a first layer and acquiring a background pattern in a second layer, wherein the second layer is a layer positioned below the first layer;
according to the background pattern, performing interpolation operation on the touch track pattern to generate a conversion pattern, wherein the resolution of the conversion pattern is equal to that of the background pattern;
and superposing the conversion pattern and the background pattern to control the display to display a superposition result in real time.
According to the technical scheme, the display device provided by the first aspect of the application can detect the touch track input by the user through the touch component while the display displays the user interface, and presents the touch track pattern in the first layer in real time. After the touch track pattern is obtained, interpolation operation can be performed on the touch track pattern in the first layer according to the background pattern in the second layer, the resolution of the touch track pattern is improved, and finally the conversion pattern after interpolation operation is overlapped with the background pattern and is displayed through a display in real time. The display device can perform interpolation operation on the touch track pattern according to the background pattern, can relieve the influence of the transparency of the first layer on the edge interpolation algorithm result of the touch track pattern, relieve the problem of sawtooth or black edge when the touch track pattern is superposed, and improve the real-time display effect.
In a second aspect, the present application further provides a method for overlaying multiple layers, where the method for overlaying multiple layers is applied to a display device, and the display device includes a display, a touch module and a controller, where the touch module is configured to detect a touch trajectory input by a user, and the method for overlaying multiple layers includes:
acquiring a touch track pattern in a first layer and acquiring a background pattern in a second layer, wherein the second layer is a layer positioned on the lower layer of the first layer;
according to the background pattern, performing interpolation operation on the touch track pattern to generate a conversion pattern, wherein the resolution of the conversion pattern is equal to that of the background pattern;
and superposing the conversion pattern and the background pattern to control the display to display a superposition result in real time.
As can be seen from the foregoing technical solutions, the multi-layer overlaying method provided in the second aspect of the present application may be configured in a controller of a display device, and is used for displaying a touch track pattern in real time in a multi-layer overlaying manner when a user inputs a touch track. According to the method, interpolation operation can be performed on the touch track pattern according to the background pattern, the influence of the transparency of the first layer on the edge interpolation algorithm result of the touch track pattern can be relieved, the problem of sawtooth or black edge when the touch track pattern is overlapped is solved, and the real-time display effect is improved.
In a third aspect, the present application further provides a method for overlaying multiple layers, where the method for overlaying multiple layers is applied to a display device, and the display device includes a display, a touch component and a controller, where the touch component is configured to detect a touch trajectory input by a user, and the method for overlaying multiple layers includes:
controlling the display to display a drawing interface, wherein the drawing interface comprises a first layer and a second layer, and the second layer is a layer positioned on the lower layer of the first layer;
responding to a touch action of a user, displaying a conversion pattern on the first layer, and displaying a background pattern on the second layer, wherein the conversion pattern is generated by performing interpolation operation on the touch track pattern, and the resolution of the conversion pattern is equal to that of the background pattern;
and controlling the display to display a superimposed pattern in real time, wherein the superimposed pattern is obtained by superimposing a conversion pattern and the background pattern.
According to the technical scheme, the multi-layer superposition method provided by the third aspect of the application can control the display to display the drawing interface in the process that the user executes the demonstration action, respectively display the conversion pattern and the background pattern through the first layer and the second layer of the drawing interface, and superpose and display the patterns in the first layer and the second layer in real time, so that the influence of the transparency of the first layer on the edge interpolation algorithm result of the touch track pattern is relieved, the problem of sawtooth or black edge during superposition of the touch track pattern is solved, and the real-time display effect is improved.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control apparatus in an embodiment of the present application;
fig. 2 is a block diagram of a hardware configuration of a display device in an embodiment of the present application;
fig. 3 is a block diagram of a hardware configuration of a control device in the embodiment of the present application;
FIG. 4 is a schematic diagram of a software configuration of a display device in an embodiment of the present application;
fig. 5 is a schematic diagram of multiple layers of a display device in an embodiment of the present application;
fig. 6 is a schematic diagram of a multi-layer overlay display effect in the embodiment of the present application;
FIG. 7 is a diagram illustrating an interpolation operation performed to generate an overlay effect according to a background pattern in the embodiment of the present application;
fig. 8 is a schematic flowchart of a multi-layer stacking method in an embodiment of the present application;
fig. 9 is a schematic diagram illustrating an overlapping effect of a second layer and a third layer in an embodiment of the application;
FIG. 10 is a diagram illustrating interpolation effects of a second layer in an embodiment of the present application;
fig. 11 is a schematic diagram of a superposition effect of a transparent or semitransparent second layer in an embodiment of the present application;
Fig. 12 is a schematic diagram of a top-set display overlay effect in the embodiment of the present application.
Detailed Description
To make the purpose and embodiments of the present application clearer, the following will clearly and completely describe the exemplary embodiments of the present application with reference to the attached drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, and not all embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
Fig. 1 is a schematic diagram of a usage scenario of a display device according to an embodiment. As shown in fig. 1, the display apparatus 200 is also in data communication with a server 400, and a user can operate the display apparatus 200 through the smart device 300 or the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes at least one of an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, and controls the display device 200 in a wireless or wired manner. The user may control the display apparatus 200 by inputting a user instruction through at least one of a key on a remote controller, a voice input, a control panel input, and the like.
In some embodiments, the smart device 300 may include any of a mobile terminal, a tablet, a computer, a notebook, an AR/VR device, and the like.
In some embodiments, the smart device 300 may also be used to control the display device 200. The display device 200 is controlled, for example, using an application running on a smart device.
In some embodiments, the smart device 300 and the display device may also be used for communication of data.
In some embodiments, the display device 200 may also be controlled in a manner other than the control apparatus 100 and the smart device 300, for example, the voice instruction control of the user may be directly received by a module configured inside the display device 200 to obtain a voice instruction, or may be received by a voice control apparatus provided outside the display device 200.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers.
In some embodiments, software steps executed by one step execution agent may be migrated on demand to another step execution agent in data communication therewith for execution. Illustratively, software steps performed by the server may be migrated to be performed on a display device in data communication therewith, and vice versa, as desired.
Fig. 2 exemplarily shows a block diagram of a configuration of the control apparatus 100 according to an exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply source. The control apparatus 100 may receive an input operation instruction from a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200.
In some embodiments, the communication interface 130 is used for external communication, and includes at least one of a WIFI chip, a bluetooth module, NFC, or an alternative module.
In some embodiments, the user input/output interface 140 includes at least one of a microphone, a touchpad, a sensor, a key, or an alternative module.
Fig. 3 shows a hardware configuration block diagram of the display apparatus 200 according to an exemplary embodiment.
In some embodiments, the display apparatus 200 includes at least one of a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, a user interface.
In some embodiments the controller comprises a central processor, a video processor, an audio processor, a graphics processor, a RAM, a ROM, a first interface to an nth interface for input/output.
In some embodiments, the display 260 includes a display screen component for displaying pictures, and a driving component for driving image display, a component for receiving image signals from the controller output, displaying video content, image content, and menu manipulation interface, and a user manipulation UI interface, etc.
In some embodiments, the display 260 may be at least one of a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
In some embodiments, the tuner demodulator 210 receives broadcast television signals via wired or wireless reception, and demodulates audio/video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals.
In some embodiments, communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver. The display apparatus 200 may establish transmission and reception of control signals and data signals with the control device 100 or the server 400 through the communicator 220.
In some embodiments, the detector 230 is used to collect signals of the external environment or interaction with the outside. For example, the detector 230 includes a light receiver, a sensor for collecting the intensity of ambient light; alternatively, the detector 230 includes an image collector, such as a camera, which can be used to collect external environment scenes, attributes of the user, or user interaction gestures, or the detector 230 includes a sound collector, such as a microphone, which is used to receive external sounds.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, and the like. The interface may be a composite input/output interface formed by the plurality of interfaces.
In some embodiments, the controller 250 and the modem 210 may be located in different separate devices, that is, the modem 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any of a number of selectable objects, such as a hyperlink, an icon, or other operable control. Operations related to the selected object are: displaying an operation of connecting to a hyperlink page, document, image, etc., or performing an operation of a program corresponding to the icon.
In some embodiments the controller comprises at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphics Processing Unit (GPU), a RAM Random Access Memory (RAM), a ROM (Read-Only Memory), a first to nth interface for input/output, a communication Bus (Bus), and the like.
A CPU processor. For executing operating system and application program instructions stored in the memory, and executing various application programs, data and contents according to various interactive instructions received from the outside, so as to finally display and play various audio-video contents. The CPU processor may include a plurality of processors. E.g. comprising a main processor and one or more sub-processors.
In some embodiments, a graphics processor for generating various graphics objects, such as: at least one of icons, operation menus, and user input instruction display graphics. The graphics processor comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes; the system also comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, the video processor is configured to receive an external video signal, and perform at least one of video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a signal displayed or played on the direct display device 200.
In some embodiments, the video processor includes at least one of a demultiplexing module, a video decoding module, an image composition module, a frame rate conversion module, a display formatting module, and the like. The demultiplexing module is used for demultiplexing the input audio and video data stream. And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like. And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display. And the frame rate conversion module is used for converting the frame rate of the input video. And the display formatting module is used for converting the received frame rate into a video output signal and changing the signal to conform to a display format signal, such as an output RGB data signal.
In some embodiments, the audio processor is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform at least one of noise reduction, digital-to-analog conversion, and amplification processing to obtain a sound signal that can be played in the speaker.
In some embodiments, the user may input a user command on a Graphical User Interface (GUI) displayed on the display 260, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input a user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form that is acceptable to the user. A common presentation form of a User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include at least one of an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc. visual interface elements.
In some embodiments, user interface 280 is an interface that may be used to receive control inputs (e.g., physical buttons on the body of the display device, or the like).
In some embodiments, the system of the display device may include a Kernel (Kernel), a command parser (shell), a file system, and an application. The kernel, shell, and file system together make up the basic operating system structure that allows users to manage files, run programs, and use the system. After power-on, the kernel is started, kernel space is activated, hardware is abstracted, hardware parameters are initialized, and virtual memory, a scheduler, signals and interprocess communication (IPC) are operated and maintained. And after the kernel is started, loading the Shell and the user application program. The application program is compiled into machine code after being started, and a process is formed.
Referring to fig. 4, in some embodiments, the system is divided into four layers, which are an Application (Applications) layer (abbreviated as "Application layer"), an Application Framework (Application Framework) layer (abbreviated as "Framework layer"), an Android runtime (Android runtime) and system library layer (abbreviated as "system runtime library layer"), and a kernel layer from top to bottom.
In some embodiments, at least one application program runs in the application program layer, and the application programs can be a Window (Window) program of an operating system, a system setting program or a clock program, and the like; or an application developed by a third party developer. In particular implementations, the application packages in the application layer are not limited to the above examples.
The framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions. The application framework layer acts as a processing center that decides to let the applications in the application layer act. The application program can access the resources in the system and obtain the services of the system in execution through the API interface.
As shown in fig. 4, in the embodiment of the present application, the application framework layer includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used for interacting with all activities running in the system; the Location Manager (Location Manager) is used for providing the system service or application with the access of the system Location service; a Package Manager (Package Manager) for retrieving various information related to an application Package currently installed on the device; a Notification Manager (Notification Manager) for controlling display and clearing of Notification messages; a Window Manager (Window Manager) is used to manage the icons, windows, toolbars, wallpapers, and desktop components on the user interface.
In some embodiments, the activity manager is used to manage the lifecycle of the various applications as well as general navigational fallback functions, such as controlling the exit, opening, fallback, etc. of the applications. The window manager is used for managing all window programs, such as obtaining the size of the display screen, judging whether a status bar exists, locking the screen, intercepting the screen, controlling the change of the display window (for example, reducing the display window, displaying a shake, displaying a distortion deformation, and the like), and the like.
In some embodiments, the system runtime layer provides support for an upper layer, i.e., the framework layer, and when the framework layer is used, the android operating system runs the C/C + + library included in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the core layer at least comprises at least one of the following drivers: audio drive, display driver, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (like fingerprint sensor, temperature sensor, pressure sensor etc.) and power drive etc..
Based on the display device 200, the display device 200 may support a touch interaction function by adding the touch component 276. In general, the touch-sensitive component 276 may constitute a touch screen with the display 260. Different control instructions can be input by a user on the touch screen through touch operation. For example, the user may input a click, slide, long press, double click, etc. touch command, and different touch commands may represent different control functions.
To implement the different touch motions, the touch control component 276 may generate different electrical signals when a user inputs different touch motions and transmit the generated electrical signals to the controller 250. The controller 250 may perform feature extraction on the received electrical signal to determine a control function to be performed by the user based on the extracted features.
For example, when a user enters a click touch action at any program icon location in the application program interface, the touch component 276 will sense the touch action and generate an electrical signal. After receiving the electrical signal, the controller 250 may first determine a duration of a level corresponding to a touch action in the electrical signal, and when the duration is less than a preset time threshold, recognize that a click touch instruction is input by the user. The controller 250 then extracts the positional features generated by the electrical signals to determine the touch position. And when the touch position is within the display range of the application icon, determining that the user inputs a click touch instruction at the position of the application icon. Accordingly, the click touch command is used to execute a function of running a corresponding application program in the current scene, so that the controller 250 may start running the corresponding application program.
For another example, when the user inputs a slide motion in the media asset presentation page, the touch component 276 also transmits the sensed electrical signal to the controller 250. The controller 250 determines a duration of a signal corresponding to a touch action in the electrical signals. When the determined duration is longer than the preset time threshold, the position change condition generated by the signal is judged, obviously, for the interactive touch action, the generation position of the signal changes, and therefore it is determined that the user inputs the sliding touch instruction. The controller 250 determines the sliding direction of the sliding touch instruction according to the change condition of the position of the signal generation, and controls to turn pages of the display frame in the media asset display page so as to display more media asset options. Further, the controller 250 may extract features such as a sliding speed and a sliding distance of the sliding touch instruction, and perform a page-turning screen control according to the extracted features, so as to achieve a hand-following effect.
Similarly, for the touch instruction such as double click, long press, etc., the controller 250 may execute the corresponding control function according to the preset interaction rule by extracting different features and determining the type of the touch instruction through feature judgment. In some embodiments, the touch component 276 also supports multi-touch, such that a user can input touch actions on the touch screen with multiple fingers, e.g., multi-finger clicks, multi-finger long presses, multi-finger swipes, and the like.
The touch control action can be matched with a specific application program to realize a specific function. For example, when the user opens a "whiteboard" application, the display 260 may present a drawing area, the user may draw a specific touch trajectory in the drawing area by sliding the touch command, and the controller 250 determines a touch pattern through the touch detected by the touch component 276 and controls the display 260 to display in real time to satisfy the demonstration effect.
In order to achieve a real-time display effect, the display device 200 may display the hand-drawing process in a manner of overlapping a plurality of layers. Generally, the display device 200 may use one layer to display the sliding touch trajectory corresponding to the user's hand drawing in real time, and may use another layer to display the whiteboard interface, and the final picture displayed on the display 260 is formed by superimposing the two layers. For convenience of distinguishing, in the embodiment of the present application, a layer for displaying a touch track pattern in real time is referred to as a first layer, and a layer for displaying a whiteboard interface is referred to as a second layer. Obviously, in order to present the final picture, the layers that can be presented by the display device 200 include not only the above two layers, but also other layers for presenting different picture contents.
For example, as shown in fig. 5, the display device 200 may include three layers, which are respectively a first layer: group of Pictures (GOP), second layer: on-screen display (OSD) and a third layer: video layer (Video). The GOP layer, also called GOP2 layer or acceleration layer, can be used to display temporarily drawn content that is displayed on top of a menu. The OSD layer is also called an intermediate layer or a menu layer, and is used for displaying contents such as an application interface, an application menu, a toolbar and the like. The Video layer is also called as a bottom layer, and can be generally used for displaying picture contents corresponding to external signals connected with a television.
The different layers can be set with a hierarchical relationship to achieve a specific display effect. For example, the hierarchical relationship of the GOP layer, the OSD layer, and the Video layer may be: GOP layer-OSD layer-Video layer, i.e. Video layer is displayed on the bottom layer to show the content of external signal picture, OSD layer is displayed on the Video layer to make the application menu float on the external signal picture for display, and GOP layer is displayed on OSD layer to make it stand out when user inputs drawing picture.
Among them, for the GOP layer, since it is used to display the temporarily drawn content, the picture displayed in the GOP layer will be presented with different content according to the input of the user drawing action. Therefore, in practical applications, in order to meet the drawing requirement, after one sliding touch input is completed, the display device 200 may update the drawn pattern to the OSD layer for display, and continue to display other touch trajectory contents through the GOP layer. In such a display mode, the pattern generated by the new drawing action can be overlaid on the pattern generated by the previous drawing action so as to adapt to the operation habit of the user.
It should be noted that, for the pattern in the multiple layers that can be presented by the display device 200, the representation form of the pattern may be an ARGB form, that is, on the basis of the conventional RGB form, the pattern is provided with transparency information to facilitate the superposition of the pictures of the multiple layers. For example, for a picture drawn by a user, a part drawn by a brush of the picture is a specific touch track pattern, and other parts are completely transparent patterns, so that the situation that the part not drawn by the user shields the content in the bottom layer is avoided. Therefore, based on the above-described plurality of layers, the display apparatus 200 may present a final picture according to the specific pattern content and transparency in each layer.
Since the specific content for display of each layer is different, the patterns in each layer may have different picture resolutions. For example, if the resolution of the GOP layer is 2k level and the resolutions of the OSD layer and the video layer are 4k level, it is difficult to align the patterns in the layers due to the difference in resolution when the pictures are superimposed, resulting in display deviation or error.
In order to enable the overlay display, when there is a difference in resolution between different layers, an interpolation operation may be performed on a pattern in a layer with a lower resolution to improve the resolution of the picture of the layer. For example, as shown in fig. 6, when the GOP2 layer, the OSD layer, and the Video layer are stacked, since the resolution of the GOP2 layer is 2K, and the resolution of the OSD layer and the Video layer is 4K, the GOP2 layer is first increased to 4K by the interpolation algorithm, and then stacked with other two layers.
The interpolation operation is an interpolation algorithm of an image picture, and the content of a pixel to be inserted can be calculated through the content of a plurality of adjacent pixels in the image, so that the resolution of the picture is improved. However, because the superimposed layers include transparency information and different layers are often provided with different transparencies, when performing the interpolation algorithm, the contents of adjacent pixels are affected by the transparencies, so that the edge positions of the drawn pattern are subjected to the interpolation algorithm, thereby causing a problem of display errors.
As shown in fig. 6, taking the electronic whiteboard application of the display device 200 as an example, the writing process of the electronic whiteboard is generally displayed at the GOP layer, the written drawn lines are displayed at the OSD layer, and the actual electronic whiteboard interface displays the superposition of the GOP layer and the OSD layer. In the overlay, if there is a difference in layer resolution, a pattern of low resolution is generally enlarged to high resolution by interpolation and then overlaid. When the GOP2 layer (2K) is to be superimposed on the OSD layer (4K), the GOP2 needs to be increased to 4K resolution, and at this time, an interpolation algorithm needs to be performed on pixel points, and if the background of the GOP2 is transparent (i.e., the background color is 0X00000000), at the boundary of the drawn line, the color of the drawn line and the transparent color are used as the interpolation algorithm. Because the transparent color in the interpolation algorithm does not work, the sawtooth problem can occur due to the fact that 2K is switched to 4K after interpolation, if the color value of the transparent color is considered to be 000000, the color is changed into the drawn line color and the transparent black is used as the interpolation algorithm, the situation of semitransparent black can occur after interpolation, and the effect that the black edge exists at the boundary of the drawn line is shown.
In order to improve the problem of the edge display error of the touch track pattern, some embodiments of the present application provide a multi-layer overlay method, which may be applied to a display device 200, in order to implement the multi-layer overlay method, the display device 200 includes a display 260, a touch component 276 and a controller 250, where the touch component 276 is configured to detect a touch track input by a user, as shown in fig. 7 and 8, and the multi-layer overlay method includes the following steps:
and acquiring a touch track pattern in the first layer and acquiring a background pattern in the second layer.
The first layer is used for displaying touch track patterns, and the second layer is used for displaying interface contents such as application interfaces, application menus and toolbars. Therefore, the second layer is a layer located one layer below the first layer. For example, the first layer is a GOP2 layer, and the second layer is an OSD layer.
The user can click the application icon to start the related application through the application starting interface. If the application started by the user is an application capable of using the first layer, an application interface can be displayed in the second layer. Meanwhile, the touch track input by the user is detected in real time through the touch component 276, and the touch track pattern is presented in the first layer according to the user input action. In the embodiment of the application, the content displayed in the second layer not only includes application contents such as an application interface, an application menu, a toolbar, and the like, but also includes drawing contents synchronized to the second layer after one touch action is finished, and therefore, for convenience of description, the application interface content displayed in the second layer is referred to as a background pattern.
After obtaining the touch track pattern in the first layer and the background pattern in the second layer, the controller 250 may further perform an interpolation algorithm on the touch track pattern according to the background pattern, so as to convert the touch track pattern into a conversion pattern with a resolution equal to that of the background pattern.
The interpolation algorithm is used to change the resolution of the touch track pattern, and may take different forms according to different requirements of the processed image, such as nearest neighbor interpolation, bilinear interpolation, bicubic interpolation, directional interpolation, and the like. Taking the proximity interpolation method as an example, when a 2k image needs to be converted into a 4k image, the pixel value of each pixel in the 2k image can be traversed, and the average value of the pixel values of two adjacent pixels is calculated, so as to obtain the pixel value corresponding to the pixel point to be inserted. That is, when two adjacent pixels are (0, 255, 0) and (255, 255, 255), the values in the RGB channels may be calculated, that is, the value of the R-channel interpolation pixel point is (0+ 255)/2-128, the value of the G-channel interpolation pixel point is (255+ 255)/2-255, and the value of the B-channel interpolation pixel point is (0+ 255)/2-128.
When the interpolation algorithm is executed, pixel image data can be extracted from the edge of the touch track pattern and the position close to the background pattern, so that the image data of the interpolation pixel can be calculated according to the pixel data extracted from the background pattern and the touch track pattern.
For example, after acquiring the touch track pattern, the color of the touch track pattern may be extracted to obtain image data (192, 0, 255, 0), that is, a user draws a pure green image with 75% opacity by hand, and at the same time, the color is extracted from the background pattern to obtain image data (255, 255, 255, 255) of the background pattern, that is, the background pattern is a pure white interface. Therefore, according to the extracted image data, the interpolated pixel point data may be calculated as the transparency channel value maintaining 192 (i.e., 75% opacity), the R-channel interpolated pixel point value being (0+ 255)/2-128, the G-channel interpolated pixel point value being (255+ 255)/2-255, and the B-channel interpolated pixel point value being (0+ 255)/2-128, that is, the interpolated pixel point being (192, 128, 255, 128), that is, when the interpolation algorithm is executed to increase the resolution, the pixel point of (192, 128, 255, 128) is inserted at the edge of the touch trajectory pattern.
And superposing the conversion pattern and the background pattern to control the display to display a superposition result in real time.
After performing the interpolation operation on the touch trajectory pattern, the controller 250 may also perform the superimposition according to the result of the interpolation operation and the background pattern, and display the superimposition result fact on the display 260. It can be seen that, in the above embodiment, in the process of performing the interpolation operation, the calculated interpolation pixel data is determined by the background pattern and the touch track pattern, when two image layer pictures are superimposed, no black edge or sawtooth occurs at the edge of the touch track pattern, and the display effect in the image layer superimposing process is improved.
In the above embodiment, when performing the interpolation operation on the touch track pattern, it is necessary to extract pattern data from the first layer and the second layer, and for the picture displayed in the first layer, since the pattern is generated by the touch input of the user, the extraction of the pattern in the first layer can be directly obtained from the touch operation of the user. That is, in some embodiments, the step of obtaining the touch track pattern in the first layer further includes: receiving a touch track input by a user in real time; responding to the touch track, and extracting foreground colors; and presenting a touch track in the first image layer according to the foreground color so as to generate a touch track pattern.
After the display device 200 starts running an application such as a presentation whiteboard application, a user may input a touch trajectory through a touch action, and after the touch trajectory is detected by the touch component 276, the touch trajectory may be sent to the controller 250, so that the controller 250 extracts a foreground color in response to the touch trajectory.
In the embodiment of the application, the foreground color is the color of the brush selected by the user in the drawing demonstration process. For example, a user may select a brush shape and set the foreground color green in a toolbar window that demonstrates a whiteboard application interface. The user may form a green touch trajectory in the whiteboard interface after a subsequent input of a sliding touch operation. Therefore, in order to obtain the pixel point data corresponding to the touch track pattern, the controller 250 may directly extract the foreground color and present the touch track in the first layer according to the foreground color to generate the touch track pattern.
While generating the touch trajectory pattern, the controller 250 may also reserve the extracted foreground color as pixel point data corresponding to the touch trajectory pattern. That is, when the interpolation algorithm is executed, the interpolation calculation can be directly performed through foreground color data and background color data extracted from the second layer.
For example, a transparent color extractor may be provided in the presentation whiteboard program of the display device 200, and the GOP layer background portion color may be set to the color of the transparent color extractor, so that the color of the drawn line and the color of the transparent color extractor are interpolated to prevent the occurrence of a boundary line.
And selecting the brush color or the interface color by transparent color selection according to the condition that the brush color or the interface color in the transparent layer is single, and setting the brush color or the interface color to be completely transparent, so that the color at the boundary of the drawn line or the interface and the transparent layer is the interpolated value of the semitransparent brush color, and the boundary does not have obvious black or other boundary colors.
However, during part of the demonstration or drawing process, the brush used by the user may not be a fixed color, that is, the brush may be a multi-color brush, and the multi-color brush may appear as a combination of colors as the touch trajectory extends. When the user uses the color pen to draw, the controller 250 uses the foreground color as the pixel data of the touch track pattern, which causes the extracted pixel data not to be consistent with the actual touch track pattern, and affects the calculation result of the interpolation algorithm. To this end, in some embodiments, the step of performing an interpolation operation on the touch trajectory pattern according to the background pattern further includes: firstly, extracting boundary colors and boundary positions of touch track patterns; and extracting the background color in the boundary position correlation area from the second image layer, solving an interpolation result according to the boundary color and the background color, and performing interpolation operation on the touch track pattern.
The controller 250 may extract the boundary of the touch trajectory pattern by executing an image analysis program in the process that the first layer presents the touch trajectory pattern, and obtain the color of the boundary and the position of the boundary pixel point. The image boundary can be determined by traversing all pixel points in the image and determining the color value difference between two adjacent pixel points, and when the difference is large, the position of the two adjacent pixel points is determined to be the boundary of the touch track pattern. And because the first layer is displayed on the topmost layer, for the purpose of superposition display, the corresponding opacity of the pixel points on the first layer, which are not the touch track pattern, is 0%, so that the boundary of the touch track pattern can be determined according to the opacity.
After obtaining the boundary color and the boundary position, the controller 250 may further extract a background color in the second image layer. If the background pattern in the second image layer is a solid background, the background color can be extracted from any pixel point in the background pattern; if the background pattern in the second image layer is not a solid background, searching in the second image layer according to the boundary position is needed, and the color of the pixel point of the boundary position corresponding to the background pattern position is determined as the background color.
Obviously, the boundary of the touch action track pattern is a two-dimensional array formed by a plurality of pixel points, so the background color extracted from the background pattern is also a two-dimensional array formed by a plurality of pixel points. The controller 250 then solves the interpolation result according to the boundary color and the background color, so as to convert the touch track pattern into a conversion pattern with higher resolution, so as to perform the superposition operation between multiple image layers.
For example, in the case of a colored drawn line, the colors in the drawn line are not fixed, and if one of the colors is selected by the transparent color extractor, the colors may still be different at the boundary. For this case, the transparent color picker can select the color of the OSD layer. The color of the next layer to be superposed is selected because the resolution ratio is increased and finally superposed together, and if the background color is a single color, the color of the transparent color sampler is selected to be a full transparent value of the single color; if the background color is not a single transparent color, the transparent color extractor is a two-dimensional array. And aiming at the area to be displayed with the content in the GOP layer, acquiring a color array of the area at the corresponding position of the OSD layer, and then taking the full transparency value of the color in the color array as the color of the transparent color picker.
When the transparent color sampler is used as the background of the content to be displayed and the overlaying is carried out, because the color in the transparent color sampler is the color value of the next layer of the boundary area to be overlaid, the border color after the interpolation is a semitransparent value of the color to be overlaid, the condition that the border line or the border color is abnormal after the overlaying does not exist.
According to the technical scheme provided by the embodiment, the pixel points located at the boundary position can be directly determined by the boundary color and the boundary position of the touch track pattern, and the background color associated with the boundary pixel points is determined at the same time to adapt to the color change of the touch track in the first layer and the color change of the background pattern in the second layer, so that when the interpolation operation is performed on the boundary of the touch track pattern, the interpolation result adapted to the colors of the two layers can be obtained, and the image quality of the boundary area can be improved.
In practical applications, the interpolation algorithm is an operation performed when the plurality of layers have different resolutions, and when the resolutions between the layers are the same, it may not be necessary to perform an interpolation algorithm process on the touch track pattern, that is, in some embodiments, the step of performing the interpolation operation on the touch track pattern according to the background pattern further includes: detecting the resolution of the touch track pattern and the background pattern; executing different program steps according to the detection result, and if the resolution of the touch track pattern is smaller than that of the background pattern, executing the step of extracting the boundary color and the boundary position of the touch track pattern; and if the resolution of the touch track pattern is equal to that of the background pattern, performing superposition on the touch track pattern and the background pattern.
The resolutions of the touch trajectory pattern and the background pattern may be acquired by a screen resolution supported by the display 260 of the display device 200 or a resolution supported by an application currently running. After the resolutions of the touch track pattern and the background pattern are detected, the resolutions of the two image layers can be compared, and the superposition mode is determined according to the comparison result.
When the resolution of the touch track pattern is smaller than the resolution of the background pattern, that is, the resolution of the content displayed in the first layer is smaller than the resolution of the content displayed in the second layer, the pattern with the smaller resolution needs to be added, that is, an interpolation algorithm is performed on the touch track pattern, that is, a step of extracting the boundary color and the boundary position of the touch track pattern is performed.
Obviously, when the interpolation algorithm is executed, the number of pixel points inserted in the interpolation algorithm also needs to be determined according to the resolution of the background pattern. For example, if the GOP layer has a resolution of 2k and the OSD layer has a resolution of 4k, one time of pixel points need to be inserted into the touch trajectory pattern in the GOP layer, so that the touch trajectory pattern is also converted into a 4k pattern.
When the resolution of the touch track pattern is equal to the resolution of the background pattern, that is, the resolution of the patterns in the first layer and the second layer is the same, the touch track pattern and the background pattern may be directly superimposed without performing interpolation processing on the touch track pattern.
The display device 200 may also display different types of application program interfaces by superimposing a plurality of layers, that is, a third layer such as a video layer may be superimposed in addition to the first layer and the second layer. For example, as shown in fig. 9, while the display device 200 displays the content of the external signal through the video layer, the program interface is displayed through the OSD layer, and the presentation function is simultaneously performed through the GOP layer. At this time, not only the first layer has transparency setting, but also the second layer has transparency setting. So that when the controller 250 extracts the background pattern in the second layer, a transparent area may be extracted, thereby affecting the interpolation algorithm result and the superimposition result.
Therefore, in some embodiments, the step of displaying the content in the specific external signal according to the background pattern and performing the interpolation operation on the touch trajectory pattern further includes: detecting the transparency of the background pattern, acquiring a bottom pattern in the third layer according to the detection result if the transparency of the background pattern is fully transparent or semitransparent, and then overlapping the background pattern and the bottom pattern; thereby presenting the superimposed background pattern in the second layer.
In order to alleviate the influence of the transparent area in the second layer on the interpolation algorithm result, before the interpolation algorithm is executed, the transparency of the background pattern may also be detected to determine whether the background pattern in the second layer is a fully transparent or semi-transparent type pattern. The specific detection process can be implemented by traversing the opacity value of each pixel in the background pattern, and if a pixel or an area with an opacity value of 0 exists in the background pattern, or the proportion of the pixel with an opacity value of 0 to the number of all pixels is greater than a set value, the transparency of the background pattern is determined to be fully transparent or semi-transparent.
And when the background pattern is a fully transparent or semitransparent pattern, determining that the interpolation algorithm is influenced by the transparent pattern in the second layer and a part or all of the boundary defects appear. At this time, the second layer and the third layer may be superimposed first, and then the pattern in the first layer may be interpolated. And the third layer is a layer positioned below the second layer. For example, when it is detected that the pattern displayed on the OSD layer is a transparent or semi-transparent pattern, the bottom pattern displayed on the video layer may be extracted, and the bottom pattern and the background pattern in the second layer may be subjected to superposition processing, so as to eliminate the influence of the transparent area in the background pattern of the second layer, so that a transparent color is not extracted when the background color is extracted in the second layer, and ensure the display effect of the touch trajectory boundary.
Similarly, for the case of multiple layers, the resolutions of the second layer and the third layer may be detected before the interpolation algorithm is performed on the touch track pattern, so that the resolutions are adjusted to be consistent and then are superimposed. That is, in the step of superimposing the background pattern and the bottom layer pattern, the resolutions of the background pattern and the bottom layer pattern may also be detected, if the resolution of the background pattern is smaller than the resolution of the bottom layer pattern, the bottom layer color is extracted in the third layer, and an interpolation algorithm is performed on the background pattern according to the bottom layer color; and performing superposition on the background pattern processed by the interpolation algorithm and the bottom layer pattern.
For example, as shown in fig. 10, by detecting the resolutions of the background pattern and the underlayer pattern, it is determined that the resolution of the background pattern in the OSD layer is 2k, and the resolution of the underlayer pattern in the video layer is 4k, the underlayer color can be extracted in the video layer, and the specific extraction method is the same as the extraction method used when the first layer is superimposed, and an interpolation algorithm is performed on the background pattern according to the extracted underlayer color, so as to obtain the background pattern with 4k resolution. And then, superposing the high-resolution background pattern processed by the interpolation algorithm and the bottom pattern to obtain a pattern finally displayed in the OSD layer.
It should be noted that, because the third layer as the bottom layer may be used to display the picture content of the external signal, and the display device 200 may not be able to directly obtain the bottom layer pattern, when extracting the color in the bottom layer pattern, the video layer may be intercepted first, and after obtaining the screenshot, the bottom layer color is extracted from the screenshot image.
As can be seen, in the above embodiment, before performing the interpolation algorithm and the overlay display on the image in the first layer, the background pattern in the second layer may be processed first, so that the background pattern displayed in the second layer can be always extracted to an effective background color, so that when the interpolation algorithm is performed on the first layer, the touch trajectory pattern boundary can be reasonably interpolated, and the boundary display defect is alleviated.
Based on the above embodiment, in the actual layer overlaying process, the following effects can be obtained: for example, if the OSD layer is not transparent for the full screen, the Video layer is completely covered by the OSD layer after the superimposition. The user appears to display the content in the OSD layer and the GOP2 layer. If transparency exists at the OSD layer, the superimposed effect is that of the GOP2 layer, the OSD layer, and the Video layer.
As shown in fig. 11, for the case that the OSD layer is transparent or semi-transparent, the controller 250 may superimpose the OSD layer and the Video layer, and if the resolution of the OSD layer is different from that of the Video layer, the level of the low resolution is increased, that is, the OSD layer pattern of the 2K resolution is increased to 4K by interpolation.
If the background of the OSD layer is semi-transparent, a transparent selector is needed to be used for selecting the color of the video layer to be superposed and the semi-transparent corresponding color to carry out transparency color superposition during interpolation; if the background of the OSD layer is transparent, the transparent color selector needs to select the video layer color when interpolating. And after the OSD layer and the Video layer are superposed, the GOP2 layer is removed for superposition. Since the background of the GOP2 layer is transparent, the color of the transparent color selector can be updated to the color of the OSD layer and the Video layer after being superimposed, and the resolution is increased to 4K and then superimposed.
In addition, the annotation function under the channel is a condition that the OSD layer is a 2K transparent layer, the annotation generally aims at a picture that the Video layer is static display, when the picture is displayed in a superposition manner, a screenshot of the Video layer is firstly obtained, the color of a two-dimensional array of the screenshot is selected by a transparent color selector, interpolation is carried out on the transparent OSD layer by taking the color of the transparent color selector as a background when interpolation is carried out on the transparent OSD layer, then the color of the transparent color selector is updated to be a color obtained after the OSD and the Video layer are superposed, and when the GOP2 is superposed, superposition is carried out by taking the updated transparent color selector as the background, so that superposition of the GOP2 layer, the OSD and the Video layer is obtained.
Since the first layer is usually used for displaying the temporary drawing pattern, that is, the display device 200 displays the drawing content through the first layer during the drawing process of the user, the drawing content of the user is usually composed of geometric figures such as points and lines, and the boundary area corresponding to the drawing content is small, the controller 250 does not need to extract the color in the whole background pattern, and only needs to extract the background color in a specific area to meet the requirement of the interpolation algorithm. That is, in some embodiments, the step of performing the interpolation operation on the touch trajectory pattern further includes: traversing feature points in the touch track; a color taking area is defined in the second layer according to the position of the feature point; and extracting pixel values and transparency values in the color taking area to obtain a background pattern.
Along with the drawing process of the user, a plurality of feature points can be generated in the touch trajectory. The controller 250 may locate one or more feature points according to the user touch trajectory, where the feature points may be single points or line-drawing points in the touch trajectory, and may be determined according to the shape of the touch trajectory input by the user.
After the feature points are positioned, the color-extracting area can be defined in the background pattern according to the positions of the feature points. The division manner of the color sampling region may be defined according to the type of the feature points, the distance between the feature points, and a specific interpolation operation method, for example, the color sampling region may be a rectangular region including a predetermined number of pixel points. Obviously, the color extraction area is required to cover the boundary of the touch track pattern.
The controller 250 selects the color value and the transparency value of each pixel point from the defined color-taking region to obtain the background pattern. That is, in this embodiment, the background pattern may be only a pattern in the second layer located in the color extraction region, and the area of the color extraction region defined is smaller than the area of the pattern displayed in the entire second layer, so that the data processing amount in the interpolation operation process may be reduced.
For example, the writing process of the electronic whiteboard is a process of continuously connecting the line drawing points to form the drawn line, so that only the area partially covered by the drawn line is displayed in the writing process, that is, only the previous line drawing point and the current line drawing point area are added when every two line drawing points are connected, so that only the rectangular area connecting the line drawing points is drawn in the line drawing process, and the color taking area of the OSD layer in the line drawing process can be displayed in a superimposed manner. The color area acquired by the color extractor at the time of superimposition is thus a set of rectangular frames covering the drawn line. The calculation amount can be reduced and the writing speed can be improved by the method of acquiring the local area.
Since the touch track pattern in the first layer is synchronized to the second layer after each single touch action is finished, which causes a change in the background pattern in the second layer, in some embodiments of the present application, the step of superimposing the conversion pattern and the background pattern further includes: detecting an end point in a touch track input by a user; adding the touch track pattern to the second layer if the touch track contains the end point; updating the background pattern in the second graph layer.
The controller 250 may detect an end point of the touch trajectory in real time during the user inputs the touch trajectory. In general, for the drawing action of the user, when a touch trajectory input by the user has a breakpoint, that is, the user ends a single drawing action, so the controller 250 may detect the end point by detecting the breakpoint of the touch trajectory. When the touch track is detected to contain the end point, it is judged that the user completes one drawing action at the moment, and a touch track pattern needs to be added to the second image layer so as to update the background pattern in the second image layer.
For example, the writing process of the electronic drawing board is shown at the level of GOP2, the written drawn line is displayed at the OSD layer, the color selector selects the color of the OSD layer as the background during interpolation of the level of GOP2, and the interface of the OSD layer changes continuously with the increase of the drawn line, so the color selector needs to be refreshed again after each line drawing is finished, so as to ensure that the transparent background color during the next line drawing is the same as that during the line drawing. Similarly, when erasing in the writing board, the erasing and erasing path is displayed at the GOP2 level, and after the erasing is finished, the OSD layer is refreshed again, so as to ensure that the color selector acquires the latest background of the OSD layer at the next writing or erasing.
For another example, the erasing process of the electronic whiteboard is a process of continuously replacing background elements and drawing an eraser, the background of the path passed by the eraser and the eraser are displayed at the GOP2 level, and the background of the GOP2 level is transparent, so that a black edge line exists at the boundary between the background and the transparent connection when interpolation is performed from 2K to 4K. In addition to the method of selecting the color array of the next layer by using the transparent color picker provided in the above example, a method of selecting a color of a partial area by using the transparent color picker may be used. The method is characterized in that the method comprises the following steps of selecting a range 16 pixels larger than the frame range of an eraser area according to the erasing condition of the eraser, and selecting the color of the part of the transparent color extractor in the corresponding area of the next layer as the array range of the transparent color extractor, so that the calculation amount can be reduced, and the erasing speed can be increased.
Further, some top-set contents may also be included in the application interface displayed in the second layer. For example, when a drawing operation is performed, some toolbars need to be displayed on the touch track after one drawing action is finished, so that a user can click in a subsequent drawing action. To achieve this effect, as shown in fig. 12, in some embodiments, the step of updating the background pattern in the second image layer further includes: traversing the touch trajectory to extract pattern content covered by the touch trajectory from the background pattern; and if the pattern content is the content displayed on the top, displaying the pattern content on the upper layer of the touch track.
For example, the position of the toolbar is a special case, whether a drawn line is written at the position of the toolbar needs to be judged in the writing process, and if the drawn line is written in the range of the toolbar, the position of the toolbar needs to be refreshed every time the line is drawn, so that the real-time refreshing of the toolbar display and the drawn line display is guaranteed.
As can be seen from the foregoing technical solutions, the multi-layer overlaying method provided in the foregoing embodiment may be configured in a controller of a display device, and is used for displaying a touch track pattern in real time in a multi-layer overlaying manner when a user inputs a touch track. According to the method, interpolation operation can be performed on the touch track pattern according to the background pattern, the influence of the transparency of the first layer on the edge interpolation algorithm result of the touch track pattern can be relieved, the problem of sawtooth or black edge when the touch track pattern is superposed is relieved, and the real-time display effect is improved.
Based on the multiple image layer overlaying method provided in the foregoing embodiment, in some embodiments of the present application, there is further provided a display device 200, including: a display 260, a touch-sensitive component 276, and a controller 250. Wherein the display 260 is configured to display a user interface, the touch-sensitive component 276 is configured to detect a touch trajectory input by a user, and the controller 250 is configured to perform the following program steps:
Acquiring a touch track pattern in a first layer and acquiring a background pattern in a second layer, wherein the second layer is a layer positioned below the first layer;
according to the background pattern, performing interpolation operation on the touch track pattern to generate a conversion pattern, wherein the resolution of the conversion pattern is equal to that of the background pattern;
and superposing the conversion pattern and the background pattern to control the display to display a superposition result in real time.
As can be seen from the foregoing technical solutions, the display device 200 provided in the above embodiment may further perform interpolation operation on the touch track pattern in the first layer according to the background pattern in the second layer after the touch track pattern is obtained, so as to improve the resolution of the touch track pattern, and finally superimpose the conversion pattern after the interpolation operation on the background pattern, and display the conversion pattern and the background pattern in real time through the display. The display device can perform interpolation operation on the touch track pattern according to the background pattern, can relieve the influence of the transparency of the first layer on the edge interpolation algorithm result of the touch track pattern, relieve the problem of sawtooth or black edge when the touch track pattern is superposed, and improve the real-time display effect.
The display apparatus 200 may present the overlay effect in real time to improve display quality. In order to present a final display effect, in some embodiments of the present application, a multi-layer overlaying method is further provided, where the multi-layer overlaying method is applicable to the display device 200, and the multi-layer overlaying method includes the following steps:
first, the display 260 is controlled to display a drawing interface, where the drawing interface includes a first layer and a second layer located below the first layer. Next, the controller 250 may display the transition pattern in the first layer and the background pattern in the second layer in response to the touch action of the user. The conversion pattern is a pattern which is generated by performing interpolation operation on the touch track pattern and has the resolution equal to that of the background pattern. Finally, the controller 250 may control the display 260 to display a superimposed pattern obtained by superimposing the transition pattern with the background pattern in real time.
Therefore, the multi-layer superposition method can control the display to display the drawing interface, respectively display the conversion pattern and the background pattern through the first layer and the second layer of the drawing interface, and superpose and display the patterns in the first layer and the second layer in real time in the process of the user executing the demonstration action, so as to relieve the influence of the transparency of the first layer on the edge interpolation algorithm result of the touch track pattern, solve the problem of sawtooth or black edge when the touch track pattern is superposed, and improve the real-time display effect.
The embodiments provided in the present application are only a few examples of the general concept of the present application, and do not limit the scope of the present application. Any other embodiments extended according to the solution of the present application without inventive efforts will be within the scope of protection of the present application for a person skilled in the art.

Claims (9)

1. A display device, comprising:
a display;
a touch component configured to detect a touch trajectory input by a user;
a controller configured to:
acquiring a touch track pattern in a first layer and acquiring a background pattern in a second layer, wherein the second layer is a layer positioned below the first layer;
extracting boundary colors and boundary positions of the touch track pattern;
extracting a background color in the boundary position association area in the second image layer;
solving an interpolation result according to the boundary color and the background color, and performing interpolation operation on the touch track pattern to generate a conversion pattern, wherein the resolution of the conversion pattern is equal to that of the background pattern;
And superposing the conversion pattern and the background pattern to control the display to display a superposition result in real time.
2. The display device according to claim 1, wherein in the step of obtaining the touch track pattern in the first layer, the controller is further configured to:
receiving a touch track input by a user in real time;
extracting a foreground color in response to the touch trajectory;
and presenting the touch track in the first image layer according to the foreground color so as to generate the touch track pattern.
3. The display device according to claim 1, wherein in the step of performing the interpolation operation on the touch trajectory pattern according to the background pattern, the controller is further configured to:
detecting the resolution of the touch track pattern and the background pattern;
if the resolution of the touch track pattern is smaller than that of the background pattern, executing a step of extracting a boundary color and a boundary position of the touch track pattern;
and if the resolution of the touch track pattern is equal to that of the background pattern, performing superposition on the touch track pattern and the background pattern.
4. The display device according to claim 1, wherein in the step of performing the interpolation operation on the touch trajectory pattern according to the background pattern, the controller is further configured to:
detecting the transparency of the background pattern;
if the transparency of the background pattern is fully transparent or semitransparent, obtaining a bottom pattern in a third layer, wherein the third layer is a layer positioned below the second layer;
performing superposition on the background pattern and the bottom layer pattern;
presenting the superimposed background pattern in the second layer.
5. The display device according to claim 4, wherein in the step of performing the superimposition of the background pattern and the underlying pattern, the controller is further configured to:
detecting the resolution of the background pattern and the underlying pattern;
if the resolution of the background pattern is smaller than that of the bottom pattern, extracting bottom color in the third layer;
performing an interpolation algorithm on the background pattern according to the underlying color;
and performing superposition on the background pattern and the bottom layer pattern after the interpolation algorithm processing.
6. The display device according to claim 1, wherein in the step of performing the interpolation operation on the touch trajectory pattern, the controller is further configured to:
Traversing feature points in the touch track;
a color taking area is defined in the second image layer according to the position of the feature point;
and extracting pixel values and transparency values in the color taking area to obtain a background pattern.
7. The display device according to claim 1, wherein in the step of superimposing the conversion pattern with the background pattern, the controller is further configured to:
detecting an end point in a touch track input by a user;
adding the touch track pattern to the second layer if the touch track contains the end point;
and updating the background pattern in the second image layer.
8. The multi-layer superposition method is applied to a display device, wherein the display device comprises a display, a touch control assembly and a controller, the touch control assembly is configured to detect a touch control track input by a user, and the multi-layer superposition method comprises the following steps:
acquiring a touch track pattern in a first layer and acquiring a background pattern in a second layer, wherein the second layer is a layer positioned below the first layer;
extracting boundary colors and boundary positions of the touch track pattern;
In the second image layer, extracting a background color in the boundary position correlation area;
solving an interpolation result according to the boundary color and the background color, and performing interpolation operation on the touch track pattern to generate a conversion pattern, wherein the resolution of the conversion pattern is equal to that of the background pattern;
and superposing the conversion pattern and the background pattern to control the display to display a superposition result in real time.
9. The multi-layer superposition method is applied to a display device, wherein the display device comprises a display, a touch control assembly and a controller, the touch control assembly is configured to detect a touch control track input by a user, and the multi-layer superposition method comprises the following steps:
controlling the display to display a drawing interface, wherein the drawing interface comprises a first layer and a second layer, and the second layer is a layer positioned below the first layer;
in response to a touch action of a user, extracting boundary colors and boundary positions of the touch track pattern;
displaying a conversion pattern on the first layer and displaying a background pattern on the second layer, wherein the conversion pattern is generated by extracting a background color in the boundary position association area in the second layer and solving an interpolation result according to the boundary color and the background color, and the resolution of the conversion pattern is equal to that of the background pattern;
And controlling the display to display a superimposed pattern in real time, wherein the superimposed pattern is obtained by superimposing a conversion pattern and the background pattern.
CN202011528031.6A 2020-10-30 2020-12-22 Display device and multi-layer overlapping method Active CN112672199B (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CN202011528031.6A CN112672199B (en) 2020-12-22 2020-12-22 Display device and multi-layer overlapping method
CN202210821867.8A CN115243094A (en) 2020-12-22 2020-12-22 Display device and multi-layer stacking method
PCT/CN2021/117796 WO2022089043A1 (en) 2020-10-30 2021-09-10 Display device, geometry recognition method, and multi-pattern layer superimposed display method
CN202180066094.0A CN116324689A (en) 2020-10-30 2021-09-10 Display device, geometric figure recognition method and multi-layer stacked display method
US18/157,324 US20230162704A1 (en) 2020-10-30 2023-01-20 Display apparatus and display method for multi-layer superimposition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011528031.6A CN112672199B (en) 2020-12-22 2020-12-22 Display device and multi-layer overlapping method

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202210821867.8A Division CN115243094A (en) 2020-12-22 2020-12-22 Display device and multi-layer stacking method

Publications (2)

Publication Number Publication Date
CN112672199A CN112672199A (en) 2021-04-16
CN112672199B true CN112672199B (en) 2022-07-29

Family

ID=75407521

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202011528031.6A Active CN112672199B (en) 2020-10-30 2020-12-22 Display device and multi-layer overlapping method
CN202210821867.8A Pending CN115243094A (en) 2020-12-22 2020-12-22 Display device and multi-layer stacking method

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202210821867.8A Pending CN115243094A (en) 2020-12-22 2020-12-22 Display device and multi-layer stacking method

Country Status (1)

Country Link
CN (2) CN112672199B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116324689A (en) * 2020-10-30 2023-06-23 海信视像科技股份有限公司 Display device, geometric figure recognition method and multi-layer stacked display method
CN114461103A (en) * 2021-12-14 2022-05-10 北京鸿合爱学教育科技有限公司 Regional acceleration processing method and device, electronic equipment and storage medium
CN114442849A (en) * 2022-01-27 2022-05-06 海信视像科技股份有限公司 Display device and display method
CN115348469B (en) * 2022-07-05 2024-03-15 西安诺瓦星云科技股份有限公司 Picture display method, device, video processing equipment and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101321240A (en) * 2008-06-25 2008-12-10 华为技术有限公司 Method and device for multi-drawing layer stacking

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100576893C (en) * 2007-08-28 2009-12-30 西安交通大学 A kind of graphic OSD controller that is integrated in video frequency processing chip
US9082216B2 (en) * 2009-07-01 2015-07-14 Disney Enterprises, Inc. System and method for filter kernel interpolation for seamless mipmap filtering
JP5540770B2 (en) * 2009-07-30 2014-07-02 株式会社リコー Image processing apparatus, image processing method, and image processing program
CN102638679B (en) * 2011-02-12 2014-07-02 澜起科技(上海)有限公司 Method for image interpolation based on matrix and image processing system
US10127700B2 (en) * 2014-03-31 2018-11-13 Samsung Display Co., Ltd. Generation of display overlay parameters utilizing touch inputs
US20140327689A1 (en) * 2014-04-22 2014-11-06 Paul Maravelias Technique for real-time rendering of temporally interpolated two-dimensional contour lines on a graphics processing unit
CN104574277A (en) * 2015-01-30 2015-04-29 京东方科技集团股份有限公司 Image interpolation method and image interpolation device
CN104811677B (en) * 2015-05-22 2017-03-01 广东欧珀移动通信有限公司 The display control method of mobile terminal and device
CN105719332B (en) * 2016-01-20 2019-02-19 阿里巴巴集团控股有限公司 The implementation method and device of animation between color is mended
US20190130875A1 (en) * 2016-03-31 2019-05-02 Huawei Technologies Co., Ltd. Application Program Processing Method And Terminal Device
WO2018119632A1 (en) * 2016-12-27 2018-07-05 深圳市大疆创新科技有限公司 Image processing method, device and equipment
CN106874017B (en) * 2017-03-10 2019-10-15 Oppo广东移动通信有限公司 A kind of display scene recognition method, device and the mobile terminal of mobile terminal
CN107193794B (en) * 2017-06-28 2021-05-18 广州视源电子科技股份有限公司 Display content annotating method and device
CN108874292B (en) * 2018-07-16 2021-12-03 广州视源电子科技股份有限公司 Comment display method and device and intelligent interactive panel
US11257275B2 (en) * 2018-10-31 2022-02-22 Facebook Technologies, Llc. Dual distance field color palette
CN110377264B (en) * 2019-07-17 2023-07-21 Oppo广东移动通信有限公司 Layer synthesis method, device, electronic equipment and storage medium

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101321240A (en) * 2008-06-25 2008-12-10 华为技术有限公司 Method and device for multi-drawing layer stacking

Also Published As

Publication number Publication date
CN112672199A (en) 2021-04-16
CN115243094A (en) 2022-10-25

Similar Documents

Publication Publication Date Title
CN112672199B (en) Display device and multi-layer overlapping method
CN112799627B (en) Display apparatus and image display method
CN112506400A (en) Page information voice broadcasting method and display device
CN113593488A (en) Backlight adjusting method and display device
CN112799576A (en) Virtual mouse moving method and display device
CN112580625A (en) Display device and image content identification method
CN112584229A (en) Method for switching channels of display equipment and display equipment
CN113453069B (en) Display device and thumbnail generation method
CN112911371B (en) Dual-channel video resource playing method and display equipment
CN114157889B (en) Display equipment and touch control assisting interaction method
CN112926420B (en) Display device and menu character recognition method
CN115190351A (en) Display device and media asset scaling control method
CN113485613A (en) Display equipment and method for realizing free-drawing screen edge painting
CN112668546A (en) Video thumbnail display method and display equipment
CN113485614A (en) Display apparatus and color setting method
CN113064534A (en) Display method and display equipment of user interface
CN113703705A (en) Display device and list updating method
CN113064691A (en) Display method and display equipment for starting user interface
CN112770169B (en) List circulating page turning method and display device
CN115268697A (en) Display device and line drawing rendering method
CN113766164B (en) Display equipment and signal source interface display method
CN114281284B (en) Display apparatus and image display method
CN115550716A (en) Display device and color mixing display method
CN115550717A (en) Display device and multi-finger touch display method
CN116339570A (en) Display device and focus control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant