CN114089852A - Display equipment, electronic whiteboard device and whiteboard erasing method - Google Patents

Display equipment, electronic whiteboard device and whiteboard erasing method Download PDF

Info

Publication number
CN114089852A
CN114089852A CN202010680573.9A CN202010680573A CN114089852A CN 114089852 A CN114089852 A CN 114089852A CN 202010680573 A CN202010680573 A CN 202010680573A CN 114089852 A CN114089852 A CN 114089852A
Authority
CN
China
Prior art keywords
erasing
layer
whiteboard
area
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010680573.9A
Other languages
Chinese (zh)
Inventor
李保成
王敏
吴汉勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202010680573.9A priority Critical patent/CN114089852A/en
Publication of CN114089852A publication Critical patent/CN114089852A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The present application relates to the field of image processing technologies, and in particular, to a display device, an electronic whiteboard device, and a whiteboard erasing method. Can solve the blank to a certain extent and erase the card and pause, erase the effect and move asynchronous problem with the user, display device includes: a display configured to display a user interface, receive an erase path; a first controller configured to: recording the erasing path into a first image layer, wherein the first image layer is set to be transparent; determining an erasing area in the first image layer based on the initial position and the final position of the eraser; filling the erasing area with an opaque background picture; and overlapping the first layer to other layers and displaying the first layer on the display.

Description

Display equipment, electronic whiteboard device and whiteboard erasing method
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a display device, an electronic whiteboard device, and a whiteboard erasing method.
Background
The electronic whiteboard realizes writing on the display equipment by touching the touch control all-in-one machine by a pen, a hand or other writing objects such as touch objects under the support of an application program. Electronic whiteboards typically use superimposed double layers, including a drawn line layer displayed on an upper layer and a background layer displayed on a lower layer. Whiteboard erasing refers to erasing written content by a board eraser, hand, or other touch object.
In the implementation of some whiteboard erasures, a canvas & drawbitmap component service of an android system is generally adopted, an area to be erased is calculated through a standard drawing mechanism, an erasing track and the condition of a line intersection point in the erasing track and the erasing area are obtained, so that a line to be erased is divided into a plurality of lines, and the erasing effect of the line is realized by adjusting the color and the transparent attribute of a line pixel point.
However, in the process of erasing on the electronic whiteboard by a user, the whiteboard needs to calculate the intersection point of the drawn lines in the erasing area in real time, and when the drawn lines of the whiteboard are complicated, the electronic whiteboard will take a lot of time to calculate, render and layer merge, and the erasing of the whiteboard may be jammed, and the display of the erasing effect is behind the erasing action of the user.
Disclosure of Invention
In order to solve the problems that a whiteboard is unsmooth in erasing, the erasing effect is asynchronous with the action of a user, the application provides display equipment, an electronic whiteboard device and a whiteboard erasing method.
The embodiment of the application is realized as follows:
a first aspect of an embodiment of the present application provides a display device, including: a display configured to display a user interface, receive an erase path; a first controller configured to: recording the erasing path into a first image layer, wherein the first image layer is set to be transparent; determining an erasing area in the first image layer based on the initial position and the final position of the eraser; filling the erasing area with an opaque background picture; and overlapping the first layer to other layers and displaying the first layer on the display.
A second aspect of the embodiments of the present application provides a whiteboard erasing method, including: inputting an erasing path input by a user into a first image layer, wherein the first image layer is set to be transparent; determining an erasing area in the first image layer based on the initial position and the final position of the eraser; filling the erasing area with an opaque background picture; and superposing the first layer on other layers for display.
A third aspect of the embodiments of the present application provides an electronic whiteboard device, including a memory, a processor, and a computer program stored on the memory, where the processor executes the computer program to perform the whiteboard erasing method according to the second aspect of the present disclosure.
The beneficial effect of this application: by constructing the first layer, the layer can be independently established for the erasing path; further, the background picture is filled in the erasing area, so that the scribing erasing speed can be increased; further, the first image layer is displayed in an overlapping mode, so that the white board erasing calculation can be simplified; further, by determining the initial position of the eraser, misoperation on a screen outside a whiteboard application window can be avoided; further, by suspending the refreshing mechanism of the native system in the erasing process, the UI interface interference of the native system on the white board erasing operation can be avoided; furthermore, the erasing speed can be improved by locally refreshing the erasing area, the jamming in the whiteboard erasing process is reduced, and the problem that the erasing effect is not synchronous with the erasing action of a user is solved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus according to an embodiment;
fig. 2 is a block diagram exemplarily showing a hardware configuration of a display device 200 according to an embodiment;
fig. 3 is a block diagram exemplarily showing a hardware configuration of the control apparatus 100 according to the embodiment;
fig. 4 is a diagram exemplarily showing a functional configuration of the display device 200 according to the embodiment;
fig. 5a schematically shows a software configuration in the display device 200 according to an embodiment;
fig. 5b schematically shows a configuration of an application in the display device 200 according to an embodiment;
FIG. 6A shows a schematic diagram of a UI interface for erasing a line in a television whiteboard application according to an embodiment of the application;
FIG. 6B shows a UI interface diagram for a line wipe for a television whiteboard application according to another embodiment of the present application;
FIG. 6C shows a UI interface diagram for a line wipe for a television whiteboard application according to another embodiment of the present application;
FIG. 6D illustrates a UI interface diagram for line wipe off in a television whiteboard application according to another embodiment of the present application;
FIG. 6E illustrates a UI interface diagram for line wipe off in a television whiteboard application according to another embodiment of the present application;
FIG. 6F is a schematic illustration of a UI interface for line wipe in a television whiteboard application according to another embodiment of the present application;
FIG. 6G is a schematic illustration of a UI interface for line wipe in a television whiteboard application according to another embodiment of the present application;
fig. 7 is a logic diagram illustrating erasing of a drawn line in a tv whiteboard application according to an embodiment of the present application;
fig. 8 shows a flowchart of a whiteboard erasing method according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the exemplary embodiments of the present application clearer, the technical solutions in the exemplary embodiments of the present application will be clearly and completely described below with reference to the drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, but not all the embodiments.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments shown in the present application without inventive effort, shall fall within the scope of protection of the present application. Moreover, while the disclosure herein has been presented in terms of exemplary one or more examples, it is to be understood that each aspect of the disclosure can be utilized independently and separately from other aspects of the disclosure to provide a complete disclosure.
It should be understood that the terms "first," "second," "third," and the like in the description and in the claims of the present application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used are interchangeable under appropriate circumstances and can be implemented in sequences other than those illustrated or otherwise described herein with respect to the embodiments of the application, for example.
Furthermore, the terms "comprises" and "comprising," as well as any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
The term "module" as used herein refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
Reference throughout this specification to "embodiments," "some embodiments," "one embodiment," or "an embodiment," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, appearances of the phrases "in various embodiments," "in some embodiments," "in at least one other embodiment," or "in an embodiment" or the like throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. Thus, the particular features, structures, or characteristics shown or described in connection with one embodiment may be combined, in whole or in part, with the features, structures, or characteristics of one or more other embodiments, without limitation. Such modifications and variations are intended to be included within the scope of the present application.
The term "remote control" as used in this application refers to a component of an electronic device, such as the display device disclosed in this application, that is typically wirelessly controllable over a short range of distances. Typically using infrared and/or Radio Frequency (RF) signals and/or bluetooth to connect with the electronic device, and may also include WiFi, wireless USB, bluetooth, motion sensor, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in the common remote control device with the user interface in the touch screen.
The term "gesture" as used in this application refers to a user's behavior through a change in hand shape or an action such as hand motion to convey a desired idea, action, purpose, or result.
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus according to an embodiment. As shown in fig. 1, a user may operate the display device 200 through the mobile terminal 300 and the control apparatus 100.
The control device 100 may control the display device 200 in a wireless or other wired manner by using a remote controller, including infrared protocol communication, bluetooth protocol communication, other short-distance communication manners, and the like. The user may input a user command through a key on a remote controller, voice input, control panel input, etc. to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
In some embodiments, mobile terminals, tablets, computers, laptops, and other smart devices may also be used to control the display device 200. For example, the display device 200 is controlled using an application program running on the smart device. The application, through configuration, may provide the user with various controls in an intuitive User Interface (UI) on a screen associated with the smart device.
For example, the mobile terminal 300 may install a software application with the display device 200, implement connection communication through a network communication protocol, and implement the purpose of one-to-one control operation and data communication. Such as: the mobile terminal 300 and the display device 200 can establish a control instruction protocol, synchronize a remote control keyboard to the mobile terminal 300, and control the display device 200 by controlling a user interface on the mobile terminal 300. The audio and video content displayed on the mobile terminal 300 can also be transmitted to the display device 200, so as to realize the synchronous display function.
As also shown in fig. 1, the display apparatus 200 also performs data communication with the server 400 through various communication means. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. Illustratively, the display device 200 receives software program updates, or accesses a remotely stored digital media library, by sending and receiving information, as well as Electronic Program Guide (EPG) interactions. The servers 400 may be a group or groups of servers, and may be one or more types of servers. Other web service contents such as video on demand and advertisement services are provided through the server 400.
The display device 200 may be a liquid crystal display, an OLED display, a projection display device. The particular display device type, size, resolution, etc. are not limiting, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide an intelligent network tv function that provides a computer support function in addition to the broadcast receiving tv function. Examples include a web tv, a smart tv, an Internet Protocol Tv (IPTV), and the like.
A hardware configuration block diagram of a display device 200 according to an exemplary embodiment is exemplarily shown in fig. 2. As shown in fig. 2, the display device 200 includes a controller 210, a tuning demodulator 220, a communication interface 230, a detector 240, an input/output interface 250, a video processor 260-1, an audio processor 60-2, a display 280, an audio output 270, a memory 290, a power supply, and an infrared receiver.
A display 280 for receiving the image signal from the video processor 260-1 and displaying the video content and image and components of the menu manipulation interface. The display 280 includes a display screen assembly for presenting a picture, and a driving assembly for driving the display of an image. The video content may be displayed from broadcast television content, or may be broadcast signals that may be received via a wired or wireless communication protocol. Alternatively, various image contents received from the network communication protocol and sent from the network server side can be displayed.
Meanwhile, the display 280 simultaneously displays a user manipulation UI interface generated in the display apparatus 200 and used to control the display apparatus 200.
And, a driving component for driving the display according to the type of the display 280. Alternatively, in case the display 280 is a projection display, it may also comprise a projection device and a projection screen.
The communication interface 230 is a component for communicating with an external device or an external server according to various communication protocol types. For example: the communication interface 230 may be a Wifi chip 231, a bluetooth communication protocol chip 232, a wired ethernet communication protocol chip 233, or other network communication protocol chips or near field communication protocol chips, and an infrared receiver (not shown).
The display apparatus 200 may establish control signal and data signal transmission and reception with an external control apparatus or a content providing apparatus through the communication interface 230. And an infrared receiver, an interface device for receiving an infrared control signal for controlling the apparatus 100 (e.g., an infrared remote controller, etc.).
The detector 240 is a signal used by the display device 200 to collect an external environment or interact with the outside. The detector 240 includes a light receiver 242, a sensor for collecting the intensity of ambient light, and parameters such as parameter changes can be adaptively displayed by collecting the ambient light.
The image acquisition device 241, such as a camera and a camera, may be used to acquire an external environment scene, acquire attributes of a user or interact gestures with the user, adaptively change display parameters, and recognize gestures of the user, so as to implement an interaction function with the user.
In some other exemplary embodiments, the detector 240, a temperature sensor, etc. may be provided, for example, by sensing the ambient temperature, and the display device 200 may adaptively adjust the display color temperature of the image. For example, the display apparatus 200 may be adjusted to display a cool tone when the temperature is in a high environment, or the display apparatus 200 may be adjusted to display a warm tone when the temperature is in a low environment.
In other exemplary embodiments, the detector 240, and a sound collector, such as a microphone, may be used to receive a user's voice, a voice signal including a control instruction from the user to control the display device 200, or collect an ambient sound for identifying an ambient scene type, and the display device 200 may adapt to the ambient noise.
The input/output interface 250 controls data transmission between the display device 200 of the controller 210 and other external devices. Such as receiving video and audio signals or command instructions from an external device.
Input/output interface 250 may include, but is not limited to, the following: any one or more of high definition multimedia interface HDMI interface 251, analog or data high definition component input interface 253, composite video input interface 252, USB input interface 254, RGB ports (not shown in the figures), etc.
In some other exemplary embodiments, the input/output interface 250 may also form a composite input/output interface with the above-mentioned plurality of interfaces.
The tuning demodulator 220 receives the broadcast television signals in a wired or wireless receiving manner, may perform modulation and demodulation processing such as amplification, frequency mixing, resonance, and the like, and demodulates the television audio and video signals carried in the television channel frequency selected by the user and the EPG data signals from a plurality of wireless or wired broadcast television signals.
The tuner demodulator 220 is responsive to the user-selected television signal frequency and the television signal carried by the frequency, as selected by the user and controlled by the controller 210.
The tuner-demodulator 220 may receive signals in various ways according to the broadcasting system of the television signal, such as: terrestrial broadcast, cable broadcast, satellite broadcast, or internet broadcast signals, etc.; and according to different modulation types, the modulation mode can be digital modulation or analog modulation. Depending on the type of television signal received, both analog and digital signals are possible.
In other exemplary embodiments, the tuner/demodulator 220 may be in an external device, such as an external set-top box. In this way, the set-top box outputs television audio/video signals after modulation and demodulation, and the television audio/video signals are input into the display device 200 through the input/output interface 250.
The video processor 260-1 is configured to receive an external video signal, and perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image synthesis, and the like according to a standard codec protocol of the input signal, so as to obtain a signal that can be displayed or played on the direct display device 200.
Illustratively, the video processor 260-1 includes a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio and video data stream, and if the input MPEG-2 is input, the demultiplexing module demultiplexes the input audio and video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert an input video frame rate, such as a 60Hz frame rate into a 120Hz frame rate or a 240Hz frame rate, and the normal format is implemented in, for example, an interpolation frame mode.
The display format module is used for converting the received video output signal after the frame rate conversion, and changing the signal to conform to the signal of the display format, such as outputting an RGB data signal.
The audio processor 260-2 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform noise reduction, digital-to-analog conversion, amplification processing, and the like to obtain an audio signal that can be played in the speaker.
In other exemplary embodiments, video processor 260-1 may comprise one or more chips. The audio processor 260-2 may also comprise one or more chips.
And, in other exemplary embodiments, the video processor 260-1 and the audio processor 260-2 may be separate chips or may be integrated together with the controller 210 in one or more chips.
An audio output 272, which receives the sound signal output from the audio processor 260-2 under the control of the controller 210, such as: the speaker 272, and the external sound output terminal 274 that can be output to the generation device of the external device, in addition to the speaker 272 carried by the display device 200 itself, such as: an external sound interface or an earphone interface and the like.
The power supply provides power supply support for the display device 200 from the power input from the external power source under the control of the controller 210. The power supply may include a built-in power supply circuit installed inside the display device 200, or may be a power supply interface installed outside the display device 200 to provide an external power supply in the display device 200.
A user input interface for receiving an input signal of a user and then transmitting the received user input signal to the controller 210. The user input signal may be a remote controller signal received through an infrared receiver, and various user control signals may be received through the network communication module.
For example, the user inputs a user command through the remote controller 100 or the mobile terminal 300, the user input interface responds to the user input through the controller 210 according to the user input, and the display device 200 responds to the user input.
In some embodiments, a user may enter a user command on a Graphical User Interface (GUI) displayed on the display 280, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
The controller 210 controls the operation of the display apparatus 200 and responds to the user's operation through various software control programs stored in the memory 290.
As shown in fig. 2, the controller 210 includes a RAM213 and a ROM214, and a graphic processor 216, a CPU processor 212, a communication interface 218, such as: a first interface 218-1 through an nth interface 218-n, and a communication bus. The RAM213 and the ROM214, the graphic processor 216, the CPU processor 212, and the communication interface 218 are connected via a bus.
A ROM213 for storing instructions for various system boots. If the display apparatus 200 starts power-on upon receipt of the power-on signal, the CPU processor 212 executes a system boot instruction in the ROM, copies the operating system stored in the memory 290 to the RAM213, and starts running the boot operating system. After the start of the operating system is completed, the CPU processor 212 copies the various application programs in the memory 290 to the RAM213, and then starts running and starting the various application programs.
A graphics processor 216 for generating various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The display device comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes. And a renderer for generating various objects based on the operator and displaying the rendered result on the display 280.
A CPU processor 212 for executing operating system and application program instructions stored in memory 290. And executing various application programs, data and contents according to various interactive instructions received from the outside so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU processor 212 may include a plurality of processors. The plurality of processors may include one main processor and a plurality of or one sub-processor. A main processor for performing some operations of the display apparatus 200 in a pre-power-up mode and/or operations of displaying a screen in a normal mode. A plurality of or one sub-processor for one operation in a standby mode or the like.
The controller 210 may control the overall operation of the display apparatus 100. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
Wherein the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon. The user command for selecting the UI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
The memory 290 includes a memory for storing various software modules for driving the display device 200. Such as: various software modules stored in memory 290, including: the system comprises a basic module, a detection module, a communication module, a display control module, a browser module, various service modules and the like.
Wherein the basic module is a bottom layer software module for signal communication among the various hardware in the postpartum care display device 200 and for sending processing and control signals to the upper layer module. The detection module is used for collecting various information from various sensors or user input interfaces, and the management module is used for performing digital-to-analog conversion and analysis management.
For example: the voice recognition module comprises a voice analysis module and a voice instruction database module. The display control module is a module for controlling the display 280 to display image content, and may be used to play information such as multimedia image content and UI interface. And the communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing a module for data communication between browsing servers. And the service module is used for providing various services and modules including various application programs.
Meanwhile, the memory 290 is also used to store visual effect maps and the like for receiving external data and user data, images of respective items in various user interfaces, and a focus object.
A block diagram of the configuration of the control apparatus 100 according to an exemplary embodiment is exemplarily shown in fig. 3. As shown in fig. 3, the control apparatus 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory 190, and a power supply 180.
The control device 100 is configured to control the display device 200 and may receive an input operation instruction of a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200. Such as: the user responds to the channel up and down operation by operating the channel up and down keys on the control device 100.
In some embodiments, the control device 100 may be a smart device. Such as: the control apparatus 100 may install various applications that control the display apparatus 200 according to user demands.
In some embodiments, as shown in fig. 1, a mobile terminal 300 or other intelligent electronic device may function similar to the control device 100 after installing an application that manipulates the display device 200. Such as: the user may implement the functions of controlling the physical keys of the device 100 by installing applications, various function keys or virtual buttons of a graphical user interface available on the mobile terminal 300 or other intelligent electronic device.
The controller 110 includes a processor 112 and RAM113 and ROM114, a communication interface 218, and a communication bus. The controller 110 is used to control the operation of the control device 100, as well as the internal components for communication and coordination and external and internal data processing functions.
The communication interface 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the received user input signal is transmitted to the display apparatus 200. The communication interface 130 may include at least one of a WiFi chip, a bluetooth module, an NFC module, and other near field communication modules.
A user input/output interface 140, wherein the input interface includes at least one of a microphone 141, a touch pad 142, a sensor 143, keys 144, and other input interfaces. Such as: the user can realize a user instruction input function through actions such as voice, touch, gesture, pressing, and the like, and the input interface converts the received analog signal into a digital signal and converts the digital signal into a corresponding instruction signal, and sends the instruction signal to the display device 200.
The output interface includes an interface that transmits the received user instruction to the display apparatus 200. In some embodiments, the interface may be an infrared interface or a radio frequency interface. Such as: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to the rf control signal modulation protocol and then transmitted to the display device 200 through the rf transmitting terminal.
In some embodiments, the control device 100 includes at least one of a communication interface 130 and an output interface. The control device 100 is provided with a communication interface 130, such as: the WiFi, bluetooth, NFC, etc. modules may transmit the user input command to the display device 200 through the WiFi protocol, or the bluetooth protocol, or the NFC protocol code.
A memory 190 for storing various operation programs, data and applications for driving and controlling the control apparatus 200 under the control of the controller 110. The memory 190 may store various control signal commands input by a user.
And a power supply 180 for providing operational power support to the various elements of the control device 100 under the control of the controller 110. A battery and associated control circuitry.
Fig. 4 is a diagram schematically illustrating a functional configuration of the display device 200 according to an exemplary embodiment. As shown in fig. 4, the memory 290 is used to store an operating system, an application program, contents, user data, and the like, and performs system operations for driving the display device 200 and various operations in response to a user under the control of the controller 210. The memory 290 may include volatile and/or nonvolatile memory.
The memory 290 is specifically configured to store an operating program for driving the controller 210 in the display device 200, and to store various application programs installed in the display device 200, various application programs downloaded by a user from an external device, various graphical user interfaces related to the applications, various objects related to the graphical user interfaces, user data information, and internal data of various supported applications. The memory 290 is used to store system software such as an OS kernel, middleware, and applications, and to store input video data and audio data, and other user data.
The memory 290 is specifically used for storing drivers and related data such as the audio/video processors 260-1 and 260-2, the display 280, the communication interface 230, the tuning demodulator 220, the input/output interface of the detector 240, and the like.
In some embodiments, memory 290 may store software and/or programs, software programs for representing an Operating System (OS) including, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. For example, the kernel may control or manage system resources, or functions implemented by other programs (e.g., the middleware, APIs, or applications), and the kernel may provide interfaces to allow the middleware and APIs, or applications, to access the controller to implement controlling or managing system resources.
The memory 290, for example, includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, an audio control module 2906, an external instruction recognition module 2907, a communication control module 2908, a light receiving module 2909, a power control module 2910, an operating system 2911, and other applications 2912, a browser module, and the like. The controller 210 performs functions such as: a broadcast television signal reception demodulation function, a television channel selection control function, a volume selection control function, an image control function, a display control function, an audio control function, an external instruction recognition function, a communication control function, an optical signal reception function, an electric power control function, a software control platform supporting various functions, a browser function, and the like.
A block diagram of a configuration of a software system in a display device 200 according to an exemplary embodiment is exemplarily shown in fig. 5 a.
As shown in fig. 5a, an operating system 2911, including executing operating software for handling various basic system services and for performing hardware related tasks, acts as an intermediary for data processing performed between application programs and hardware components. In some embodiments, portions of the operating system kernel may contain a series of software to manage the display device hardware resources and provide services to other programs or software code.
In other embodiments, portions of the operating system kernel may include one or more device drivers, which may be a set of software code in the operating system that assists in operating or controlling the devices or hardware associated with the display device. The drivers may contain code that operates the video, audio, and/or other multimedia components. Examples include a display screen, a camera, Flash, WiFi, and audio drivers.
The accessibility module 2911-1 is configured to modify or access the application program to achieve accessibility and operability of the application program for displaying content.
A communication module 2911-2 for connection to other peripherals via associated communication interfaces and a communication network.
The user interface module 2911-3 is configured to provide an object for displaying a user interface, so that each application program can access the object, and user operability can be achieved.
Control applications 2911-4 for controllable process management, including runtime applications and the like.
The event transmission system 2914, which may be implemented within the operating system 2911 or within the application program 2912, in some embodiments, on the one hand, within the operating system 2911 and on the other hand, within the application program 2912, is configured to listen for various user input events, and to refer to handlers that perform one or more predefined operations in response to the identification of various types of events or sub-events, depending on the various events.
The event monitoring module 2914-1 is configured to monitor an event or a sub-event input by the user input interface.
The event identification module 2914-1 is configured to input definitions of various types of events for various user input interfaces, identify various events or sub-events, and transmit the same to a process for executing one or more corresponding sets of processes.
The event or sub-event refers to an input detected by one or more sensors in the display device 200 and an input of an external control device (e.g., the control device 100). Such as: the method comprises the following steps of inputting various sub-events through voice, inputting gestures through gesture recognition, inputting sub-events through remote control key commands of the control equipment and the like. Illustratively, the one or more sub-events in the remote control include a variety of forms including, but not limited to, one or a combination of key presses up/down/left/right/, ok keys, key presses, and the like. And non-physical key operations such as move, hold, release, etc.
The interface layout manager 2913, directly or indirectly receiving the input events or sub-events from the event transmission system 2914, monitors the input events or sub-events, and updates the layout of the user interface, including but not limited to the position of each control or sub-control in the interface, and the size, position, and level of the container, and other various execution operations related to the layout of the interface.
As shown in fig. 5b, the application layer 2912 contains various applications that may also be executed at the display device 200. The application may include, but is not limited to, one or more applications such as: live television applications, video-on-demand applications, media center applications, application centers, gaming applications, and the like.
The live television application program can provide live television through different signal sources. For example, a live television application may provide television signals using input from cable television, radio broadcasts, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on the display device 200.
A video-on-demand application may provide video from different storage sources. Unlike live television applications, video on demand provides a video display from some storage source. For example, the video on demand may come from a server side of the cloud storage, from a local hard disk storage containing stored video programs.
The media center application program can provide various applications for playing multimedia contents. For example, a media center, which may be other than live television or video on demand, may provide services that a user may access to various images or audio through a media center application.
The application program center can provide and store various application programs. The application may be a game, an application, or some other application associated with a computer system or other device that may be run on the smart television. The application center may obtain these applications from different sources, store them in local storage, and then be operable on the display device 200.
The embodiment of the application can be applied to various types of display devices (including but not limited to smart televisions, tablet computers and the like). The technical solution will be explained below in relation to a relevant UI interface for performing a draw line erase in a tv whiteboard application.
Fig. 6A to 6G are schematic diagrams illustrating UI interfaces for erasing a drawn line in an electronic whiteboard application of the display device according to the embodiment of the present application.
Fig. 6A shows a schematic diagram of a UI interface for erasing a drawn line in a tv whiteboard application according to an embodiment of the present application.
A display device provided herein includes a display configured to display a user interface, receive an erase path. As shown, the main window in the display device UI is the electronic whiteboard UI and the trace 601 is a drawn line that the user has written to the electronic whiteboard. In some embodiments, electronic whiteboards generally use superimposed dual layers of graphics, namely: the drawing layer is displayed on the upper layer, and the background layer is displayed on the lower layer. During writing, a writing area formed by the drawn lines formed by writing is displayed opaque and in the drawn line color. And after the drawn lines are written and enter the writing queue, calculating the drawn lines entering the writing queue to update the drawn line layer, thereby realizing the drawn line writing function of the electronic whiteboard.
In some embodiments, taking the drawn line 601 as an example, for a touch screen terminal, which is usually displayed to a user in an application form through a canvas, the user may touch the canvas with a finger or move on the screen through a touch tool, the terminal receives and recognizes a sliding track of the user, collects touch points, connects to form a drawn line Path (Path), and simultaneously needs to store the drawn line touch data. In some embodiments, the first controller of the display device abstracts the drawn line into a command object, and touch data of the drawn line, such as touch point position coordinates, touch track thickness, color and the like of the drawn line, are all saved in the command object as touch data, and then the upper layer application calls the touch data through the command object, modifies the editing and the like, for example, when a certain drawn line in the canvas needs to be edited, the touch data needs to be modified through the command object.
In some embodiments, the electronic whiteboard UI also includes whiteboard operational controls for operating and configuring the electronic whiteboard, which may include, for example, a brush control for input, a eraser control for erasure, a palette control for selecting a background color and brush color, and the like.
In some embodiments, while the display device is applying electronic whiteboard, the display device is further configured to present other interactive elements, which may include, for example, television home page controls, search controls, message button controls, mailbox controls, browser controls, favorites controls, signal bar controls, voice controls, and the like.
To improve convenience and visualization of a television UI, the display apparatus provided in the embodiments of the present application includes a first controller that controls a television and its UI in response to an operation of an interactive element. For example, a user clicking on a search control through a controller such as a remote control may expose the search UI on top of other UIs, i.e., the UI controlling the application components to which the interactive elements are mapped can be made large, or run and displayed full screen.
In some embodiments, the interactive elements of the television may also be operated by sensors, which may be, but are not limited to, acoustic input sensors, pressure sensors, such as microphones, touch screens, which may detect voice commands, or touch instructions, including indications of the desired interactive elements.
Fig. 6B shows a UI interface diagram illustrating erasing of a drawn line in a tv whiteboard application according to another embodiment of the present application.
The display device provided by the present application comprises a display configured to receive an erase path showing a motion trajectory of a user erasing a line 601 using a eraser, wherein the eraser is erased from an initial position 602 to a final position 603 along a vertical direction.
In some embodiments, the first controller enters an erasing path input by a user into the first image layer during the process of using the eraser by the user, and sets the first image layer as a transparent attribute.
The electronic whiteboard of the display device comprises an upper-layer displayed drawing layer, a lower-layer displayed background layer and a first layer for entering an erasing track, wherein when the whiteboard application program is started, the first layer is initialized to a Native (local framework) layer of the system by a first controller so as to communicate with an upper-layer whiteboard application program code through the Native layer.
That is, it can be considered that, in the embodiment of the present application, a user enters a trajectory of a drawn line into a drawn line image layer during input, and when erasing, the user enters an erasing path of an eraser into the first image layer provided in the embodiment of the present application.
In some embodiments, a first controller of the display device acquires an erasing key point according to a touch gesture of a user on the display, connects the erasing key point to form an erasing path, stores data of the erasing path, and records the erasing path into a first layer, for example, the first layer may be implemented as a canvas bitmap.
The eraser of the electronic whiteboard may be configured as a touch object or the palm of the user, and the shape of the eraser may be configured as a rectangle as shown in fig. 6B, and may also be configured as a circle, a triangle, or another shape.
The first controller acquires user erasing key points and then smoothly connects the erasing key points to form an erasing path. In some embodiments, the first controller may also make a complement between the erase keypoints. For example, when the erasing path is constituted by the erasing points, whether or not the dot complementing is required is determined according to the minimum pitch Dmin of the adjacent points, which is the minimum radius of the rectangular eraser in the figure. And when the distance between the adjacent erasing points is greater than Dmin, performing point supplementing on the adjacent erasing points until the distance between the two adjacent erasing points is less than Dmin.
After an erasing path input by a user is input into the first image layer, the first controller determines an erasing area in the first image layer based on the initial position and the final position of the eraser.
In some embodiments, the first controller configures the erase region as an erase path region that does not include the final position when the initial position and the final position of the eraser do not intersect.
As shown in fig. 6B, the eraser moves vertically downward from the initial position 602 to the final position 603 through the erasing path, and the first controller can calculate the erasing area as the initial position 602 and the area of the erasing path through the eraser according to the shape of the eraser and the erasing path, as shown in the figure. It should be noted that the erasing area does not include the final position 603 of the eraser, so that the user can always keep the erased content in a visible state in the process of erasing the drawn line, and the accuracy of erasing the drawn line can be improved.
Fig. 6C shows a UI interface diagram illustrating erasing of a drawn line in a tv whiteboard application according to another embodiment of the present application.
Based on the erasing action of the user in fig. 6B, this figure shows the UI diagram of the drawn line 601 erased by the eraser, and it can be found that the display effect of the erased area is the same as that of the background layer, thereby implementing the erasing of the drawn line.
After the erased area in the first layer is determined, the first controller fills the erased area into an opaque background image, and then superimposes the first layer on other layers to be displayed on the display, as shown in fig. 6C.
In some embodiments, in an erased state of the electronic whiteboard, after the first controller obtains the erased area, the attribute of the pixel included in the erased area in the first layer is changed to be opaque, and the erased area is set as a background map, that is, the pixel included in the erased area is set as the color setting of the pixel in the background layer.
For example, the drawing line 601 in the upper drawing line layer has a black color, the parameter configuration is (0, 0, 0), the color presented in the lower background layer is white, and the parameter configuration is (255, 255, 255). Since the pixels included in the drawing line 601 in the drawing line layer on the upper layer are opaque, the black drawing line 601 can be seen, and the pixels in the rest area of the drawing line layer are all transparent, so that the white background can be seen. The first controller sets the color of the erasing area in the first layer as the color (255, 255, 255) of the background layer, sets the attribute of the pixel as opaque, and sets the attribute of the pixel of other areas in the first layer as transparent by default.
And finally, the first controller superposes the first layer with the set erasing area on other layers and displays the first layer on the display. For example, the first image layer may be implemented to be superimposed on the upper end of the background image layer and the drawing image layer, so as to erase the drawing of the electronic whiteboard, as shown in the figure.
Because the first layer is arranged at the upper end of other layers, the erasing area is configured as a background image, and the erasing effect of the erasing area of the drawing line 601 can be realized through the superposition, synthesis and display of images.
It can be understood that the erasing path is recorded into the independent first image layer, so that the erasing area can be quickly obtained, the intersection point of the erasing area and the existing drawn line does not need to be calculated, the existing drawn line does not need to be split, the whiteboard content is prevented from being erased by using a drawing mechanism, the whiteboard erasing speed can be increased, the calculation amount of the display device is reduced, and the whiteboard erasing and the user erasing action are synchronized.
It should be noted that the drawing in the drawing layer is still retained in the final position 603, so that accurate erasing of the electronic whiteboard is realized.
In some embodiments, during the first image layer entering the erasing path, the first controller suspends a Native system refreshing mechanism of the display device, and the Native layer refreshing mechanism is used for locally refreshing the erasing area.
And after detecting that the electronic whiteboard is in the erasing state, the first controller temporarily terminates a native system refreshing mechanism of the display equipment set in the whiteboard application program. And the native system refreshing mechanism is used for periodically refreshing the screen of the electronic whiteboard, and through refreshing, the content such as an advertisement page or a dialog box and the like can be popped up from the user interface on the screen of the electronic whiteboard sometimes for the user to watch. For example, if a user inserts a USB into an electronic whiteboard, the native system refresh mechanism can detect the insertion of the USB, so that a dialog box for prompting the USB insertion is displayed on a screen of the electronic whiteboard, which causes confusion of a user interface and affects the operation and viewing of the electronic whiteboard by the user.
In the process of continuously rendering and displaying the first layer by the first controller in an overlapping mode, the first controller refreshes an erasing area of the first layer by a self-defined Native layer refreshing mechanism and displays the erasing area of the first layer in an overlapping mode with other layers, and therefore the effect of erasing the white board is achieved.
In some embodiments, after the first controller detects that the electronic whiteboard completes the erasing action, the first controller restarts a native system refresh mechanism of the display device, so as to avoid affecting the look and feel and other functions of the user interface.
In some embodiments, when the initial position of the eraser intersects the final position, the first controller configures the erasing area as the initial position removing the intersection area.
Fig. 6D shows a UI interface diagram illustrating erasing of a drawn line in a tv whiteboard application according to another embodiment of the present application.
In the figure, the moving track of the eraser moves from an initial position 602 to a final position 603, the initial position and the final position intersect, and an intersection area 604 of the initial position and the final position, it can be seen that an end point of the drawing line 601 at the center is located in the intersection area 604, and based on the shape of the eraser and the erasing path of the eraser, the first controller sets the erasing area in the first image layer as an area 605 to be filled.
Then, the first controller fills the erased area, that is, the area to be filled 605, with an opaque background map, that is, the color of the pixels included in the filled area 605 is set to be the color (255, 255, 255) of the background map layer, the attribute of the pixels is set to be opaque, and the attributes of the pixels in other areas in the first map layer are all set to be transparent by default.
Finally, the first controller superimposes the first layer on other layers and displays the first layer on the display, as shown in fig. 6E.
Fig. 6E shows a UI interface diagram illustrating erasing of a drawn line in a tv whiteboard application according to another embodiment of the present application.
Based on the erasing action of the user in fig. 6D, this figure shows a UI diagram of the drawn line 601 erased by the eraser, and it can be found that the display effect of the erased area is the same as that of the background layer, thereby implementing erasing of the drawn line.
After determining the erased area and the area to be filled 605 in the first layer, the first controller fills the filled area 605 into an opaque background image, and then superimposes the first layer on other layers and displays the first layer on the display. For example, the first image layer may be implemented to be superimposed on the upper end of the background image layer and the drawing image layer, so as to erase the drawing of the electronic whiteboard, as shown in the figure.
In some embodiments, the first controller updates the initial position of the eraser to be the intersection of the eraser and the whiteboard application window when the initial position exceeds the area of the whiteboard application window.
Fig. 6F shows a UI interface diagram illustrating erasing of a drawn line in a tv whiteboard application according to another embodiment of the present application.
The electronic whiteboard UI is a small window in the screen, the initial position of the eraser is located at the position of the frame 606 of the electronic whiteboard UI, and the first controller configures the initial position 602 of the eraser as the intersection area of the eraser and the electronic whiteboard application window, as shown in the figure.
The moving track of the eraser moves from an initial position 602 to a final position 603, the initial position and the final position intersect, an intersection area 604 of the initial position and the final position, and the first controller sets an erasing area in the first layer as an area to be filled 605 based on the shape of the eraser and the erasing path of the eraser.
Then, the first controller fills the erased area, that is, the area to be filled 605, with an opaque background map, that is, the color of the pixels included in the filled area 605 is set to be the color (255, 255, 255) of the background map layer, the attribute of the pixels is set to be opaque, and the attributes of the pixels in other areas in the first map layer are all set to be transparent by default.
Finally, the first controller superimposes the first layer on other layers and displays the first layer on the display, as shown in fig. 6G.
Fig. 6G shows a UI interface diagram illustrating erasing of a drawn line in a tv whiteboard application according to another embodiment of the present application.
Based on the erasing action of the user in fig. 6F, this figure shows a UI diagram of the drawn line 601 erased by the eraser, and it can be found that the display effect of the erased area is the same as that of the background layer, thereby implementing erasing of the drawn line.
After determining the erased area and the area to be filled 605 in the first layer, the first controller fills the filled area 605 into an opaque background image, and then superimposes the first layer on other layers and displays the first layer on the display. For example, the first image layer may be implemented to be superimposed on the upper end of the background image layer and the drawing image layer, so as to erase the drawing of the electronic whiteboard, as shown in the figure.
Fig. 7 is a logic diagram illustrating erasing of a drawn line in a tv whiteboard application according to an embodiment of the present application.
In step 701, the first layer is initialized and transferred to the Native layer.
The electronic whiteboard of the display device comprises a drawing layer displayed on the upper layer, a background layer displayed on the lower layer and a first layer used for entering an erasing track, for example, the first layer can be implemented as a canvas bitmap, and when a whiteboard application program is started, a first controller initializes the first layer to a Native (local framework) layer of a system so as to communicate with a whiteboard application program code on the upper layer through the Native layer.
In step 702, touch key points are collected according to the touch gesture
The first controller collects and acquires input gestures of a user on a display screen, and acquires touch key points, namely erasing key points. The erase keypoints may include a start point, an inflection point, an end point, and the like.
In step S703, a touch Path (Path) is formed by connecting the touch key points, and meanwhile, the touch data is stored and is recorded into the first layer canvas bitmap.
The first controller acquires user erasing key points and then smoothly connects the erasing key points to form an erasing path. In some embodiments, the first controller may also perform a padding between the erase keypoints to obtain a more accurate erase path.
In step S704, the native system refresh mechanism of the Android system of the display device is stopped.
After detecting that the electronic whiteboard is in an erasing state, the first controller temporarily terminates a native system refreshing mechanism of the display device set in the whiteboard application program, so that the phenomenon that the native system refreshing mechanism periodically refreshes the screen of the electronic whiteboard to pop up contents such as advertisement pages or dialog boxes and the like to influence the viewing and operation of whiteboard erasing is avoided.
In step S705, the erasing area of the first image layer is refreshed by using the custom Native layer.
In step 706, the first layer is superimposed on other layers.
In the process of continuously rendering and displaying the first layer by the first controller in an overlaying manner, the first controller carries out local refreshing on an erasing area in the first layer, namely the canvas bitmap, by adopting a self-defined Native layer refreshing mechanism, and then overlays the refreshed first layer on the upper side of other layers.
In step S707, a Panel screen is displayed.
The display equipment displays the comprehensive layer formed by superposing the first layer on the other layers on the display, so that the effect of erasing the whiteboard is realized.
In step S707, after the erasing is finished, the screen is refreshed by using an Android native system refresh mechanism.
And after the first controller detects that the electronic whiteboard finishes the erasing action, the first controller restarts a native system refreshing mechanism of the display device to recover the normal display of other functional applications of the UI of the display device.
In some embodiments, the display device first controller will calculate an alpha value for the first layer erased area when the drawn lines of the whiteboard are erased, and configure its alpha value when the first layer is overlaid.
For example, when the first layer src is superimposed on the drawing layer dst, the pixels of the finally output integrated layer outcontrol picture and the calculation formula thereof may be represented as follows:
outcome.alpha=src.alpha*src.alpha+dst.alpha*(1-src.alpha);
outcome.red=src.red*src.alpha+dst.red*(1-src.alpha);
outcome.blue=src.blue*src.alpha+dst.blue*(1-src.alpha);
outcome.green=src.green*src.alpha+dst.green*(1-src.alpha);
wherein, outcontrol.alpha represents the transparency of the comprehensive layer; alpha represents the transparency of the first layer; respectively representing color configuration parameters of pixel points of the comprehensive layer by using outcontrol, red/blue/green; the src, red, blue and green respectively represent color configuration parameters of the pixel points of the first layer; red, blue and green respectively represent color configuration parameters of pixel points of the drawing layer.
Based on the above description of fig. 6 regarding the drawing line erasing UI interface in the tv whiteboard application and the operation of the display device, the present application also provides a whiteboard erasing method.
Fig. 8 shows a flowchart of a whiteboard erasing method according to an embodiment of the present application.
In step 801, an erasing path input by a user is input into a first layer, and the first layer is set to be transparent;
in step 802, determining an erasing area in the first layer based on the initial position and the final position of the eraser;
in step 803, the erasing area is filled with an opaque background image;
in step 804, the first layer is superimposed on other layers for display.
The specific content of the whiteboard erasing method has been described in detail in the above description of the whiteboard erasing UI interface in fig. 6A to 6G and the specific operations related to the display device, and is not described again here.
The application also provides an electronic whiteboard device, which comprises a memory, a processor and a computer program stored on the memory, wherein the processor executes the contents of the whiteboard erasing method provided by the embodiment of the application when executing the computer program.
The method and the device have the advantages that the first layer is built, so that the layer can be built for the erasing path independently; further, the background picture is filled in the erasing area, so that the scribing erasing speed can be increased; further, the first image layer is displayed in an overlapping mode, so that the white board erasing calculation can be simplified; further, by determining the initial position of the eraser, misoperation on a screen outside a whiteboard application window can be avoided; further, by suspending the refreshing mechanism of the native system in the erasing process, the UI interface interference of the native system on the white board erasing operation can be avoided; furthermore, the erasing speed can be improved by locally refreshing the erasing area, the jamming in the whiteboard erasing process is reduced, and the problem that the erasing effect is not synchronous with the erasing action of a user is solved.
Moreover, those skilled in the art will appreciate that aspects of the present application may be illustrated and described in terms of several patentable species or situations, including any new and useful combination of processes, machines, manufacture, or materials, or any new and useful improvement thereon. Accordingly, various aspects of the present application may be embodied entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or in a combination of hardware and software. The above hardware or software may be referred to as "data block", "controller", "engine", "unit", "component", or "system". Furthermore, aspects of the present application may be represented as a computer product, including computer readable program code, embodied in one or more computer readable media.
The computer storage medium may comprise a propagated data signal with the computer program code embodied therewith, for example, on baseband or as part of a carrier wave. The propagated signal may take any of a variety of forms, including electromagnetic, optical, etc., or any suitable combination. A computer storage medium may be any computer-readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code located on a computer storage medium may be propagated over any suitable medium, including radio, cable, fiber optic cable, RF, or the like, or any combination of the preceding.
Computer program code required for the operation of various portions of the present application may be written in any one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C + +, C #, VB.NET, Python, and the like, a conventional programming language such as C, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, a dynamic programming language such as Python, Ruby, and Groovy, or other programming languages, and the like. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any network format, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or in a cloud computing environment, or as a service, such as a software as a service (SaaS).
Additionally, the order in which elements and sequences of the processes described herein are processed, the use of alphanumeric characters, or the use of other designations, is not intended to limit the order of the processes and methods described herein, unless explicitly claimed. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing server or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to require more features than are expressly recited in the claims. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
The entire contents of each patent, patent application publication, and other material cited in this application, such as articles, books, specifications, publications, documents, and the like, are hereby incorporated by reference into this application. Except where the application is filed in a manner inconsistent or contrary to the present disclosure, and except where the claim is filed in its broadest scope (whether present or later appended to the application) as well. It is noted that the descriptions, definitions and/or use of terms in this application shall control if they are inconsistent or contrary to the statements and/or uses of the present application in the material attached to this application.

Claims (13)

1. A display device, comprising:
a display configured to display a user interface, receive an erase path;
a first controller configured to:
recording the erasing path into a first image layer, wherein the first image layer is set to be transparent;
determining an erasing area in the first image layer based on the initial position and the final position of the eraser;
filling the erasing area with an opaque background picture;
and overlapping the first layer to other layers and displaying the first layer on the display.
2. The display device of claim 1, wherein the first controller configures the erase region as an erase path region that does not include the final position when the initial position and the final position of the eraser do not intersect.
3. The display device of claim 1, wherein when an initial position of the eraser intersects a final position, the first controller configures the erasing area to the initial position to remove the intersection area.
4. The display device of claim 1, wherein the first controller updates the initial position to be an intersection area of the eraser and the whiteboard application window when the initial position of the eraser exceeds an area of the whiteboard application window.
5. The display device of claim 1, wherein the first controller initializes the first layer to a Native layer of a system upon startup of a whiteboard application.
6. The display device of claim 1, wherein the first image layer is during entry of an erasure path, the first controller suspends a Native system refresh mechanism of the display device, and the Native layer refresh mechanism is used to locally refresh the erasure area.
7. A whiteboard erasing method, the method comprising:
inputting an erasing path input by a user into a first image layer, wherein the first image layer is set to be transparent;
determining an erasing area in the first image layer based on the initial position and the final position of the eraser;
filling the erasing area with an opaque background picture;
and superposing the first layer on other layers for display.
8. The whiteboard erasing method of claim 7, wherein when the initial position and the final position of the eraser do not intersect, the erasing area is configured as an eraser pass area not including the final position.
9. The whiteboard erasing method of claim 7, wherein when an initial position of the board eraser intersects with a final position, the erasing area is configured to remove the intersection area from the initial position.
10. The whiteboard erasing method of claim 7, wherein when the initial position of the eraser exceeds the area of the whiteboard application window, the initial position is updated to be the intersection area of the eraser and the whiteboard application window.
11. The whiteboard erasing method of claim 7, wherein the first layer is initialized to a Native layer of a system when a whiteboard application is started.
12. The whiteboard erasing method of claim 7, wherein during entering an erasing path, the first image layer suspends a Native system refresh mechanism, and a Native layer refresh mechanism is used to locally refresh the erasing area.
13. An electronic whiteboard device comprising a memory, a processor and a computer program stored on the memory, the processor performing the method according to any of claims 7 to 12 when executing the computer program.
CN202010680573.9A 2020-07-15 2020-07-15 Display equipment, electronic whiteboard device and whiteboard erasing method Pending CN114089852A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010680573.9A CN114089852A (en) 2020-07-15 2020-07-15 Display equipment, electronic whiteboard device and whiteboard erasing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010680573.9A CN114089852A (en) 2020-07-15 2020-07-15 Display equipment, electronic whiteboard device and whiteboard erasing method

Publications (1)

Publication Number Publication Date
CN114089852A true CN114089852A (en) 2022-02-25

Family

ID=80294796

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010680573.9A Pending CN114089852A (en) 2020-07-15 2020-07-15 Display equipment, electronic whiteboard device and whiteboard erasing method

Country Status (1)

Country Link
CN (1) CN114089852A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009037464A (en) * 2007-08-02 2009-02-19 Sharp Corp Image display device and computer program
CN103578553A (en) * 2013-10-14 2014-02-12 江苏敏行信息技术有限公司 Quick vector linetype erasing method
CN104992460A (en) * 2015-05-28 2015-10-21 深圳市创易联合科技有限公司 Method for erasing vector scripts
CN107544730A (en) * 2017-08-25 2018-01-05 广州视源电子科技股份有限公司 Image display method, device and readable storage medium storing program for executing
WO2020010775A1 (en) * 2018-07-10 2020-01-16 广州视源电子科技股份有限公司 Method and device for operating interface element of electronic whiteboard, and interactive intelligent device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009037464A (en) * 2007-08-02 2009-02-19 Sharp Corp Image display device and computer program
CN103578553A (en) * 2013-10-14 2014-02-12 江苏敏行信息技术有限公司 Quick vector linetype erasing method
CN104992460A (en) * 2015-05-28 2015-10-21 深圳市创易联合科技有限公司 Method for erasing vector scripts
CN107544730A (en) * 2017-08-25 2018-01-05 广州视源电子科技股份有限公司 Image display method, device and readable storage medium storing program for executing
WO2020010775A1 (en) * 2018-07-10 2020-01-16 广州视源电子科技股份有限公司 Method and device for operating interface element of electronic whiteboard, and interactive intelligent device

Similar Documents

Publication Publication Date Title
CN109618206B (en) Method and display device for presenting user interface
CN110659010A (en) Picture-in-picture display method and display equipment
CN111970549B (en) Menu display method and display device
CN110519628A (en) A kind of picture-in-picture display methods and display equipment
CN114157889B (en) Display equipment and touch control assisting interaction method
CN114079829A (en) Display device and generation method of video collection file watermark
CN111176603A (en) Image display method for display equipment and display equipment
CN112165641A (en) Display device
CN113395556A (en) Display device and method for displaying detail page
CN112437334A (en) Display device
CN112203154A (en) Display device
CN112473121B (en) Display device and avoidance ball display method based on limb identification
CN113630569B (en) Display apparatus and control method of display apparatus
CN113485613A (en) Display equipment and method for realizing free-drawing screen edge painting
CN111984167B (en) Quick naming method and display device
CN113473241A (en) Display equipment and display control method of image-text style menu
CN113141528B (en) Display device, boot animation playing method and storage medium
RU2697835C2 (en) Display device and display method
CN110572519A (en) Method for intercepting caller identification interface and display equipment
CN111259639B (en) Self-adaptive adjustment method of table and display equipment
CN112040299B (en) Display device, server and live broadcast display method
CN112073777B (en) Voice interaction method and display device
CN114089852A (en) Display equipment, electronic whiteboard device and whiteboard erasing method
CN115185392A (en) Display device, image processing method and device
CN112367550A (en) Method for realizing multi-title dynamic display of media asset list and display equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination