CN114442849A - Display device and display method - Google Patents

Display device and display method Download PDF

Info

Publication number
CN114442849A
CN114442849A CN202210101365.8A CN202210101365A CN114442849A CN 114442849 A CN114442849 A CN 114442849A CN 202210101365 A CN202210101365 A CN 202210101365A CN 114442849 A CN114442849 A CN 114442849A
Authority
CN
China
Prior art keywords
track
layer
display
drawing board
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210101365.8A
Other languages
Chinese (zh)
Other versions
CN114442849B (en
Inventor
申静
张振宝
李保成
杨会芹
肖博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202210101365.8A priority Critical patent/CN114442849B/en
Publication of CN114442849A publication Critical patent/CN114442849A/en
Application granted granted Critical
Publication of CN114442849B publication Critical patent/CN114442849B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present application provides a display device and a display method, which can allow a user to select a desired display mode based on interaction with the display device, inputting a first track on the electronic drawing board interface, and responding to the input of a user, the acceleration layer stores the first track in a segmented mode, wherein, the first track stored in the acceleration layer comprises at least two segmented tracks, the adjacent two segmented tracks are connected by overlapping of end points to form an overlapping area, the acceleration layer is controlled to preprocess the stored first track, the end point of any one segmented track in each overlapping area is eliminated, the interface of the electronic drawing board is refreshed, the preprocessed first track is displayed on the interface of the electronic drawing board, so that when a painting brush control corresponding to a highlighter is used for painting on the electronic drawing board in the prior art, due to the fact that segmentation processing cannot be conducted on the input track, the drawing speed is low, and user experience is improved.

Description

Display device and display method
Technical Field
The application relates to the technical field of smart television drawing boards, in particular to a display device and a display method.
Background
The display device refers to a terminal device capable of outputting a specific display picture, such as a smart television, a mobile terminal, a smart advertisement screen, a projector, and the like. Taking an intelligent television as an example, the intelligent television can be based on an Internet application technology, has an open operating system and a chip, has an open application platform, can realize a bidirectional man-machine interaction function, integrates multiple functions of audio and video, entertainment, education, data and the like, and is a television product for meeting diversified and personalized requirements of users.
For example, a user may leave a touch track on the display device through interaction with the display device to implement a drawing board function of the display device, but when the color of the touch track is a fluorescent color, because the fluorescent color has the particularity of a transparent effect, if the touch track is displayed by using a method of storing in segments and refreshing a display interface, a finally displayed picture may have a superposition effect at a position of storing in segments, which is not favorable for user experience.
Disclosure of Invention
The application provides a display device and a display method, which are used for solving the problem that when a painting brush control corresponding to a highlighter is used for painting on an electronic drawing board in the prior art, the input track cannot be segmented, so that the painting speed is slow, and the user experience is improved.
In one aspect, the present application provides a display device, including:
a display for displaying a user interface;
the touch control assembly is used for receiving an instruction input by a user through touch control, wherein the touch control assembly and the display form a touch control screen;
a controller configured to:
presenting an electronic drawing board interface, wherein the electronic drawing board interface is used for displaying a target track stored in an acceleration layer, and the acceleration layer is used for storing the target track input by a user in the electronic drawing board interface within a preset time;
responding to a first track input by a user on the electronic drawing board interface, and storing the first track segment to the acceleration layer, wherein the first track stored to the acceleration layer comprises at least two segment tracks, and two adjacent segment tracks are connected through superposition of end points to form a superposition area;
controlling the acceleration layer to preprocess the stored first track and eliminate the end point of any one segmented track in each superposition area;
refreshing the electronic drawing board interface to display the preprocessed first track on the electronic drawing board interface.
In another aspect, the present application provides a display method, including:
presenting an electronic drawing board interface, wherein the electronic drawing board interface is used for displaying a target track stored in an acceleration layer, and the acceleration layer is used for storing the target track input by a user in the electronic drawing board interface within a preset time;
responding to a first track input by a user on the electronic drawing board interface, and storing the first track segment to the acceleration layer, wherein the first track stored to the acceleration layer comprises at least two segment tracks, and two adjacent segment tracks are connected through superposition of end points to form a superposition area;
controlling the acceleration layer to preprocess the stored first track and eliminate the end point of any one segmented track in each superposition area;
refreshing the electronic drawing board interface to display the preprocessed first track on the electronic drawing board interface.
According to the technical scheme, a user can input a first track on an electronic drawing board interface based on interaction with the display device, the acceleration layer can store the first track in a segmented manner in response to the input of the user, the first track stored in the acceleration layer comprises at least two segmented tracks, two adjacent segmented tracks are connected through the superposition of endpoints to form a superposition area, the acceleration layer is controlled to preprocess the stored first track, the endpoint of any segmented track in each superposition area is eliminated, the electronic drawing board interface is refreshed, the preprocessed first track is displayed on the electronic drawing board interface, and the problem that when a painting brush control corresponding to a fluorescent pen draws on the electronic drawing board in the prior art, the input track cannot be processed in a segmented manner, so that the drawing speed is slow is solved, the user experience is improved.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 illustrates an operational scenario between a display device and a control apparatus, according to some embodiments;
FIG. 2 is a block diagram illustrating a hardware configuration of a display device according to some embodiments;
FIG. 3 is a diagram illustrating a software configuration in a display device according to some embodiments;
fig. 4 is a schematic diagram of an electronic palette interface shown in some embodiments of the present application;
FIG. 5 is a schematic diagram illustrating a touch trajectory of a user in some embodiments of the present application;
FIG. 6 is a schematic diagram illustrating the preprocessing of stored traces by the acceleration layer shown in some embodiments of the present application;
fig. 7 is a schematic diagram of an electronic palette interface shown in some embodiments of the present application;
fig. 8 is a schematic diagram of an electronic palette interface shown in some embodiments of the present application;
FIG. 9 is a flow chart of a display method shown in some embodiments of the present application.
Detailed Description
To make the purpose and embodiments of the present application clearer, the following will clearly and completely describe the exemplary embodiments of the present application with reference to the attached drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control apparatus according to an embodiment. As shown in fig. 1, a user may operate the display apparatus 200 through the smart device 300 or the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, and controls the display device 200 in a wireless or wired manner. The user may input a user instruction through a key on a remote controller, voice input, control panel input, etc., to control the display apparatus 200.
In some embodiments, the smart device 300 (e.g., mobile terminal, tablet, computer, laptop, etc.) may also be used to control the display device 200. For example, the display device 200 is controlled using an application program running on the smart device.
In some embodiments, the display device 200 may also be controlled in a manner other than the control apparatus 100 and the smart device 300, for example, the voice instruction control of the user may be directly received by a module configured inside the display device 200 to obtain a voice instruction, or may be received by a voice control apparatus provided outside the display device 200.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers.
Fig. 2 shows a hardware configuration block diagram of a display device 200 according to an exemplary embodiment.
In some embodiments, the display device comprises a touch component, the display device can realize a touch interaction function through the touch component, and a user can operate the host machine by only lightly touching the display with fingers, so that the operations of a keyboard, a mouse and a remote controller are avoided, and the man-machine interaction is more straightforward. On the touch display, a user can input different control instructions through touch operation. For example, the user may input a click, slide, long press, double click, etc. touch command, and different touch commands may represent different control functions.
In order to implement the different touch actions, the touch component may generate different electrical signals when the user inputs the different touch actions, and send the generated electrical signals to the controller 250. The controller 250 may perform feature extraction on the received electrical signal to determine a control function to be performed by the user based on the extracted features. For example, when a user inputs a click touch action at any program icon position in the application program interface, the touch component senses the touch action and generates an electrical signal. After receiving the electrical signal, the controller 250 may first determine a duration of a level corresponding to the touch action in the electrical signal, and when the duration is less than a preset time threshold, recognize that the user inputs a click touch command. The controller 250 extracts the position characteristics generated by the electrical signals to determine the touch position. And when the touch position is within the display range of the application icon, determining that the user inputs a click touch instruction at the position of the application icon. Accordingly, the click touch command is used to execute a function of running a corresponding application program in the current scene, so that the controller 250 may start running the corresponding application program.
For another example, when the user inputs a sliding motion in the media asset display page, the touch component also sends the sensed electrical signal to the controller 250. The controller 250 first determines the duration of the signal corresponding to the touch action in the electrical signal. When the determined duration is longer than the preset time threshold, the position change condition generated by the signal is judged, and obviously, for the interactive touch action, the generation position of the signal changes, so that the user is determined to input the sliding touch instruction. The controller 250 determines the sliding direction of the sliding touch instruction according to the change condition of the position of the signal generation, and controls to turn pages of the display frame in the media asset display page so as to display more media asset options. Further, the controller 250 may extract features such as a sliding speed and a sliding distance of the sliding touch instruction, and perform a page-turning screen control according to the extracted features, so as to achieve a hand-following effect.
Similarly, for the touch instruction such as double click, long press, etc., the controller 250 may execute the corresponding control function according to the preset interaction rule by extracting different features and determining the type of the touch instruction through feature judgment. In some embodiments, the touch component also supports multi-touch, such that a user can input a touch action on the touch screen through multiple fingers, e.g., multi-finger click, multi-finger long press, multi-finger slide, etc.
The touch control action can be matched with a specific application program to realize a specific function. For example, after the user opens the drawing board application, the display 260 may present a drawing area, the user may draw a specific touch action track in the drawing area through a sliding touch instruction, and the controller 250 determines a touch action pattern through a touch action detected by the touch component and controls the display 260 to display in real time to satisfy the demonstration effect.
In some embodiments, the display apparatus 200 further comprises at least one of a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, a user interface.
In some embodiments the controller comprises a processor, a video processor, an audio processor, a graphics processor, a RAM, a ROM, a first interface to an nth interface for input/output.
In some embodiments, the display 260 includes a display screen component for presenting a picture, and a driving component for driving an image display, a component for receiving an image signal from the controller output, performing display of video content, image content, and a menu manipulation interface, and a user manipulation UI interface.
In some embodiments, the display 260 may be a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
In some embodiments, communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver. The display apparatus 200 may establish transmission and reception of control signals and data signals with the external control device 100 or the server 400 through the communicator 220.
In some embodiments, the user interface may be configured to receive control signals for controlling the apparatus 100 (e.g., an infrared remote control, etc.).
In some embodiments, the detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for collecting ambient light intensity; alternatively, the detector 230 includes an image collector, such as a camera, which may be used to collect external environment scenes, attributes of the user, or user interaction gestures, or the detector 230 includes a sound collector, such as a microphone, which is used to receive external sounds.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, and the like. The interface may be a composite input/output interface formed by the plurality of interfaces.
In some embodiments, the tuner demodulator 210 receives broadcast television signals via wired or wireless reception, and demodulates audio/video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink, an icon, or other actionable control. The operations related to the selected object are: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon.
In some embodiments the controller comprises at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphics Processing Unit (GPU), a RAM Random Access Memory (RAM), a ROM (Read-Only Memory), a first to nth interface for input/output, a communication Bus (Bus), and the like.
A CPU processor. For executing operating system and application program instructions stored in the memory, and executing various application programs, data and contents according to various interactive instructions receiving external input, so as to finally display and play various audio-video contents. The CPU processor may include a plurality of processors. E.g. comprising a main processor and one or more sub-processors.
In some embodiments, a graphics processor for generating various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The graphic processor comprises an arithmetic unit, which performs operation by receiving various interactive instructions input by a user and displays various objects according to display attributes; the system also comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, the user may input a user command on a Graphical User Interface (GUI) displayed on the display 260, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form that the user can receive. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
In some embodiments, a system of a display device may include a Kernel (Kernel), a command parser (shell), a file system, and an application program. The kernel, shell, and file system together make up the basic operating system structure that allows users to manage files, run programs, and use the system. After power-on, the kernel is started, kernel space is activated, hardware is abstracted, hardware parameters are initialized, and virtual memory, a scheduler, signals and interprocess communication (IPC) are operated and maintained. And after the kernel is started, loading the Shell and the user application program. The application program is compiled into machine code after being started, and a process is formed.
Referring to fig. 3, in some embodiments, the system is divided into four layers, which are an Application (Applications) layer (abbreviated as "Application layer"), an Application Framework (Application Framework) layer (abbreviated as "Framework layer"), an Android runtime (Android runtime) and system library layer (abbreviated as "system runtime library layer"), and a kernel layer, respectively, from top to bottom.
In some embodiments, at least one application program runs in the application program layer, and the application programs may be windows (windows) programs carried by an operating system, system setting programs, clock programs or the like; or an application developed by a third party developer. In particular implementations, the application packages in the application layer are not limited to the above examples.
The framework layer provides an Application Programming Interface (API) and a programming framework for the application. The application framework layer includes a number of predefined functions. The application framework layer acts as a processing center that decides to let the applications in the application layer act. The application program can access the resources in the system and obtain the services of the system in execution through the API interface.
As shown in fig. 3, in the embodiment of the present application, the application framework layer includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used for interacting with all activities running in the system; the Location Manager (Location Manager) is used for providing the system service or application with the access of the system Location service; a Package Manager (Package Manager) for retrieving various information related to an application Package currently installed on the device; a Notification Manager (Notification Manager) for controlling display and clearing of Notification messages; a Window Manager (Window Manager) is used to manage the icons, windows, toolbars, wallpapers, and desktop components on a user interface.
In some embodiments, the activity manager is used to manage the lifecycle of the various applications as well as general navigational fallback functions, such as controlling exit, opening, fallback, etc. of the applications. The window manager is used for managing all window programs, such as obtaining the size of a display screen, judging whether a status bar exists, locking the screen, intercepting the screen, controlling the change of the display window (for example, reducing the display window, displaying a shake, displaying a distortion deformation, and the like), and the like.
In some embodiments, the system runtime layer provides support for the upper layer, i.e., the framework layer, and when the framework layer is used, the android operating system runs the C/C + + library included in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 3, the core layer comprises at least one of the following drivers: audio drive, display driver, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (like fingerprint sensor, temperature sensor, pressure sensor etc.) and power drive etc..
In some embodiments, controller 250 controls the operation of display device 200 and responds to user operations associated with display 260 by running various software control programs (e.g., an operating system and/or various application programs) stored on a memory. For example, control presents a user interface on a display, the user interface including a number of UI objects thereon; in response to a received user command for a UI object on the user interface, the controller 250 may perform an operation related to the object selected by the user command.
The application provides a display device has the drawing board function. The drawing board function can be realized based on the application which is installed on the display equipment and is related to the drawing board function. For convenience of description, an application related to a drawing board function installed on a display device is referred to as a "drawing board application", and after the drawing board application is started by the display device, an electronic drawing board interface is presented on the display. Areas on the electronic palette interface that display user interface objects, information, and/or input content corresponding to one or more functions of the palette application. The aforementioned user interface objects refer to objects constituting an electronic palette interface, and may include, but are not limited to, text, images, icons, soft keys (or "virtual buttons"), drop-down menus, radio buttons, check boxes, selectable lists, and the like. The displayed user interface objects may include non-interactive objects for conveying information or otherwise forming the appearance of the user interface, interactive objects for user interaction, or a combination of non-interactive and interactive objects. The user makes contact with the touch screen at a location on the touch screen corresponding to the interactive object with which the user wishes to interact, thereby interacting with the user interface object. The display device detects contact and responds to the detected contact by performing an operation corresponding to interaction of the interaction object to enable drawing in a sketchpad application.
In some embodiments, some or all of the steps involved in embodiments of the present application are implemented within an operating system and within an application program. In some embodiments, the application program for implementing some or all of the steps of the embodiments of the present application is the above-mentioned "sketchpad application" stored in the memory, and the controller 250 controls the operation of the display device 200 and responds to the user operation related to the application program by running the application program in the operating system.
It should be noted that the display device according to the embodiment of the present application includes, but is not limited to, the display device 200 described in the above embodiment, and may also be other terminal devices having an image display function, a data processing function, and an information transceiving function, such as a portable mobile terminal, e.g., a mobile phone, a tablet computer, and the like. The embodiments of the present application will be described in detail below with reference to a display device as an example.
In some embodiments, an electronic palette interface includes a drawing area and a control area. The drawing area is an area capable of inputting contents, and the control area is used for intensively displaying user interface objects and information corresponding to one or more functions of the drawing board application. The user interface object includes, but is not limited to, a brush control, an eraser control, a color drawing control, and the like, and the information includes various parameter information corresponding to the brush control, such as a current input color, an optional color, a thickness, a line shape, and the like.
Fig. 4 is an electronic palette interface diagram shown in some embodiments of the present application. As shown in fig. 4, the electronic palette interface includes a drawing area 610 and a control area 620. The drawing area 610 is used for receiving contents input by a user through a control in the control area 620, and displaying the received contents, such as lines, graphics, text, and the like. The control area 620 is configured to display various types of functional controls, including multiple different- style brush controls 621a, 621b, 621c, 621d, and 621e, an erasing control 622, a deleting control 623, a recording control 624, a further control 625, a removing control 626, and a restoring control 627, where the brush control 621a may be a brush control with a fluorescent pen effect, when a certain brush control is selected, for example, when the brush control 621a with the fluorescent pen effect is selected, a toolbar corresponding to the brush control is displayed, and a brush color, a thickness, a type, and the like may be selected in the toolbar. When the electronic drawing board interface is displayed, a user clicks the painting brush control to pick up the painting brush, and in the state of picking up the painting brush control, the user can input content based on contact with the painting area, wherein the input content is a contact track of the user on the painting area.
In some embodiments, a user may draw a specific touch action track, that is, a target track, in the drawing area through a sliding operation, the controller may store the target track into the acceleration layer in a segmented manner according to the touch action detected by the touch component, and the controller refreshes the electronic drawing board interface every time the acceleration layer receives a segment of the target track, so that the electronic drawing board interface displays the target track stored in the acceleration layer.
However, when a user selects a brush control corresponding to a "highlighter" to draw a picture on the electronic drawing board, because the highlighter has the particularity of a transparent effect, when the method is used for copying the content in the acceleration layer to the display layer in a segmented manner for refreshing, because both ends of each segmented track are in the form of semicircular caps, the finally displayed picture has a superposition effect as shown in fig. 5 at the segmented processing position, so that the line is not smooth, and a breakpoint exists in the vision. Therefore, after a user draws a certain track in the electronic drawing board through the brush control corresponding to the 'highlighter', all contents drawn by the user need to be copied to the display layer again for refreshing display, and the track cannot be segmented, that is, the track is copied to the display layer for refreshing display while being drawn, which causes the problem that when the brush control corresponding to the highlighter draws on the electronic drawing board, the drawing speed is slow due to the fact that segmentation processing cannot be performed on the input track, and is not favorable for user experience.
In this regard, in some embodiments, the controller segment-stores a target trajectory into the acceleration layer based on a sliding operation of the user in the drawing area detected by the touch component, the target trajectory being a touch trajectory of the user in the drawing area, for example, referring further to fig. 5, the user slides from a position a in the drawing area to form a touch trajectory, when reaching the position B, the controller controls to store a touch trajectory from the position a to the position B into the acceleration layer as an a-B trajectory, and when reaching the position C, the controller 250 controls to continue to store a touch trajectory from the position B to the position C into the acceleration layer as a B-C trajectory as the sliding operation of the user, since the a-B trajectory and the B-C trajectory stored into the acceleration layer both pass through the position B (end point), the trajectory a-B trajectory and the trajectory B-C trajectory are superimposed at the position B (end point), so that the a-B track and the B-C track form a continuous track and the superimposed part is marked as the superimposed area (shaded from the B position (end).
In some embodiments, the controller may detect, according to the touch component, that a user performs a sliding operation, and control, every time a preset time period elapses, the touch trajectory formed in the preset time period to be stored in the acceleration layer, for example, if the preset time is set to 0.01s, the touch trajectory formed in the 0.01s is stored in the acceleration layer every 0.01s, if the time for the user to perform the sliding operation is 1s, that is, the time for the user to leave a continuous touch trajectory through the sliding operation is 1s, the controller may control the acceleration layer to divide the touch trajectory into 100 trajectory segments and store the trajectory segments into the acceleration layer, where the number of the overlap regions is 99 at end points of any two adjacent trajectories in the 100 trajectory segments.
In some embodiments, the controller may control, according to the detection of the start of the sliding operation performed by the user by the touch component, to store the touch trajectory of the preset distance segment in the acceleration layer every time the preset distance segment passes, for example, if the preset distance is set to 0.1cm, the touch trajectory of 0.1cm is stored in the acceleration layer every time the sliding trajectory of the user passes 0.1cm, and if the user leaves a continuous touch trajectory with a length of 5cm through the sliding operation, the controller will control the acceleration layer to store the touch trajectory of 50 trajectory segments in the acceleration layer, where the end points of any two adjacent trajectories in the 50 trajectory segments have overlap areas, and the number of the overlap areas is 49.
In some embodiments, after the acceleration layer stores the target track in segments, the controller controls the acceleration layer to preprocess the stored target track and eliminate the content of any segment of the segmented track in each overlap region, so as to eliminate the overlap region. For example, referring to fig. 6, when there is an overlap area, i.e., a-B track and B-C track are shaded at the B position, the track of the a-B track in the overlap area (B end point) is eliminated to eliminate the overlap area, or the track of the B-C track in the overlap area (B end point) is eliminated to eliminate the overlap area.
In some embodiments, the user performs a sliding operation through the position C, and if the sliding operation is stopped or terminated, the controller may detect whether an overlap region still exists in the input target trajectory for the control acceleration layer, and if the overlap region exists, eliminate an end point of any segment trajectory in all the overlap regions to eliminate the overlap region.
In some embodiments, as further shown in fig. 5, after the user performs the sliding operation through the C position, if the controller detects that the user continues to perform the sliding operation through the touch component, and sequentially passes through the D position, the E position, and the like, the controller controls to sequentially store the touch tracks from the C position to the D position (denoted as C-D tracks) and the touch tracks from the D position to the E position (denoted as D-E tracks) into the acceleration layer according to the input time sequence, at this time, the B-C tracks and the C-D tracks are overlapped at the C position (end point) to form an overlapped area, the C-D tracks and the D-E tracks are overlapped at the D position (end point) to form an overlapped area, and the acceleration layer sequentially eliminates the overlapped area formed by overlapping the B-C tracks and the C-D tracks at the C position (end point) and the overlapped area formed by overlapping the C-D tracks and the D-E tracks at the D position (end point), until the overlap area on the target track stored in the acceleration layer is completely eliminated.
In some embodiments, the acceleration layer may store the target trajectory within a preset time, the controller may continuously refresh the electronic drawing board interface according to an update of the stored content of the acceleration layer, so as to display the content stored in the acceleration layer on the electronic drawing board interface, that is, the acceleration layer is essentially a layer in which the target trajectory is temporarily stored, and the controller gradually displays the target trajectory corresponding to the touch operation of the user by continuously refreshing the electronic drawing board interface by imaging the touch trajectory input by the user in real time through a sliding operation on the electronic drawing board interface, visually generating the touch operation along with the touch operation of the user. As further shown in fig. 5, after the user passes through A, B, C positions in sequence through the sliding operation, the controller stores the a-B track and the B-C track in sequence to the acceleration layer, after the acceleration layer preprocesses the a-B track and the B-C track, that is, after the overlapping area of the position B is eliminated, if the user continues to slide, before the controller stores the C-D track into the acceleration layer to form a new overlapping area (position C), the controller may refresh the electronic palette interface according to the preprocessed a-B track and B-C track stored in the acceleration layer to display the a-B track and the B-C track as shown in fig. 7 on the electronic palette interface, because the time for the a-B track and the B-C track to pass through is short as the a-B track and the B-C track are stored in the acceleration layer to the electronic palette interface are refreshed to display the a-B track and the B-C track, the touch control can be completed in an instant, so that the effect of gradually displaying the target track corresponding to the touch control operation of the user along with the touch control operation of the user can be generated visually.
In some embodiments, after the controller refreshes the electronic drawing board interface according to the acceleration layer to enable the electronic drawing board interface to display an a-B trajectory and a B-C trajectory, if the controller detects that the user passes through the position D again through the touch component, the C-D trajectory is continuously stored in the acceleration layer, the C-D trajectory and the originally stored B-C trajectory in the acceleration layer are overlapped at the position C (endpoint), the acceleration layer eliminates the trajectory (endpoint) of any one of the C-D trajectory or the B-C trajectory in the overlap region to obtain an a-D trajectory, the a-D trajectory does not have an overlap region and is a continuous and smooth trajectory, and before the controller controls the new trajectory to be stored in the acceleration layer to form a new overlap region, the controller can continuously refresh the electronic drawing board interface according to the a-D trajectory stored in the acceleration layer, to display the a-D trajectories on the electronic palette interface.
In some embodiments, each time a user performs a sliding operation, a complete target track can be obtained, the acceleration layer stores the complete target track, and after the preprocessing of the target track is completed, the target track is sent to the display layer, and after the display layer receives and stores the target track, the controller refreshes the interface of the electronic drawing board according to the content stored in the display layer, so as to display the target track stored in the display layer on the interface of the electronic drawing board.
In some embodiments, the controller refreshes the electronic drawing board interface, so that before the electronic drawing board interface displays the target track stored in the display layer, the controller may control to clear the target track stored in the acceleration layer, and after the target track stored in the acceleration layer is cleared, the controller may refresh the electronic drawing board interface according to the display layer and the acceleration layer, at this time, because the target track stored in the acceleration layer is cleared, the refreshed electronic drawing board interface only displays the target track stored in the display layer, thereby avoiding a problem of overlapping tracks, which occurs when the refreshed electronic drawing board interface simultaneously displays the target track stored in the display layer and the target track stored in the acceleration layer due to the existence of the target track in the acceleration layer, and avoiding a problem of aggravation of color and/or transparency of the tracks visually in the electronic drawing board interface.
In some embodiments, if the user performs a sliding operation after inputting the target trajectory through the sliding operation, to distinguish the target trajectory from the second trajectory, the controller determines the target trajectory as a first trajectory, determines the trajectory input later as a second trajectory, and after the controller controls to clear the first trajectory stored in the acceleration layer, the controller may control to store the second trajectory in the acceleration layer in a segmented manner based on the second trajectory detected by the touch control component, where only the second trajectory or a part of the second trajectory is stored in the acceleration layer, the acceleration layer stores the complete second trajectory and preprocesses the second trajectory, and after an overlapping area of the second trajectory is eliminated, if the electronic palette interface has not been refreshed yet, that is, all contents of the first trajectory stored in the display layer are displayed, the controller does not refresh the electronic palette interface for the second time based on the contents stored in the acceleration layer, that is, before all contents of the first trajectory are not displayed on the electronic palette interface, the electronic palette interface may not display the second trajectory stored in the acceleration layer.
In some embodiments, when the electronic drawing board interface is refreshed, that is, the first trajectory stored in the display layer is completely displayed on the electronic drawing board interface, the controller refreshes the electronic drawing board interface according to the contents stored in the display layer and the acceleration layer, as shown in fig. 8, so that the refreshed electronic drawing board interface displays both the first trajectory stored in the display layer and the second trajectory stored in the acceleration layer, thereby avoiding a screen flashing problem caused by the controller refreshing the electronic drawing board interface for the second time based on the contents stored in the acceleration layer when the first trajectory is not completely displayed on the electronic drawing board interface.
In some embodiments, the acceleration layer receives and stores the complete second trajectory, preprocesses the second trajectory, sends the preprocessed second trajectory to the display layer, and after receiving and storing the second trajectory, the display layer can synthesize the first trajectory and the second trajectory to obtain a synthesized trajectory, detects that the content stored in the display layer is updated, and the controller refreshes the electronic drawing board interface according to the content stored in the display layer, so that the refreshed electronic drawing board interface displays the synthesized trajectory.
In some embodiments, the display layer further includes a plurality of transparent layers stacked on each other, the first track is stored in the layer at the bottom, the display layer receives the second track and stores the second track in the layer above the layer where the first track is located, so as to form a composite track, and the controller may refresh the electronic drawing board interface according to the layer where the first track and the second track are stored in the display layer, so that the refreshed electronic drawing board interface displays the composite track obtained by stacking the first track and the second track.
According to the technical scheme, the application provides the display device, a user can input a first track on the interface of the electronic drawing board based on interaction with the display device, the acceleration layer can store the first track in a segmented manner in response to the input of the user, the first track stored in the acceleration layer comprises at least two segmented tracks, two adjacent segmented tracks are connected through superposition of endpoints to form a superposition area, the acceleration layer is controlled to preprocess the stored first track, the endpoint of any one segmented track in each superposition area is eliminated, the interface of the electronic drawing board is refreshed, the preprocessed first track is displayed on the interface of the electronic drawing board, and the problem that when a control corresponding to a highlighter in the prior art draws on the electronic drawing board, the drawing speed is slow due to the fact that segmentation processing cannot be performed on the input track is solved, the user experience is improved.
According to the display device provided by the above embodiment, the present application also provides a display method, which is applied to the above display device, and as shown in fig. 9, the method may include:
s101: and presenting an electronic drawing board interface, wherein the electronic drawing board interface is used for displaying a target track stored in an acceleration layer, and the acceleration layer is used for storing the target track input by a user in the electronic drawing board interface within a preset time.
S102: responding to a first track input by a user on the electronic drawing board interface, storing the first track segment to the acceleration layer, wherein the first track stored to the acceleration layer comprises at least two segment tracks, and two adjacent segment tracks are connected through superposition of end points to form a superposition area.
S103: and controlling the acceleration layer to preprocess the stored first track and eliminate the end point of any one of the segmented tracks in each superposition area.
S104: refreshing the electronic drawing board interface to display the preprocessed first track on the electronic drawing board interface.
In some embodiments, the refreshing the electronic palette interface to display the preprocessed first trajectory on the electronic palette interface further includes: and refreshing the electronic drawing board interface according to the acceleration layer so as to display the preprocessed first track stored in the acceleration layer on the electronic drawing board interface.
In some embodiments, the electronic palette interface is further configured to display a target trajectory stored in a display layer, the display layer is configured to receive and store the target trajectory sent by the acceleration layer, and the electronic palette interface is refreshed to display the preprocessed first trajectory stored in the acceleration layer on the electronic palette interface, where the method further includes: controlling the acceleration layer to send a stored original track to the display layer, wherein the original track is the preprocessed first track stored in the display layer; controlling the display layer to receive and store the original trajectory; and refreshing the electronic drawing board interface according to the display layer so as to display the original track stored in the display layer on the electronic drawing board interface.
In some embodiments, the method further comprises emptying the original trajectory stored by the acceleration layer, so that the acceleration layer stores a second trajectory input by a user on the electronic sketchpad interface, wherein the input time of the second trajectory is later than that of the first trajectory.
In some embodiments, the emptying the original trajectory stored by the acceleration layer to cause the acceleration layer to store a second trajectory input by a user at the electronic palette interface further includes: in response to the second track input by the user on the electronic palette interface, storing the second track to the acceleration layer; after the original track stored in the display layer is displayed on the electronic drawing board interface, refreshing the electronic drawing board interface according to the display layer and the acceleration layer so as to display the original track stored in the display layer and the second track stored in the acceleration layer on the electronic drawing board interface.
In some embodiments, the refreshing the electronic palette interface according to the display layer and the acceleration layer to display the original trajectory and the second trajectory stored by the acceleration layer after the electronic palette interface displays the original trajectory and the second trajectory, further includes: controlling the acceleration layer to send the stored second track to the display layer; controlling the display layer to receive and store the second track sent by the acceleration layer; controlling the display layer to synthesize the original track and the second track to obtain a synthesized track; and refreshing the electronic drawing board interface according to the synthetic track stored in the display layer so as to display the synthetic track stored in the display layer on the electronic drawing board interface.
In some embodiments, the controlling the display layer to perform the synthesizing process on the original track and the second track includes: and controlling to superpose the second image layer on the first image layer, wherein the first image layer and the second image layer are both transparent image layers.
In some embodiments, the refreshing the electronic palette interface to display the composite trajectory stored by the display layer on the electronic palette interface further includes: emptying the second trajectory stored in the acceleration layer; and refreshing the electronic drawing board interface according to the display layer and the acceleration layer so as to only display the synthetic track stored in the display layer on the electronic drawing board interface.
In specific implementation, the present invention further provides a computer storage medium, where the computer storage medium may store a program, and the program may include some or all of the steps in the embodiments of the display method provided by the present invention when executed. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM) or a Random Access Memory (RAM).
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented as software plus a required general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be essentially or partially implemented in the form of a software product, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments.
The same and similar parts in the various embodiments in this specification may be referred to each other. In particular, for the embodiment of the display device, since it is substantially similar to the embodiment of the method, the description is simple, and for the relevant points, refer to the description in the embodiment of the method.
The above-described embodiments of the present invention should not be construed as limiting the scope of the present invention.

Claims (9)

1. A display device, comprising:
a display for displaying a user interface;
the touch control assembly is used for receiving an instruction input by a user through touch control, wherein the touch control assembly and the display form a touch control screen;
a controller configured to:
presenting an electronic drawing board interface, wherein the electronic drawing board interface is used for displaying a target track stored in an acceleration layer, and the acceleration layer is used for storing the target track input by a user in the electronic drawing board interface within a preset time;
responding to a first track input by a user on the electronic drawing board interface, and storing the first track segment to the acceleration layer, wherein the first track stored to the acceleration layer comprises at least two segment tracks, and two adjacent segment tracks are connected through superposition of end points to form a superposition area;
controlling the acceleration layer to preprocess the stored first track and eliminate the end point of any one segmented track in each superposition area;
refreshing the electronic drawing board interface to display the preprocessed first track on the electronic drawing board interface.
2. The display device of claim 1, wherein the refreshing the electronic palette interface to display the preprocessed first trace on the electronic palette interface is further configured to:
and refreshing the electronic drawing board interface according to the acceleration layer so as to display the preprocessed first track stored in the acceleration layer on the electronic drawing board interface.
3. The display device according to claim 1, wherein the electronic palette interface is further configured to display a target trajectory stored in a display layer, the display layer is configured to receive and store the target trajectory transmitted by the acceleration layer, and the electronic palette interface is refreshed to display the preprocessed first trajectory on the electronic palette interface, and further configured to:
controlling the acceleration layer to send a stored original track to the display layer, wherein the original track is the first track after preprocessing;
controlling the display layer to receive and store the original trajectory;
and refreshing the electronic drawing board interface according to the display layer so as to display the original track stored in the display layer on the electronic drawing board interface.
4. The display device of claim 3, further configured to:
emptying the original track stored in the acceleration layer, so that the acceleration layer stores a second track input by a user on the electronic drawing board interface, wherein the input time of the second track is later than that of the first track.
5. The display device according to claim 4, wherein the emptying of the original trajectory stored by the acceleration layer to cause the acceleration layer to store a second trajectory input by a user at the electronic palette interface is further configured to:
in response to the second track input by the user on the electronic palette interface, storing the second track to the acceleration layer;
after the original track stored in the display layer is displayed on the electronic drawing board interface, refreshing the electronic drawing board interface according to the display layer and the acceleration layer so as to display the original track stored in the display layer and the second track stored in the acceleration layer on the electronic drawing board interface.
6. The display device according to claim 5, wherein the electronic palette interface is refreshed according to the display layer and the acceleration layer, so that after the electronic palette interface displays the original trajectory and the second trajectory stored by the acceleration layer, the electronic palette interface is further configured to:
controlling the acceleration layer to send the stored second track to the display layer;
controlling the display layer to receive and store the second track sent by the acceleration layer;
controlling the display layer to synthesize the original track and the second track to obtain a synthesized track;
and refreshing the electronic drawing board interface according to the synthetic track stored in the display layer so as to display the synthetic track stored in the display layer on the electronic drawing board interface.
7. The display device according to claim 6, wherein the display layer includes a first layer and a second layer, the original track is stored in the first layer, the second track is stored in the second layer, and the control unit is further configured to perform synthesis processing on the original track and the second track by the display layer and further configured to:
and controlling to superpose the second image layer on the first image layer, wherein the first image layer and the second image layer are both transparent image layers.
8. The display device of claim 6, wherein the refreshing the electronic palette interface to display the composite trajectory stored by the display layer on the electronic palette interface is further configured to:
emptying the second trajectory stored in the acceleration layer;
and refreshing the electronic drawing board interface according to the display layer and the acceleration layer so as to only display the synthetic track stored in the display layer on the electronic drawing board interface.
9. A display method, comprising:
presenting an electronic drawing board interface, wherein the electronic drawing board interface is used for displaying a target track stored in an acceleration layer, and the acceleration layer is used for storing the target track input by a user in the electronic drawing board interface within a preset time;
responding to a first track input by a user on the electronic drawing board interface, and storing the first track segment to the acceleration layer, wherein the first track stored to the acceleration layer comprises at least two segment tracks, and two adjacent segment tracks are connected through superposition of end points to form a superposition area;
controlling the acceleration layer to preprocess the stored first track and eliminate the end point of any one segmented track in each superposition area;
refreshing the electronic drawing board interface to display the preprocessed first track on the electronic drawing board interface.
CN202210101365.8A 2022-01-27 2022-01-27 Display equipment and display method Active CN114442849B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210101365.8A CN114442849B (en) 2022-01-27 2022-01-27 Display equipment and display method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210101365.8A CN114442849B (en) 2022-01-27 2022-01-27 Display equipment and display method

Publications (2)

Publication Number Publication Date
CN114442849A true CN114442849A (en) 2022-05-06
CN114442849B CN114442849B (en) 2024-05-17

Family

ID=81369234

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210101365.8A Active CN114442849B (en) 2022-01-27 2022-01-27 Display equipment and display method

Country Status (1)

Country Link
CN (1) CN114442849B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105096365A (en) * 2014-05-16 2015-11-25 Tcl集团股份有限公司 3D interface local refreshing method and system
CN112672199A (en) * 2020-12-22 2021-04-16 海信视像科技股份有限公司 Display device and multi-layer stacking method
CN112799627A (en) * 2021-02-08 2021-05-14 海信视像科技股份有限公司 Display apparatus and image display method
CN113810746A (en) * 2021-09-14 2021-12-17 海信视像科技股份有限公司 Display device and picture sharing method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105096365A (en) * 2014-05-16 2015-11-25 Tcl集团股份有限公司 3D interface local refreshing method and system
CN112672199A (en) * 2020-12-22 2021-04-16 海信视像科技股份有限公司 Display device and multi-layer stacking method
CN112799627A (en) * 2021-02-08 2021-05-14 海信视像科技股份有限公司 Display apparatus and image display method
CN113810746A (en) * 2021-09-14 2021-12-17 海信视像科技股份有限公司 Display device and picture sharing method

Also Published As

Publication number Publication date
CN114442849B (en) 2024-05-17

Similar Documents

Publication Publication Date Title
CN114501107A (en) Display device and coloring method
CN112799627B (en) Display apparatus and image display method
CN113810746B (en) Display equipment and picture sharing method
CN112672199B (en) Display device and multi-layer overlapping method
CN112698905B (en) Screen saver display method, display device, terminal device and server
CN114501108A (en) Display device and split-screen display method
CN115129214A (en) Display device and color filling method
CN114115637A (en) Display device and electronic drawing board optimization method
CN114157889B (en) Display equipment and touch control assisting interaction method
CN113225602A (en) Display equipment and control method for loading and displaying television homepage content
CN112947800A (en) Display device and touch point identification method
CN112926420B (en) Display device and menu character recognition method
CN112650418B (en) Display device
CN112947783B (en) Display device
CN115562544A (en) Display device and revocation method
CN114442849B (en) Display equipment and display method
CN113485613A (en) Display equipment and method for realizing free-drawing screen edge painting
CN113485614A (en) Display apparatus and color setting method
CN112732120A (en) Display device
CN114296623A (en) Display device
CN114281284B (en) Display apparatus and image display method
CN113721817A (en) Display device and editing method of filling graph
CN115550716A (en) Display device and color mixing display method
CN115550717A (en) Display device and multi-finger touch display method
CN115268697A (en) Display device and line drawing rendering method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant