CN115562544A - Display device and revocation method - Google Patents

Display device and revocation method Download PDF

Info

Publication number
CN115562544A
CN115562544A CN202210106451.8A CN202210106451A CN115562544A CN 115562544 A CN115562544 A CN 115562544A CN 202210106451 A CN202210106451 A CN 202210106451A CN 115562544 A CN115562544 A CN 115562544A
Authority
CN
China
Prior art keywords
record list
user
event
historical
operation record
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210106451.8A
Other languages
Chinese (zh)
Inventor
董率
张振宝
王之奎
肖媛
李乃金
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Publication of CN115562544A publication Critical patent/CN115562544A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Abstract

When a cancel instruction is input by a user, in response to the cancel instruction input by the user for a historical operation event, presenting options related to a first operation record list in a control area, wherein the first operation record list comprises at least one historical operation event input by the user on an electronic drawing board interface, updating the first operation record list in response to the operation selected by the user for the first historical operation event, deleting the first historical operation event in the first operation record list, and refreshing the electronic drawing board interface so as to display the operation event in the updated first operation record list on the electronic drawing board interface. The display device and the revocation method can directly revoke any executed operation step, and improve the experience of a user.

Description

Display device and revocation method
The present application claims priority of chinese patent application having application number 202110739293.5 entitled "a display device and revocation method" filed by chinese patent office at 30/6/2021, the entire contents of which are incorporated herein by reference.
Technical Field
The application relates to the technical field of smart television drawing boards, in particular to a display device and a revocation method.
Background
The display device refers to a terminal device capable of outputting a specific display picture, such as a smart television, a mobile terminal, a smart advertisement screen, a projector, and the like. Taking an intelligent television as an example, the intelligent television can be based on an Internet application technology, has an open operating system and a chip, has an open application platform, can realize a bidirectional man-machine interaction function, integrates multiple functions of audio and video, entertainment, education, data and the like, and is a television product for meeting diversified and personalized requirements of users. For example, drawing board application software can be installed on the smart television for users to draw.
In the prior art, when drawing board application software is used, if one-time "pen drop-move-pen lift" of a painting brush is recognized as one operation, only the operation of the last step can be cancelled when the cancellation operation is executed, and similarly, only the operation of the last step can be restored when a user executes the restoration operation, which is not favorable for the user experience.
Disclosure of Invention
The application provides a display device and a revocation method, which are used for solving the problem that a user can only revoke from the last operation when revoking the operation, and are beneficial to the user experience.
In a first aspect, the present application provides a display device comprising:
a display configured to display an electronic palette interface, the electronic palette interface including a control area and a drawing area, the control area including at least one control for performing an input operation on the drawing area, the drawing area being used to present input content;
the memory is configured to store a first operation record list, and the first operation record list is used for storing at least one historical operation event input by a user on the electronic drawing board interface;
a controller configured to:
in response to a cancel instruction of the historical operation event input by a user, presenting a first operation record list related option in the control area;
in response to the selection operation of a user on a first historical operation event, updating the first operation record list to delete the first historical operation event in the first operation record list;
and refreshing the electronic drawing board interface to display the updated operation events in the first operation record list on the electronic drawing board interface.
In a second aspect, the present application provides a revocation method, applied to a display device, including:
presenting a related option of a first operation record list in a control area in response to a cancel instruction of the historical operation event input by a user;
updating the first operation record list in response to the user's selected operation on a first historical operation event so as to delete the first historical operation event in the first operation record list;
and refreshing the electronic drawing board interface to display the updated operation events in the first operation record list on the electronic drawing board interface.
According to the technical scheme, when a cancel instruction is input by a user, options related to a first operation record list are presented in a control area in response to the cancel instruction for the historical operation events input by the user, the first operation record list comprises at least one historical operation event input by the user on an electronic drawing board interface, the first operation record list is updated in response to the selection operation of the user on the first historical operation event, so that the first historical operation event in the first operation record list is deleted, and the electronic drawing board interface is refreshed, so that the operation event in the updated first operation record list is displayed on the electronic drawing board interface. The display device and the revocation method can directly revoke any executed operation step, and improve the experience of a user.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a usage scenario of a display device in some embodiments;
fig. 2 is a block diagram of a hardware configuration of a display device in some embodiments;
FIG. 3 is a diagram of software configuration in a display device in some embodiments;
FIG. 4 is a diagram of a user interface displayed by a display device in some embodiments;
FIG. 5 is a schematic diagram of an application list in some embodiments;
FIG. 6 is a schematic diagram of an electronic palette in some embodiments;
FIG. 7 is a diagram of a brushes toolbar in some embodiments;
FIG. 8 is a schematic diagram of an operating chain structure in some embodiments;
FIG. 9 is a diagram of a undo page in some embodiments;
FIG. 10 is a diagram of a validation revocation page in some embodiments;
FIG. 11 is a diagram of a user interface after revocation confirmation in some embodiments
FIG. 12 is a recovery page map in some embodiments;
FIG. 13 is a diagram of a confirmation recovery page in some embodiments;
FIG. 14 is a diagram of a user interface after validation recovery in some embodiments;
fig. 15 is a revocation method in some embodiments.
Detailed Description
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. The following description refers to the accompanying drawings in which the same numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following examples do not represent all embodiments consistent with the present application. But merely as examples of systems and methods consistent with certain aspects of the application, as detailed in the claims.
To make the purpose and embodiments of the present application clearer, the following will clearly and completely describe the exemplary embodiments of the present application with reference to the attached drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for convenience of understanding of the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the foregoing drawings are used for distinguishing between similar or analogous objects or entities and are not necessarily intended to limit the order or sequence in which they are presented unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that is capable of performing the functionality associated with that element.
Fig. 1 is a schematic diagram of a usage scenario of a display device according to an embodiment. As shown in fig. 1, the display apparatus 200 is also in data communication with a server 400, and a user may operate the display apparatus 200 through the smart device 300 or the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device 200 includes at least one of an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, and the display device 200 is controlled by a wireless or wired method. The user may control the display apparatus 200 by inputting a user instruction through at least one of a key on a remote controller, a voice input, a control panel input, and the like.
In some embodiments, the smart device 300 may also be used to control the display device 200. For example, the display device 200 is controlled using a camera application running on the smart device.
In some embodiments, the smart device 300 and the display device 200 may also be used for communication of data.
In some embodiments, the display device 200 may also be controlled in a manner other than the control apparatus 100 and the smart device 300, for example, the voice instruction control of the user may be directly received by a module configured inside the display device 200 to obtain a voice instruction, or may be received by a voice control apparatus provided outside the display device 200.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers.
Fig. 2 illustrates a hardware configuration block diagram of a display device according to an exemplary embodiment.
In some embodiments, the display apparatus 200 includes at least one of a tuner 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, and a user interface.
In some embodiments the controller comprises a central processor, a video processor, an audio processor, a graphics processor, a RAM, a ROM, a first interface to an nth interface for input/output.
In some embodiments, the display 260 includes a display screen component for displaying pictures, and a driving component for driving image display, and is used for receiving image signals from the controller output, displaying video content, image content, and components of a menu manipulation interface, and a user manipulation UI interface, etc.
In some embodiments, the display 260 may be at least one of a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
In some embodiments, communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver. The display apparatus 200 may establish transmission and reception of a control signal and a data signal with the control device 100 or the server 400 through the communicator 220.
In some embodiments, the detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for collecting ambient light intensity; alternatively, the detector 230 includes an image collector, such as a camera, which may be used to collect external environment scenes, attributes of the user, or user interaction gestures, or the detector 230 includes a sound collector, such as a microphone, which is used to receive external sounds.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, and the like. Or may be a composite input/output interface formed by the plurality of interfaces.
In some embodiments, the controller 250 controls the operation of the display device 200 and responds to user operations through various software control programs stored in a memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink, an icon, or other actionable control. The operations related to the selected object are: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon.
In some embodiments, the controller includes at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphic Processing Unit (GPU), a RAM Random Access Memory (RAM), a ROM (Read-Only Memory), a first interface to an nth interface for input/output, a communication Bus (Bus), and the like.
A CPU processor. The system is used for executing the operating system and the camera application instructions stored in the memory and executing various camera applications, data and contents according to various interaction instructions received from the outside so as to finally display and play various audio and video contents. The CPU processor may include a plurality of processors. E.g., comprising a main processor and one or more sub-processors.
In some embodiments, a graphics processor for generating various graphics objects, such as: at least one of an icon, an operation menu, and a user input instruction display figure. The graphic processor comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes; the system also comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, the user may input a user command on a Graphical User Interface (GUI) displayed on the display 260, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between a camera application or operating system and a user that enables conversion between an internal form of information and a user-acceptable form. A common presentation form of a User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include at least one of an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc. visual interface elements.
In some embodiments, user interface 280 is an interface that may be used to receive control inputs (e.g., physical buttons on the body of display device 200, or the like).
In some embodiments, the system of display device 200 may include a Kernel (Kernel), a command parser (shell), a file system, and a camera application. The kernel, shell, and file system together form the basic operating system structure that allows users to manage files, run programs, and use the system. After power-on, the kernel is started, kernel space is activated, hardware is abstracted, hardware parameters are initialized, and virtual memory, a scheduler, signals and interprocess communication (IPC) are operated and maintained. And after the kernel is started, loading the Shell and the user camera application. The camera application is compiled into machine code after being started, and a process is formed.
Referring to fig. 3, in some embodiments, the system is divided into four layers, which are, from top to bottom, a camera Application (Applications) layer (abbreviated as "Application layer"), a camera Application Framework (Application Framework) layer (abbreviated as "Framework layer"), an Android runtime (Android runtime) and system library layer (abbreviated as "system runtime library layer"), and a kernel layer.
In some embodiments, at least one camera application runs in the camera application layer, and the camera applications may be a Window (Window) program of an operating system, a system setting program, a clock program, or the like; or may be a camera application developed by a third party developer. In particular, the camera application package in the camera application layer is not limited to the above example.
The framework layer provides an Application Programming Interface (API) and a programming framework for the camera application of the camera application layer. The camera application framework layer includes some predefined functions. The camera application framework layer acts as a processing center that decides to let the camera applications in the application layer act. The camera application can access resources in the system and obtain services of the system in execution through the API interface.
As shown in fig. 3, in the embodiment of the present application, the camera application framework layer includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used for interacting with all activities running in the system; the Location Manager (Location Manager) is used for providing the system service or application with the access of the system Location service; a Package Manager (Package Manager) for retrieving various information related to the camera application Package currently installed on the device; a Notification Manager (Notification Manager) for controlling display and clearing of Notification messages; a Window Manager (Window Manager) is used to manage the icons, windows, toolbars, wallpapers, and desktop components on a user interface.
In some embodiments, the activity manager is used to manage the lifecycle of the various camera applications and the usual navigation fallback functions, such as controlling the exit, opening, fallback, etc. of the camera applications. The window manager is used for managing all window programs, such as obtaining the size of a display screen, judging whether a status bar exists, locking the screen, intercepting the screen, controlling the change of the display window (for example, reducing the display window, displaying a shake, displaying a distortion deformation, and the like), and the like.
In some embodiments, the system runtime layer provides support for the upper layer, i.e., the framework layer, and when the framework layer is used, the android operating system runs the C/C + + library included in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 3, the core layer comprises at least one of the following drivers: audio drive, display driver, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (like fingerprint sensor, temperature sensor, pressure sensor etc.) and power drive etc..
In some embodiments, the display device 200 may have a touch interaction function, and a user may operate the host machine by only lightly touching the display with a finger, so as to get rid of operations of a keyboard, a mouse, and a remote controller, and make human-computer interaction more straightforward. Based on the display device 200, the display device 200 can support a touch interaction function by adding a touch component. In general, the touch sensing device may constitute a touch screen together with the display 260. A user can input different control instructions on the touch screen through touch operation. For example, the user may input a click, slide, long press, double click, etc. touch command, and different touch commands may represent different control functions.
In order to implement the different touch actions, the touch assembly may generate different electrical signals when a user inputs different touch actions, and transmit the generated electrical signals to the controller 250. The controller 250 may perform feature extraction on the received electrical signal to determine a control function to be performed by the user based on the extracted features.
For example, when a user inputs a click touch action at any program icon position in the application program interface, the touch component senses the touch action and generates an electrical signal. After receiving the electrical signal, the controller 250 may first determine a duration of a level corresponding to a touch action in the electrical signal, and when the duration is less than a preset time threshold, recognize that a click touch instruction is input by the user. The controller 250 extracts the position characteristics generated by the electrical signals to determine the touch position. And when the touch position is within the display range of the application icon, determining that the user inputs a click touch instruction at the position of the application icon. Accordingly, the click touch command is used to execute a function of running a corresponding application program in the current scene, so that the controller 250 may start running the corresponding application program.
For another example, when the user inputs a sliding motion in the media asset presentation page, the touch component also sends the sensed electrical signal to the controller 250. The controller 250 first determines the duration of the signal corresponding to the touch action in the electrical signal. When the determined duration is longer than the preset time threshold, the position change condition generated by the signal is judged, and obviously, for the interactive touch action, the generation position of the signal changes, so that the sliding touch instruction input by the user is determined. The controller 250 determines the sliding direction of the sliding touch instruction according to the change condition of the position of the signal generation, and controls to turn pages of the display frame in the media asset display page so as to display more media asset options. Further, the controller 250 may also extract features such as a sliding speed and a sliding distance of the sliding touch instruction, and perform image control of page turning according to the extracted features, so as to achieve a hand-following effect.
Similarly, for the touch instruction such as double click, long press, etc., the controller 250 may execute the corresponding control function according to the preset interaction rule by extracting different features and determining the type of the touch instruction through feature judgment. In some embodiments, the touch component also supports multi-touch, such that a user can input touch actions on the touch screen through multiple fingers, e.g., multi-finger clicks, multi-finger long presses, multi-finger swipes, and the like.
The touch control action can be matched with a specific application program to realize a specific function. For example, after the user opens the drawing board application, the display 260 may present a drawing area, the user may draw a specific touch action track in the drawing area through a sliding touch instruction, and the controller 250 determines a touch action pattern through a touch action detected by the touch component and controls the display 260 to display in real time to satisfy the demonstration effect.
In some embodiments, multiple applications may be installed in the display device 200, and fig. 4 shows a user interface displayed by the display in some embodiments, and a user may click "my applications" in the user interface to trigger the display of the application list. All applications installed by the display apparatus 200 are included in the application list.
FIG. 5 illustrates a schematic diagram of an application list in some embodiments. As shown in fig. 6. The display device 200 may be installed with a drawing board application, a player application, a video chat application, a camera application, and a mirror application. When the application list is displayed in the display, a user may select one of the applications and open the application to trigger the interface displaying the application, for example, the user may select to open the drawing board application, trigger the display to display the interface corresponding to the drawing board application, and perform drawing operation in the interface corresponding to the drawing board application to display corresponding content in the interface.
In some embodiments, when a user launches a sketchpad application, the controller may control the electronic sketchpad to be displayed in the display. An interactive area corresponding to one or more functions of the sketchpad application is displayed on the electronic sketchpad interface, and the interactive area can display text, images, icons, button buttons, pull-down menus, check boxes, selectable lists and the like. The user can make contact with the touch screen at a position where interaction is needed, so as to interact with the interaction area. The display apparatus 200 detects a contact and responds to the detected contact by performing a corresponding operation.
In some embodiments, fig. 6 is an exemplary electronic palette interface, as shown in fig. 6, which includes a drawing area and a function area, wherein the drawing area is an area into which content can be input. The functional area may display entry entries corresponding to one or more functions of the drawing board application, including a brush item, a cancel item, a restore item, and the like, and a user may click on each item to trigger the display device 200 to display a corresponding item interface, where parameters of some items may be set. For example, a user may click a brush item, trigger the display to display a brush toolbar shown in fig. 7, based on the brush toolbar, select items such as a brush type, a brush color, a line type, a line thickness, and the like, and after the selection is completed, the brush tool is in a pickup state, and the user may control the brush tool to perform a corresponding operation in a drawing area.
After the user selects various parameter information of the painting tool, the painting tool can be controlled to input contents in the drawing area, each time ' pen falling ' -moving ' -pen lifting ' or each time ' pen falling ' -pen lifting ' of the user is regarded as an operation event, and the contents input in the drawing area by the painting tool are controlled to be a touch track by the user through the operations of ' pen falling ' -moving ' -pen lifting '; the user controls the contents input in the drawing area by the brush tool through the operations of ' pen falling ' -pen lifting ' to be a touch point.
In the related function, when the user controls the painting tool to input content in the painting area, each operation event of the user is stored in the memory, and when the cancel operation is performed, the controller only reads the operation event of the last step each time, for example, the user performs N operations, and when the cancel operation is performed, the cancel operation can be performed only from the nth time. If a user wants to cancel a certain operation event in all previous operation events, for example, the user wants to cancel the mth operation, where M is between 1 and N, all operation events from M to N must be canceled in sequence.
In order to solve the above technical problem and increase the experience of the user, the display device 200 provided in the present application only cancels a certain operation event when the user performs a cancellation operation, and all other operation events can be retained, and the cancellation is recoverable, thereby providing convenience for the user to operate and improving the experience of the user.
In some embodiments, the user may transmit an "undo"/"restore" instruction to the display apparatus 200 by operating a designated key of the remote controller. Illustratively, the correspondence between the "undo"/"restore" command and the remote control key is pre-bound. For example, a "cancel"/"restore" key is provided on the remote controller, when the user touches the key, the remote controller sends a "cancel"/"restore" command to the controller, and the controller executes a "cancel"/"restore" operation after receiving the "cancel"/"restore" command.
In some embodiments, the correspondence between the "undo"/"resume" command and the plurality of remote control keys may also be pre-bound, and when the user touches the plurality of keys bound to the "undo"/"resume" command, the remote control issues the "undo"/"resume" command. In some embodiments, the keys bound by the "cancel"/"restore" command are sequentially direction keys (left, down, left, down), that is, the remote controller sends the "cancel"/"restore" command to the controller only when the user continuously touches the keys (left, down, left, down) within a preset time. By adopting the binding method, the situation that 'revocation'/'recovery' is sent out due to misoperation of a user can be avoided. The embodiments of the present application are merely exemplary to provide several binding relationships between the "cancel"/"restore" instruction and the key, and the binding relationships between the "cancel"/"restore" instruction and the key may be set according to habits of a user, which is not limited herein.
In some embodiments, the user may send a "undo"/"resume" instruction to the display device 200 by means of voice input using a sound collector, e.g., a microphone, of the display device 200 to control the display device 200 to enter into performing an "undo"/"resume" operation. The display device 200 may be provided with an intelligent voice system, and the intelligent voice system may recognize the voice of the user to extract the instruction content input by the user. The user can input a preset wake-up word through the microphone, so that the intelligent voice system is started, and the controller can respond to the instruction input by the user. For example, the user may enter "something classmate" to activate the intelligent speech system. And then, the 'undo'/'restore' is input, so that an 'undo'/'restore' instruction is sent to the display device 200.
In some embodiments, the user may also send an "undo"/"resume" instruction to the display device 200 through a preset gesture. The display apparatus 200 may detect the behavior of the user through an image collector, such as a camera. When the user makes a preset gesture, it may be considered that the user transmits an "undo"/"resume" instruction to the display apparatus 200. For example, it may be set as: when the user is detected to draw a V word, it is determined that the user has input a "cancel"/"restore" instruction to the display apparatus 200. The user may also send an "undo"/"resume" instruction to the display device 200 through a preset action. For example, it can be set as: when it is detected that the user lifts the left foot and the right hand at the same time, it is determined that the user has input an "undo"/"resume" instruction to the display device 200.
In some embodiments, when the user controls the display device 200 using a smart device, for example, using a cell phone, an "undo"/"resume" instruction may also be sent to the display device 200. For example, a control may be set in the mobile phone, and the control selects whether to execute the operation of "undo"/"restore", and when the user selects to execute the operation of "undo"/"restore", a "undo"/"restore" instruction is sent to the controller, and after the controller receives the "undo"/"restore" instruction, the operation of "undo"/"restore" is executed.
In some embodiments, when the user controls the display device 200 using the mobile phone, a continuous click command may be issued to the mobile phone. The continuous click command refers to: in a preset period, the number of times that a user clicks the same area of the mobile phone touch screen exceeds a preset threshold value. For example: when the user clicks a certain area of the mobile phone touch screen for 3 times in 1s, the user is regarded as a continuous clicking instruction. After receiving the continuous click command, the mobile phone may send a "cancel"/"restore" command to the display device 200, so that the controller performs a "cancel"/"restore" operation.
In some embodiments, when the user uses the mobile phone to control the display device 200, the following may be set: when detecting that a touch pressure value of a certain area of the touch screen of the mobile phone by a user exceeds a preset pressure threshold, the mobile phone may send a "cancel"/"restore" instruction to the display device 200.
It should be understood that the aforementioned "revocation"/"restoration" is only an exemplary function name/mode name defined for convenience of description, which represents a certain function possessed by the display device 200, and does not limit the scope of the present disclosure.
In some embodiments, in response to an operation event input by a user at the electronic palette interface, the controller may store the operation event input by the user in the memory. The operation event input by the user on the electronic drawing board interface refers to an event that the content in the electronic drawing board interface is changed by the user through operation, for example, the user picks up a painting tool and leaves a touch track on the electronic drawing board interface, which can be regarded as an operation event; the user picks up the eraser tool, and partial content displayed in the electronic drawing board interface is erased to be regarded as an operation event; and clicking the undo item by the user to undo the content displayed on the electronic drawing board interface by a certain input operation of the user can also be regarded as an operation event.
In some embodiments, in response to an operation of opening the electronic drawing board application by a user, the display is controlled to display the electronic drawing board interface, and a first operation record list is generated in the memory, wherein the first operation record list is used for storing operation events input by the user on the electronic drawing board interface.
In some embodiments, in response to a target operation event input by a user on the electronic palette interface, the controller may control to store the target operation event in the first operation record list.
In some embodiments, the first operation record list may be an operation chain, the operation chain includes several nodes, and each operation event of the user may be stored at one node of the operation chain, for example, after the user completes operation event a, operation event a is stored at the first node of the operation chain, and then the user completes operation event B, operation event B is stored at the second node of the operation chain, and so on, after the user completes operation event, the operation event is sequentially stored at the nodes of the operation chain.
More specifically, the user controls the brush tool to sequentially input a graph a, a graph B, and a graph C in the drawing area, where the graph a may be stored in a first node of the operation chain, the graph B may be stored in a second node of the operation chain, the graph C may be stored in a third node of the operation chain, fig. 8 is an operation chain structure diagram, and information stored in each node may include brush type information, brush color information, brush width information, input shape information, and the like input by the user.
In some embodiments, the input shape information of the user in the current operation may be obtained according to the coordinate of the touch trajectory input by the user controlling the brush tool on the electronic drawing board interface, for example, in response to a pen drop operation input by the user controlling the brush tool on the electronic drawing board interface, a pen drop point coordinate corresponding to the pen drop operation of the user is obtained, and the coordinate of the current position is stored at preset time points every preset time period until the user controlling the brush tool performs a pen lift operation on the electronic drawing board interface, and a pen lift point coordinate corresponding to the pen lift operation is obtained, so as to complete storage of the input shape information. For example, the preset time period is set to 60ms, the preset time point is set to 60ms, the time is counted from the start of the pen-down operation performed by the user, the coordinates of the pen-down point of the user are obtained, and the coordinates of the current position of the user are sequentially stored at times of multiples of 60ms, such as 60ms, 120ms, 180ms, and the like, so as to obtain the input shape information of the user in the current operation.
In some embodiments, the user clicks on the undo item to input an undo instruction for the historical operation events to the controller, and the controller, after receiving the undo instruction input by the user, obtains a first operation record list from the memory, where the first operation record list includes at least one historical operation event input by the user, and triggers the display device 200 to display an undo interface, where the undo interface includes the first operation record list, and the first operation record list includes all the historical operation events input by the user.
In some embodiments, the drawing area is drawn in a part or all of the area of the first layer, and the controller generates a second layer after receiving a cancel instruction input by a user, where the second layer is a layer used for drawing the first operation record list, and the first operation record list is drawn in a part or all of the area of the second layer.
In some embodiments, the second layer is arranged on the upper layer of the first layer in a floating manner, or the second layer and the first layer are arranged on the electronic drawing board interface in parallel, so that the first operation record list is displayed on the electronic drawing board interface.
In some embodiments, referring to fig. 9, the first operation record list may be a thumbnail list displayed in the form of a picture, the thumbnail list including all operation events input by the user displayed in the form of thumbnails, wherein any one of the lists corresponds to one operation event or modification of the last operation. For example, when the historical operation events input by the user are lines or graphs, each historical operation event input by the user is enclosed in a text frame to prompt the user to know the operation event corresponding to the current thumbnail. Alternatively, when the history operation event input by the user is a filter addition or color filling, a thumbnail corresponding to the history operation event may be displayed, and a screen to which the filter is added or the color is filled is displayed in the thumbnail. The thumbnail list is arranged in the time axis, the two sides of the time axis are respectively provided with the rolling control, and the time axis can be controlled to roll by clicking the rolling control, so that the complete thumbnail list is browsed.
In some embodiments, when the user wants to cancel an operation event of a certain step, the operation event option corresponding to the selectable thumbnail list may be selected in a single mode or in multiple modes, the thumbnail list is further provided with a determination option, after the operation event option corresponding to the thumbnail list is selected, the determination option is clicked, the display device 200 is triggered to display a confirmation cancel page, see fig. 10, where the confirmation cancel page includes a "confirmation" option entry and a "cancel" option entry, when the user selects the "confirmation" option entry, the controller sends an instruction to delete the selected operation event to the memory, after the memory receives the instruction, the memory updates the first operation record list in response to the operation event selected by the user, so as to delete the operation event selected by the user in the first operation record list, and then the controller controls to redraw the first image layer according to the operation event in the updated first operation record list, so that the display device 200 displays the content corresponding to the operation event in the updated first operation record list as shown in fig. 11, and completes the cancellation of the selected operation event; when the user selects the "cancel" option entry, then the controller controls the undo page to close.
In some embodiments, in response to a user's selection of one or more first historical operational events in the first operational record list on the revocation page, the controller controls the memory to store all the historical operational events selected by the user in a second operational list stored in the memory for storing the operational events deleted in the first operational record list.
Alternatively, in response to a user selecting one or more first historical operating events in the first operating record list on the revocation page, the controller may determine whether a second operating record list is stored in the memory, where the second operating record list is generated when the second historical operating event is deleted in the first operating record list, and the time when the second historical operating event is deleted in the first operating record list is earlier than the time when the first historical operating event is deleted in the first operating record list, and if the second operating record list is stored in the memory, store the first historical operating event in the second operating record list. And if the second operation record list is not stored in the memory, generating the second operation record list according to the first historical operation event.
In some embodiments, the second operation record list may be a recovery chain including a plurality of nodes, and the selected events deleted by the user from the first operation record list are sequentially stored at the nodes of the recovery chain according to the deletion time, for example, the user performs a revocation operation on the operation event a, the operation event B and the operation event C sequentially, and the operation event a, the operation event B and the operation event C are sequentially deleted from the first operation record list/operation chain and sequentially stored at the nodes of the recovery chain, that is, the operation event a is stored at the first node of the recovery chain, the operation event B is stored at the second node of the recovery chain, and the operation event C is stored at the third node of the recovery chain.
In some embodiments, the user clicks the recovery item to input a recovery instruction to the controller, and the controller triggers the display device 200 to display a recovery interface after receiving the recovery instruction input by the user, where the recovery interface includes a second operation record list, and the second operation record list includes all the operation events deleted in the first operation record list.
In some embodiments, after receiving a recovery instruction input by a user, the controller may generate a third image layer, where the third image layer is used to draw a second operation record list, and the second operation record list is drawn in a part or all of the third image layer.
In some embodiments, the third layer is arranged on the upper layer of the first layer in a floating manner, or the third layer and the first layer are arranged on the electronic drawing board interface in parallel, so that the second operation record list is displayed on the electronic drawing board interface.
Referring to fig. 12, the second operation record list may be a thumbnail list displayed in a picture form, where the thumbnail list includes all operation events to be recovered by the user displayed in a thumbnail form, where any one of the second operation record list corresponds to one historical operation event to be recovered in the second operation record list, and content executed when each operation event is recovered is marked or displayed in the thumbnail, for example, when the historical operation event in the second operation record list is a line or a graph, each historical operation event to be recovered is displayed in a dotted line form, so as to prompt the user to know the operation event to be recovered corresponding to the current thumbnail. Or, when the historical operation event to be restored is a filter or color filling event, there is also a thumbnail corresponding to the historical operation event, and a picture with the filter or the color filling event is displayed in the thumbnail. The thumbnail list is arranged in the time axis, the two sides of the time axis are respectively provided with the rolling control, and the time axis can be controlled to roll by clicking the rolling control, so that the complete thumbnail list is browsed.
In some embodiments, when the user wants to recover an operation event of a certain step, the corresponding revoked operation event option in the selectable thumbnail list may be selected singly or in multiple ways, the thumbnail list is further provided with a determination option, after the selection of the corresponding operation event option in the thumbnail list is completed, the determination option is clicked, the display device 200 is triggered to display a confirmation recovery page, see fig. 13, after the confirmation option on the confirmation recovery page is clicked again, a recovery selected operation event instruction is sent to the memory, after the memory receives the instruction, in response to the operation selected by the user for the operation time, the operation event selected by the user in the recovery chain is deleted, that is, the selected operation event in the second operation record list is deleted, and the selected operation event is added to the first operation record list, and then the first image layer is redrawn, so that the display device 200 displays the operation event in the second operation record list after the operation event selected by the user is added as shown in fig. 14, and the recovery of the selected operation event is completed; when the user selects the "cancel" option entry, then the controller controls the restore page to close.
According to the above embodiments, an embodiment of the present application provides a revocation method, as shown in fig. 15, the method includes:
and S101, presenting related options of a first operation record list in the control area in response to a cancel instruction of the historical operation event input by a user.
In some embodiments, the layer in which the drawing area is located is a first layer, and a second layer is generated in response to a cancel instruction of the historical operation event, which is input by a user, and the second layer is used for drawing the options related to the first operation record list; and the second layer is arranged on the upper layer of the first layer in a suspending manner, or the second layer and the first layer are arranged on the electronic drawing board interface in parallel, so that the related options of the first operation record list are displayed on the display.
In some embodiments, in response to a user-input undo instruction for the historical operating events, generating thumbnails corresponding to the historical operating events in the first operating record list in a one-to-one manner; controlling the display to display the thumbnails in one-to-one correspondence with the historical operating events in the first operating record list.
And S102, in response to the selection operation of the user on the first historical operation event, updating the first operation record list to delete the first historical operation event in the first operation record list.
In some embodiments, in response to a user selection operation on the first historical operation event, deleting the first historical operation event in the first operation record list; and refreshing the second image layer to display the first operation record list after the first historical operation event is deleted on the second image layer.
In some embodiments, in response to a user's selection of the first historical operation event, determining whether a second operation record list exists, wherein the second operation list is generated when a second historical operation event is deleted in the first operation list, and the time of deleting the second historical operation event in the first operation record list is earlier than the time of deleting the first historical operation event in the first operation record list;
and if the second operation record list exists, storing the first historical operation event into the second operation record list.
In some embodiments, if the second operation record list does not exist, generating the second operation record list, where the second operation record list is used to store the historical operation events deleted in the first operation record list;
and storing the first historical operation event into the second operation record list.
S103, refreshing the electronic drawing board interface to display the updated operation events in the first operation record list on the electronic drawing board interface.
In some embodiments, in response to a user-input recovery instruction for the historical operating event deleted from the first operating record list, presenting a second operating record list-related option in the control area; responding to the selection operation of a user on a third history operation event, and storing the third history operation event into the first operation record list, wherein the third history operation event is an event in the second operation record list; and refreshing the electronic drawing board interface to display the operation events in the first operation record list after the third history operation events are stored on the electronic drawing board interface.
In some embodiments, the layer in which the drawing area is located is a first layer, and a third layer is generated in response to a user-input instruction for recovering the historical operation event deleted from the first operation record list, where the third layer is used for drawing options related to the second operation record list; and the third layer is arranged on the upper layer of the first layer in a suspending manner, or the third layer and the first layer are arranged on the electronic drawing board interface in parallel, so that the options related to the second operation record list are displayed on the display.
In some embodiments, the third history operation event in the second operation record list is deleted in response to a user selection operation on the third history operation event.
In specific implementation, the present invention further provides a computer storage medium, where the computer storage medium may store a program, and the program may include some or all of the steps in the embodiments of the revocation method provided by the present invention when executed. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM) or a Random Access Memory (RAM).
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented as software plus a required general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be essentially or partially implemented in the form of a software product, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments.
The same and similar parts in the various embodiments in this specification may be referred to each other. In particular, for the embodiment of the display device, since it is substantially similar to the embodiment of the method, the description is simple, and for the relevant points, refer to the description in the embodiment of the method.
The above-described embodiments of the present invention do not limit the scope of the present invention.

Claims (10)

1. A display device, comprising:
the display is configured to display an electronic drawing board interface, the electronic drawing board interface comprises a control area and a drawing area, the control area comprises at least one control used for carrying out input operation on the drawing area, and the drawing area is used for presenting input contents;
the memory is configured to store a first operation record list, and the first operation record list is used for storing at least one historical operation event input by a user on the electronic drawing board interface;
a controller configured to:
presenting a first operation record list related option in the control area in response to a cancel instruction of the historical operation event input by a user;
updating the first operation record list in response to the user's selected operation on a first historical operation event so as to delete the first historical operation event in the first operation record list;
and refreshing the electronic drawing board interface to display the updated operation events in the first operation record list on the electronic drawing board interface.
2. The display device according to claim 1, wherein the layer on which the drawing area is located is a first layer, and the presenting, in response to a cancel instruction for the historical operation event input by a user, a first operation record list related option in the control area further comprises:
responding to a cancel instruction of the historical operation event input by a user, and generating a second image layer, wherein the second image layer is used for drawing the related options of the first operation record list;
and the second layer is arranged on the upper layer of the first layer in a suspending manner, or the second layer and the first layer are arranged on the electronic drawing board interface in parallel, so that the related options of the first operation record list are displayed on the display.
3. The display device according to claim 2, wherein the updating the first operation record list in response to a user selection operation on a first historical operation event further comprises:
deleting the first historical operation event in the first operation record list in response to the selection operation of the user on the first historical operation event;
and refreshing the second image layer to display the first operation record list after the first historical operation event is deleted on the second image layer.
4. The display device according to claim 1, wherein the updating the first operation record list in response to a user selection operation on a first historical operation event further comprises:
responding to the selected operation of a user on the first historical operation event, and judging whether a second operation record list exists or not, wherein the second operation list is generated when a second historical operation event is deleted in the first operation list, and the time of deleting the second historical operation event in the first operation record list is earlier than the time of deleting the first historical operation event in the first operation record list;
and if the second operation record list exists, storing the first historical operation event into the second operation record list.
5. The display device according to claim 4, wherein determining whether or not there is a second operation record list in response to a user's selection operation of the executed operation event further includes:
if the second operation record list does not exist, generating the second operation record list, wherein the second operation record list is used for storing the historical operation events deleted in the first operation record list;
and storing the first historical operation event into the second operation record list.
6. The display device of claim 1, wherein the presenting a first operation record list related option in the control area in response to a user-entered undo instruction for the historical operation event further comprises:
generating thumbnails corresponding to the historical operation events in the first operation record list in a one-to-one mode in response to a revocation instruction of the historical operation events input by a user;
controlling the display to display the thumbnails in one-to-one correspondence with the historical operating events in the first operating record list.
7. The display device according to claim 5, wherein the controller is further configured to:
presenting a second operation record list related option in the control area in response to a recovery instruction of the historical operation event deleted from the first operation record list input by a user;
responding to the selected operation of a user on a third history operation event, and storing the third history operation event into the first operation record list, wherein the third history operation event is an event in the second operation record list;
and refreshing the electronic drawing board interface to display the operation events in the first operation record list after the third history operation events are stored on the electronic drawing board interface.
8. The display device according to claim 7, wherein the layer in which the drawing area is located is a first layer, and the controlling the display to display the second operation record list in response to the recovery instruction input by the user further comprises:
responding to a recovery instruction of the historical operation event deleted from the first operation record list, which is input by a user, and generating a third image layer, wherein the third image layer is used for drawing the options related to the second operation record list;
and the third layer is arranged on the upper layer of the first layer in a suspending manner, or the third layer and the first layer are arranged on the electronic drawing board interface in parallel, so that the related options of the second operation record list are displayed on the display.
9. The display device according to claim 6, wherein the storing the third history operation event into the first operation record list in response to a user selection operation of the third history operation event further comprises:
and deleting the third history operation event in the second operation record list in response to the selected operation of the user on the third history operation event.
10. An undo method for use in a display device, comprising:
presenting a related option of a first operation record list in a control area in response to a cancel instruction of the historical operation event input by a user;
updating the first operation record list in response to the user's selected operation on a first historical operation event so as to delete the first historical operation event in the first operation record list;
and refreshing the electronic drawing board interface to display the updated operation events in the first operation record list on the electronic drawing board interface.
CN202210106451.8A 2021-06-30 2022-01-28 Display device and revocation method Pending CN115562544A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2021107392935 2021-06-30
CN202110739293 2021-06-30

Publications (1)

Publication Number Publication Date
CN115562544A true CN115562544A (en) 2023-01-03

Family

ID=84736974

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210106451.8A Pending CN115562544A (en) 2021-06-30 2022-01-28 Display device and revocation method

Country Status (1)

Country Link
CN (1) CN115562544A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116883548A (en) * 2023-09-08 2023-10-13 福昕鲲鹏(北京)信息科技有限公司 Method and device for conveniently adding and modifying electronic image in electronic document

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116883548A (en) * 2023-09-08 2023-10-13 福昕鲲鹏(北京)信息科技有限公司 Method and device for conveniently adding and modifying electronic image in electronic document
CN116883548B (en) * 2023-09-08 2023-12-19 福昕鲲鹏(北京)信息科技有限公司 Method and device for conveniently adding and modifying electronic image in electronic document

Similar Documents

Publication Publication Date Title
CN108089786B (en) User interface display method, device, equipment and storage medium
WO2021184375A1 (en) Method for execution of hand gesture commands, apparatus, system, and storage medium
CN107111496B (en) Customizable blade application
US8458615B2 (en) Device, method, and graphical user interface for managing folders
US9013366B2 (en) Display environment for a plurality of display devices
KR20140025494A (en) Edge gesture
KR20140025493A (en) Edge gesture
CN114501107A (en) Display device and coloring method
CN113625932B (en) Full-screen handwriting input method and device
CN113810746B (en) Display equipment and picture sharing method
WO2017113551A1 (en) System and method for operating system of mobile device
CN110413187B (en) Method and device for processing annotations of interactive intelligent equipment
CN114501108A (en) Display device and split-screen display method
CN115129214A (en) Display device and color filling method
CN114115637A (en) Display device and electronic drawing board optimization method
CN115562544A (en) Display device and revocation method
US11531719B2 (en) Navigation tab control organization and management for web browsers
CN114157889B (en) Display equipment and touch control assisting interaction method
WO2021219002A1 (en) Display device
CN113485614A (en) Display apparatus and color setting method
CN114296623A (en) Display device
CN114442849B (en) Display equipment and display method
CN112650418A (en) Display device
CN112926420A (en) Display device and menu character recognition method
CN112947783A (en) Display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination