CN114501108A - Display device and split-screen display method - Google Patents

Display device and split-screen display method Download PDF

Info

Publication number
CN114501108A
CN114501108A CN202210133703.6A CN202210133703A CN114501108A CN 114501108 A CN114501108 A CN 114501108A CN 202210133703 A CN202210133703 A CN 202210133703A CN 114501108 A CN114501108 A CN 114501108A
Authority
CN
China
Prior art keywords
area
control
user
drawing area
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210133703.6A
Other languages
Chinese (zh)
Inventor
董率
崔尧尧
肖媛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Publication of CN114501108A publication Critical patent/CN114501108A/en
Priority to PCT/CN2022/109184 priority Critical patent/WO2023065766A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8146Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics
    • H04N21/8153Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics comprising still images, e.g. texture, background image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8166Monomedia components thereof involving executable data, e.g. software
    • H04N21/8173End-user applications, e.g. Web browser, game

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

When the electronic drawing board interface is displayed, a user can instruct the electronic drawing board interface to perform partitioning through operation, a first drawing board area and a second drawing board area are triggered to be displayed on the electronic drawing board interface, the first drawing board area comprises a first control area and a first drawing area, the second drawing board area comprises a second control area and a second drawing area, the user can trigger the first drawing area to display contents edited by a control in the first control area through respectively operating controls in the first drawing control area and the second drawing control area, and the second drawing area is triggered to display contents edited by a control in the second control area. According to the display device and the split-screen display method, the electronic drawing board interface is partitioned, each partition comprises a plurality of controls capable of independently controlling the partition, so that one or more users can edit and draw pictures in different partitions of the same electronic drawing board interface at the same time, and user experience is facilitated.

Description

Display device and split-screen display method
The present application claims priority of chinese patent application entitled "a display device and a split screen display method" filed by chinese patent office on 22/10/2021 under the application number 202111235610.6, which is incorporated herein by reference in its entirety.
Technical Field
The application relates to the technical field of smart television drawing boards, in particular to a display device and a split-screen display method.
Background
The display device refers to a terminal device capable of outputting a specific display picture, such as a smart television, a mobile terminal, a smart advertisement screen, a projector, and the like. Taking an intelligent television as an example, the intelligent television can be based on an Internet application technology, has an open operating system and a chip, has an open application platform, can realize a bidirectional man-machine interaction function, integrates multiple functions of audio and video, entertainment, education, data and the like, and is a television product for meeting diversified and personalized requirements of users. For example, a drawing board application may be installed on the display device, and a user may edit a drawing on the electronic drawing board interface by opening the drawing board application.
However, in the same time, only a single user can edit and draw pictures on the electronic drawing board interface, and multiple users cannot edit and draw pictures on the same drawing board interface without interfering with each other, which is not favorable for user experience.
Disclosure of Invention
The application provides a display device and a split screen display method, which aim to solve the problem that multiple users can not edit and draw pictures on the same drawing board interface without mutual interference in the prior art.
In one aspect, the present application provides a display device, comprising:
the display is used for displaying the electronic drawing board interface;
the touch control assembly is used for receiving an instruction input by a user through touch control, wherein the touch control assembly and the display form a touch control screen;
a controller configured to:
displaying a first drawing board area and a second drawing board area on an electronic drawing board interface in response to an instruction which is input by a user and indicates that the electronic drawing board interface is partitioned, wherein the first drawing board area comprises a first control area and a first drawing area, the first control area comprises at least one control used for carrying out a first input operation on the first drawing area, and the first drawing area is used for displaying first operation content corresponding to the first input operation; the second drawing board area comprises a second control area and a second drawing area, the second control area comprises at least one control used for carrying out second input operation on the second drawing area, and the second drawing area is used for displaying second operation content corresponding to the second input operation;
displaying the first operation content on the first drawing area in response to a first operation instruction input by a first user, wherein the first operation instruction is used for indicating that the first input operation is performed on the first drawing area according to a control selected by the user in the first control area;
when the first operation instruction input by the first user is received, responding to a second operation instruction input by a second user, and displaying the second operation content on the second drawing area, wherein the second operation instruction is used for indicating that the second input operation is performed on the second drawing area according to a control in the second control area selected by the user.
On the other hand, the application also provides a split screen display method, which comprises the steps of
Displaying a first drawing board area and a second drawing board area on an electronic drawing board interface in response to an instruction which is input by a user and indicates that the electronic drawing board interface is partitioned, wherein the first drawing board area comprises a first control area and a first drawing area, the first control area comprises at least one control used for carrying out a first input operation on the first drawing area, and the first drawing area is used for displaying first operation content corresponding to the first input operation; the second drawing board area comprises a second control area and a second drawing area, the second control area comprises at least one control used for carrying out second input operation on the second drawing area, and the second drawing area is used for displaying second operation content corresponding to the second input operation;
displaying the first operation content on the first drawing area in response to a first operation instruction input by a first user, wherein the first operation instruction is used for indicating that the first input operation is performed on the first drawing area according to a control selected by the user in the first control area;
when the first operation instruction input by the first user is received, responding to a second operation instruction input by a second user, and displaying the second operation content on the second drawing area, wherein the second operation instruction is used for indicating that the second input operation is performed on the second drawing area according to a control in the second control area selected by the user.
When the electronic drawing board interface is displayed, a user can instruct the electronic drawing board interface to perform partitioning through operation, and a first drawing board area and a second drawing board area are triggered to be presented on the electronic drawing board interface, wherein the first drawing board area comprises a first control area and a first drawing area, the second drawing board area comprises a second control area and a second drawing area, the user can trigger the first drawing area to present contents edited by a control in the first control area through respectively operating controls in the first control area and the second control area, and trigger the second drawing area to present contents edited by a control in the second control area. According to the display device and the split-screen display method, the electronic drawing board interface is partitioned, each partition comprises the plurality of controls capable of independently controlling the partition, so that one or more users can edit and draw pictures in different partitions of the same electronic drawing board interface at the same time, and user experience is facilitated.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 illustrates an operational scenario between a display device and a control apparatus, according to some embodiments;
FIG. 2 is a block diagram illustrating a hardware configuration of a display device according to some embodiments;
FIG. 3 is a diagram illustrating a software configuration in a display device according to some embodiments;
fig. 4 is an electronic palette interface diagram of the present application in some embodiments;
fig. 5 is an electronic palette interface diagram of the present application in some embodiments;
fig. 6 is an electronic palette interface diagram of the present application in some embodiments;
fig. 7 is an electronic palette interface diagram of the present application in some embodiments;
FIG. 8 is a selection interface diagram of the present application in some embodiments;
fig. 9 is an electronic palette interface diagram of the present application in some embodiments;
FIG. 10 is a selection interface diagram of the present application in some embodiments;
fig. 11 is an electronic palette interface diagram of the present application in some embodiments;
fig. 12 is an electronic palette interface diagram of the present application in some embodiments;
fig. 13 is an electronic palette interface diagram of the present application in some embodiments;
fig. 14 is an electronic palette interface diagram of the present application in some embodiments;
fig. 15 is an electronic palette interface diagram of the present application in some embodiments;
fig. 16 is an electronic palette interface diagram of the present application in some embodiments;
fig. 17 is a flowchart of a split screen display method provided in the present application.
Detailed Description
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following examples do not represent all embodiments consistent with the present application. But merely as exemplifications of systems and methods consistent with certain aspects of the application, as recited in the claims.
To make the purpose and embodiments of the present application clearer, the following will clearly and completely describe the exemplary embodiments of the present application with reference to the attached drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
Fig. 1 is a schematic diagram of a usage scenario of a display device according to an embodiment. As shown in fig. 1, the display apparatus 200 is also in data communication with a server 400, and a user may operate the display apparatus 200 through the smart device 300 or the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device 200 includes at least one of an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, and the display device 200 is controlled by a wireless or wired method. The user may control the display apparatus 200 by inputting a user instruction through at least one of a key on a remote controller, a voice input, a control panel input, and the like.
In some embodiments, the smart device 300 may also be used to control the display device 200. For example, the display device 200 is controlled using a camera application running on the smart device.
In some embodiments, the smart device 300 and the display device 200 may also be used for communication of data.
In some embodiments, the display device 200 may also be controlled in a manner other than the control apparatus 100 and the smart device 300, for example, the voice instruction control of the user may be directly received by a module configured inside the display device 200 to obtain a voice instruction, or may be received by a voice control apparatus provided outside the display device 200.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers.
Fig. 2 illustrates a hardware configuration block diagram of a display device according to an exemplary embodiment.
In some embodiments, the display apparatus 200 includes at least one of a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, a user interface.
In some embodiments the controller comprises a central processor, a video processor, an audio processor, a graphics processor, a RAM, a ROM, a first interface to an nth interface for input/output.
In some embodiments, the display 260 includes a display screen component for displaying pictures, and a driving component for driving image display, a component for receiving image signals from the controller output, displaying video content, image content, and menu manipulation interface, and a user manipulation UI interface, etc.
In some embodiments, the display 260 may be at least one of a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
In some embodiments, communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver. The display apparatus 200 may establish transmission and reception of control signals and data signals with the control device 100 or the server 400 through the communicator 220.
In some embodiments, the detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for collecting ambient light intensity; alternatively, the detector 230 includes an image collector, such as a camera, which may be used to collect external environment scenes, attributes of the user, or user interaction gestures, or the detector 230 includes a sound collector, such as a microphone, which is used to receive external sounds.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, and the like. The interface may be a composite input/output interface formed by the plurality of interfaces.
In some embodiments, the controller 250 controls the operation of the display device 200 and responds to user operations through various software control programs stored in the memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink, an icon, or other actionable control. The operations related to the selected object are: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon.
In some embodiments the controller comprises at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphics Processing Unit (GPU), a RAM Random Access Memory (RAM), a ROM (Read-Only Memory), a first to nth interface for input/output, a communication Bus (Bus), and the like.
A CPU processor. The system is used for executing the operating system and the camera application instructions stored in the memory and executing various camera applications, data and contents according to various interaction instructions received from the outside so as to finally display and play various audio and video contents. The CPU processor may include a plurality of processors. E.g. comprising a main processor and one or more sub-processors.
In some embodiments, a graphics processor for generating various graphics objects, such as: at least one of an icon, an operation menu, and a user input instruction display figure. The graphic processor comprises an arithmetic unit, which performs operation by receiving various interactive instructions input by a user and displays various objects according to display attributes; the system also comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on display 260, and the user input interface receives the user input commands through the Graphical User Interface (GUI). Alternatively, the user may input a user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between a camera application or operating system and a user that enables conversion between an internal form of information and a user-acceptable form. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include at least one of an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc. visual interface elements.
In some embodiments, user interface 280 is an interface that may be used to receive control inputs (e.g., physical buttons on the body of display device 200, or the like).
In some embodiments, the system of display device 200 may include a Kernel (Kernel), a command parser (shell), a file system, and a camera application. The kernel, shell, and file system together make up the basic operating system structure that allows users to manage files, run programs, and use the system. After power-on, the kernel is started, kernel space is activated, hardware is abstracted, hardware parameters are initialized, and virtual memory, a scheduler, signals and interprocess communication (IPC) are operated and maintained. And after the kernel is started, loading the Shell and the user camera application. The camera application is compiled into machine code after being started, and a process is formed.
Referring to fig. 2, in some embodiments, the system is divided into four layers, which are, from top to bottom, a camera Application (Applications) layer (abbreviated as "Application layer"), a camera Application Framework (Application Framework) layer (abbreviated as "Framework layer"), an Android runtime (Android runtime) and system library layer (abbreviated as "system runtime library layer"), and a kernel layer.
In some embodiments, at least one camera application runs in the camera application layer, and the camera applications may be a Window (Window) program of an operating system, a system setting program, a clock program, or the like; or a camera application developed by a third party developer. In particular, the camera application package in the camera application layer is not limited to the above example.
The framework layer provides an Application Programming Interface (API) and a programming framework for the camera application of the camera application layer. The camera application framework layer includes some predefined functions. The camera application framework layer acts as a processing center that decides to let the camera applications in the application layer act. The camera application can access resources in the system and obtain services of the system in execution through the API interface.
As shown in fig. 2, in the embodiment of the present application, the camera application framework layer includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used for interacting with all activities running in the system; the Location Manager (Location Manager) is used for providing the system service or application with the access of the system Location service; a Package Manager (Package Manager) for retrieving various information related to the camera application Package currently installed on the device; a Notification Manager (Notification Manager) for controlling display and clearing of Notification messages; a Window Manager (Window Manager) is used to manage the icons, windows, toolbars, wallpapers, and desktop components on a user interface.
In some embodiments, the activity manager is used to manage the lifecycle of the various camera applications and the usual navigation fallback functions, such as controlling the exit, opening, fallback, etc. of the camera applications. The window manager is used for managing all window programs, such as obtaining the size of a display screen, judging whether a status bar exists, locking the screen, intercepting the screen, controlling the change of the display window (for example, reducing the display window, displaying a shake, displaying a distortion deformation, and the like), and the like.
In some embodiments, the system runtime layer provides support for the upper layer, i.e., the framework layer, and when the framework layer is used, the android operating system runs the C/C + + library included in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 2, the core layer includes at least one of the following drivers: audio drive, display driver, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (like fingerprint sensor, temperature sensor, pressure sensor etc.) and power drive etc..
In some embodiments, the display device 200 may have a touch interaction function, and a user may operate the host machine by only lightly touching the display with a finger, so as to get rid of operations of a keyboard, a mouse, and a remote controller, and make human-computer interaction more straightforward. Based on the display device 200, the display device 200 can support a touch interaction function by adding a touch component. In general, the touch-sensing component may constitute a touch screen together with the display 260. A user can input different control instructions on the touch screen through touch operation. For example, the user may input a click, slide, long press, double click, etc. touch command, and different touch commands may represent different control functions.
In order to implement the different touch actions, the touch assembly may generate different electrical signals when a user inputs different touch actions, and transmit the generated electrical signals to the controller 250. The controller 250 may perform feature extraction on the received electrical signal to determine a control function to be performed by the user based on the extracted features.
For example, when a user inputs a click touch action at any program icon position in the application program interface, the touch component senses the touch action and generates an electrical signal. After receiving the electrical signal, the controller 250 may first determine a duration of a level corresponding to a touch action in the electrical signal, and when the duration is less than a preset time threshold, recognize that a click touch instruction is input by the user. The controller 250 then extracts the positional features generated by the electrical signals to determine the touch position. And when the touch position is within the display range of the application icon, determining that the user inputs a click touch instruction at the position of the application icon. Accordingly, the click touch command is used to execute a function of running a corresponding application program in the current scene, so that the controller 250 may start running the corresponding application program.
For another example, when the user inputs a sliding motion in the media asset presentation page, the touch component also sends the sensed electrical signal to the controller 250. The controller 250 first determines the duration of the signal corresponding to the touch action in the electrical signal. When the determined duration is longer than the preset time threshold, the position change condition generated by the signal is judged, and obviously, for the interactive touch action, the generation position of the signal changes, so that the sliding touch instruction input by the user is determined. The controller 250 determines the sliding direction of the sliding touch instruction according to the change condition of the position of the signal generation, and controls to turn pages of the display frame in the media asset display page so as to display more media asset options. Further, the controller 250 may extract features such as a sliding speed and a sliding distance of the sliding touch instruction, and perform a page-turning screen control according to the extracted features, so as to achieve a hand-following effect.
Similarly, for the touch instruction such as double click, long press, etc., the controller 250 may execute the corresponding control function according to the preset interaction rule by extracting different features and determining the type of the touch instruction through feature judgment. In some embodiments, the touch component also supports multi-touch, such that a user can input touch actions on the touch screen with multiple fingers, e.g., multi-finger click, multi-finger long press, multi-finger swipe, etc.
The touch control action can be matched with a specific application program to realize a specific function. For example, after the user opens the drawing board application, the display 260 may present a drawing area, the user may draw a specific touch action track in the drawing area by sliding the touch instruction, and the controller 250 determines a touch action pattern through the touch action detected by the touch component and controls the display 260 to display in real time to satisfy the demonstration effect.
In some embodiments, multiple applications may be installed in the display device 200, and fig. 3 shows a user interface displayed by the display in some embodiments, and a user may click "my applications" in the user interface to trigger the display of the application list. All applications installed by the display apparatus 200 are included in the application list.
Figure 5 shows a schematic diagram of an application list in some embodiments. As shown in fig. 5. The display device 200 may be installed with a drawing board application, a player application, a video chat application, a camera application, and a mirror application. When the application list is displayed in the display, a user may select one of the applications and open the application to trigger the interface displaying the application, for example, the user may select to open the drawing board application, trigger the display to display the interface corresponding to the drawing board application, and perform drawing operation in the interface corresponding to the drawing board application to display corresponding content in the interface.
In some embodiments, when a user launches a sketchpad application, the controller may control the electronic sketchpad to be displayed in the display. An interactive area corresponding to one or more functions of the sketchpad application is displayed on the electronic sketchpad interface, and the interactive area can display text, images, icons, button buttons, pull-down menus, check boxes, selectable lists and the like. The user can make contact with the touch screen at a position where interaction is needed, so as to interact with the interaction area. The display apparatus 200 detects the contact and responds to the detected contact by performing a corresponding operation.
In some embodiments, an electronic palette interface may include a drawing area and a control area. The drawing area is an area capable of inputting contents, and the control area is used for intensively displaying user interface objects and information corresponding to one or more functions of the drawing board application. The user interface object includes, but is not limited to, a brush control, an erasing control, a split screen control, and the like, and the information includes various parameter information corresponding to the brush control, such as a current input color, an optional color, a thickness, a line shape, and the like.
In some embodiments, the layer in which the drawing area is located is a first layer, the layer in which the control area is located is a second layer, the second layer is arranged on the upper layer of the first layer in a suspended manner, and a user can select the second layer and move the second layer to any position on the upper layer of the first layer through touch dragging.
Fig. 4 is an electronic palette interface shown in some embodiments of the present application. As shown in fig. 4, the electronic palette interface includes a drawing area 610 and a control area 620. The drawing area 610 is used for receiving content input by a user through a control in the control area 620 and displaying a part or all of the area of the layer of the received content, and the content received by the drawing area 610 may be at least one of lines, graphics and text. The control area 620 is a partial or whole area for displaying layers of various types of functional controls, and the functional controls displayed in the control area 620 include at least one of a brush control 621, an erasing control 622, a deleting control 623, a recording control 624, a toolbar control 625, a canceling control 626, and a restoring control 627.
In some embodiments, in response to an operation event input by a user at the electronic palette interface, the controller may store the operation event input by the user in the memory. The operation event input by the user on the electronic drawing board interface refers to an event that the content in the electronic drawing board interface is changed by the user through operation, for example, the user picks up a painting tool and leaves a touch track on the electronic drawing board interface, which can be regarded as an operation event; the user picks up the eraser tool, and partial content displayed in the electronic drawing board interface is erased to be regarded as an operation event; and clicking the undo item by the user to undo the content displayed on the electronic drawing board interface by a certain input operation of the user can also be regarded as an operation event.
In some embodiments, in response to an operation of opening the electronic drawing board application by a user, the display is controlled to display the electronic drawing board interface, and a first operation list is generated in the memory, wherein the first operation list is used for storing operation events input by the user on the electronic drawing board interface.
In some embodiments, in response to a target operation event input by a user on the electronic drawing board interface, the controller may control to store the target operation event in the first operation list, and refresh the first image layer according to the target operation event and the historical operation event stored in the first operation list, so as to draw content corresponding to the operation event stored in the first operation list in the first image layer.
In some embodiments, referring to fig. 5, for an electronic palette interface shown in some embodiments of the present application, when a toolbar control 625 is selected, a toolbar 700 corresponding to the toolbar control 625 is triggered and displayed, and the toolbar 700 may include at least one of a spin control 710, an insert control 720, a save control 730, a split screen control 740, a share control 750, a collaborative drawing control 760, and a help control 770.
In some embodiments, when a user selects a trigger control, for example, when the user selects split control 740, the electronic palette interface will be divided into at least two regions, i.e., the electronic palette interface presents at least two regions. Referring to fig. 6, a screen splitting interface diagram of an electronic drawing board shown in some embodiments of the present application is shown. As shown in fig. 6, the electronic palette interface is divided into a first palette area 810 and a second palette area 820, wherein the first palette area 810 includes a first control area 811 and a first drawing area 812, and the second palette area 820 includes a second control area 821 and a second drawing area 822.
In some embodiments, when the user selects the split-screen control 740, the controller controls to determine a first preset area of the first layer as the first drawing area 812, determine a second preset area of the first layer as the second drawing area 822, and refresh the second layer, display the controls in the first control area in the first preset area of the second layer, and display the controls in the second control area in the second preset area of the second layer. More specifically, the present application does not limit the position, size, and shape of the first drawing region 812 and the second drawing region 822, and may be set by itself as needed.
In some embodiments, as shown in FIG. 6, the first control region 811 includes a plurality of controls for editing the first drawing region 812, and the first control region 811 may include at least one of a brush control 813, an erase control 814, a delete control 815, a record control 816, a toolbar control 817, an undo control 818, and a restore control 819. The second control area 821 includes a plurality of controls for editing the second drawing area 822, and the second control area 821 may include at least one of a brush control 823, an erase control 824, a delete control 825, a record control 826, a toolbar control 827, an undo control 828, and a restore control 829.
In some embodiments, in response to an operation of the split-screen control 740 selected by the user, the second operation list is controlled to be generated in the memory, and the first operation list is controlled only to store the operation event input by the user in the first drawing area 812 through the control in the first control area 811, so that after the first operation list stores the operation event input by the user in the first drawing area 812 through the control in the first control area 811, the first layer is refreshed, the operation content corresponding to the operation event stored in the first operation list is displayed in the first preset region (the first drawing area 812) of the first layer, and the second operation list is controlled only to store the operation event input by the user in the second drawing area 822 through the control in the second control area 821, so that after the second operation list stores the operation event input by the user in the second drawing area 822 through the control in the second control area 821, and refreshing the first image layer to realize that the operation content corresponding to the operation event stored in the second operation list is displayed in a second preset area (a second drawing area 822) of the first image layer.
In some application scenarios, the same user may edit the first drawing region 812 and the second drawing region 822 through the controls in the first control region 811 and the second control region 821, and the same user may also edit the first drawing region 812 and the second drawing region 822 through the controls in the first control region 811 and the second control region 821 at the same time, and further, different users may also edit the first drawing region 812 and the second drawing region 822 through the controls in the first control region 811 and the second control region 821 at the same time.
In some embodiments, when the user selects the brush control 813 of the first control area 811, a brush toolbar 910 corresponding to the brush control 813 may be triggered and displayed, referring to fig. 7, and a brush type, a line color, a line type, a line thickness, and the like may be selected in the brush toolbar 910. The user selects the brush control 813 for the target type to pick up the brush, and after picking up the brush, the user may continue to select the existing options in the brush toolbar 910, such as line color, line type, and line thickness. The line color, line type, and line thickness selected by the user in the brush toolbar 910, the line color, line type, and line thickness input by the brush control 813 to be configured as the target type. In a state where the brush control 813 is picked up, the user can input content, that is, a sliding path of the user on the first drawing region 812, based on contact with the first drawing region 812.
Further, after the user selects parameters such as a line color, a line type, and a line thickness in the brush toolbar 910, the brush toolbar 910 may be hidden by contacting the brush control 813 with the first drawing area 812 so as not to affect an area where drawing is possible, and an interface of the electronic drawing board behind the brush toolbar 910 is hidden as shown in fig. 6.
In some embodiments, after the user leaves a sliding path through contact with the first drawing area 812, the controller controls to store the coordinates of the pixel points passed by the sliding path in the first operation list, and refreshes the first drawing layer based on the first operation list, so as to fill the color of the pixel points corresponding to the coordinates of the pixel points stored in the memory in the first preset area (the first drawing area 812) of the first drawing layer, so as to display the sliding path in the first preset area (the first drawing area 812), thereby completing the drawing of the sliding path.
In some embodiments, when the user selects the brush control 823 in the second control area 821, the brush toolbar 920 corresponding to the brush control 823 may be triggered and displayed, and referring to fig. 7, a brush type, a line color, a line type, a line thickness, and the like may be selected in the brush toolbar 920. The user selects the brush control 823 of the target type to pick up the brush, and after the brush is picked up, the existing options such as the line color, the line type, and the line thickness may be selected in the brush toolbar 920. The line color, line type, and line thickness selected by the user in the brush toolbar 920, the line color, line type, and line thickness input by the brush control 813 to be configured as the target type. In the state of picking up the brush control 823, the user can input content based on contact with the second drawing region 822, that is, a sliding path of the user on the second drawing region 822.
Further, after the user selects parameters such as a line color, a line type, a line thickness, and the like in the brush toolbar 920, in order not to affect an area where drawing can be performed, the brush toolbar 920 may be hidden by contacting the brush control 823 with the second drawing area 822, and an interface of the electronic drawing board behind the brush toolbar 920 is hidden as shown in fig. 6.
In some embodiments, after the user leaves a sliding path through contact with the second drawing area 822, the controller controls to store the coordinates of the pixel points that the sliding path passes through into the second operation list, and refreshes the first drawing layer based on the second operation list, so as to fill the pixel points corresponding to the coordinates of the pixel points stored in the memory in the second preset area (the second drawing area 822) of the first drawing layer with color, so as to display the sliding path in the second preset area (the second drawing area 822), thereby completing drawing of the sliding path.
In some embodiments, the user may begin drawing after configuring the parameters of the brush control. Because the second layer to which the first control area 811 and the second control area 821 belong is arranged in a floating manner on the upper layer of the first layer to which the first drawing area 812 and the second drawing area 822 belong, when a user picks up the brush control and starts drawing, the second layer will receive a sliding operation from a drawing starting point to a drawing end point input by the user prior to the first layer, when the second layer receives a sliding track corresponding to the sliding operation from the drawing starting point to the drawing end point input by the user, the controller controls to store pixel point coordinates passed by the sliding path into the first operation list, and determines a target area according to the pixel point coordinates corresponding to the drawing starting point, the target area is the drawing area to which the drawing starting point is located, if the drawing starting point belongs to the first drawing area 812, the first layer is refreshed according to the brush parameters indicated by the first brush control and the first operation list stored in the memory, and carrying out color filling or replacement on pixel points which pass through the sliding path in the first drawing area. And if the drawing starting point belongs to the second drawing area 822, refreshing the first layer according to the drawing parameters indicated by the second drawing control and a second operation list stored in the memory, and performing color filling or replacement on the pixel points passing through the sliding path in the second drawing area 822.
For example, after the user respectively configures the brush parameters of the brush control 813 and the brush control 823, if the drawing start point is in the first drawing area 812, the controller will read the brush parameters of the brush control 813, at this time, the brush control 813 is in a pick-up state, if the user continues to perform touch movement on the electronic drawing board by using the brush control 813, all the pixels located on the first drawing area 812 on the touch movement trajectory will be filled according to the brush parameters of the brush control 813, and all the pixels located on the second drawing area 822 on the touch movement trajectory will not be filled. Or, if the drawing start point is in the second drawing area 822, the controller reads the brush parameter of the brush control 823, at this time, the brush control 823 is in a pick-up state, if the user continues to perform touch movement on the electronic drawing board using the brush control 823, all pixel points located on the second drawing area 822 on the touch movement trajectory will be filled according to the brush parameter of the brush control 823, and all pixel points located on the first drawing area 812 on the touch movement trajectory will not be filled.
In some embodiments, the user can erase the contents in the first drawing area 812 and/or the second drawing area 822 through an eraser control, and specifically, the controller determines an erasing path from an erasing start point to an erasing end point in response to a sliding operation from the erasing start point to the erasing end point input by the user, and obtains a target drawing area according to coordinates of a pixel point where the erasing start point is located, where the target drawing area is the drawing area where the erasing start point is located; if the target drawing area is the first drawing area 812, storing the coordinates of the pixel points passed by the erasing path into a first operation list, refreshing the first layer according to the coordinates of the pixel points passed by the erasing path stored in the first operation list, and eliminating or replacing the colors of the pixel points passed by the erasing path in the first layer so as to erase the historical drawing track coinciding with the erasing path in the first drawing area 812, namely eliminating or replacing the colors of the pixel points passed by the historical drawing track coinciding with the erasing path in the first drawing area 812; if the target drawing area is the second drawing area 822, storing the coordinates of the pixel points passed by the erasing path into a second operation list, refreshing the first layer according to the coordinates of the pixel points passed by the erasing path stored in the second operation list, and eliminating or replacing the colors of the pixel points passed by the erasing path in the first layer so as to erase the historical drawing track in the second drawing area 822 coincident with the erasing path, namely eliminating or replacing the colors of the pixel points passed by the historical drawing track in the second drawing area 822 coincident with the erasing path.
It should be noted that the eraser control 814 in the first control area 811 can only delete the content in the first drawing area 812, and the eraser control 824 in the second control area 821 can only delete the content in the second drawing area 822. That is, when the user's drawing start point belongs to the first drawing region 812, the eraser control 814 is in the pick-up state, and the user moves the eraser control 814 to delete the content on the touch movement locus of the eraser control 814, and the content in the second drawing region 822 will not be deleted even if the touch movement locus of the eraser control 814 is partially on the second drawing region 822. Similarly, when the user's falling point belongs to the second drawing region 822, the eraser control 824 is in the pickup state, and the user deletes the content on the touch movement locus of the eraser control 824 by moving the eraser control 824, even if the touch movement locus of the eraser control 824 is partially on the second drawing region 822, the content in the second drawing region 822 is not deleted.
In some embodiments, the user can delete the entire contents of the first drawing region 812 by selecting the delete control 815 in the first control region 811. For example, when the user selects the delete control 815, a selection interface as shown in FIG. 8 is triggered, which includes a confirm option and a cancel option, and when the user selects the confirm option, the entire contents of the first drawing region 812 are deleted, while the entire contents of the second drawing region 822 are retained, as shown in FIG. 9.
In some embodiments, in response to a user selecting the deletion control 815 in the first control area 811, the operation contents stored in the first operation list are controlled to be deleted, and the first image layer is refreshed according to the second operation list and the first operation list after the deletion operation, so as to empty all contents in the first drawing area 812 and retain the contents in the second drawing area 822.
In some embodiments, the user can delete the entire contents of the second drawing region 822 by selecting the delete control 825 in the second control region 821. For example, when the user selects the delete control 825, a selection interface as shown in FIG. 10 is triggered, which includes a confirm option and a cancel option, and when the user selects the confirm option, the entire contents of the second drawing region 812 will be deleted, while the entire contents of the first drawing region 812 will remain, as shown in FIG. 11.
In some embodiments, in response to a user selecting the deletion control 825 in the second control area 821, the deletion of the operation contents stored in the second operation list is controlled, and the first image layer is refreshed according to the first operation list and the second operation list after the deletion operation, so as to empty all contents in the second drawing area 822 and retain the contents in the first drawing area 812.
In some embodiments, each time the user inputs an operation content in the first drawing area 812, the controller controls the first operation list to store drawing operation data corresponding to the operation content, and the user can edit and change the content input in the first drawing area 812 by selecting at least one of the erasing control 814, the deleting control 814, the canceling control 815, the restoring control 81, and the like in the first control area 811, so that the operation of editing the content in the first drawing area 812 by selecting at least one of the erasing control 814, the deleting control 815, the canceling control 818, the restoring control 819, and the like in the first control area 811 can also be regarded as the operation content input in the first drawing area 812 by the user, and each time the user performs an operation, the controller controls to store the drawing operation data performed by the user into the drawing operation data sets in the first operation list in a time sequence, since the aforementioned drawing operation data set stores only the operation contents input in the first drawing region 812, the drawing operation data set can be regarded as the first drawing operation data set.
In some embodiments, the user may select the second control area 821 by selecting at least one of the erase control 824, delete control 824, undo control 825, and restore control 826, since the contents input in the second drawing area 822 are edited and changed, the operation of the user for editing the contents in the second drawing area 822 by selecting at least one of the erase control 824, the delete control 825, the cancel control 818, the restore control 819, and the like in the second control area 821 may be regarded as the operation contents input in the second drawing area 822 by the user, and the controller may control the drawing operation data performed by the user to be stored in the drawing operation data set of the second operation list in time order every time the user performs the operation, since the drawing operation data set stores only the operation contents input in the second drawing region 822, the drawing operation data set can be regarded as the second drawing operation data set.
It will be appreciated that the first drawing operation data set and the second drawing operation data set are independent of each other and do not affect each other.
In some embodiments, each time the user selects the undo control 818 in the first control area 811, the display device reads the first drawing operation data set, stores the drawing operation data most recently saved in the first drawing operation data set into the first recovery data set in the first operation list, and deletes the drawing operation data in the first drawing operation data set. Meanwhile, the first image layer is refreshed according to the first drawing operation data set and the second drawing operation data set to delete the input content corresponding to the drawing operation data in the first drawing region 812.
In some embodiments, each time the user selects the restore control 819 in the first control area 811, the display device reads the first restore data set, stores the most recently saved drawing operation data in the first restore data set into the first drawing operation data set, and deletes the drawing operation data in the first restore data set. At the same time, the first image layer is refreshed according to the first drawing operation data set and the second drawing operation data set to display the input contents corresponding to the aforementioned drawing operation data in the first drawing region 812.
In some embodiments, each time the user selects the undo control 828 in the second control area 821, the display device reads the second drawing operation data set, stores the drawing operation data stored in the second drawing operation data set last time into the second restore data set, and deletes the drawing operation data in the second drawing operation data set. At the same time, the input contents corresponding to the aforementioned drawing operation data in the second drawing region 822 are deleted.
In some embodiments, each time the user selects the restore control 829 in the second control area 821, the display device reads the second restore data set, stores the drawing operation data stored in the second restore data set last time into the second drawing operation data set, and deletes the drawing operation data in the second restore data set. Meanwhile, the first image layer is refreshed according to the first drawing operation data set and the second drawing operation data set to input an input content corresponding to the drawing operation data in the second drawing region 822.
It can be understood that, since the first recovery data set is stored in the first operation list, the second recovery data set is stored in the second operation list, and the first operation list and the second operation list are independent from each other, the first recovery data set and the second recovery data set are also independent from each other and do not affect each other.
In some embodiments, referring to fig. 12, when a user selects a toolbar control 817 in the first control area 811, a toolbar 910 corresponding to the toolbar control 817 is triggered and displayed, and the toolbar 910 may include at least one of a save control 911 and a share control 912. When the user selects the toolbar control 827 in the second control area 821, the toolbar 920 corresponding to the toolbar control 827 is triggered and displayed, and the toolbar 920 may include at least one of a save control 921 and a share control 922.
In some embodiments, the user can save the entire contents of the first drawing region 812 by selecting the save control 911 in the toolbar 910. For example, when the save control 911 is selected by the user, the controller controls the first operation list to store the target picture in the first drawing area 812 and the first drawing operation data set corresponding to the target picture, so that when the user opens the target picture again later, the display apparatus can display an editable target picture according to the stored first drawing operation data set.
In some embodiments, the user can save the entire contents of the second drawing region 822 by selecting the save control 921 in the toolbar 920. For example, when the save control 921 is selected by the user, the controller controls the second operation list to store the target picture in the second drawing area 822 and the second drawing operation data set corresponding to the target picture, so that the display apparatus can display the editable target picture according to the stored second drawing operation data set when the user opens the target picture again later.
In some embodiments, the user may select the sharing control 912 in the toolbar 910 to share all the contents in the first drawing area 812 to the target sink device, so that when the target sink device opens the target picture, the target sink device may display the editable target picture according to the received first drawing operation data set.
In some embodiments, the user may select the sharing control 922 in the toolbar 920 to share all the contents in the second drawing area 822 to the target sink device, so that when the target sink device opens the target picture, the target sink device may display the editable target picture according to the received second drawing operation data set.
In some embodiments, referring to fig. 13, the first drawing board area 810 further includes an exit control 930, and when the user selects the exit control 930 in the first drawing board area 810, the first operation list is controlled to store the target picture and the first drawing operation data set corresponding to the content that has been input in the first drawing area 812, and at the same time, the split state of the electronic drawing board interface is cancelled, and the first drawing layer is refreshed according to the second operation list, so that only the content in the second drawing area 822 is presented on the electronic drawing board interface, and the content in the second drawing area 822 is adjusted to a preset size and position. For example, referring to fig. 14, the content in the second palette area 820 may be zoomed in and moved to the center of the electronic palette interface.
In some embodiments, referring to fig. 15, the second drawing board area 820 further includes an exit control 940, and when the exit control 940 in the second drawing board area 820 is selected by the user, the second operation list stores the target picture and the second drawing operation data set corresponding to the content that has been input in the second drawing area 822, and at the same time, the split screen state of the electronic drawing board interface is cancelled, so that only the content in the first drawing area 812 is presented on the electronic drawing board interface, and the content in the first drawing area 812 is adjusted to a preset size and position. For example, referring to fig. 16, the content in the first palette area 810 may be zoomed in and moved to the center of the electronic palette interface.
According to the display device provided by the foregoing embodiment, the present application also provides a split-screen display method, which is applied to the display device described above, and as shown in fig. 17, the method may include:
s171, in response to an input instruction for indicating an electronic drawing board interface to be partitioned, displaying a first drawing board area and a second drawing board area on the electronic drawing board interface, wherein the first drawing board area comprises a first control area and a first drawing area, the first control area comprises at least one control for performing first input operation on the first drawing area, and the first drawing area is used for displaying first operation content corresponding to the first input operation; the second drawing board area comprises a second control area and a second drawing area, the second control area comprises at least one control used for carrying out second input operation on the second drawing area, and the second drawing area is used for displaying second operation content corresponding to the second input operation.
In some embodiments, the electronic drawing board interface includes a drawing area and a control area, where the drawing area is located in a layer that is a first layer, the control area is located in a layer that is a second layer, the second layer is suspended and disposed on an upper layer of the first layer, and the first drawing board area and the second drawing board area are displayed on the electronic drawing board interface in response to an input instruction that instructs the electronic drawing board interface to partition;
refreshing the second layer, displaying the control in the first control area in a first preset area of the second layer, and displaying the control in the second control area in a second preset area of the second layer
And S172, responding to a first operation instruction input by a first user, and displaying the first operation content in the first drawing area, wherein the first operation instruction is used for indicating that the first input operation is performed on the first drawing area according to a control in the first control area selected by the user.
And S173, when the first operation instruction input by the first user is received, responding to a second operation instruction input by a second user, and displaying the second operation content in the second drawing area, wherein the second operation instruction is used for indicating that the second input operation is performed on the second drawing area according to a control in the second control area selected by the user.
In some embodiments, the first control region comprises at least one first brush control, the second control region comprises at least one second brush control, and a sliding path from a drawing starting point to a drawing end point is determined in response to a sliding operation from the drawing starting point to the drawing end point input by a user; acquiring a target drawing area, wherein the target drawing area is a drawing area where the drawing starting point is located; if the target drawing area is the first drawing area, controlling and displaying the sliding path of the first drawing area according to the painting brush parameter indicated by the first painting brush control; and if the target drawing area is the second drawing area, controlling and displaying the sliding path of the second drawing area according to the painting brush parameter indicated by the second painting brush control.
In some embodiments, the brush parameters include at least a brush type, a brush color, and a brush width indicated by the brush control.
In some embodiments, the first control area comprises a first erasing control, the second control area comprises a second erasing control, and an erasing path between an erasing starting point and an erasing end point is determined in response to a sliding operation from the erasing starting point to the erasing end point input by a user; acquiring a target drawing area, wherein the target drawing area is a drawing area where the erasing starting point is located; if the target drawing area is the first drawing area, erasing a historical drawing track in the first drawing area, which is coincident with the erasing path; and if the target drawing area is the second drawing area, erasing a historical drawing track in the second drawing area, which is coincident with the erasing path.
In some embodiments, the first control area includes a first exit control, and in response to a user selecting the first exit control, the second layer is refreshed to display only the controls in the second control area in the refreshed second layer; refreshing the first image layer so as to only display the content in the second drawing area in the refreshed first image layer.
In some embodiments, the first image layer is refreshed, and the content in the second drawing area with a preset size is displayed at a preset position of the refreshed first image layer.
In some embodiments, in response to a user selecting the first exit control, a target picture corresponding to the content input in the first drawing area is stored.
In a specific implementation, the present invention further provides a computer storage medium, where the computer storage medium may store a program, and the program may include some or all of the steps in each embodiment of the split-screen display method provided by the present invention when executed. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM) or a Random Access Memory (RAM).
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented as software plus a required general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be essentially or partially implemented in the form of a software product, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments.
The same and similar parts among the various embodiments in this specification may be referred to each other. In particular, for the embodiment of the display device, since it is substantially similar to the embodiment of the method, the description is simple, and for the relevant points, refer to the description in the embodiment of the method.
The above-described embodiments of the present invention should not be construed as limiting the scope of the present invention.

Claims (8)

1. A display device, comprising:
the display is used for displaying the electronic drawing board interface;
the touch control assembly is used for receiving an instruction input by a user through touch control, wherein the touch control assembly and the display form a touch control screen;
a controller configured to:
displaying a first drawing board area and a second drawing board area on an electronic drawing board interface in response to an instruction which is input by a user and indicates that the electronic drawing board interface is partitioned, wherein the first drawing board area comprises a first control area and a first drawing area, the first control area comprises at least one control used for carrying out a first input operation on the first drawing area, and the first drawing area is used for displaying first operation content corresponding to the first input operation; the second drawing board area comprises a second control area and a second drawing area, the second control area comprises at least one control used for carrying out second input operation on the second drawing area, and the second drawing area is used for displaying second operation content corresponding to the second input operation;
displaying the first operation content on the first drawing area in response to a first operation instruction input by a first user, wherein the first operation instruction is used for indicating that the first input operation is performed on the first drawing area according to a control selected by the user in the first control area;
when the first operation instruction input by the first user is received, responding to a second operation instruction input by a second user, and displaying the second operation content on the second drawing area, wherein the second operation instruction is used for indicating that the second input operation is performed on the second drawing area according to a control in the second control area selected by the user.
2. The display device according to claim 1, wherein the electronic palette interface includes a drawing area and a control area, a layer in which the drawing area is located is a first layer, a layer in which the control area is located is a second layer, the second layer is arranged in a floating manner on an upper layer of the first layer, and the first palette area and the second palette area are displayed on the electronic palette interface in response to an input instruction indicating that the electronic palette interface is partitioned, further including:
in response to an input instruction for indicating an electronic drawing board interface to partition, determining a first preset area of the first image layer as a first drawing area, and determining a second preset area of the first image layer as a second drawing area;
refreshing the second layer, displaying the control in the first control area in a first preset area of the second layer, and displaying the control in the second control area in a second preset area of the second layer.
3. The display device of claim 2, wherein the first control region includes at least one first-brush control, wherein the second control region includes at least one second-brush control, and wherein the controller is further configured to:
in response to a sliding operation from a drawing starting point to a drawing end point input by a user, determining a sliding path from the drawing starting point to the drawing end point;
acquiring a target drawing area, wherein the target drawing area is a drawing area where the drawing starting point is located;
if the target drawing area is the first drawing area, controlling and displaying the sliding path in the first drawing area according to the painting brush parameter indicated by the first painting brush control;
and if the target drawing area is the second drawing area, controlling and displaying the sliding path in the second drawing area according to the painting brush parameter indicated by the second painting brush control.
4. The display device of claim 2, wherein the first control region comprises a first erase control, wherein the second control region comprises a second erase control, and wherein the controller is further configured to:
in response to a sliding operation from an erasing starting point to an erasing end point input by a user, determining an erasing path from the erasing starting point to the erasing end point;
acquiring a target drawing area, wherein the target drawing area is a drawing area where the erasing starting point is located;
if the target drawing area is the first drawing area, erasing a historical drawing track which is coincident with the erasing path in the first drawing area;
and if the target drawing area is the second drawing area, erasing the historical drawing track which is coincident with the erasing path in the second drawing area.
5. The display device of claim 2, wherein the first control region comprises a first exit control, the controller further configured to:
responding to the operation of a user for selecting the first quitting control, refreshing the second layer, and only displaying the control in the second control area in the refreshed second layer;
refreshing the first image layer so as to only display the content in the second drawing area in the refreshed first image layer.
6. The display device according to claim 5, wherein the first image layer is refreshed to display only the content in the second drawing region in the refreshed first image layer, further comprising:
refreshing the first image layer, and displaying the content in the second drawing area with the preset size at the preset position of the refreshed first image layer.
7. The display device of claim 6, wherein the refreshing the second layer in response to the user selecting the first exit control further comprises:
and responding to the selection operation of the user on the first quitting control, and storing the target picture corresponding to the content input in the first drawing area.
8. A split-screen display method, comprising:
displaying a first drawing board area and a second drawing board area on an electronic drawing board interface in response to an instruction which is input by a user and indicates that the electronic drawing board interface is partitioned, wherein the first drawing board area comprises a first control area and a first drawing area, the first control area comprises at least one control used for carrying out a first input operation on the first drawing area, and the first drawing area is used for displaying first operation content corresponding to the first input operation; the second drawing board area comprises a second control area and a second drawing area, the second control area comprises at least one control used for carrying out second input operation on the second drawing area, and the second drawing area is used for displaying second operation content corresponding to the second input operation;
displaying the first operation content on the first drawing area in response to a first operation instruction input by a first user, wherein the first operation instruction is used for indicating that the first input operation is performed on the first drawing area according to a control selected by the user in the first control area;
when the first operation instruction input by the first user is received, responding to a second operation instruction input by a second user, and displaying the second operation content on the second drawing area, wherein the second operation instruction is used for indicating that the second input operation is performed on the second drawing area according to a control in the second control area selected by the user.
CN202210133703.6A 2021-10-22 2022-02-11 Display device and split-screen display method Pending CN114501108A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/109184 WO2023065766A1 (en) 2021-10-22 2022-07-29 Display device and display method thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111235610 2021-10-22
CN2021112356106 2021-10-22

Publications (1)

Publication Number Publication Date
CN114501108A true CN114501108A (en) 2022-05-13

Family

ID=81479615

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210133703.6A Pending CN114501108A (en) 2021-10-22 2022-02-11 Display device and split-screen display method

Country Status (1)

Country Link
CN (1) CN114501108A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114995731A (en) * 2022-07-29 2022-09-02 深圳润方创新技术有限公司 Display control method and system of drawing board and drawing board
WO2023065766A1 (en) * 2021-10-22 2023-04-27 海信视像科技股份有限公司 Display device and display method thereof

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103207741A (en) * 2012-01-12 2013-07-17 飞宏科技股份有限公司 Multi-user touch control method and system of computer virtual object
CN106339135A (en) * 2016-08-30 2017-01-18 科盟(福州)电子科技有限公司 Infrared electronic whiteboard A/B screen splitting method capable of supporting independent operation by multiple persons
US20180181293A1 (en) * 2016-12-28 2018-06-28 Amazon Technologies, Inc. Interruption and resumption of feedback animation for touch-based interactions

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103207741A (en) * 2012-01-12 2013-07-17 飞宏科技股份有限公司 Multi-user touch control method and system of computer virtual object
CN106339135A (en) * 2016-08-30 2017-01-18 科盟(福州)电子科技有限公司 Infrared electronic whiteboard A/B screen splitting method capable of supporting independent operation by multiple persons
US20180181293A1 (en) * 2016-12-28 2018-06-28 Amazon Technologies, Inc. Interruption and resumption of feedback animation for touch-based interactions

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023065766A1 (en) * 2021-10-22 2023-04-27 海信视像科技股份有限公司 Display device and display method thereof
CN114995731A (en) * 2022-07-29 2022-09-02 深圳润方创新技术有限公司 Display control method and system of drawing board and drawing board
CN114995731B (en) * 2022-07-29 2022-10-04 深圳润方创新技术有限公司 Display control method and system of drawing board and drawing board

Similar Documents

Publication Publication Date Title
CN113810746B (en) Display equipment and picture sharing method
CN114501107A (en) Display device and coloring method
CN114501108A (en) Display device and split-screen display method
CN112799627B (en) Display apparatus and image display method
WO2017113624A1 (en) System and method for operating system of mobile device
CN113507646B (en) Display equipment and browser multi-label page media resource playing method
CN114157889B (en) Display equipment and touch control assisting interaction method
CN115129214A (en) Display device and color filling method
CN114115637A (en) Display device and electronic drawing board optimization method
CN112947800A (en) Display device and touch point identification method
CN115562544A (en) Display device and revocation method
CN115550717A (en) Display device and multi-finger touch display method
CN112926420B (en) Display device and menu character recognition method
CN112947783B (en) Display device
CN112650418B (en) Display device
CN113485614A (en) Display apparatus and color setting method
CN114442849B (en) Display equipment and display method
CN114296623A (en) Display device
CN112732120A (en) Display device
CN114727142B (en) Display equipment and collaborative drawing method
WO2023065766A1 (en) Display device and display method thereof
CN114281284B (en) Display apparatus and image display method
CN115550718A (en) Display device and display method
CN114727142A (en) Display device and collaborative drawing method
CN118119918A (en) Display device and display method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination