CN113485614A - Display apparatus and color setting method - Google Patents

Display apparatus and color setting method Download PDF

Info

Publication number
CN113485614A
CN113485614A CN202110736313.3A CN202110736313A CN113485614A CN 113485614 A CN113485614 A CN 113485614A CN 202110736313 A CN202110736313 A CN 202110736313A CN 113485614 A CN113485614 A CN 113485614A
Authority
CN
China
Prior art keywords
color
filling
color value
user
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110736313.3A
Other languages
Chinese (zh)
Inventor
董率
李乃金
张振宝
肖媛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202110736313.3A priority Critical patent/CN113485614A/en
Publication of CN113485614A publication Critical patent/CN113485614A/en
Priority to PCT/CN2022/096009 priority patent/WO2023273761A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/40Filling a planar surface by adding surface attributes, e.g. colour or texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Abstract

The application provides a display apparatus and a color setting method. The display device includes a display and a controller. Wherein the display is configured to: and displaying the electronic drawing board, wherein the electronic drawing board comprises a drawing brush control and a drawing area. The drawing area can display a target picture, the target picture comprises a color filling area and an area boundary, and the brush control can input content in the color filling area. The controller is configured to: and detecting the filling color indicated by the brush control selected by the user, and determining a filling color value corresponding to the filling color. And when the filling color value is the same as the color value of the pixel point of the region boundary, adjusting the filling color value. At this moment, the color values of the pixel points filling the color values and the region boundary are different, so that the original region boundary of the target image cannot be damaged by the content input by the user, and the user experience is improved.

Description

Display apparatus and color setting method
Technical Field
The present application relates to the field of display device technologies, and in particular, to a display device and a color setting method.
Background
The display device refers to a terminal device capable of outputting a specific display picture, such as a smart television, a mobile terminal, a smart advertisement screen, a projector, and the like. Along with the rapid development of display equipment, the function of the display equipment is more and more abundant, the performance is more and more powerful, the bidirectional man-machine interaction function can be realized, and various functions such as audio and video, entertainment, data and the like are integrated, so that the diversified and personalized requirements of users are met.
With the popularization of display equipment and the continuous update of multimedia education televisions, more and more teaching entertainment and children intelligence development applications can be used on the televisions, so that the functions of education, teaching, training, intelligence development and the like are realized or assisted. The drawing board application can be installed in the display device, so that the drawing board function provided by the drawing board application is used. For example, different types of pictures can be provided in a drawing board application, and a user can perform operations such as painting on the pictures.
When a user paints a picture, if the colors of the drawn part and the picture boundary are the same, the part may be considered as the picture boundary, so that the original picture boundary is damaged, the painting effect is affected, and the user experience is poor.
Disclosure of Invention
The invention provides a display device and a color setting method. The problem that in the related art, the picture boundary is damaged, so that the user experience is poor is solved.
In a first aspect, the present application provides a display device comprising a display and a controller. Wherein the display is configured to: the method comprises the steps of displaying an electronic drawing board, wherein the electronic drawing board comprises a drawing pen control and a drawing area, the drawing area is used for displaying a target picture, the target picture comprises a color filling area and an area boundary, and the drawing pen control is used for inputting content in the color filling area. The controller is configured to perform the steps of:
determining the filling color indicated by the brush control in response to the triggering operation of the brush control by a user; detecting a filling color value corresponding to the filling color; and when the filling color value is the same as the color value of the pixel point of the region boundary, adjusting the filling color value so as to enable the filling color value to be different from the color value of the pixel point of the region boundary.
In some implementations, the controller is further configured to: prior to performing the step of determining the fill color indicated by the brush control,
responding to the triggering operation of the user on the brush control, and controlling a display to display a filling color condition so that the user selects the filling color indicated by the brush control; the fill color case includes all colors that the brush control can indicate and the fill color currently indicated by the brush control.
In some implementations, the controller is further configured to: in performing the step of adjusting the fill color values,
and adding or subtracting a preset value to the numerical value of the first color value component in the filling color value so as to enable the filling color value to be different from the color value of the pixel point of the region boundary.
In some implementations, the controller is further configured to: and when the filling color value is different from the color of the pixel point of the region boundary, the filling color value is not adjusted.
In some implementations, the display device further includes a touch component configured to detect a touch trajectory input by a user.
In some implementations, the controller is further configured to:
responding to the operation of a user in the color filling area, and detecting a first touch track input by the user; and updating the color values of all the pixel points in the first touch track according to the filling color values.
In some implementations, the controller is further configured to: in the step of updating the color values of all the pixel points in the first touch trajectory according to the filling color values,
and replacing the color values of all the pixel points in the first touch track with filling color values.
In some implementations, the controller is further configured to: in the step of updating the color values of all the pixel points in the first touch trajectory according to the filling color values,
acquiring a current first color value of a pixel point in the first touch track; superposing the first color value and the filling color value to obtain a second color value; and updating the color value of the pixel point in the first touch track to the second color value.
In some implementations, the controller is further configured to: after the step of setting the second color value as the color value of the pixel point in the first touch trajectory is performed,
detecting whether the second color value is the same as the color value of the pixel point of the region boundary; and when the second color value is the same as the color value of the pixel point of the area boundary, adjusting the second color value to obtain a third color value, and updating the color value of the pixel point in the first touch track to the third color value.
In some implementations, the controller is further configured to: and determining a filling color value of the switched filling color in response to an instruction of switching the filling color indicated by the brush control and input by a user.
In some implementations, the controller is further configured to:
responding to the operation of the user in the color filling area, and detecting a second touch track input by the user; and updating the color values of all pixel points in the second touch track according to the switched filling color values.
In a second aspect, the present application provides a color setting method applied to a display device, the method including:
the method comprises the steps that an electronic drawing board is displayed, the electronic drawing board comprises a drawing pen control and a drawing area, the drawing area is used for displaying a target picture, the target picture comprises a color filling area and an area boundary, and the drawing pen control is used for inputting content in the color filling area;
determining the filling color indicated by the brush control in response to the triggering operation of the brush control by a user; detecting a filling color value corresponding to the filling color; and when the filling color value is the same as the color value of the pixel point of the region boundary, adjusting the filling color value so as to enable the filling color value to be different from the color value of the pixel point of the region boundary.
According to the technical scheme, the display device and the color setting method can display the electronic drawing board, and the electronic drawing board comprises the drawing pen control and the drawing area. The drawing area can display a target picture, the target picture comprises a color filling area and an area boundary, and the brush control can input content in the color filling area. And detecting the filling color indicated by the brush control selected by the user, and determining a filling color value corresponding to the filling color. And when the filling color value is the same as the color value of the pixel point of the region boundary, adjusting the filling color value. At this moment, the color values of the pixel points filling the color values and the region boundary are different, so that the original region boundary of the target image cannot be damaged by the content input by the user, and the user experience is improved.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without any creative effort.
FIG. 1 illustrates a usage scenario of a display device according to some embodiments;
fig. 2 illustrates a hardware configuration block diagram of the control apparatus 100 according to some embodiments;
fig. 3 illustrates a hardware configuration block diagram of the display apparatus 200 according to some embodiments;
FIG. 4 illustrates a software configuration diagram in the display device 200 according to some embodiments;
FIG. 5 illustrates a user interface in a display in a possible embodiment;
FIG. 6 shows a schematic diagram of an application list in a possible embodiment;
fig. 7 shows a schematic diagram of an electronic palette in a possible embodiment;
FIG. 8 is a schematic diagram of a target picture in a possible embodiment;
FIG. 9 is a diagram illustrating a toolbar corresponding to the brush control in a possible embodiment;
FIG. 10 is a schematic diagram of the color options in the toolbar in a possible embodiment;
FIG. 11 is a diagram illustrating a content destruction area boundary input by a user in the related art;
FIG. 12 is a diagram showing a display area boundary protection mode confirmation message in the display in one possible embodiment;
FIG. 13 illustrates an interaction flow diagram for components of a display device in some embodiments;
FIG. 14 is a schematic diagram illustrating that the content input by the user does not damage the regional boundary in a possible embodiment;
FIG. 15 shows a schematic flow chart diagram of one embodiment of a color setting method.
Detailed Description
To make the purpose and embodiments of the present application clearer, the following will clearly and completely describe the exemplary embodiments of the present application with reference to the attached drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
Fig. 1 is a schematic diagram of a usage scenario of a display device according to an embodiment. As shown in fig. 1, the display apparatus 200 is also in data communication with a server 400, and a user can operate the display apparatus 200 through the smart device 300 or the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes at least one of an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, and controls the display device 200 in a wireless or wired manner. The user may control the display apparatus 200 by inputting a user instruction through at least one of a key on a remote controller, a voice input, a control panel input, and the like.
In some embodiments, the smart device 300 may include any of a mobile terminal, a tablet, a computer, a laptop, an AR/VR device, and the like.
In some embodiments, the smart device 300 may also be used to control the display device 200. For example, the display device 200 is controlled using a camera application running on the smart device.
In some embodiments, the smart device 300 and the display device may also be used for communication of data.
In some embodiments, the display device 200 may also be controlled in a manner other than the control apparatus 100 and the smart device 300, for example, the voice instruction control of the user may be directly received by a module configured inside the display device 200 to obtain a voice instruction, or may be received by a voice control apparatus provided outside the display device 200.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers.
In some embodiments, software steps executed by one step execution agent may be migrated on demand to another step execution agent in data communication therewith for execution. Illustratively, software steps performed by the server may be migrated to be performed on a display device in data communication therewith, and vice versa, as desired.
Fig. 2 exemplarily shows a block diagram of a configuration of the control apparatus 100 according to an exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply. The control apparatus 100 may receive an input operation instruction from a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200.
In some embodiments, the communication interface 130 is used for external communication, and includes at least one of a WIFI chip, a bluetooth module, NFC, or an alternative module.
In some embodiments, the user input/output interface 140 includes at least one of a microphone, a touchpad, a sensor, a key, or an alternative module.
Fig. 3 shows a hardware configuration block diagram of the display apparatus 200 according to an exemplary embodiment.
In some embodiments, the display apparatus 200 includes at least one of a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, a user interface.
In some embodiments the controller comprises a central processor, a video processor, an audio processor, a graphics processor, a RAM, a ROM, a first interface to an nth interface for input/output.
In some embodiments, the display 260 includes a display screen component for displaying pictures, and a driving component for driving image display, a component for receiving image signals from the controller output, displaying video content, image content, and menu manipulation interface, and a user manipulation UI interface, etc.
In some embodiments, the display 260 may be at least one of a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
In some embodiments, the tuner demodulator 210 receives broadcast television signals via wired or wireless reception, and demodulates audio/video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals.
In some embodiments, communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver. The display apparatus 200 may establish transmission and reception of control signals and data signals with the control device 100 or the server 400 through the communicator 220.
In some embodiments, the detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for collecting ambient light intensity; alternatively, the detector 230 includes an image collector, such as a camera, which may be used to collect external environment scenes, attributes of the user, or user interaction gestures, or the detector 230 includes a sound collector, such as a microphone, which is used to receive external sounds.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, and the like. The interface may be a composite input/output interface formed by the plurality of interfaces.
In some embodiments, the controller 250 and the modem 210 may be located in different separate devices, that is, the modem 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink, an icon, or other actionable control. The operations related to the selected object are: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon.
In some embodiments the controller comprises at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphics Processing Unit (GPU), a RAM Random Access Memory (RAM), a ROM (Read-Only Memory), a first to nth interface for input/output, a communication Bus (Bus), and the like.
A CPU processor. The system is used for executing the operating system and the camera application instructions stored in the memory and executing various camera applications, data and contents according to various interaction instructions received from the outside so as to finally display and play various audio and video contents. The CPU processor may include a plurality of processors. E.g. comprising a main processor and one or more sub-processors.
In some embodiments, a graphics processor for generating various graphics objects, such as: at least one of an icon, an operation menu, and a user input instruction display figure. The graphic processor comprises an arithmetic unit, which performs operation by receiving various interactive instructions input by a user and displays various objects according to display attributes; the system also comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, the video processor is configured to receive an external video signal, and perform at least one of video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a signal displayed or played on the direct display device 200.
In some embodiments, the video processor includes at least one of a demultiplexing module, a video decoding module, an image composition module, a frame rate conversion module, a display formatting module, and the like. The demultiplexing module is used for demultiplexing the input audio and video data stream. And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like. And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display. And the frame rate conversion module is used for converting the frame rate of the input video. And the display formatting module is used for converting the received video output signal after the frame rate conversion, and changing the signal to be in accordance with the signal of the display format, such as an output RGB data signal.
In some embodiments, the audio processor is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform at least one of noise reduction, digital-to-analog conversion, and amplification processing to obtain a sound signal that can be played in the speaker.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on display 260, and the user input interface receives the user input commands through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between a camera application or operating system and a user that enables conversion between an internal form of information and a user-acceptable form. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include at least one of an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc. visual interface elements.
In some embodiments, user interface 280 is an interface that may be used to receive control inputs (e.g., physical buttons on the body of the display device, or the like).
In some embodiments, a system of a display device may include a Kernel (Kernel), a command parser (shell), a file system, and a camera application. The kernel, shell, and file system together make up the basic operating system structure that allows users to manage files, run programs, and use the system. After power-on, the kernel is started, kernel space is activated, hardware is abstracted, hardware parameters are initialized, and virtual memory, a scheduler, signals and interprocess communication (IPC) are operated and maintained. And after the kernel is started, loading the Shell and the user camera application. The camera application is compiled into machine code after being started, and a process is formed.
Referring to fig. 4, in some embodiments, the system is divided into four layers, which are, from top to bottom, a camera Application (Applications) layer (abbreviated as "Application layer"), a camera Application Framework (Application Framework) layer (abbreviated as "Framework layer"), an Android runtime (Android runtime) and system library layer (abbreviated as "system runtime library layer"), and a kernel layer.
In some embodiments, at least one camera application runs in the camera application layer, and the camera applications may be a Window (Window) program of an operating system, a system setting program, a clock program, or the like; or a camera application developed by a third party developer. In particular, the camera application package in the camera application layer is not limited to the above example.
The framework layer provides an Application Programming Interface (API) and a programming framework for the camera application of the camera application layer. The camera application framework layer includes some predefined functions. The camera application framework layer acts as a processing center that decides to let the camera applications in the application layer act. The camera application can access resources in the system and obtain services of the system in execution through the API interface.
As shown in fig. 4, in the embodiment of the present application, the camera application framework layer includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used for interacting with all activities running in the system; the Location Manager (Location Manager) is used for providing the system service or application with the access of the system Location service; a Package Manager (Package Manager) for retrieving various information related to the camera application Package currently installed on the device; a Notification Manager (Notification Manager) for controlling display and clearing of Notification messages; a Window Manager (Window Manager) is used to manage the icons, windows, toolbars, wallpapers, and desktop components on a user interface.
In some embodiments, the activity manager is used to manage the lifecycle of the various camera applications and the usual navigation fallback functions, such as controlling the exit, opening, fallback, etc. of the camera applications. The window manager is used for managing all window programs, such as obtaining the size of a display screen, judging whether a status bar exists, locking the screen, intercepting the screen, controlling the change of the display window (for example, reducing the display window, displaying a shake, displaying a distortion deformation, and the like), and the like.
In some embodiments, the system runtime layer provides support for the upper layer, i.e., the framework layer, and when the framework layer is used, the android operating system runs the C/C + + library included in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the core layer includes at least one of the following drivers: audio drive, display driver, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (like fingerprint sensor, temperature sensor, pressure sensor etc.) and power drive etc..
Based on the display device, the display device can support the rotating and/or lifting functions by adding the driving component and the gesture detection component. Typically, the driving assembly includes a rotation assembly and/or a lifting assembly, and the controller 250 may be in communication with the rotation assembly and/or the lifting assembly to control the rotation assembly to rotate the display when the display needs to be rotated and to control the lifting assembly to lift or lower the display when the display needs to be lifted or lowered.
In a possible implementation manner, a GPIO interface is provided on the rotation component and/or the lifting component, and the controller changes the state of the GPIO interface of the rotation component and/or the lifting component by reading the GPIO interface. And the rotating component and/or the lifting component drive the display to rotate and/or lift according to the changed GPIO interface state when the GPIO interface state changes.
In a possible implementation, the lifting assembly and/or the lifting assembly includes an MCU chip on which a bluetooth module is integrated, such that the lifting assembly and/or the lifting assembly supports bluetooth functionality, such as Bluetooth Low Energy (BLE), and further, the controller 250 may communicate with the lifting assembly and/or the lifting assembly based on a bluetooth protocol.
In some embodiments, the detection assembly includes a sensor for detecting a rotational state of the display and a sensor for detecting a lifting state of the display. In the rotating or lifting process of the display, the controller monitors the rotating state or the lifting state of the display in real time according to the data detected by the gesture detection assembly. For example, in the process of controlling the display to rotate, information such as a rotation angle, an angular speed and the like is acquired by monitoring data of the sensor. In the process of controlling the display to ascend and descend, information such as ascending and descending distance and ascending and descending speed is acquired by monitoring data of the sensor.
In some embodiments, the detection assembly includes a drive assembly. For example, a sensor for detecting a rotation state of the display is included in the rotation member, and constitutes the rotation member together with the rotation member. The sensor for detecting and displaying the lifting state is included in the lifting assembly, and the lifting assembly jointly form the lifting assembly.
The drawing board application can be installed in the display device, so that the drawing board function provided by the drawing board application is used. For example, different types of pictures can be provided in a drawing board application, and a user can perform operations such as painting on the pictures. When a user paints a picture, if the colors of the drawn part and the picture boundary are the same, the part may be considered as the picture boundary, so that the original picture boundary is damaged, the painting effect is affected, and the user experience is poor.
A display device includes a display and a controller. Wherein the display is used for displaying a user interface.
The display equipment can have a touch interaction function, and a user can operate the host machine only by lightly touching the display with fingers, so that the operation of a keyboard, a mouse and a remote controller is avoided, and the human-computer interaction is more direct. Based on the display device 200, the display device 200 may support a touch interaction function by adding the touch component 276. In general, the touch-sensitive component 276 may constitute a touch screen with the display 260. A user can input different control instructions on the touch screen through touch operation. For example, the user may input a click, slide, long press, double click, etc. touch command, and different touch commands may represent different control functions.
To implement the different touch actions, the touch control component 276 may generate different electrical signals when the user inputs different touch actions, and transmit the generated electrical signals to the controller 250. The controller 250 may perform feature extraction on the received electrical signal to determine a control function to be performed by the user based on the extracted features.
For example, when a user enters a click touch action at any program icon location in the application interface, the touch component 276 will sense the touch action and generate an electrical signal. After receiving the electrical signal, the controller 250 may first determine a duration of a level corresponding to a touch action in the electrical signal, and when the duration is less than a preset time threshold, recognize that a click touch instruction is input by the user. The controller 250 then extracts the positional features generated by the electrical signals to determine the touch position. And when the touch position is within the display range of the application icon, determining that the user inputs a click touch instruction at the position of the application icon. Accordingly, the click touch command is used to execute a function of running a corresponding application program in the current scene, so that the controller 250 may start running the corresponding application program.
For another example, when the user enters a swipe action in the media presentation page, the touch-sensitive component 276 also sends the sensed electrical signal to the controller 250. The controller 250 first determines the duration of the signal corresponding to the touch action in the electrical signal. When the determined duration is longer than the preset time threshold, the position change condition generated by the signal is judged, and obviously, for the interactive touch action, the generation position of the signal changes, so that the sliding touch instruction input by the user is determined. The controller 250 determines the sliding direction of the sliding touch instruction according to the change condition of the position of the signal generation, and controls to turn pages of the display frame in the media asset display page so as to display more media asset options. Further, the controller 250 may extract features such as a sliding speed and a sliding distance of the sliding touch instruction, and perform a page-turning screen control according to the extracted features, so as to achieve a hand-following effect.
Similarly, for the touch instruction such as double click, long press, etc., the controller 250 may execute the corresponding control function according to the preset interaction rule by extracting different features and determining the type of the touch instruction through feature judgment. In some embodiments, the touch component 276 also supports multi-touch, such that a user can input touch actions on the touch screen with multiple fingers, e.g., multi-finger clicks, multi-finger long presses, multi-finger swipes, and the like.
The touch control action can be matched with a specific application program to realize a specific function. For example, after the user opens the drawing board application, the display 260 may present a drawing area, the user may draw a specific touch trajectory in the drawing area through the sliding touch command, and the controller 250 determines a touch pattern through the touch detected by the touch component 276 and controls the display 260 to display in real time to satisfy the demonstration effect.
The display device may also have a sketchpad function. The user can install a touch application related to the drawing board function, such as a drawing board application, through the display device. In the touch interfaces of the applications, a user can perform operations such as writing, drawing and the like, and the display device can generate a touch track according to the touch action of the user so as to realize the function of drawing board demonstration or entertainment.
In some embodiments, multiple applications may be installed in the display device. Fig. 5 shows a user interface in a display in a possible embodiment. The user may click on "my applications" in the user interface to view the list of applications. All applications installed by the display device are included in the application list.
Fig. 6 shows a schematic diagram of an application list in a possible embodiment. The display device may be equipped with a drawing board application, a player application, a video chat application, a camera application, and a mirror application. When the application list is displayed in the display, the user can select one of the applications and open it to implement the function of the application.
In some embodiments, when a user launches a sketchpad application, the controller may control the electronic sketchpad to be displayed in the display. An interactive area corresponding to one or more functions of the sketchpad application is displayed on the electronic sketchpad interface, and the interactive area can display text, images, icons, button buttons, pull-down menus, check boxes, selectable lists and the like. The user can make contact with the touch screen at a position where interaction is needed, so as to interact with the interaction area. The display device detects the contact and responds to the detected contact by performing a corresponding operation.
In some embodiments, the electronic drawing board includes a drawing area and a control area. Wherein, the drawing area is an area in which contents can be input. The control area may intensively display controls corresponding to one or more functions of the drawing board application, such as a brush control, an erasing control, a dye bucket control, and the like, and a user may perform corresponding operations using the respective controls, and may also set parameters of each control, such as a filling color and a line style indicated by the brush control, and the like. Fig. 7 shows a schematic diagram of an electronic drawing board in a possible embodiment.
In some embodiments, a target picture may also be displayed in the drawing area, and a user may perform operations such as painting on the target picture. Fig. 8 shows a schematic diagram of a target picture in a possible embodiment. The target picture comprises a color filling area and an area boundary, and a user can paint the color filling area and the like. The user can also use various controls in the control area to perform various operations on the target picture in the drawing area, such as: inputting lines, graphs and the like in a color filling area of a target picture by using a brush control; filling colors in the color filling area by using a dyeing barrel control; or erasing the content input by the user by using an erasing control, and the like.
In some embodiments, the user may perform various operations on the target picture using the controls in the control area. For example, the user may trigger the brush control, and the brush control is in a pickup state, at this time, the user may control the brush control to perform a corresponding operation in the target picture. When detecting that the brush control is selected, the controller may further control the display to display a toolbar corresponding to the brush control. Fig. 9 is a schematic diagram illustrating a toolbar corresponding to the brush control in a possible embodiment. The parameter information such as the color, the thickness and the style of the line of the brush can be selected in the toolbar.
After the user selects the parameter information of the brush control, the user can input content based on touch with the drawing area, and the input content is a touch track of the user on the drawing area.
In some embodiments, when the user triggers the brush control, for example, clicks on the brush control, a toolbar corresponding to the brush control is displayed in the display. And the user can select the filling color indicated by the brush control through the toolbar corresponding to the brush control. When the user clicks on the color option in the toolbar, the controller may control the display to display a fill color condition. The fill color case includes all colors that the brush control can indicate and the fill color currently indicated by the brush control. FIG. 10 is a diagram illustrating color options in a toolbar in a possible embodiment. The user may view the fill color that the brush control is currently indicating. The user can also set the filling color indicated by the brush control.
In the related art, when the filling color indicated by the brush control selected by the user is completely the same as the color of the region boundary of the target picture, the content input by the user may be determined as the region boundary of the target picture, so that the original region boundary of the target picture is damaged. Therefore, the subsequent drawing effect of the user is reduced, and the user experience is seriously influenced. Fig. 11 is a diagram illustrating a content destruction area boundary input by a user in the related art. L1 is a line input by the user, and the line is determined as a region boundary by the system, thereby destroying the original region boundary of the target picture.
In view of this problem, the display device provided by the present application can protect the region boundary of the target picture from being damaged by the content input by the user.
The display device may be provided with a region boundary protection mode. When the display device is in the region boundary protection mode, the content input by the user can not be judged as the region boundary, so that the region boundary of the target picture can be protected.
In some embodiments, the user may transmit the region boundary protection mode instruction to the display apparatus by operating a designated key of the remote controller. And binding the corresponding relation between the region boundary protection mode instruction and the remote controller key in advance in the actual application process. For example, an area boundary protection mode key is set on the remote controller, and when a user touches the key, the remote controller sends an area boundary protection mode command to the controller, and at this time, the controller controls the display device to enter an area boundary protection mode. When the user touches the key again, the controller may control the display device to exit the area boundary protection mode.
In some embodiments, the corresponding relationship between the area boundary protection mode command and the plurality of remote control keys may also be pre-bound, and when the user touches the plurality of keys bound to the area boundary protection mode command, the remote control sends the area boundary protection mode command. In a feasible embodiment, the keys bound by the zone boundary protection mode command are sequentially direction keys (left, down, left, down), that is, when the user continuously touches the keys (left, down, left, down) within a preset time, the remote controller sends the zone boundary protection mode command to the controller. By adopting the binding method, the region boundary protection mode instruction can be prevented from being sent out due to misoperation of a user. The embodiments of the present application are merely exemplary in providing several binding relationships between the area boundary protection mode command and the key, and the binding relationships between the area boundary protection mode command and the key may be set according to the habit of the user in the actual application process, which is not limited herein.
In some embodiments, the user may send a region boundary protection mode instruction to the display device by means of voice input using a sound collector, such as a microphone, of the display device to control the display device to enter the region boundary protection mode. An intelligent voice system can be arranged in the display device, and the intelligent voice system can recognize the voice of the user so as to extract the instruction content input by the user. The user can input a preset awakening word through the microphone so as to start the intelligent voice system, and the controller can respond to the instruction input by the user. And inputting a region boundary protection mode instruction within a certain time to enable the display device to enter a region boundary protection mode. For example, the user may enter "something classmate" to activate the intelligent speech system. And inputting 'entering a region boundary protection mode', and realizing sending a region boundary protection mode instruction to the display equipment.
In some embodiments, the user may also send a region boundary protection mode instruction to the display device through a preset gesture. The display device may detect the user's behavior through an image collector, such as a camera. When the user makes a preset gesture, it may be considered that the user has sent a region boundary protection mode instruction to the display device. For example, it can be set as: when the V-shaped word is detected to be scribed by the user, the user is determined to input the area boundary protection mode instruction to the display device. The user can also send a region boundary protection mode instruction to the display device through a preset action. For example, it can be set as: when it is detected that the user lifts the left foot and the right hand at the same time, it is determined that the user has input an area boundary protection mode instruction to the display device.
In some embodiments, when the user controls the display device using the smart device, for example, using a cell phone, the region boundary protection mode instruction may also be sent to the display device. In the process of practical application, a control can be set in the mobile phone, and whether the mobile phone enters the area boundary protection mode can be selected through the control, so that an area boundary protection mode instruction is sent to the controller, and at the moment, the controller can control the display equipment to enter the area boundary protection mode.
In some embodiments, when the user controls the display device using the cell phone, a continuous click command may be issued to the cell phone. The continuous click command refers to: in a preset period, the number of times that a user clicks the same area of the mobile phone touch screen exceeds a preset threshold value. For example: when the user continuously clicks a certain area of the mobile phone touch screen for 3 times within 1s, the user is regarded as a continuous clicking instruction. After receiving the continuous click command, the mobile phone can send a region boundary protection mode command to the display device, so that the controller controls the display device to enter a region boundary protection mode.
In some embodiments, when the user uses the mobile phone to control the display device, the following may also be set: when detecting that a touch pressure value of a certain area of the mobile phone touch screen by a user exceeds a preset pressure threshold, the mobile phone can send an area boundary protection mode instruction to the display device.
An area boundary protection mode option may also be set in the UI interface of the display device, and when the user clicks the option, the display device may be controlled to enter or exit the area boundary protection mode.
In some embodiments, to prevent the user from triggering the area boundary protection mode by mistake, when the controller receives the area boundary protection mode instruction, the controller may control the display to display area boundary protection mode confirmation information, so that the user performs secondary confirmation to determine whether to control the display device to enter the area boundary protection mode. Fig. 12 is a schematic diagram showing the display area boundary protection mode confirmation information in the display in a possible embodiment.
FIG. 13 illustrates an interaction flow diagram for components of a display device in some embodiments.
In some embodiments, when the display device enters the black screen detection mode, the controller may first detect whether the content to be input by the user would break the region boundary of the target picture.
In some embodiments, the controller may detect the fill color indicated by the user-selected brush control in detecting whether the content to be input by the user would violate the region boundary of the target picture. After the filling color indicated by the brush control selected by the user is determined, the controller may further detect a color value corresponding to the filling color, which is named as a filling color value in this embodiment of the application.
After determining the filling color value of the filling color, the controller may further determine the color value of the pixel point of the region boundary of the target picture. And if the filling color value of the filling color is the same as the color value of the pixel point of the area boundary, indicating that the filling color indicated by the brush control is completely the same as the color of the area boundary. At this time, if the user adopts the fill color indicated by the currently selected brush control, the input content may damage the region boundary of the target picture. Therefore, the filling color value of the filling color needs to be adjusted, so that the color value of the pixel point of the filling color value and the color value of the pixel point of the region boundary are different, and the region boundary of the target picture cannot be damaged. FIG. 14 shows a schematic diagram of a possible embodiment in which the user-entered content does not disrupt the zone boundaries. The L1 is a line input by the user, and the color value of the line is different from the color value of the pixel point of the area boundary, so that the system does not determine the content input by the user as the area boundary, and the original area boundary of the target picture can be protected.
In some embodiments, the color value may be an ARGB value. Wherein, the ARGB is a color mode, that is, the RGB color mode is added with a transparency channel. The RGB color scheme is a scheme in which various colors are obtained by changing three color channels of Red (Red), Green (Green), and Blue (Blue) and superimposing them on each other.
In some embodiments, the region boundary of the target picture is determined, its color is also determined, and is the same color. The controller may obtain color values of the pixel points of the region boundary (R0, G0, B0, Alpha 0).
After the user selects the fill color indicated by the brush control, the controller can determine the fill color value (R ^ G ^ B ^ Alpha) corresponding to the fill color currently. Whether the region boundary of the target picture can be damaged by the content input by the user can be determined by judging whether the color values of the pixel points of the region boundary are the same as the filling color values.
In some embodiments, when it is detected that the content input by the user may damage the region boundary of the target picture, that is, the filling color value is the same as the color value of the pixel point of the region boundary. The controller can finely tune the filling color value, and can add or subtract the filling color value to a preset value. For example, the value of a color value component in the fill color value (R, G, B, Alpha) can be increased or decreased by one to ensure that the color values of the pixel points of the fill color value and the region boundary are different.
In some embodiments, the sensitivity to green (the G component) is the highest, the sensitivity to red (the R component) is the next lowest, and the sensitivity to blue (the B component) is the lowest, considering that the human eye is the most sensitive to green (the G component). Therefore, when the filling color value is the same as the color value of the pixel point of the area boundary, the controller can finely adjust the B component in the filling color value, and the numerical value of the B component is subjected to addition or subtraction processing, so that the filling color indicated by the painting brush control is kept consistent with the original color to the maximum extent, and the content input by a user cannot form a new area boundary, thereby protecting the original area boundary of the target picture.
When it is detected that the filling color indicated by the brush control selected by the user and the color of the area boundary are both black, the corresponding color value is (0, 0, 0, Alpha), and at this time, the value of the B component in the filling color value needs to be increased by one. When it is detected that the fill color indicated by the brush control selected by the user and the color of the region boundary are both white, the corresponding color value is (255, 255, 255, Alpha), and at this time, the value of the B component in the fill color value needs to be decreased by one. When the filling color indicated by the brush control selected by the user is other colors, the numerical value of the B component may be increased or decreased by one, and may be specifically set by the user.
In some embodiments, when the filling color value is fine-tuned, other components may also be adjusted, and the adjusted value may be set by a user, which is not limited in this application.
In some embodiments, after the filling color value is adjusted, it can be ensured that the filling color value is different from the color value of the pixel point of the region boundary, and the content input by the user is no longer determined as the region boundary of the target picture by the system. At this time, the controller determines that the color value of the filling color currently indicated by the brush control is the adjusted filling color value.
When the filling color value selected by the user is different from the color value of the pixel point of the area boundary, the filling color value does not need to be adjusted, and the filling color value can be directly used for operation. At this time, the controller determines the color value of the fill color currently indicated by the brush control as the fill color value selected by the user.
In some embodiments, the controller may also detect a touch operation by the user when the fill color value is determined. The user can pick up the brush control to paint the target picture in the drawing area and the like. For example, the user can click or move in the touch screen by using a finger, and color painting of the color filling area in the target picture is realized.
In response to a user's operation in the color fill area, the controller may detect a touch trajectory input by the user using the touch component. And confirming the area which the user wants to paint in the target picture according to the touch track input by the user.
The controller can update the color values of all the pixel points in the touch track according to the filling color values. And inputting the filling color indicated by the brush control into the touch track, so that the color filling area is painted.
In some embodiments, when the color values of all the pixel points in the touch trajectory are updated, the controller may replace the color values of all the pixel points in the touch trajectory with the filling color value currently corresponding to the brush control.
Specifically, when the user picks up the painting brush control and paints the painting brush control in the painting area, the controller can replace all color values of pixel points in all areas touched by the user with filling color values. That is, all the areas touched by the user are set as the colors selected by the user. For example, the color value of the pixel point in the touch track is (R1, G1, B1, Alpha1), and the fill color value of the fill color selected by the user is (R ^ G ^ B ^ Alpha). At this time, the color values of all the pixels in the touch trajectory need to be set to be filling color values, i.e. (R ^ G ^ B ^ Alpha).
In some embodiments, when the color values of all the pixels in the touch trajectory are updated, the controller may perform color mixing processing on the area corresponding to the touch trajectory, that is, perform color mixing processing on all the pixels in the touch trajectory.
Specifically, the controller may first determine a current first color value of each pixel point in the touch trajectory. And for each pixel point, the current first color value and the filling color value are superposed to obtain a second color value, and the second color value is obtained after color mixing processing is carried out on each pixel point. Furthermore, the controller can update the color value of each pixel point in the touch track to the corresponding second color value, so that the color mixing processing of the corresponding area of the touch track is realized. For example, the first color value of a certain pixel point in the touch trajectory is (R1, G1, B1, Alpha1), and the fill color value of the fill color selected by the user is (R ^ G, B ^ Alpha, Alpha ^). In this case, it is necessary to superimpose the color values (R1, G1, B1, Alpha1) and the fill color values (R ^ G ^ B ^ Alpha) to obtain second color values (R2, G2, B2, Alpha 2). The controller further sets the color value of the pixel point to (R2, G2, B2, Alpha 2).
In some embodiments, after the color values of all the pixel points in the touch trajectory are updated, for some pixel points, the obtained color values after the color blending processing may be the same as the color values of the pixel points at the area boundary, that is, the second color values (R2, G2, B2, Alpha2) are the same as the color values (R0, G0, B0, Alpha0) of the pixel points at the area boundary. For the touch tracks corresponding to these pixel points, the system may determine the area boundary of the target picture, and therefore the color values of the pixel points in this part of touch tracks need to be adjusted. The specific adjustment method can refer to the above steps, and is not described herein again.
After the second color value is adjusted, a third color value can be obtained, so that the third color value and the color value of the pixel point of the area boundary are different. At this time, the controller may update the color value of the pixel point in the portion of the touch trajectory to the third color value. Thereby ensuring that the content input by the user is not judged as the area boundary of the target picture by the system. The specific updating method can refer to the above steps, and is not described herein again.
In some embodiments, the user may also switch the fill color indicated by the brush control. When it is detected that the user switches the filling color indicated by the brush control, the controller may determine a filling color value of the switched filling color.
At this time, when the user continues to touch the touch screen, the controller may continue to monitor the touch trajectory input by the user. And simultaneously updating the color values of all pixel points in the touch track according to the switched filling color values. The specific updating method can refer to the above steps, and is not described herein again.
In some embodiments, the user may also use a dye bucket control. After the user clicks the dyeing barrel control, the dyeing barrel control is in a pickup state, the user can further select the color indicated by the dyeing barrel control and click a certain color filling area, at the moment, the color of the color filling area can be updated to the color indicated by the dyeing barrel control selected by the user, and color filling of the color filling area is achieved.
When the color indicated by the dyeing bucket control selected by the user is completely the same as the color of the region boundary of the target picture, that is, the color value corresponding to the color indicated by the dyeing bucket control is the same as the color value of the pixel point of the region boundary, the color filling region may be judged as the region boundary by the system after the color filling.
Therefore, after the user selects the color indicated by the dyeing bucket control, the controller can judge whether the color value corresponding to the dyeing bucket control is the same as the color value of the pixel point of the area boundary. If so, the controller can adjust the color value corresponding to the dyeing bucket so that all color values of the dyeing bucket are different from the color values of the pixel points of the region boundary. The specific adjustment method can refer to the above steps, and is not described herein again.
An embodiment of the present application further provides a color setting method, which is applied to a display device, and as shown in fig. 15, the method includes:
step 1501, displaying an electronic drawing board, wherein the electronic drawing board comprises a drawing control and a drawing area, the drawing area is used for displaying a target picture, the target picture comprises a color filling area and an area boundary, and the drawing control is used for inputting content in the color filling area;
step 1502, responding to a trigger operation of a user on the brush control, and determining a filling color indicated by the brush control;
step 1503, detecting a brush color value corresponding to the filling color;
step 1504, when the filling color value is the same as the color value of the pixel point of the area boundary, adjusting the filling color value so as to make the filling color value and the color value of the pixel point of the area boundary different.
The same and similar parts in the embodiments in this specification may be referred to one another, and are not described herein again.
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented as software plus a required general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be essentially or partially implemented in the form of software products, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and include instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method in the embodiments or some parts of the embodiments of the present invention.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A display device, comprising:
the display is configured to display an electronic drawing board, the electronic drawing board comprises a painting brush control and a painting area, the painting area is used for displaying a target picture, the target picture comprises a color filling area and an area boundary, and the painting brush control is used for inputting content in the color filling area;
a controller configured to:
determining the filling color indicated by the brush control in response to the triggering operation of the brush control by a user;
detecting a filling color value corresponding to the filling color;
and when the filling color value is the same as the color value of the pixel point of the region boundary, adjusting the filling color value so as to enable the filling color value to be different from the color value of the pixel point of the region boundary.
2. The display device of claim 1, wherein the controller is further configured to:
prior to performing the step of determining the fill color indicated by the brush control,
responding to the triggering operation of the user on the brush control, and controlling a display to display a filling color condition so that the user selects the filling color indicated by the brush control; the fill color case includes all colors that the brush control can indicate and the fill color currently indicated by the brush control.
3. The display device of claim 1, wherein the controller is further configured to:
in performing the step of adjusting the fill color values,
and adding or subtracting a preset value to the numerical value of the first color value component in the filling color value so as to enable the filling color value to be different from the color value of the pixel point of the region boundary.
4. The display device of claim 1, wherein the controller is further configured to:
and when the filling color value is different from the color of the pixel point of the region boundary, the filling color value is not adjusted.
5. The display device according to claim 1, further comprising:
a touch component configured to detect a touch trajectory input by a user;
the controller is further configured to:
responding to the operation of a user in the color filling area, and detecting a first touch track input by the user;
and updating the color values of all the pixel points in the first touch track according to the filling color values.
6. The display device of claim 5, wherein the controller is further configured to:
in the step of updating the color values of all the pixel points in the first touch trajectory according to the filling color values,
and replacing the color values of all the pixel points in the first touch track with filling color values.
7. The display device of claim 6, wherein the controller is further configured to:
in the step of updating the color values of all the pixel points in the first touch trajectory according to the filling color values,
acquiring a current first color value of a pixel point in the first touch track;
superposing the first color value and the filling color value to obtain a second color value;
and updating the color value of the pixel point in the first touch track to the second color value.
8. The display device of claim 7, wherein the controller is further configured to:
after the step of setting the second color value as the color value of the pixel point in the first touch trajectory is performed,
detecting whether the second color value is the same as the color value of the pixel point of the region boundary;
and when the second color value is the same as the color value of the pixel point of the area boundary, adjusting the second color value to obtain a third color value, and updating the color value of the pixel point in the first touch track to the third color value.
9. The display device of claim 5, wherein the controller is further configured to:
in response to an instruction input by a user for switching the filling color indicated by the brush control, determining a filling color value of the switched filling color;
responding to the operation of the user in the color filling area, and detecting a second touch track input by the user;
and updating the color values of all pixel points in the second touch track according to the switched filling color values.
10. A color setting method applied to a display device, the method comprising:
the method comprises the steps that an electronic drawing board is displayed, the electronic drawing board comprises a drawing pen control and a drawing area, the drawing area is used for displaying a target picture, the target picture comprises a color filling area and an area boundary, and the drawing pen control is used for inputting content in the color filling area;
determining the filling color indicated by the brush control in response to the triggering operation of the brush control by a user;
detecting a filling color value corresponding to the filling color;
and when the filling color value is the same as the color value of the pixel point of the region boundary, adjusting the filling color value so as to enable the filling color value to be different from the color value of the pixel point of the region boundary.
CN202110736313.3A 2021-06-30 2021-06-30 Display apparatus and color setting method Pending CN113485614A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110736313.3A CN113485614A (en) 2021-06-30 2021-06-30 Display apparatus and color setting method
PCT/CN2022/096009 WO2023273761A1 (en) 2021-06-30 2022-05-30 Display device and image processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110736313.3A CN113485614A (en) 2021-06-30 2021-06-30 Display apparatus and color setting method

Publications (1)

Publication Number Publication Date
CN113485614A true CN113485614A (en) 2021-10-08

Family

ID=77937681

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110736313.3A Pending CN113485614A (en) 2021-06-30 2021-06-30 Display apparatus and color setting method

Country Status (1)

Country Link
CN (1) CN113485614A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115480651A (en) * 2022-11-04 2022-12-16 深圳润方创新技术有限公司 Control method of electronic drawing board with copy content analysis function and electronic drawing board

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115480651A (en) * 2022-11-04 2022-12-16 深圳润方创新技术有限公司 Control method of electronic drawing board with copy content analysis function and electronic drawing board

Similar Documents

Publication Publication Date Title
CN113810746B (en) Display equipment and picture sharing method
CN114501107A (en) Display device and coloring method
CN112799627B (en) Display apparatus and image display method
CN112672199B (en) Display device and multi-layer overlapping method
CN113268199A (en) Display device and function item setting method
CN112584211A (en) Display device
CN115129214A (en) Display device and color filling method
CN114501108A (en) Display device and split-screen display method
CN114115637A (en) Display device and electronic drawing board optimization method
CN113593488A (en) Backlight adjusting method and display device
CN113490024A (en) Control device key setting method and display equipment
CN113485614A (en) Display apparatus and color setting method
CN112860331A (en) Display device and voice interaction prompting method
CN112947800A (en) Display device and touch point identification method
CN112650418B (en) Display device
CN112926420B (en) Display device and menu character recognition method
CN112947783B (en) Display device
CN115562544A (en) Display device and revocation method
CN112732120A (en) Display device
CN114296623A (en) Display device
CN112668546A (en) Video thumbnail display method and display equipment
CN113542882A (en) Method for awakening standby display device, display device and terminal
CN113573112A (en) Display device and remote controller
CN114281284B (en) Display apparatus and image display method
CN115550718A (en) Display device and display method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination