CN114968049A - Display device and color rendering method - Google Patents

Display device and color rendering method Download PDF

Info

Publication number
CN114968049A
CN114968049A CN202210564855.1A CN202210564855A CN114968049A CN 114968049 A CN114968049 A CN 114968049A CN 202210564855 A CN202210564855 A CN 202210564855A CN 114968049 A CN114968049 A CN 114968049A
Authority
CN
China
Prior art keywords
parameter
color
value
user
path pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210564855.1A
Other languages
Chinese (zh)
Inventor
鲁好锦
张足刚
董率
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202210564855.1A priority Critical patent/CN114968049A/en
Publication of CN114968049A publication Critical patent/CN114968049A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a display device and a color rendering method, which can acquire painting brush parameters, action parameters and path pixel points through which sliding operation passes according to sliding operation input by a user in a painting area, generate effect color values based on the painting brush parameters and the action parameters, and set the path pixel points and the colors of the path pixel point association regions according to the effect color values, so that the effects of color, transparency, halation radius and the like of each path pixel point are different, the real painting effect of a painting brush with a color halation effect can be displayed when painting like a writing brush and the like, and the user experience is facilitated.

Description

Display device and color rendering method
Technical Field
The application relates to the technical field of smart television drawing boards, in particular to a display device and a color rendering method.
Background
The display device refers to a terminal device capable of outputting a specific display picture, such as a smart television, a mobile terminal, a smart advertisement screen, a projector, and the like. Taking an intelligent television as an example, the intelligent television can be based on an Internet application technology, has an open operating system and a chip, has an open application platform, can realize a bidirectional man-machine interaction function, integrates multiple functions of audio and video, entertainment, education, data and the like, and is a television product for meeting diversified and personalized requirements of users. For example, the display device may have a drawing board function, and a user may select a corresponding brush style through interaction with the display device, leave a touch trajectory on the display device, and perform color filling on a pixel point through which the touch trajectory passes by the display device, so as to implement the drawing board function of the display device.
When the user launches the drawing board function of display device, the filling color of each pixel point that corresponds with the same touch trajectory that leaves on display device is the same, can demonstrate like the true drawing effect of painting brush of types such as pencil, pen, nevertheless can't demonstrate like writing brush etc. can appear the true drawing effect of painting brush that the colour tinges the effect when drawing, is unfavorable for user experience.
Disclosure of Invention
The application provides a display device and a color rendering method, which are used for solving the problem that the existing display device cannot display the real painting effect of a painting brush with a color shading effect when the painting is performed like a writing brush and the like, and are beneficial to user experience.
In one aspect, the present application provides a display device, comprising:
the display is used for displaying an electronic drawing board interface, the electronic drawing board interface comprises a control area and a drawing area, and the control area comprises at least one brush control;
the touch control assembly is used for detecting a touch control pressure value and touch control duration when a user performs touch control operation on the display;
a controller configured to:
responding to a sliding operation input by a user in the drawing area, and acquiring painting brush parameters which comprise an initial color parameter, a water dipping amount parameter and a drawing paper style parameter;
detecting action parameters of the sliding operation and path pixel points passed by the sliding operation, wherein the action parameters comprise a stroke direction parameter and a writing force parameter;
generating an effect color value according to the painting brush parameter and the action parameter;
and setting the colors of the path pixel points and the associated areas of the path pixel points according to the effect color values.
In another aspect, the present application provides a color rendering method, including:
responding to a sliding operation input by a user in a drawing area, and acquiring painting brush parameters which comprise an initial color parameter, a water dipping amount parameter and a drawing paper style parameter;
detecting action parameters of the sliding operation and path pixel points passed by the sliding operation, wherein the action parameters comprise a stroke direction parameter, a sliding time parameter and a writing force parameter;
generating an effect color value according to the painting brush parameter and the action parameter;
and setting the colors of the path pixel points and the associated areas of the path pixel points according to the effect color values.
According to the technical scheme, the display device and the color rendering method can acquire the painting brush parameters, the action parameters and the path pixel points passing through the sliding operation according to the sliding operation input by the user in the drawing area, generate the effect color values based on the painting brush parameters and the action parameters, and set the path pixel points and the colors of the path pixel point association areas according to the effect color values, so that the color, the transparency, the halation radius and other effects of the path pixel points are different, the real drawing effect of the painting brush with the color halation effect can be displayed when the painting is performed like a writing brush, and the user experience is facilitated.
Drawings
Fig. 1 is an operation scenario between a display device and a control apparatus provided by an example of the present application;
fig. 2 is a block diagram of a hardware configuration of a display device exemplarily provided in the present application;
fig. 3 is a diagram of software configuration in a display device according to an exemplary embodiment of the present disclosure;
fig. 4 is a schematic diagram of an electronic palette interface shown in some embodiments of the present application;
fig. 5 is a schematic diagram of an electronic palette interface shown in some embodiments of the present application;
FIG. 6 is a flow chart of a method for generating a water dip parameter as exemplary provided herein;
FIG. 7 is an electronic palette interface for displaying a drawing toolbar illustratively provided herein;
fig. 8 is a flowchart illustrating a touch trajectory according to an exemplary embodiment of the present disclosure;
fig. 9 is a schematic diagram of an electronic palette interface shown in some embodiments of the present application;
fig. 10 is a schematic diagram of an electronic palette interface shown in some embodiments of the present application;
FIG. 11 is a flow chart of a method for generating transparency values as exemplary provided herein;
fig. 12 is a schematic diagram illustrating a relationship between a touch pressure value and a transparency value according to an example of the present application;
FIG. 13 is a flow chart of a method for generating transparency values as exemplary provided herein;
FIG. 14 is a flow chart of a method for generating transparency values as exemplary provided herein;
FIG. 15 is a graphical illustration of shading radius values versus dip parameters as exemplary provided herein;
FIG. 16 is a flow chart of a method of generating a rendering radius value as exemplary provided herein;
fig. 17 is a schematic diagram of a path pixel point association area exemplarily provided in the present application;
FIG. 18 is a flow chart of generating color effect values as exemplary provided herein;
fig. 19 is a schematic diagram of path pixel with dot color according to an example of the present application.
Detailed Description
To make the purpose and embodiments of the present application clearer, the following will clearly and completely describe the exemplary embodiments of the present application with reference to the attached drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
Fig. 1 is a schematic diagram of an operation scenario between a display device 200 and a control apparatus 100 according to an embodiment. As shown in fig. 1, the user may operate the display device 200 through the smart device 300 or the control apparatus 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device 200 includes an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, and controls the display device 200 in a wireless or wired manner. The user may input a user instruction through a key on a remote controller, voice input, control panel input, etc., to control the display apparatus 200.
In some embodiments, the smart device 300 (e.g., mobile terminal, tablet, computer, laptop, etc.) may also be used to control the display device 200. For example, the display device 200 is controlled using an application program running on the smart device.
In some embodiments, the display device 200 may also be controlled by a manner other than the control apparatus 100 and the smart device 300, for example, the voice instruction control of the user may be directly received by a module configured inside the display device 200 to obtain a voice instruction, or may be received by the voice control apparatus 100 provided outside the display device 200.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers.
Fig. 2 shows a hardware configuration block diagram of a display device 200 according to an exemplary embodiment.
In some embodiments, the display device 200 includes a touch component, and the display device 200 can implement a touch interaction function through the touch component, so that a user can operate the host computer by only lightly touching the display 260 with a finger, thereby getting rid of operations of a keyboard, a mouse, and a remote controller, and making human-computer interaction more straightforward. On the touch display 260, a user can input different control instructions through touch operation, and when the user performs touch operation on the display screen, the touch component can be used for detecting a touch action, touch strength and continuous touch time of the user. For example, the user may input touch actions such as clicking, sliding, long-time pressing, double-click, and the like, different touch actions represent different touch instructions, and different touch instructions may trigger different control functions, or the user may input touch actions of different touch strengths, for example, when the display 260 displays a drawing interface, the user may touch the display 260 to generate an instruction for displaying a function control, and the user may also touch the display 260 again to generate an instruction for performing a selection operation on a drawing target in the drawing interface. Or, the user may input touch actions of different touch times, for example, if the duration of the user touching a certain function control exceeds a threshold, an instruction for hiding the function control is generated, and if the duration of the user touching the function control does not exceed the threshold, an instruction for selecting the control touched by the user is generated.
In order to implement the different touch actions, the touch component may detect touch factors such as a touch action, a touch force, and a continuous touch time of the user when the user inputs a touch operation, generate different electrical signals based on the detected touch factors, and send the generated electrical signals to the controller 250. The controller 250 may perform feature extraction on the received electrical signal to determine a control function to be performed by the user based on the extracted features. For example, when a user inputs a click touch action at any program icon position in the application program interface, the touch component senses the touch action and generates an electrical signal. After receiving the electrical signal, the controller 250 may first determine a duration of a level corresponding to the touch action in the electrical signal, and when the duration is less than a preset time threshold, recognize that the user inputs a click touch command. The controller 250 extracts the position characteristics generated by the electrical signals to determine the touch position. And when the touch position is within the display range of the application icon, determining that the user inputs a click touch instruction at the position of the application icon. Accordingly, the click touch command is used to execute a function of running a corresponding application program in the current scene, so that the controller 250 may start running the corresponding application program.
For another example, when the user inputs a sliding motion in the media asset display page, the touch component also sends the sensed electrical signal to the controller 250. The controller 250 first determines the duration of the signal corresponding to the touch action in the electrical signal. When the determined duration is longer than the preset time threshold, the position change condition generated by the signal is judged, and obviously, for the interactive touch action, the generation position of the signal changes, so that the user is determined to input the sliding touch instruction. The controller 250 determines the sliding direction of the sliding touch instruction according to the change condition of the position of the signal generation, and controls to turn pages of the display frame in the media asset display page so as to display more media asset options. Further, the controller 250 may also extract features of the sliding speed, the sliding distance, the sliding force, and the like of the sliding touch instruction, and perform the image control of page turning according to the extracted features, so as to achieve the hand following effect and the like.
Similarly, for the touch instruction such as double click, long press, etc., the controller 250 may execute the corresponding control function according to the preset interaction rule by extracting different features and determining the type of the touch instruction through feature judgment. In some embodiments, the touch component also supports multi-touch, such that a user can input a touch action on the touch screen through multiple fingers, e.g., multi-finger click, multi-finger long press, multi-finger slide, etc.
The touch control action can be matched with a specific application program to realize a specific function. For example, after the user opens the drawing board application, the display 260 may present a drawing area, the user may draw a specific touch action track in the drawing area through a sliding touch instruction, and the controller 250 determines a touch action pattern through a touch action detected by the touch component and controls the display 260 to display in real time to satisfy the demonstration effect.
In some embodiments, the display apparatus 200 further comprises at least one of a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, a user interface.
In some embodiments, the controller 250 includes a processor, a video processor, an audio processor, a graphic processor, a RAM, a ROM, a first interface to an nth interface for input/output.
In some embodiments, the display 260 includes a display screen component for presenting a picture, and a driving component for driving image display, a component for receiving an image signal from the controller 250, performing display of video content, image content, and a menu manipulation interface, and a user manipulation UI interface.
In some embodiments, the display 260 may be a liquid crystal display 260, an OLED display 260, and a projection display 260, and may also be a projection device and a projection screen.
In some embodiments, communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver. The display apparatus 200 may establish transmission and reception of control signals and data signals with the external control device 100 or the server 400 through the communicator 220.
In some embodiments, the user interface may be configured to receive control signals for controlling the apparatus 100 (e.g., an infrared remote control, etc.).
In some embodiments, the detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for collecting ambient light intensity; alternatively, the detector 230 includes an image collector, such as a camera, which may be used to collect external environment scenes, attributes of the user, or user interaction gestures, or the detector 230 includes a sound collector, such as a microphone, which is used to receive external sounds.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, and the like. The interface may be a composite input/output interface formed by the plurality of interfaces.
In some embodiments, the tuner demodulator 210 receives broadcast television signals via wired or wireless reception, and demodulates audio/video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals.
In some embodiments, the controller 250 controls the operation of the display device 200 and responds to user operations through various software control programs stored in the memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink, an icon, or other actionable control. The operations related to the selected object are: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon.
In some embodiments, the controller 250 includes at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphic Processing Unit (GPU), a RAM Random Access Memory (RAM), a ROM (Read-Only Memory), a first interface to an nth interface for input/output, a communication Bus (Bus), and the like.
A CPU processor. For executing operating system and application program instructions stored in the memory, and executing various application programs, data and contents according to various interactive instructions receiving external input, so as to finally display and play various audio-video contents. The CPU processor may include a plurality of processors. E.g. comprising a main processor and one or more sub-processors.
In some embodiments, a graphics processor for generating various graphical objects, such as: icons, operation menus, user input instruction display graphics, and the like. The graphic processor comprises an arithmetic unit, which performs operation by receiving various interactive instructions input by a user and displays various objects according to display attributes; the system further comprises a renderer for rendering various objects obtained by the arithmetic unit, wherein the rendered objects are displayed on the display 260.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on display 260, and the user input interface receives the user input commands through the Graphical User Interface (GUI). Alternatively, the user may input a user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form that the user can receive. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
In some embodiments, the system of display device 200 may include a Kernel (Kernel), a command parser (shell), a file system, and an application program. The kernel, shell, and file system together make up the basic operating system structure that allows users to manage files, run programs, and use the system. After power-on, the kernel is started, kernel space is activated, hardware is abstracted, hardware parameters are initialized, and virtual memory, a scheduler, signals and interprocess communication (IPC) are operated and maintained. And after the kernel is started, loading the Shell and the user application program. The application program is compiled into machine code after being started, and a process is formed.
Referring to fig. 3, in some embodiments, the system is divided into four layers, which are an Application (Applications) layer (abbreviated as "Application layer"), an Application Framework (Application Framework) layer (abbreviated as "Framework layer"), an Android runtime (Android runtime) and system library layer (abbreviated as "system runtime library layer"), and a kernel layer, respectively, from top to bottom.
In some embodiments, at least one application program runs in the application program layer, and the application programs may be windows (windows) programs carried by an operating system, system setting programs, clock programs or the like; or an application developed by a third party developer. In particular implementations, the application packages in the application layer are not limited to the above examples.
The framework layer provides an Application Programming Interface (API) and a programming framework for the application. The application framework layer includes a number of predefined functions. The application framework layer acts as a processing center that decides to let the applications in the application layer act. The application program can access the resources in the system and obtain the services of the system in execution through the API interface.
As shown in fig. 3, in the embodiment of the present application, the application framework layer includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used for interacting with all activities running in the system; the Location Manager (Location Manager) is used for providing the system service or application with the access of the system Location service; a Package Manager (Package Manager) for retrieving various information related to an application Package currently installed on the device; a Notification Manager (Notification Manager) for controlling display and clearing of Notification messages; a Window Manager (Window Manager) is used to manage the icons, windows, toolbars, wallpapers, and desktop components on a user interface.
In some embodiments, the activity manager is used to manage the lifecycle of the various applications as well as general navigational fallback functions, such as controlling exit, opening, fallback, etc. of the applications. The window manager is used for managing all window programs, such as obtaining the size of a display screen, judging whether a status bar exists, locking the screen, intercepting the screen, controlling the change of the display window (for example, reducing the display window, displaying a shake, displaying a distortion deformation, and the like), and the like.
In some embodiments, the system runtime layer provides support for the upper layer, i.e., the framework layer, and when the framework layer is used, the android operating system runs the C/C + + library included in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 3, the inner core layer contains at least one of the following drivers: audio drive, display driver, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (like fingerprint sensor, temperature sensor, pressure sensor etc.) and power drive etc..
In some embodiments, controller 250 controls the operation of display device 200 and responds to user operations associated with display 260 by running various software control programs (e.g., an operating system and/or various application programs) stored on a memory. For example, control presents a user interface on display 260, including several UI objects thereon; in response to a received user command for a UI object on the user interface, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the display device 200 may have a plurality of functions, for example, a drawing board function, a karaoke function, a magic mirror function, a video call function, a player function, and the like, wherein the drawing board function may be implemented based on an application installed on the display device 200 in relation to the drawing board function. For convenience of description, an application related to a sketchpad function installed on the display device 200 is referred to as a "sketchpad application", and after the sketchpad application is started by the display device 200, an electronic sketchpad interface is presented on the display 260. Areas on the electronic palette interface that display user interface objects, information, and/or input content corresponding to one or more functions of the palette application. The aforementioned user interface objects refer to objects constituting an electronic palette interface, and may include, but are not limited to, text, images, icons, soft keys (or "virtual buttons"), drop-down menus, radio buttons, check boxes, selectable lists, and the like. The displayed user interface objects may include non-interactive objects for conveying information or forming the appearance of the user interface, interactive objects available for user interaction, or a combination of non-interactive and interactive objects. The user may interact with the user interface object by contacting the touch screen at a location on the touch screen corresponding to the interactive object with which the user desires to interact. The display device 200 detects the contact and responds to the detected contact by performing an operation corresponding to the interaction of the interaction object to enable drawing in a sketchpad application.
In some embodiments, some or all of the steps involved in embodiments of the present application are implemented within an operating system and/or within an application program. A sketchpad application for implementing some or all of the steps of an embodiment of the present application may be stored in the memory, and the controller 250 controls the operation of the display device 200 and responds to user operations related to the application by running the application in an operating system.
It should be noted that the display device 200 according to the embodiment of the present application includes, but is not limited to, the display device 200 described in the above embodiment, and may also be other terminal devices having an image display function, a data processing function, and an information transceiving function, such as a portable mobile terminal, e.g., a mobile phone, a tablet computer, and the like. The embodiments of the present application will be described in detail below with reference to the display device 200 as an example.
In some embodiments, the electronic palette interface includes a drawing area 610 and a control area 620. The drawing area 610 is an area in which content can be input, and the control area 620 is used for collectively displaying user interface objects and information corresponding to one or more functions of the drawing board application. The user interface object includes, but is not limited to, a brush control, an eraser control, a color drawing control, and the like, and the information includes various parameter information corresponding to the brush control, such as a current input color, an optional color, a thickness, a line shape, and the like.
In some embodiments, the brush control may be a brush control corresponding to a pencil, a pen, or the like, and a user selects the corresponding brush control, which may leave a touch trajectory of a drawing effect corresponding to the brush control on the display device 200, where colors and line sizes at various places on the touch trajectory are generally consistent, so that it is difficult to display a real drawing effect of a brush that may have a color tinging effect during drawing, such as a writing brush, and the like, and is not favorable for user experience.
Fig. 4 is an electronic palette interface diagram shown in some embodiments of the present application. As shown in fig. 4, the electronic palette interface includes a drawing area 610 and a control area 620. The drawing area 610 is used for receiving contents input by a user through a control in the control area 620, and displaying the received contents, such as lines, graphics, text, and the like. The control area 620 is configured to display various functional controls, and at least includes a plurality of brush controls 621a, 621b, 621c, and 621d with different styles, an erasing control 622, a drawing control 623, a removing control 624, and a restoring control 625, where the brush control 621a may be a brush control corresponding to a brush pen effect, when the brush control 621a is selected, the display 260 is triggered to display a brush toolbar 6211 corresponding to the brush control 621a shown in fig. 5, the brush toolbar 6211 includes items for setting a brush color and a brush thickness, after the brush color and the brush thickness are set by a user, a brush corresponding to the brush control 621a is triggered, and in a state of picking up the brush, the user may input content based on contact with the drawing area 610, and the input content is a touch trajectory of the user on the drawing area 610.
In some embodiments, a user may perform a selection operation on any item for setting a brush color, and the controller 250 invokes an initial color parameter corresponding to the item for setting the brush color selected by the user in response to the selection operation of the user, and sets an input color of the brush control 621a according to the initial color parameter.
In some embodiments, the brush toolbar 6211 further includes a water dipping control 6212, the user may perform a selection operation on the water dipping control 6212, the controller 250, in response to the selection operation on the water dipping control 6212 by the user, detects a duration of the selection of the water dipping control 6212, and generates a water dipping amount parameter based on the duration, and may set an input color of the brush control 621a according to the water dipping amount parameter, so that an output effect of the brush control 621a is consistent with a real output effect of a brush, such as a writing brush, that needs to be dipped in water.
In some embodiments, a time threshold corresponding to the maximum dipping water parameter may be preset, and a functional relationship between the maximum dipping water parameter and the duration of the selection operation performed by the user on the dipping water control may be set when the maximum dipping water parameter is not reached, for example, the maximum dipping water parameter may be set to V Max Maximum value V of water dipping quantity parameter Max The corresponding time threshold is T Max If the generated water dipping quantity parameter is represented by V and the duration of the selection operation of the user on the water dipping control is represented by T, the dipping can be represented by the following functionRelationship between water quantity parameter V and duration T:
Figure BDA0003657498120000071
wherein A is the gradient coefficient of water dipping quantity and 0<A is less than or equal to 1, the value of A can be preset, the increase speed of the water dipping quantity parameter is controlled, the increase speed of the water dipping quantity parameter is increased along with the increase of the numerical value of the water dipping quantity gradient coefficient A, and the water dipping quantity parameter is positively correlated with the duration time. For example, if a is set to 0.2, V Max When the duration of the selection operation of the user on the water dipping control reaches 5s, the water dipping quantity parameter can reach the maximum value, namely when T is 5, V Max 100%, if set a 0.02, V Max When the duration of the selection operation of the user on the water dipping control is 20s, the water dipping quantity parameter V is 40%, namely T is 20s, and V is 40%, and when the duration of the selection operation of the user on the water dipping control reaches 50s, the water dipping quantity parameter reaches the maximum value, namely T is 50, and V is 50 Max =100%。
Fig. 6 is a flowchart of a method for generating a dip parameter according to an exemplary embodiment of the present disclosure, as shown in fig. 6, including: step S601, when the user selects the water dipping control 6212, the controller 250 responds to the selection operation of the user on the water dipping control 6212 and detects the duration time for which the water dipping control is selected; step S602, if it is detected that the duration is less than the time threshold, invoking a V ═ a × T function; step S603, if it is detected that the duration is greater than or equal to a time threshold, call V ═ V Max A function; and step S604, generating a water dipping quantity parameter based on the called function.
In some embodiments, after the controller 250 controls the generation of the dip parameter, the dip parameter may be controlled to be displayed at a preset position in the electronic palette interface to prompt the user of the dipping degree of the brush, for example, referring to fig. 5, after the controller 250 controls the generation of the dip parameter, the generated dip parameter may be controlled to be displayed on the dip control.
In some embodiments, a user may select a drawing control 623, and the controller 250 detects the user's selection of the drawing control 623, and controls to display a drawing toolbar 6221 on an electronic drawing board interface, referring to fig. 7, for an electronic drawing board interface displaying the drawing toolbar exemplarily provided by the present application, as shown in fig. 7, the drawing toolbar 6221 at least includes a plurality of drawing controls with different styles, and the user selects different drawing controls, and may set a drawing background, and different drawing controls 621a draw different effects in different drawing backgrounds, where the drawing styles include, but are not limited to, gouache, oil drawing paper, sketch paper, rice paper, and the like.
In some embodiments, the user selects a drawing control, and the controller 250 invokes a drawing style parameter corresponding to the drawing control in response to the user selecting the drawing control, and controls the drawing style corresponding to the drawing control selected by the user to be presented in the electronic drawing board interface based on the drawing style parameter.
In some embodiments, a default drawing style parameter may be preset, and if the controller 250 does not detect a user's operation selected for the drawing control, the default drawing style parameter is called, and based on the default drawing style parameter, a corresponding drawing style is controlled to be presented in the electronic drawing board interface.
In some embodiments, after detecting that the user has set the painting parameter, the controller 250 controls the painting brush control 621a to be in the pick-up state, and when the painting brush control 621a is in the pick-up state, the user may input content based on contact with the painting area 610, where the input content is a touch trajectory of the user on the painting area 610.
Fig. 8 is a flowchart of an exemplary touch trajectory display provided in the present application, and as shown in fig. 8, the flowchart includes: s801, a user draws a specific touch trajectory in the drawing area through a sliding operation, and the controller 250 may obtain a brush parameter and an action parameter in response to the sliding operation input by the user in the drawing area 610; s802, the controller 250 sets the color of the pixel point through which the sliding operation passes based on the brush parameter and the action parameter, where the brush parameter at least includes an initial color parameter and a water dipping amount parameter generated when the input color and the water dipping amount of the corresponding brush control 621a are selected in advance and are set, and a drawing style parameter generated when the corresponding drawing control is selected in advance. The action parameters are generated when the controller 250 detects that the user inputs a sliding operation in the drawing area 610, and the action parameters at least include a pen point direction parameter and a writing force parameter.
In some embodiments, when the user inputs a sliding operation in the drawing area 610 by touching the display 260, the touch component may detect positions where the user performs the sliding operation, and generate coordinates of the positions where the sliding operation passes in the reference coordinate system of the display 260, and the controller 250 may acquire start coordinates and real-time coordinates generated by the touch component in response to the sliding operation input by the user, where the start coordinates are coordinates of a start point of the sliding operation in the reference coordinate system of the display 260, and the real-time coordinates are coordinates of path pixels where the sliding operation passes in the reference coordinate system of the display 260; the controller 250 may determine a sliding trajectory between the starting point and the path pixel point according to the obtained starting point coordinate and the real-time coordinate, and generate a stroke direction parameter according to the sliding trajectory, where the stroke direction parameter includes coordinates of each pixel point passed by the sliding operation in a reference coordinate system of the display 260, so as to reflect a sliding direction and a sliding path corresponding to the sliding operation of the user.
In some embodiments, when the user inputs a sliding operation in the drawing area 610 by touching the display 260, the touch component may detect a touch pressure value when the sliding operation of the user passes through each path pixel point, and the controller 250 may obtain, in response to the sliding operation input by the user, the touch pressure value when the sliding operation passes through each path pixel point detected by the touch component, and generate the writing strength parameter based on the touch pressure value when the sliding operation passes through each path pixel point.
In some embodiments, in response to the user's input of a sliding operation in the drawing region 610, the controller 250 may generate a harmonic color value according to the initial color parameter and the water dipping amount parameter, so as to set the input color of the brush control 621a according to the generated harmonic color value, and the relationship between the initial color parameter, the water dipping amount parameter, and the color value may be expressed as a function of:
Figure BDA0003657498120000091
wherein, Color' represents the blending Color value, and Color represents the initial Color parameter, and the blending Color value reduces along with the increase of dipping water volume parameter, and the Color that the blending Color value corresponds promptly dilutes the colour that corresponds to the initial Color parameter for according to dipping water volume parameter and obtains. As shown in fig. 9, the initial color parameters of the line L1 and the line L2 are the same, the dipping amount parameter of the line L1 is 30%, the dipping amount parameter of the line L2 is 50%, and the color of the line L2 is lighter than that of the line L1 in the electronic palette interface.
In some embodiments, in response to a sliding operation input by the user in the drawing region 610, the controller 250 may set a transparency value corresponding to an input color of the selected brush control 621a according to the stroke direction parameter, where the controller 250 may calculate a distance between a path pixel point and a starting point of the sliding operation according to the stroke direction parameter, and generate a transparency value corresponding to the input color of the brush control 621a according to the distance, as shown in fig. 10, U is the starting point of the sliding operation, and the larger the numerical value of the distance is, the larger the transparency value corresponding to the input color of the brush control 621a that the controller 250 controls to generate is, that is, the farther the pixel point is from the starting point, the greater the transparency value of the set color is.
In some embodiments, a threshold of the distance between the corresponding path pixel point and the starting point of the sliding operation when the transparency value reaches the maximum may be preset, and a functional relationship between the path and the distance when the transparency value does not reach the maximum may be set, for example, the maximum transparency value may be set to Q Max The corresponding distance threshold is S Max If Q represents the generated transparency value and S represents the distance between the path pixel point and the starting point of the sliding operation, the relationship between the transparency value Q and the distance S can be represented by the following function:
Figure BDA0003657498120000092
wherein B is a first transparency gradient coefficient and 0<B is less than or equal to 1, a user can preset the value of B and control the increasing speed of the transparency value, the increasing speed of the transparency value is increased along with the increase of the numerical value of the first transparency gradient coefficient B, and the transparency value is positively correlated with the numerical value corresponding to the distance. For example, if a is set to 0.1, Q Max When the distance between the path pixel point and the starting point of the sliding operation reaches 10cm, the transparency value can reach the maximum value, that is, when S is 10, Q is Max When S is equal to 5cm, Q is equal to 50%, that is, when the value corresponding to the route is 5cm, the generated transparency value is 50%, and when the transparency value reaches the maximum value, a sliding path corresponding to the sliding operation of the user after the transparency value reaches the maximum value will not be displayed in the electronic palette interface.
Fig. 11 is a flowchart of generating a transparency value according to an example of the present application, as shown in fig. 11, including: step S111, the controller 250 calls a pen front direction parameter in response to the sliding operation input by the user in the drawing area 610; step S112, calculating the distance between the starting point of the sliding operation and the passing path pixel point according to the called stroke direction parameter; step S113, if the distance is detected to be less than the distance threshold, calling a function Q ═ B ═ S; step S114, if the distance is detected to be greater than or equal to the distance threshold, calling Q ═ Q Max A function; in step S115, a transparency value is generated based on the called function.
In some embodiments, the controller 250 responds to the sliding operation input by the user in the drawing area 610, and may set a transparency value corresponding to the input color of the selected brush control 621a according to the writing force parameter, where the controller 250 may obtain a touch pressure value when the sliding operation of the user passes through each path pixel point according to the writing force parameter, and generate a transparency value corresponding to the input color of the brush control 621a according to the obtained touch pressure value, and in order to make the color and the effect of each path pixel point closer to a real effect presented by a brush pen that needs to be dipped in water, such as a writing brush, the larger the touch pressure value is, the smaller the transparency value corresponding to the input color of the brush control 621a that is controlled by the controller 250 should be.
In some embodiments, the touch pressure values corresponding to the minimum and maximum transparency values respectively may be preset, and the functional relationship between the touch pressure values and the functional relationship between the minimum and maximum transparency values respectively may be set, for example, the minimum transparency value may be set to Q Min Maximum transparency value of Q Max Minimum transparency value Q Min The corresponding maximum touch pressure value is P Max Maximum transparency value of Q Max The corresponding minimum touch pressure value is P Max If Q represents the generated transparency value and P represents the touch pressure value, the relationship between the transparency value Q and the touch pressure value P may be represented by the following function:
Figure BDA0003657498120000101
wherein C is a second transparency gradient coefficient and 0<C is less than or equal to 1, and when the value range of the touch pressure value P is P Min ≤P≤P Max And when the touch control pressure value is increased, the value of the C is preset by a user, the increasing speed of the transparency value is controlled, the increasing speed of the transparency value is increased along with the increase of the numerical value of the second transparency gradient coefficient C, and the transparency value Q is negatively related to the touch control pressure value.
Fig. 12 is a schematic diagram of a relationship between a touch pressure value and a transparency value provided by the present application by way of example, as shown in fig. 12, where initial color parameters and water dipping amount parameters of a line L3 and a line L4 are the same, a writing strength parameter of a line L3 is greater than a writing strength parameter of a line L4, that is, a touch pressure value corresponding to the line L3 is greater than a touch pressure value corresponding to the line L4, and in an electronic palette interface, a color of a line L4 is lighter than a color of the line L3.
Fig. 13 is a flowchart of generating a transparency value according to an example of the present application, as shown in fig. 13, including: step S131, the controller 250 responds to the sliding operation input by the user in the drawing area 610, and invokes a writing strength parameter; step S132, according to the callWriting force parameters are obtained, and touch pressure values P when the sliding operation of a user passes through the pixel points of each path are obtained; step S133, if the touch pressure value P is detected to be greater than or equal to the maximum touch pressure value P Max If Q is equal to Q Min A function; step S134, if it is detected that the touch pressure value P is less than or equal to the minimum touch pressure value P Min If Q is equal to Q Max A function; step S135, if the range of the touch pressure value P is detected to be at the minimum touch pressure value P Min And a maximum touch pressure value P Max Then call
Figure BDA0003657498120000102
In step S126, a transparency value is generated based on the called function.
In some embodiments, fig. 14 is a flowchart of generating a transparency value according to an example of the present application, as shown in fig. 14, including: in step S141, in response to the sliding operation input by the user in the drawing area 610, the controller 250 may calculate a distance between a pixel point of the path and a starting point of the sliding operation according to the stroke direction parameter, and generate a transparency value Q associated with the stroke direction parameter according to the distance 1 (ii) a In step S142, the controller 250 may further obtain a touch pressure value when the sliding operation of the user passes through each path pixel point according to the writing force parameter, and generate a transparency value Q related to the writing force parameter according to the touch pressure value 2 (ii) a Step S143, the controller is according to the transparency value Q 1 And transparency value Q 2 Calculate a transparency value Q corresponding to the input color of the selected brush control 621a, for example, Q ═ Q 1 *Q 2 Wherein the controller 250 generates a transparency value Q related to the stroke direction parameter according to the distance 1 And generating a transparency value Q related to the writing force parameter according to the touch pressure value 2 Reference is made to the above-described embodiments, and a description thereof will not be repeated.
In some embodiments, the controller 250 may invoke the water dipping amount parameter in response to a sliding operation input by the user in the drawing area 610, and generate the shading radius value according to the water dipping amount parameter, so as to determine the associated region of each path pixel point according to the shading radius value, set the color of each path pixel point and the associated region of each path pixel point, and present the shading effect presented by a brush pen, such as a writing brush, which needs to be dipped in water.
In some embodiments, the water dipping quantity parameter corresponding to the maximum shading radius value and the functional relationship between the water dipping quantity parameter and the maximum shading radius value can be preset, for example, the maximum shading radius value can be set as R Max The corresponding maximum water dipping quantity parameter is V Max If the shading radius value is represented by R, the relationship between the shading radius value R and the water dipping quantity parameter V can be represented by the following function:
Figure BDA0003657498120000111
wherein D is the first halation radius coefficient and 0<D is less than or equal to 1, the value of D can be preset by a user, the increasing speed of the rendering radius value is controlled, and the increasing speed of the rendering radius value is increased along with the increase of the numerical value of the first shading radius coefficient D. FIG. 15 is a graph showing the relationship between halation radius value and water dipping amount parameter, wherein V is 1 、V 2 、V 3 、V 4 、V 5 And V 6 Are all parameters of water dipping amount, and V 1 >V 2 ≥V Max >V 3 >V 4 >V 5 >V 6 Obtaining corresponding halation radius values R according to each water dipping quantity parameter 1 、R 1 、R 2 、R 3 、R 4 And R 5 And R is 1 =R Max >R 2 >R 3 >R 4 >R 5 That is, when the water dipping quantity parameter V is less than the maximum water dipping quantity parameter V Max When the water dipping quantity parameter V is more than or equal to the maximum water dipping quantity parameter V, the rendering radius value is in positive correlation with the water dipping quantity parameter Max The rendering radius value remains unchanged.
In some embodiments, the controller 250 may invoke a writing force parameter in response to a sliding operation input by a user in the drawing area 610, generate a shading radius value according to the writing force parameter, determine an associated region of each path pixel point according to the shading radius value, set colors of each path pixel point and the associated region of each path pixel point, and may present a shading effect presented by a brush needing to be dipped in water, such as a brush pen.
In some embodiments, the maximum writing force parameter corresponding to the maximum shading radius value and the functional relationship between the writing force parameter and the maximum shading radius value can be preset, for example, the maximum shading radius value can be set as R Max The corresponding maximum writing power parameter is L Max If the generated shading radius value is represented by R, the relationship between the shading radius value R and the writing power parameter L can be represented by the following function:
Figure BDA0003657498120000112
wherein E is the second halation radius coefficient and 0<E is less than or equal to 1, the user can preset the value of E and control the growth speed of the rendering radius value, the growth speed of the rendering radius value increases along with the increase of the numerical value of the second shading radius coefficient E, further see FIG. 14, wherein L 1 、L 2 、L 3 、L 4 、L 5 And L 6 Are all writing force parameters, and L 1 >L 2 ≥L Max >L 3 >L 4 >L 5 >L 6 Obtaining corresponding shading radius values R according to each writing force parameter 1 、R 1 、R 2 、R 3 、R 4 And R 5 And R is 1 =R Max >R 2 >R 3 >R 4 >R 5 That is, when the writing power parameter is less than the maximum writing power parameter, it is L Max When the writing force parameter is greater than or equal to the maximum writing force parameter, the rendering radius value is in positive correlation with the writing force parameter, and the writing force parameter is L Max Time, render radius value preservationAnd is not changed.
In some embodiments, the controller 250 may invoke the drawing style parameter in response to a sliding operation input by the user in the drawing area 610, and generate the shading radius value according to the drawing style parameter, so as to determine the associated region of each path pixel point according to the shading radius value, set the color of each path pixel point and the associated region of each path pixel point, and may present the shading effect presented by a brush pen, such as a writing brush, which needs to be dipped in water.
In some embodiments, the drawing style parameter may reflect the water absorption capacity of the drawing paper of the corresponding style, and the relationship between the shading radius value R and the drawing style parameter K may be represented by the following function:
R=F*K;
and F is a third shading radius coefficient, and F is more than 0 and less than or equal to 1, the value of F can be preset by a user, the increase speed of the rendering radius value is controlled, the increase speed of the rendering radius value is increased along with the increase of the numerical value of the third shading radius coefficient F, and the rendering radius value is positively correlated with the drawing paper style parameter.
In some embodiments, fig. 16 is a flowchart of generating a rendering radius value according to an exemplary embodiment of the present application, as shown in fig. 16, including: in step S161, the controller 250 may generate a rendering radius value R related to the dipping amount parameter according to the dipping amount parameter in response to a sliding operation input by the user in the drawing region 610 1 (ii) a Step S162, the controller generates a rendering radius value R related to the writing strength parameter according to the writing strength parameter 2 (ii) a Step S163, the controller generates rendering radius value R related to the drawing style parameter according to the drawing style parameter 3 Step S164, the controller controls the motor according to R 1 、R 2 And R 3 Calculating the vignetting radius value R, e.g. R ═ R 1 *R 2 *R 3 Wherein the controller 250 generates a rendering radius value R related to the water dipping quantity parameter according to the water dipping quantity parameter 1 Generating a rendering radius value R related to the writing force parameter according to the writing force parameter 2 Generating a rendering radius value R related to the drawing paper style parameter according to the drawing paper style parameter 3 The method can refer to the above embodiments, and details are not repeated。
In some embodiments, the controller 250 may detect a path pixel point coordinate in response to a sliding operation input by a user in the drawing area 610, and calculate a boundary coordinate of a path pixel point association area according to the path pixel point coordinate and the vignetting radius value to obtain the path pixel point association area, where the boundary coordinate is a coordinate where a distance between the boundary coordinate and the path pixel point coordinate is equal to the vignetting radius value, see fig. 17, which is a schematic diagram of a path pixel point association area provided by the present application, and as shown in fig. 16, the path pixel point association area 161 is a circular area with the path pixel point 162 as a center and the vignetting radius value R as a radius.
In some embodiments, the controller 250 may generate an effect color value according to the blend color value, the transparency value, and the shading radius value, and set the path pixel point and the color of the path pixel point association region based on the effect color value after obtaining the boundary coordinates of the path pixel point association region through calculation.
In some embodiments, the path pixel point association area includes a plurality of filling points, and fig. 18 is a flowchart of generating a color effect value according to an exemplary embodiment of the present application, as shown in fig. 18, including: in step S181, the controller 250 may detect a filling point coordinate corresponding to each filling point in response to a sliding operation input by the user in the drawing region 610; step S182, the controller 250 calculates the distance between each filling point and the path pixel point according to the coordinates of each filling point and the coordinates of the path pixel point; step S183, the controller 250 adjusts the transparency value according to the distance, so as to update the effect color value according to the adjusted transparency value; step S184, the controller 250 sets the color of the path pixel point according to the effect color value before adjustment; in step S185, the controller 250 sets the color of the corresponding filling point according to the adjusted effect color value. The greater the transparency value corresponding to the color displayed by the filling point farther from the path pixel point, the effect that the color of the boundary from the path pixel point to the path pixel point association area becomes gradually lighter (the transparency value of the color becomes larger) as shown in fig. 19 appears in the path pixel point and the path pixel point association area.
In some embodiments, in response to the sliding operation input by the user in the drawing area 610, if it is detected that at least one of the color parameter, the stroke direction parameter, and the writing strength parameter is null, the effect color value is updated to be null, and if the effect color value is null, the color of the path pixel point and the color of the path pixel point association region that the sliding operation passes through will not change.
In some embodiments, if the controller 250 has generated the effect color value, the user performs the selection operation on the color item corresponding to the brush control again, the controller 250 invokes an initial color parameter corresponding to the color item selected by the user, a current water dipping amount parameter, and a drawing style parameter in response to the selection operation of the user, the brush parameter may be obtained based on the newly invoked initial color parameter, water dipping amount parameter, and drawing style parameter, and, the controller 250 clears the action parameter acquired before the operation selected in response to the user, so that when the user inputs a sliding operation again in the drawing region, the controller 250 generates a new effect color value based on the brush parameter and the detected motion parameter of the sliding operation in response to the sliding operation of the user, and setting the color of the path pixel point and the color of the path pixel point association area through which the sliding operation of the user passes according to the new effect color value.
According to the technical scheme, the application provides a display device 200, can acquire the path pixel point that painting brush parameter, action parameter and sliding operation pass through according to the sliding operation of user at drawing district 610 input to based on painting brush parameter and action parameter generation effect colour value, set up according to effect colour value the path pixel point with the colour in path pixel point correlation area, so that the colour of each path pixel point, transparency, the equal effect of shading radius are all different, can demonstrate if the true drawing effect of painting brush that colour shading effect can appear when drawing such as writing brush, be favorable to user experience.
According to the display device 200 provided by the foregoing embodiment, the present application further provides a color rendering method, which is applied to the display device 200, and the method may include:
responding to a sliding operation input by a user in a drawing area 610, obtaining painting brush parameters, wherein the painting brush parameters at least comprise an initial color parameter, a water dipping amount parameter and a drawing paper style parameter;
detecting action parameters of the sliding operation and path pixel points passed by the sliding operation, wherein the action parameters at least comprise a stroke direction parameter, a sliding time parameter and a writing force parameter;
generating an effect color value according to the painting brush parameter and the action parameter;
and setting the colors of the path pixel points and the associated areas of the path pixel points according to the effect color values.
In some embodiments, in response to the step of the sliding operation input by the user in the drawing region 610, the method further comprises: receiving a water dipping instruction input by a user; responding to the water dipping instruction, and displaying a water dipping control on the electronic drawing board interface; responding to the selection operation of a user on the water dipping control, and detecting the duration of the water dipping control subjected to the selection operation; and generating the water dipping quantity parameter according to the duration.
In some embodiments, in the step of generating the dip parameter, the method further comprises: if the operation time is less than the preset time, setting the water dipping quantity parameter according to the operation time, wherein the water dipping quantity parameter is positively correlated with the operation time; and if the operation time is more than or equal to the preset time, setting the dipping water quantity parameter to be equal to the preset value.
In some embodiments, in the step of detecting motion parameters of the sliding operation and path pixel points passed by the sliding operation, the method further includes: acquiring a starting point coordinate of the sliding operation and a real-time coordinate corresponding to the path pixel point; acquiring a sliding track between the starting point and the path pixel point according to the starting point coordinate and the real-time coordinate; and generating the stroke front direction parameter according to the sliding track.
In some embodiments, in the step of detecting motion parameters of the sliding operation and path pixel points passed by the sliding operation, the method further includes:
acquiring a touch pressure value when the sliding operation passes through the path pixel point;
and generating the writing force parameter according to the touch pressure value.
In some embodiments, the method further comprises: generating a harmonic color value according to the initial color parameter and the water dipping quantity parameter; generating a transparency value according to the stroke direction parameter and the writing force parameter; generating a shading radius value according to the water dipping quantity parameter, the writing force parameter and the drawing paper style parameter; and generating an effect color value according to the harmonic color value, the transparency value and the vignetting radius value.
In some embodiments, in the step of setting the color of the path pixel and the color of the area associated with the path pixel according to the effect color value, the method further includes: detecting the coordinates of the path pixel points; calculating the boundary coordinates of the associated area of the path pixel points according to the coordinates of the path pixel points and the shading radius values; setting the color of the pixel point according to the harmonic color value; and setting the color of the path pixel point association area according to the harmony color value and the transparency value.
In some embodiments, the path pixel point association region includes a plurality of padding points, and in the step of setting the color of the path pixel point and the color of the path pixel point association region according to the harmony color value and the transparency value, the method further includes: detecting filling point coordinates corresponding to the filling points; calculating the distance between the filling point and the path pixel point according to the path pixel point coordinate and the filling point coordinate; adjusting the transparency value according to the distance; and setting the color of the filling point according to the harmony color value and the adjusted transparency value.
In some embodiments, the method further comprises: and if at least one of the color parameter, the stroke direction parameter and the writing force parameter is detected to be a null value, updating the effect color value to be a null value.
The above-described embodiments of the present invention do not limit the scope of the present invention.

Claims (10)

1. A display device, comprising:
the electronic drawing board interface comprises a control area and a drawing area, wherein the control area comprises at least one painting brush control;
the touch control assembly is used for detecting a touch control pressure value and touch control duration when a user performs touch control operation on the display;
a controller configured to:
responding to a sliding operation input by a user in the drawing area, and acquiring painting brush parameters which comprise an initial color parameter, a water dipping amount parameter and a drawing paper style parameter;
detecting action parameters of the sliding operation and path pixel points passed by the sliding operation, wherein the action parameters comprise a stroke direction parameter and a writing force parameter;
generating an effect color value according to the painting brush parameter and the action parameter;
and setting the colors of the path pixel points and the path pixel point association area according to the effect color value.
2. The display device according to claim 1, wherein the controller is further configured to, in response to a sliding operation input by a user at the drawing region:
receiving a water dipping instruction input by a user;
responding to the water dipping instruction, and displaying a water dipping control on the electronic drawing board interface;
responding to the selection operation of a user on the water dipping control, and detecting the duration of the water dipping control subjected to the selection operation;
and generating the water dipping quantity parameter according to the duration.
3. The display device of claim 2, wherein in the generating the dip water quantity parameter, the controller is further configured to:
if the operation time is less than the preset time, setting the water dipping quantity parameter according to the operation time, wherein the water dipping quantity parameter is positively correlated with the operation time;
and if the operation time is more than or equal to the preset time, setting the dipping water quantity parameter to be equal to the preset value.
4. The display device according to claim 1, wherein the controller is further configured to, in the detection of the action parameter of the sliding operation and the path pixel point through which the sliding operation passes:
acquiring a starting point coordinate of the sliding operation and a real-time coordinate corresponding to the path pixel point;
acquiring a sliding track between the starting point and the path pixel point according to the starting point coordinate and the real-time coordinate;
and generating the pen point direction parameter according to the sliding track.
5. The display device according to claim 1, wherein the controller is further configured to detect an action parameter of the sliding operation and a path pixel point through which the sliding operation passes, and to:
acquiring a touch pressure value when the sliding operation passes through the path pixel point;
and generating the writing force parameter according to the touch pressure value.
6. The display device of claim 1, wherein the controller is further configured to:
generating a harmonic color value according to the initial color parameter and the water dipping quantity parameter;
generating a transparency value according to the stroke direction parameter and the writing force parameter;
generating a shading radius value according to the water dipping quantity parameter, the writing force parameter and the drawing paper style parameter;
and generating an effect color value according to the harmonic color value, the transparency value and the vignetting radius value.
7. The display device of claim 6, wherein the setting of the color of the path pixel and the associated region of the path pixel according to the effect color value is further configured to:
detecting the coordinates of the path pixel points;
calculating the boundary coordinates of the associated area of the path pixel points according to the coordinates of the path pixel points and the shading radius values;
setting the color of the pixel point according to the harmony color value;
and setting the color of the path pixel point association area according to the harmony color value and the transparency value.
8. The display device of claim 7, wherein the path pixel point association region includes a plurality of filler points therein, wherein setting the color of the path pixel point and the path pixel point association region according to the harmonic color value and the transparency value, wherein the controller is further configured to:
detecting filling point coordinates corresponding to the filling points;
calculating the distance between the filling point and the path pixel point according to the path pixel point coordinate and the filling point coordinate;
adjusting the transparency value according to the distance;
and setting the color of the filling point according to the harmony color value and the adjusted transparency value.
9. The display device of claim 1, wherein the controller is further configured to:
and if at least one of the color parameter, the stroke direction parameter and the writing force parameter is detected to be a null value, updating the effect color value to be a null value.
10. A color rendering method, comprising:
responding to sliding operation input by a user in a drawing area, and acquiring painting brush parameters which comprise an initial color parameter, a water dipping amount parameter and a drawing paper style parameter;
detecting action parameters of the sliding operation and path pixel points passed by the sliding operation, wherein the action parameters comprise a stroke direction parameter, a sliding time parameter and a writing force parameter;
generating an effect color value according to the painting brush parameter and the action parameter;
and setting the colors of the path pixel points and the associated areas of the path pixel points according to the effect color values.
CN202210564855.1A 2022-05-23 2022-05-23 Display device and color rendering method Pending CN114968049A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210564855.1A CN114968049A (en) 2022-05-23 2022-05-23 Display device and color rendering method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210564855.1A CN114968049A (en) 2022-05-23 2022-05-23 Display device and color rendering method

Publications (1)

Publication Number Publication Date
CN114968049A true CN114968049A (en) 2022-08-30

Family

ID=82984832

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210564855.1A Pending CN114968049A (en) 2022-05-23 2022-05-23 Display device and color rendering method

Country Status (1)

Country Link
CN (1) CN114968049A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103164158A (en) * 2013-01-10 2013-06-19 深圳市欧若马可科技有限公司 Method, system and device of creating and teaching painting on touch screen
CN106708406A (en) * 2016-12-07 2017-05-24 南京仁光电子科技有限公司 Handwriting beautifying method and apparatus
CN108089761A (en) * 2018-01-02 2018-05-29 京东方科技集团股份有限公司 Electronic drawing board, remote teaching drawing board, drawing emulation mode and device
CN112015320A (en) * 2019-05-30 2020-12-01 京东方科技集团股份有限公司 Electronic color matching device, color matching method, drawing system and drawing method
CN113485613A (en) * 2021-06-30 2021-10-08 海信视像科技股份有限公司 Display equipment and method for realizing free-drawing screen edge painting
CN113485614A (en) * 2021-06-30 2021-10-08 海信视像科技股份有限公司 Display apparatus and color setting method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103164158A (en) * 2013-01-10 2013-06-19 深圳市欧若马可科技有限公司 Method, system and device of creating and teaching painting on touch screen
CN106708406A (en) * 2016-12-07 2017-05-24 南京仁光电子科技有限公司 Handwriting beautifying method and apparatus
CN108089761A (en) * 2018-01-02 2018-05-29 京东方科技集团股份有限公司 Electronic drawing board, remote teaching drawing board, drawing emulation mode and device
CN112015320A (en) * 2019-05-30 2020-12-01 京东方科技集团股份有限公司 Electronic color matching device, color matching method, drawing system and drawing method
CN113485613A (en) * 2021-06-30 2021-10-08 海信视像科技股份有限公司 Display equipment and method for realizing free-drawing screen edge painting
CN113485614A (en) * 2021-06-30 2021-10-08 海信视像科技股份有限公司 Display apparatus and color setting method

Similar Documents

Publication Publication Date Title
CN113810746B (en) Display equipment and picture sharing method
CN114501107A (en) Display device and coloring method
CN112698905B (en) Screen saver display method, display device, terminal device and server
CN112473121B (en) Display device and avoidance ball display method based on limb identification
CN114157889B (en) Display equipment and touch control assisting interaction method
CN111901646A (en) Display device and touch menu display method
CN115129214A (en) Display device and color filling method
CN114115637B (en) Display equipment and electronic drawing board optimization method
CN115562544A (en) Display device and revocation method
CN114501108A (en) Display device and split-screen display method
CN113485613A (en) Display equipment and method for realizing free-drawing screen edge painting
CN113485614A (en) Display apparatus and color setting method
CN112926420B (en) Display device and menu character recognition method
CN114968049A (en) Display device and color rendering method
CN115550717A (en) Display device and multi-finger touch display method
CN114007129A (en) Display device and network distribution method
CN114442849B (en) Display equipment and display method
CN114296623A (en) Display device
CN114281284B (en) Display apparatus and image display method
CN111787350A (en) Display device and screenshot method in video call
CN113378096B (en) Display equipment and browser residual frame clearing method
CN113766164B (en) Display equipment and signal source interface display method
CN115550716A (en) Display device and color mixing display method
CN115543116A (en) Display device and method for eliminating regional gray scale
CN115243094B (en) Display device and multi-layer stacking method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination